Sample records for control qc measures

  1. A Total Quality-Control Plan with Right-Sized Statistical Quality-Control.

    PubMed

    Westgard, James O

    2017-03-01

    A new Clinical Laboratory Improvement Amendments option for risk-based quality-control (QC) plans became effective in January, 2016. Called an Individualized QC Plan, this option requires the laboratory to perform a risk assessment, develop a QC plan, and implement a QC program to monitor ongoing performance of the QC plan. Difficulties in performing a risk assessment may limit validity of an Individualized QC Plan. A better alternative is to develop a Total QC Plan including a right-sized statistical QC procedure to detect medically important errors. Westgard Sigma Rules provides a simple way to select the right control rules and the right number of control measurements. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. CHALLENGES IN SETTING UP QUALITY CONTROL IN DIAGNOSTIC RADIOLOGY FACILITIES IN NIGERIA.

    PubMed

    Inyang, S O; Egbe, N O; Ekpo, E

    2015-01-01

    The Nigerian Nuclear Regulatory Authority (NNRA) was established to regulate and control the use of radioactive and radiation emitting sources in Nigeria. Quality control (QC) on diagnostic radiology equipment form part of the fundamental requirements for the authorization of diagnostic radiology facilities in the Country. Some quality control tests (output, exposure linearity and reproducibility) were measured on the x-ray machines in the facilities that took part in the study. Questionnaire was developed to evaluate the frequencies at which QC tests were conducted in the facilities and the challenges in setting up QC. Results show great variation in the values of the QC parameters measured. Inadequate cooperation by facilities management, lack of QC equipment and insufficient staff form the major challenges in setting up QC in the facilities under study. The responses on the frequencies at which QC tests should be conducted did not correspond to the recommended standards; indicating that personnel were not familiar with QC implementation and may require further training on QC.

  3. QA/QC in the laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hood, F.C.

    1992-05-01

    Quality assurance and quality control (QA/QC) of analytical chemistry laboratory activities are essential to the validity and usefulness of resultant data. However, in themselves, conventional QA/QC measures will not always ensure that fraudulent data are not generated. Conventional QA/QC measures are based on the assumption that work will be done in good faith; to assure against fraudulent practices, QA/QC measures must be tailored to specific analyses protocols in anticipation of intentional misapplication of those protocols. Application of specific QA/QC measures to ensure against fraudulent practices result in an increased administrative burden being placed on the analytical process; accordingly, in keepingmore » with graded QA philosophy, data quality objectives must be used to identify specific points of concern for special control to minimize the administrative impact.« less

  4. QA/QC in the laboratory. Session F

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hood, F.C.

    1992-05-01

    Quality assurance and quality control (QA/QC) of analytical chemistry laboratory activities are essential to the validity and usefulness of resultant data. However, in themselves, conventional QA/QC measures will not always ensure that fraudulent data are not generated. Conventional QA/QC measures are based on the assumption that work will be done in good faith; to assure against fraudulent practices, QA/QC measures must be tailored to specific analyses protocols in anticipation of intentional misapplication of those protocols. Application of specific QA/QC measures to ensure against fraudulent practices result in an increased administrative burden being placed on the analytical process; accordingly, in keepingmore » with graded QA philosophy, data quality objectives must be used to identify specific points of concern for special control to minimize the administrative impact.« less

  5. Analysis of quality control data of eight modern radiotherapy linear accelerators: the short- and long-term behaviours of the outputs and the reproducibility of quality control measurements

    NASA Astrophysics Data System (ADS)

    Kapanen, Mika; Tenhunen, Mikko; Hämäläinen, Tuomo; Sipilä, Petri; Parkkinen, Ritva; Järvinen, Hannu

    2006-07-01

    Quality control (QC) data of radiotherapy linear accelerators, collected by Helsinki University Central Hospital between the years 2000 and 2004, were analysed. The goal was to provide information for the evaluation and elaboration of QC of accelerator outputs and to propose a method for QC data analysis. Short- and long-term drifts in outputs were quantified by fitting empirical mathematical models to the QC measurements. Normally, long-term drifts were well (<=1%) modelled by either a straight line or a single-exponential function. A drift of 2% occurred in 18 ± 12 months. The shortest drift times of only 2-3 months were observed for some new accelerators just after the commissioning but they stabilized during the first 2-3 years. The short-term reproducibility and the long-term stability of local constancy checks, carried out with a sealed plane parallel ion chamber, were also estimated by fitting empirical models to the QC measurements. The reproducibility was 0.2-0.5% depending on the positioning practice of a device. Long-term instabilities of about 0.3%/month were observed for some checking devices. The reproducibility of local absorbed dose measurements was estimated to be about 0.5%. The proposed empirical model fitting of QC data facilitates the recognition of erroneous QC measurements and abnormal output behaviour, caused by malfunctions, offering a tool to improve dose control.

  6. A Framework for a Quality Control System for Vendor/Processor Contracts.

    ERIC Educational Resources Information Center

    Advanced Technology, Inc., Reston, VA.

    A framework for monitoring quality control (QC) of processor contracts administered by the Department of Education's Office of Student Financial Assistance (OSFA) is presented and applied to the Pell Grant program. Guidelines for establishing QC measures and standards are included, and the uses of a sampling procedure in the QC system are…

  7. Interlaboratory quality control of total HIV-1 DNA load measurement for multicenter reservoir studies.

    PubMed

    Gantner, Pierre; Mélard, Adeline; Damond, Florence; Delaugerre, Constance; Dina, Julia; Gueudin, Marie; Maillard, Anne; Sauné, Karine; Rodallec, Audrey; Tuaillon, Edouard; Plantier, Jean-Christophe; Rouzioux, Christine; Avettand-Fenoel, Véronique

    2017-11-01

    Viral reservoirs represent an important barrier to HIV cure. Accurate markers of HIV reservoirs are needed to develop multicenter studies. The aim of this multicenter quality control (QC) was to evaluate the inter-laboratory reproducibility of total HIV-1-DNA quantification. Ten laboratories of the ANRS-AC11 working group participated by quantifying HIV-DNA with a real-time qPCR assay (Biocentric) in four samples (QCMD). Good reproducibility was found between laboratories (standard deviation ≤ 0.2 log 10 copies/10 6 PBMC) for the three positive QC that were correctly classified by each laboratory (QC1

  8. Levey-Jennings Analysis Uncovers Unsuspected Causes of Immunohistochemistry Stain Variability.

    PubMed

    Vani, Kodela; Sompuram, Seshi R; Naber, Stephen P; Goldsmith, Jeffrey D; Fulton, Regan; Bogen, Steven A

    Almost all clinical laboratory tests use objective, quantitative measures of quality control (QC), incorporating Levey-Jennings analysis and Westgard rules. Clinical immunohistochemistry (IHC) testing, in contrast, relies on subjective, qualitative QC review. The consequences of using Levey-Jennings analysis for QC assessment in clinical IHC testing are not known. To investigate this question, we conducted a 1- to 2-month pilot test wherein the QC for either human epidermal growth factor receptor 2 (HER-2) or progesterone receptor (PR) in 3 clinical IHC laboratories was quantified and analyzed with Levey-Jennings graphs. Moreover, conventional tissue controls were supplemented with a new QC comprised of HER-2 or PR peptide antigens coupled onto 8 μm glass beads. At institution 1, this more stringent analysis identified a decrease in the HER-2 tissue control that had escaped notice by subjective evaluation. The decrement was due to heterogeneity in the tissue control itself. At institution 2, we identified a 1-day sudden drop in the PR tissue control, also undetected by subjective evaluation, due to counterstain variability. At institution 3, a QC shift was identified, but only with 1 of 2 controls mounted on each slide. The QC shift was due to use of the instrument's selective reagent drop zones dispense feature. None of these events affected patient diagnoses. These case examples illustrate that subjective QC evaluation of tissue controls can detect gross assay failure but not subtle changes. The fact that QC issues arose from each site, and in only a pilot study, suggests that immunohistochemical stain variability may be an underappreciated problem.

  9. Data-quality measures for stakeholder-implemented watershed-monitoring programs

    USGS Publications Warehouse

    Greve, Adrienne I.

    2002-01-01

    Community-based watershed groups, many of which collect environmental data, have steadily increased in number over the last decade. The data generated by these programs are often underutilized due to uncertainty in the quality of data produced. The incorporation of data-quality measures into stakeholder monitoring programs lends statistical validity to data. Data-quality measures are divided into three steps: quality assurance, quality control, and quality assessment. The quality-assurance step attempts to control sources of error that cannot be directly quantified. This step is part of the design phase of a monitoring program and includes clearly defined, quantifiable objectives, sampling sites that meet the objectives, standardized protocols for sample collection, and standardized laboratory methods. Quality control (QC) is the collection of samples to assess the magnitude of error in a data set due to sampling, processing, transport, and analysis. In order to design a QC sampling program, a series of issues needs to be considered: (1) potential sources of error, (2) the type of QC samples, (3) inference space, (4) the number of QC samples, and (5) the distribution of the QC samples. Quality assessment is the process of evaluating quality-assurance measures and analyzing the QC data in order to interpret the environmental data. Quality assessment has two parts: one that is conducted on an ongoing basis as the monitoring program is running, and one that is conducted during the analysis of environmental data. The discussion of the data-quality measures is followed by an example of their application to a monitoring program in the Big Thompson River watershed of northern Colorado.

  10. Quality Assurance and Control Considerations in Environmental Measurements and Monitoring

    NASA Astrophysics Data System (ADS)

    Sedlet, Jacob

    1982-06-01

    Quality assurance and quality control have become accepted as essential parts of all environmental surveillance, measurements, and monitoring programs, both nuclear and non-nuclear. The same principles and details apply to each. It is primarily the final measurement technique that differs. As the desire and need to measure smaller amounts of pollutants with greater accuracy has increased, it has been recognized that quality assurance and control programs are cost-effective in achieving the expected results. Quality assurance (QA) consists of all the actions necessary to provide confidence in the results. Quality control (QC) is a part of QA, and consists of those actions and activities that permit the control of the individual steps in the environmental program. The distinction between the two terms is not always clearly defined, but a sharp division is not necessary. The essential principle of QA and QC is a commitment to high quality results. The essential components of a QA and QC program are a complete, written procedures manual for all parts of the environmental program, the use of standard or validated procedures, participation in applicable interlaboratory comparison or QA programs, replicate analysis and measurement, training of personnel, and a means of auditing or checking that the QA and QC programs are properly conducted. These components are discussed below in some detail.

  11. Evaluation of Various Radar Data Quality Control Algorithms Based on Accumulated Radar Rainfall Statistics

    NASA Technical Reports Server (NTRS)

    Robinson, Michael; Steiner, Matthias; Wolff, David B.; Ferrier, Brad S.; Kessinger, Cathy; Einaudi, Franco (Technical Monitor)

    2000-01-01

    The primary function of the TRMM Ground Validation (GV) Program is to create GV rainfall products that provide basic validation of satellite-derived precipitation measurements for select primary sites. A fundamental and extremely important step in creating high-quality GV products is radar data quality control. Quality control (QC) processing of TRMM GV radar data is based on some automated procedures, but the current QC algorithm is not fully operational and requires significant human interaction to assure satisfactory results. Moreover, the TRMM GV QC algorithm, even with continuous manual tuning, still can not completely remove all types of spurious echoes. In an attempt to improve the current operational radar data QC procedures of the TRMM GV effort, an intercomparison of several QC algorithms has been conducted. This presentation will demonstrate how various radar data QC algorithms affect accumulated radar rainfall products. In all, six different QC algorithms will be applied to two months of WSR-88D radar data from Melbourne, Florida. Daily, five-day, and monthly accumulated radar rainfall maps will be produced for each quality-controlled data set. The QC algorithms will be evaluated and compared based on their ability to remove spurious echoes without removing significant precipitation. Strengths and weaknesses of each algorithm will be assessed based on, their abilit to mitigate both erroneous additions and reductions in rainfall accumulation from spurious echo contamination and true precipitation removal, respectively. Contamination from individual spurious echo categories will be quantified to further diagnose the abilities of each radar QC algorithm. Finally, a cost-benefit analysis will be conducted to determine if a more automated QC algorithm is a viable alternative to the current, labor-intensive QC algorithm employed by TRMM GV.

  12. Introducing Quality Control in the Chemistry Teaching Laboratory Using Control Charts

    ERIC Educational Resources Information Center

    Schazmann, Benjamin; Regan, Fiona; Ross, Mary; Diamond, Dermot; Paull, Brett

    2009-01-01

    Quality control (QC) measures are less prevalent in teaching laboratories than commercial settings possibly owing to a lack of commercial incentives or teaching resources. This article focuses on the use of QC assessment in the analytical techniques of high performance liquid chromatography (HPLC) and ultraviolet-visible spectroscopy (UV-vis) at…

  13. Selecting Statistical Procedures for Quality Control Planning Based on Risk Management.

    PubMed

    Yago, Martín; Alcover, Silvia

    2016-07-01

    According to the traditional approach to statistical QC planning, the performance of QC procedures is assessed in terms of its probability of rejecting an analytical run that contains critical size errors (PEDC). Recently, the maximum expected increase in the number of unacceptable patient results reported during the presence of an undetected out-of-control error condition [Max E(NUF)], has been proposed as an alternative QC performance measure because it is more related to the current introduction of risk management concepts for QC planning in the clinical laboratory. We used a statistical model to investigate the relationship between PEDC and Max E(NUF) for simple QC procedures widely used in clinical laboratories and to construct charts relating Max E(NUF) with the capability of the analytical process that allow for QC planning based on the risk of harm to a patient due to the report of erroneous results. A QC procedure shows nearly the same Max E(NUF) value when used for controlling analytical processes with the same capability, and there is a close relationship between PEDC and Max E(NUF) for simple QC procedures; therefore, the value of PEDC can be estimated from the value of Max E(NUF) and vice versa. QC procedures selected by their high PEDC value are also characterized by a low value for Max E(NUF). The PEDC value can be used for estimating the probability of patient harm, allowing for the selection of appropriate QC procedures in QC planning based on risk management. © 2016 American Association for Clinical Chemistry.

  14. Eight years of quality control in Bulgaria: impact on mammography practice.

    PubMed

    Avramova-Cholakova, S; Lilkov, G; Kaneva, M; Terziev, K; Nakov, I; Mutkurov, N; Kovacheva, D; Ivanova, M; Vasilev, D

    2015-07-01

    The requirements for quality control (QC) in diagnostic radiology were introduced in Bulgarian legislation in 2005. Hospital medical physicists and several private medical physics groups provide QC services to radiology departments. The aim of this study was to analyse data from QC tests in mammography and to investigate the impact of QC introduction on mammography practice in the country. The study was coordinated by the National Centre of Radiobiology and Radiation Protection. All medical physics services were requested to fill in standardised forms with information about most important parameters routinely measured during QC. All QC service providers responded. Results demonstrated significant improvement of practice since the introduction of QC, with reduction of established deviations from 65 % during the first year to 7 % in the last year. The systems that do not meet the acceptability criteria were suspended from use. Performance of automatic exposure control and digital detectors are not regularly tested because of the absence of requirements in the legislation. The need of updated guidance and training of medical physicists to reflect the change in technology was demonstrated. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  15. 76 FR 67315 - Supplemental Nutrition Assistance Program: Quality Control Error Tolerance Threshold

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-01

    ...This direct final rule is amending the Quality Control (QC) review error threshold in our regulations from $25.00 to $50.00. The purpose for raising the QC error threshold is to make permanent the temporary threshold change that was required by the American Recovery and Reinvestment Act of 2008. This change does not have an impact on the public. The QC system measures the accuracy of the eligibility system for the Supplemental Nutrition Assistance Program (SNAP).

  16. Bulgarian experience in the establishment of reference dose levels and implementation of a quality control system in diagnostic radiology.

    PubMed

    Vassileva, J; Dimov, A; Slavchev, A; Karadjov, A

    2005-01-01

    Results from a Bulgarian patient dose survey in diagnostic radiology are presented. Reference levels for entrance surface dose (ESD) were 0.9 mGy for chest radiography (PA), 30 mGy for lumbar spine (Lat), 10 mGy for pelvis, 5 mGy for skull (AP), 3 mGy for skull (Lat) and 13 mGy for mammography. Quality control (QC) programmes were proposed for various areas of diagnostic radiology. Film processing QC warranted special attention. Proposed QC programmes included parameters to be tested, level of expertise needed and two action levels: remedial and suspension. Programmes were tested under clinical conditions to assess initial results and draw conclusions for further QC system development. On the basis of international experience, measurement protocols were developed for all parameters tested. QC equipment was provided as part of the PHARE project. A future problem for QC programme implementation may be the small number of medical physics experts in diagnostic radiology.

  17. Quality control quantification (QCQ): a tool to measure the value of quality control checks in radiation oncology.

    PubMed

    Ford, Eric C; Terezakis, Stephanie; Souranis, Annette; Harris, Kendra; Gay, Hiram; Mutic, Sasa

    2012-11-01

    To quantify the error-detection effectiveness of commonly used quality control (QC) measures. We analyzed incidents from 2007-2010 logged into a voluntary in-house, electronic incident learning systems at 2 academic radiation oncology clinics. None of the incidents resulted in patient harm. Each incident was graded for potential severity using the French Nuclear Safety Authority scoring scale; high potential severity incidents (score >3) were considered, along with a subset of 30 randomly chosen low severity incidents. Each report was evaluated to identify which of 15 common QC checks could have detected it. The effectiveness was calculated, defined as the percentage of incidents that each QC measure could detect, both for individual QC checks and for combinations of checks. In total, 4407 incidents were reported, 292 of which had high-potential severity. High- and low-severity incidents were detectable by 4.0 ± 2.3 (mean ± SD) and 2.6 ± 1.4 QC checks, respectively (P<.001). All individual checks were less than 50% sensitive with the exception of pretreatment plan review by a physicist (63%). An effectiveness of 97% was achieved with 7 checks used in combination and was not further improved with more checks. The combination of checks with the highest effectiveness includes physics plan review, physician plan review, Electronic Portal Imaging Device-based in vivo portal dosimetry, radiation therapist timeout, weekly physics chart check, the use of checklists, port films, and source-to-skin distance checks. Some commonly used QC checks such as pretreatment intensity modulated radiation therapy QA do not substantially add to the ability to detect errors in these data. The effectiveness of QC measures in radiation oncology depends sensitively on which checks are used and in which combinations. A small percentage of errors cannot be detected by any of the standard formal QC checks currently in broad use, suggesting that further improvements are needed. These data require confirmation with a broader incident-reporting database. Copyright © 2012 Elsevier Inc. All rights reserved.

  18. Planning Risk-Based SQC Schedules for Bracketed Operation of Continuous Production Analyzers.

    PubMed

    Westgard, James O; Bayat, Hassan; Westgard, Sten A

    2018-02-01

    To minimize patient risk, "bracketed" statistical quality control (SQC) is recommended in the new CLSI guidelines for SQC (C24-Ed4). Bracketed SQC requires that a QC event both precedes and follows (brackets) a group of patient samples. In optimizing a QC schedule, the frequency of QC or run size becomes an important planning consideration to maintain quality and also facilitate responsive reporting of results from continuous operation of high production analytic systems. Different plans for optimizing a bracketed SQC schedule were investigated on the basis of Parvin's model for patient risk and CLSI C24-Ed4's recommendations for establishing QC schedules. A Sigma-metric run size nomogram was used to evaluate different QC schedules for processes of different sigma performance. For high Sigma performance, an effective SQC approach is to employ a multistage QC procedure utilizing a "startup" design at the beginning of production and a "monitor" design periodically throughout production. Example QC schedules are illustrated for applications with measurement procedures having 6-σ, 5-σ, and 4-σ performance. Continuous production analyzers that demonstrate high σ performance can be effectively controlled with multistage SQC designs that employ a startup QC event followed by periodic monitoring or bracketing QC events. Such designs can be optimized to minimize the risk of harm to patients. © 2017 American Association for Clinical Chemistry.

  19. RNA-SeQC: RNA-seq metrics for quality control and process optimization.

    PubMed

    DeLuca, David S; Levin, Joshua Z; Sivachenko, Andrey; Fennell, Timothy; Nazaire, Marc-Danie; Williams, Chris; Reich, Michael; Winckler, Wendy; Getz, Gad

    2012-06-01

    RNA-seq, the application of next-generation sequencing to RNA, provides transcriptome-wide characterization of cellular activity. Assessment of sequencing performance and library quality is critical to the interpretation of RNA-seq data, yet few tools exist to address this issue. We introduce RNA-SeQC, a program which provides key measures of data quality. These metrics include yield, alignment and duplication rates; GC bias, rRNA content, regions of alignment (exon, intron and intragenic), continuity of coverage, 3'/5' bias and count of detectable transcripts, among others. The software provides multi-sample evaluation of library construction protocols, input materials and other experimental parameters. The modularity of the software enables pipeline integration and the routine monitoring of key measures of data quality such as the number of alignable reads, duplication rates and rRNA contamination. RNA-SeQC allows investigators to make informed decisions about sample inclusion in downstream analysis. In summary, RNA-SeQC provides quality control measures critical to experiment design, process optimization and downstream computational analysis. See www.genepattern.org to run online, or www.broadinstitute.org/rna-seqc/ for a command line tool.

  20. Rheological study of physical cross-linked quaternized cellulose hydrogels induced by β-glycerophosphate.

    PubMed

    You, Jun; Zhou, Jinping; Li, Qian; Zhang, Lina

    2012-03-20

    As a weak base, β-glycerophosphate (β-GP) was used to spontaneously initiate gelation of quaternized cellulose (QC) solutions at body temperature. The QC/β-GP solutions are flowable below or at room temperature but gel rapidly under physiological conditions. In order to clarify the sol-gel transition process of the QC/β-GP systems, the complex was investigated by dynamic viscoelastic measurements. The shear storage modulus (G') and loss modulus (G″) as a function of (1) concentration of β-GP (c(β-GP)), (2) concentration of QC (c(QC)), (3) degree of substitution (DS; i.e., the average number of substituted hydroxyl groups in the anhydroglucose unit) of QC, (4) viscosity-average molecular weight (M(η)) of QC, and (5) solvent medium were studied by the oscillatory rheology. The sol-gel transition temperature of QC/β-GP solutions decreased with an increase of c(QC) and c(β-GP), the M(η) of QC, and a decrease of the DS of QC and pH of the solvent. The sol-gel transition temperature and time could be easily controlled by adjusting the concentrations of QC and β-GP, M(η) and DS of QC, and the solvent medium. Gels formed after heating were irreversible; i.e., after cooling to lower temperature they could not be dissolved to become liquid again. The aggregation and entanglement of QC chains, electrostatic interaction, and hydrogen bonding between QC and β-GP were the main factors responsible for the irreversible sol-gel transition behavior of QC/β-GP systems.

  1. An introduction to statistical process control in research proteomics.

    PubMed

    Bramwell, David

    2013-12-16

    Statistical process control is a well-established and respected method which provides a general purpose, and consistent framework for monitoring and improving the quality of a process. It is routinely used in many industries where the quality of final products is critical and is often required in clinical diagnostic laboratories [1,2]. To date, the methodology has been little utilised in research proteomics. It has been shown to be capable of delivering quantitative QC procedures for qualitative clinical assays [3] making it an ideal methodology to apply to this area of biological research. To introduce statistical process control as an objective strategy for quality control and show how it could be used to benefit proteomics researchers and enhance the quality of the results they generate. We demonstrate that rules which provide basic quality control are easy to derive and implement and could have a major impact on data quality for many studies. Statistical process control is a powerful tool for investigating and improving proteomics research work-flows. The process of characterising measurement systems and defining control rules forces the exploration of key questions that can lead to significant improvements in performance. This work asserts that QC is essential to proteomics discovery experiments. Every experimenter must know the current capabilities of their measurement system and have an objective means for tracking and ensuring that performance. Proteomic analysis work-flows are complicated and multi-variate. QC is critical for clinical chemistry measurements and huge strides have been made in ensuring the quality and validity of results in clinical biochemistry labs. This work introduces some of these QC concepts and works to bridge their use from single analyte QC to applications in multi-analyte systems. This article is part of a Special Issue entitled: Standardization and Quality Control in Proteomics. Copyright © 2013 The Author. Published by Elsevier B.V. All rights reserved.

  2. From Field Notes to Data Portal - A Scalable Data QA/QC Framework for Tower Networks: Progress and Preliminary Results

    NASA Astrophysics Data System (ADS)

    Sturtevant, C.; Hackley, S.; Lee, R.; Holling, G.; Bonarrigo, S.

    2017-12-01

    Quality assurance and control (QA/QC) is one of the most important yet challenging aspects of producing research-quality data. Data quality issues are multi-faceted, including sensor malfunctions, unmet theoretical assumptions, and measurement interference from humans or the natural environment. Tower networks such as Ameriflux, ICOS, and NEON continue to grow in size and sophistication, yet tools for robust, efficient, scalable QA/QC have lagged. Quality control remains a largely manual process heavily relying on visual inspection of data. In addition, notes of measurement interference are often recorded on paper without an explicit pathway to data flagging. As such, an increase in network size requires a near-proportional increase in personnel devoted to QA/QC, quickly stressing the human resources available. We present a scalable QA/QC framework in development for NEON that combines the efficiency and standardization of automated checks with the power and flexibility of human review. This framework includes fast-response monitoring of sensor health, a mobile application for electronically recording maintenance activities, traditional point-based automated quality flagging, and continuous monitoring of quality outcomes and longer-term holistic evaluations. This framework maintains the traceability of quality information along the entirety of the data generation pipeline, and explicitly links field reports of measurement interference to quality flagging. Preliminary results show that data quality can be effectively monitored and managed for a multitude of sites with a small group of QA/QC staff. Several components of this framework are open-source, including a R-Shiny application for efficiently monitoring, synthesizing, and investigating data quality issues.

  3. Selecting Statistical Quality Control Procedures for Limiting the Impact of Increases in Analytical Random Error on Patient Safety.

    PubMed

    Yago, Martín

    2017-05-01

    QC planning based on risk management concepts can reduce the probability of harming patients due to an undetected out-of-control error condition. It does this by selecting appropriate QC procedures to decrease the number of erroneous results reported. The selection can be easily made by using published nomograms for simple QC rules when the out-of-control condition results in increased systematic error. However, increases in random error also occur frequently and are difficult to detect, which can result in erroneously reported patient results. A statistical model was used to construct charts for the 1 ks and X /χ 2 rules. The charts relate the increase in the number of unacceptable patient results reported due to an increase in random error with the capability of the measurement procedure. They thus allow for QC planning based on the risk of patient harm due to the reporting of erroneous results. 1 ks Rules are simple, all-around rules. Their ability to deal with increases in within-run imprecision is minimally affected by the possible presence of significant, stable, between-run imprecision. X /χ 2 rules perform better when the number of controls analyzed during each QC event is increased to improve QC performance. Using nomograms simplifies the selection of statistical QC procedures to limit the number of erroneous patient results reported due to an increase in analytical random error. The selection largely depends on the presence or absence of stable between-run imprecision. © 2017 American Association for Clinical Chemistry.

  4. Microbiological water methods: quality control measures for Federal Clean Water Act and Safe Drinking Water Act regulatory compliance.

    PubMed

    Root, Patsy; Hunt, Margo; Fjeld, Karla; Kundrat, Laurie

    2014-01-01

    Quality assurance (QA) and quality control (QC) data are required in order to have confidence in the results from analytical tests and the equipment used to produce those results. Some AOAC water methods include specific QA/QC procedures, frequencies, and acceptance criteria, but these are considered to be the minimum controls needed to perform a microbiological method successfully. Some regulatory programs, such as those at Code of Federal Regulations (CFR), Title 40, Part 136.7 for chemistry methods, require additional QA/QC measures beyond those listed in the method, which can also apply to microbiological methods. Essential QA/QC measures include sterility checks, reagent specificity and sensitivity checks, assessment of each analyst's capabilities, analysis of blind check samples, and evaluation of the presence of laboratory contamination and instrument calibration and checks. The details of these procedures, their performance frequency, and expected results are set out in this report as they apply to microbiological methods. The specific regulatory requirements of CFR Title 40 Part 136.7 for the Clean Water Act, the laboratory certification requirements of CFR Title 40 Part 141 for the Safe Drinking Water Act, and the International Organization for Standardization 17025 accreditation requirements under The NELAC Institute are also discussed.

  5. Quality Control (QC) System Development for the Pell Grant Program: A Conceptual Framework.

    ERIC Educational Resources Information Center

    Advanced Technology, Inc., Reston, VA.

    The objectives of the Pell Grant quality control (QC) system and the general definition of QC are considered. Attention is also directed to: the objectives of the Stage II Pell Grant QC system design and testing project, the approach used to develop the QC system, and the interface of the QC system and the Pell Grant delivery system. The…

  6. Statistical validation of reagent lot change in the clinical chemistry laboratory can confer insights on good clinical laboratory practice.

    PubMed

    Cho, Min-Chul; Kim, So Young; Jeong, Tae-Dong; Lee, Woochang; Chun, Sail; Min, Won-Ki

    2014-11-01

    Verification of new lot reagent's suitability is necessary to ensure that results for patients' samples are consistent before and after reagent lot changes. A typical procedure is to measure results of some patients' samples along with quality control (QC) materials. In this study, the results of patients' samples and QC materials in reagent lot changes were analysed. In addition, the opinion regarding QC target range adjustment along with reagent lot changes was proposed. Patients' sample and QC material results of 360 reagent lot change events involving 61 analytes and eight instrument platforms were analysed. The between-lot differences for the patients' samples (ΔP) and the QC materials (ΔQC) were tested by Mann-Whitney U tests. The size of the between-lot differences in the QC data was calculated as multiples of standard deviation (SD). The ΔP and ΔQC values only differed significantly in 7.8% of the reagent lot change events. This frequency was not affected by the assay principle or the QC material source. One SD was proposed for the cutoff for maintaining pre-existing target range after reagent lot change. While non-commutable QC material results were infrequent in the present study, our data confirmed that QC materials have limited usefulness when assessing new reagent lots. Also a 1 SD standard for establishing a new QC target range after reagent lot change event was proposed. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  7. Application of clinical assay quality control (QC) to multivariate proteomics data: a workflow exemplified by 2-DE QC.

    PubMed

    Jackson, David; Bramwell, David

    2013-12-16

    Proteomics technologies can be effective for the discovery and assay of protein forms altered with disease. However, few examples of successful biomarker discovery yet exist. Critical to addressing this is the widespread implementation of appropriate QC (quality control) methodology. Such QC should combine the rigour of clinical laboratory assays with a suitable treatment of the complexity of the proteome by targeting separate assignable causes of variation. We demonstrate an approach, metric and example workflow for users to develop such targeted QC rules systematically and objectively, using a publicly available plasma DIGE data set. Hierarchical clustering analysis of standard channels is first used to discover correlated groups of features corresponding to specific assignable sources of technical variation. These effects are then quantified using a statistical distance metric, and followed on control charts. This allows measurement of process drift and the detection of runs that outlie for any given effect. A known technical issue on originally rejected gels was detected validating this approach, and relevant novel effects were also detected and classified effectively. Our approach was effective for 2-DE QC. Whilst we demonstrated this in a retrospective DIGE experiment, the principles would apply to ongoing QC and other proteomic technologies. This work asserts that properly carried out QC is essential to proteomics discovery experiments. Its significance is that it provides one possible novel framework for applying such methods, with a particular consideration of how to handle the complexity of the proteome. It not only focusses on 2DE-based methodology but also demonstrates general principles. A combination of results and discussion based upon a publicly available data set is used to illustrate the approach and allows a structured discussion of factors that experimenters may wish to bear in mind in other situations. The demonstration is on retrospective data only for reasons of scope, but the principles applied are also important for ongoing QC, and this work serves as a step towards a later demonstration of that application. This article is part of a Special Issue entitled: Standardization and Quality Control in Proteomics. © 2013.

  8. ChronQC: a quality control monitoring system for clinical next generation sequencing.

    PubMed

    Tawari, Nilesh R; Seow, Justine Jia Wen; Perumal, Dharuman; Ow, Jack L; Ang, Shimin; Devasia, Arun George; Ng, Pauline C

    2018-05-15

    ChronQC is a quality control (QC) tracking system for clinical implementation of next-generation sequencing (NGS). ChronQC generates time series plots for various QC metrics to allow comparison of current runs to historical runs. ChronQC has multiple features for tracking QC data including Westgard rules for clinical validity, laboratory-defined thresholds and historical observations within a specified time period. Users can record their notes and corrective actions directly onto the plots for long-term recordkeeping. ChronQC facilitates regular monitoring of clinical NGS to enable adherence to high quality clinical standards. ChronQC is freely available on GitHub (https://github.com/nilesh-tawari/ChronQC), Docker (https://hub.docker.com/r/nileshtawari/chronqc/) and the Python Package Index. ChronQC is implemented in Python and runs on all common operating systems (Windows, Linux and Mac OS X). tawari.nilesh@gmail.com or pauline.c.ng@gmail.com. Supplementary data are available at Bioinformatics online.

  9. The importance of quality control in validating concentrations ...

    EPA Pesticide Factsheets

    A national-scale survey of 247 contaminants of emerging concern (CECs), including organic and inorganic chemical compounds, and microbial contaminants, was conducted in source and treated drinking water samples from 25 treatment plants across the United States. Multiple methods were used to determine these CECs, including six analytical methods to measure 174 pharmaceuticals, personal care products, and pesticides. A three-component quality assurance/quality control (QA/QC) program was designed for the subset of 174 CECs which allowed us to assess and compare performances of the methods used. The three components included: 1) a common field QA/QC protocol and sample design, 2) individual investigator-developed method-specific QA/QC protocols, and 3) a suite of 46 method comparison analytes that were determined in two or more analytical methods. Overall method performance for the 174 organic chemical CECs was assessed by comparing spiked recoveries in reagent, source, and treated water over a two-year period. In addition to the 247 CECs reported in the larger drinking water study, another 48 pharmaceutical compounds measured did not consistently meet predetermined quality standards. Methodologies that did not seem suitable for these analytes are overviewed. The need to exclude analytes based on method performance demonstrates the importance of additional QA/QC protocols. This paper compares the method performance of six analytical methods used to measure 174 emer

  10. Quality control and assurance for validation of DOS/I measurements

    NASA Astrophysics Data System (ADS)

    Cerussi, Albert; Durkin, Amanda; Kwong, Richard; Quang, Timothy; Hill, Brian; Tromberg, Bruce J.; MacKinnon, Nick; Mantulin, William W.

    2010-02-01

    Ongoing multi-center clinical trials are crucial for Biophotonics to gain acceptance in medical imaging. In these trials, quality control (QC) and assurance (QA) are key to success and provide "data insurance". Quality control and assurance deal with standardization, validation, and compliance of procedures, materials and instrumentation. Specifically, QC/QA involves systematic assessment of testing materials, instrumentation performance, standard operating procedures, data logging, analysis, and reporting. QC and QA are important for FDA accreditation and acceptance by the clinical community. Our Biophotonics research in the Network for Translational Research in Optical Imaging (NTROI) program for breast cancer characterization focuses on QA/QC issues primarily related to the broadband Diffuse Optical Spectroscopy and Imaging (DOS/I) instrumentation, because this is an emerging technology with limited standardized QC/QA in place. In the multi-center trial environment, we implement QA/QC procedures: 1. Standardize and validate calibration standards and procedures. (DOS/I technology requires both frequency domain and spectral calibration procedures using tissue simulating phantoms and reflectance standards, respectively.) 2. Standardize and validate data acquisition, processing and visualization (optimize instrument software-EZDOS; centralize data processing) 3. Monitor, catalog and maintain instrument performance (document performance; modularize maintenance; integrate new technology) 4. Standardize and coordinate trial data entry (from individual sites) into centralized database 5. Monitor, audit and communicate all research procedures (database, teleconferences, training sessions) between participants ensuring "calibration". This manuscript describes our ongoing efforts, successes and challenges implementing these strategies.

  11. Proteomics Quality Control: Quality Control Software for MaxQuant Results.

    PubMed

    Bielow, Chris; Mastrobuoni, Guido; Kempa, Stefan

    2016-03-04

    Mass spectrometry-based proteomics coupled to liquid chromatography has matured into an automatized, high-throughput technology, producing data on the scale of multiple gigabytes per instrument per day. Consequently, an automated quality control (QC) and quality analysis (QA) capable of detecting measurement bias, verifying consistency, and avoiding propagation of error is paramount for instrument operators and scientists in charge of downstream analysis. We have developed an R-based QC pipeline called Proteomics Quality Control (PTXQC) for bottom-up LC-MS data generated by the MaxQuant software pipeline. PTXQC creates a QC report containing a comprehensive and powerful set of QC metrics, augmented with automated scoring functions. The automated scores are collated to create an overview heatmap at the beginning of the report, giving valuable guidance also to nonspecialists. Our software supports a wide range of experimental designs, including stable isotope labeling by amino acids in cell culture (SILAC), tandem mass tags (TMT), and label-free data. Furthermore, we introduce new metrics to score MaxQuant's Match-between-runs (MBR) functionality by which peptide identifications can be transferred across Raw files based on accurate retention time and m/z. Last but not least, PTXQC is easy to install and use and represents the first QC software capable of processing MaxQuant result tables. PTXQC is freely available at https://github.com/cbielow/PTXQC .

  12. SIMPATIQCO: a server-based software suite which facilitates monitoring the time course of LC-MS performance metrics on Orbitrap instruments.

    PubMed

    Pichler, Peter; Mazanek, Michael; Dusberger, Frederico; Weilnböck, Lisa; Huber, Christian G; Stingl, Christoph; Luider, Theo M; Straube, Werner L; Köcher, Thomas; Mechtler, Karl

    2012-11-02

    While the performance of liquid chromatography (LC) and mass spectrometry (MS) instrumentation continues to increase, applications such as analyses of complete or near-complete proteomes and quantitative studies require constant and optimal system performance. For this reason, research laboratories and core facilities alike are recommended to implement quality control (QC) measures as part of their routine workflows. Many laboratories perform sporadic quality control checks. However, successive and systematic longitudinal monitoring of system performance would be facilitated by dedicated automatic or semiautomatic software solutions that aid an effortless analysis and display of QC metrics over time. We present the software package SIMPATIQCO (SIMPle AuTomatIc Quality COntrol) designed for evaluation of data from LTQ Orbitrap, Q-Exactive, LTQ FT, and LTQ instruments. A centralized SIMPATIQCO server can process QC data from multiple instruments. The software calculates QC metrics supervising every step of data acquisition from LC and electrospray to MS. For each QC metric the software learns the range indicating adequate system performance from the uploaded data using robust statistics. Results are stored in a database and can be displayed in a comfortable manner from any computer in the laboratory via a web browser. QC data can be monitored for individual LC runs as well as plotted over time. SIMPATIQCO thus assists the longitudinal monitoring of important QC metrics such as peptide elution times, peak widths, intensities, total ion current (TIC) as well as sensitivity, and overall LC-MS system performance; in this way the software also helps identify potential problems. The SIMPATIQCO software package is available free of charge.

  13. SIMPATIQCO: A Server-Based Software Suite Which Facilitates Monitoring the Time Course of LC–MS Performance Metrics on Orbitrap Instruments

    PubMed Central

    2012-01-01

    While the performance of liquid chromatography (LC) and mass spectrometry (MS) instrumentation continues to increase, applications such as analyses of complete or near-complete proteomes and quantitative studies require constant and optimal system performance. For this reason, research laboratories and core facilities alike are recommended to implement quality control (QC) measures as part of their routine workflows. Many laboratories perform sporadic quality control checks. However, successive and systematic longitudinal monitoring of system performance would be facilitated by dedicated automatic or semiautomatic software solutions that aid an effortless analysis and display of QC metrics over time. We present the software package SIMPATIQCO (SIMPle AuTomatIc Quality COntrol) designed for evaluation of data from LTQ Orbitrap, Q-Exactive, LTQ FT, and LTQ instruments. A centralized SIMPATIQCO server can process QC data from multiple instruments. The software calculates QC metrics supervising every step of data acquisition from LC and electrospray to MS. For each QC metric the software learns the range indicating adequate system performance from the uploaded data using robust statistics. Results are stored in a database and can be displayed in a comfortable manner from any computer in the laboratory via a web browser. QC data can be monitored for individual LC runs as well as plotted over time. SIMPATIQCO thus assists the longitudinal monitoring of important QC metrics such as peptide elution times, peak widths, intensities, total ion current (TIC) as well as sensitivity, and overall LC–MS system performance; in this way the software also helps identify potential problems. The SIMPATIQCO software package is available free of charge. PMID:23088386

  14. The Development of Quality Control Genotyping Approaches: A Case Study Using Elite Maize Lines.

    PubMed

    Chen, Jiafa; Zavala, Cristian; Ortega, Noemi; Petroli, Cesar; Franco, Jorge; Burgueño, Juan; Costich, Denise E; Hearne, Sarah J

    2016-01-01

    Quality control (QC) of germplasm identity and purity is a critical component of breeding and conservation activities. SNP genotyping technologies and increased availability of markers provide the opportunity to employ genotyping as a low-cost and robust component of this QC. In the public sector available low-cost SNP QC genotyping methods have been developed from a very limited panel of markers of 1,000 to 1,500 markers without broad selection of the most informative SNPs. Selection of optimal SNPs and definition of appropriate germplasm sampling in addition to platform section impact on logistical and resource-use considerations for breeding and conservation applications when mainstreaming QC. In order to address these issues, we evaluated the selection and use of SNPs for QC applications from large DArTSeq data sets generated from CIMMYT maize inbred lines (CMLs). Two QC genotyping strategies were developed, the first is a "rapid QC", employing a small number of SNPs to identify potential mislabeling of seed packages or plots, the second is a "broad QC", employing a larger number of SNP, used to identify each germplasm entry and to measure heterogeneity. The optimal marker selection strategies combined the selection of markers with high minor allele frequency, sampling of clustered SNP in proportion to marker cluster distance and selecting markers that maintain a uniform genomic distribution. The rapid and broad QC SNP panels selected using this approach were further validated using blind test assessments of related re-generation samples. The influence of sampling within each line was evaluated. Sampling 192 individuals would result in close to 100% possibility of detecting a 5% contamination in the entry, and approximately a 98% probability to detect a 2% contamination of the line. These results provide a framework for the establishment of QC genotyping. A comparison of financial and time costs for use of these approaches across different platforms is discussed providing a framework for institutions involved in maize conservation and breeding to assess the resource use effectiveness of QC genotyping. Application of these research findings, in combination with existing QC approaches, will ensure the regeneration, distribution and use in breeding of true to type inbred germplasm. These findings also provide an effective approach to optimize SNP selection for QC genotyping in other species.

  15. Maintaining High Quality Data and Consistency Across a Diverse Flux Network: The Ameriflux QA/QC Technical Team

    NASA Astrophysics Data System (ADS)

    Chan, S.; Billesbach, D. P.; Hanson, C. V.; Biraud, S.

    2014-12-01

    The AmeriFlux quality assurance and quality control (QA/QC) technical team conducts short term (<2 weeks) intercomparisons using a portable eddy covariance system (PECS) to maintain high quality data observations and data consistency across the AmeriFlux network (http://ameriflux.lbl.gov/). Site intercomparisons identify discrepancies between the in situ and portable measurements and calculated fluxes. Findings are jointly discussed by the site staff and the QA/QC team to improve in the situ observations. Despite the relatively short duration of an individual site intercomparison, the accumulated record of all site visits (numbering over 100 since 2002) is a unique dataset. The ability to deploy redundant sensors provides a rare opportunity to identify, quantify, and understand uncertainties in eddy covariance and ancillary measurements. We present a few specific case studies from QA/QC site visits to highlight and share new and relevant findings related to eddy covariance instrumentation and operation.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ford, Eric C., E-mail: eford@uw.edu; Terezakis, Stephanie; Souranis, Annette

    Purpose: To quantify the error-detection effectiveness of commonly used quality control (QC) measures. Methods: We analyzed incidents from 2007-2010 logged into a voluntary in-house, electronic incident learning systems at 2 academic radiation oncology clinics. None of the incidents resulted in patient harm. Each incident was graded for potential severity using the French Nuclear Safety Authority scoring scale; high potential severity incidents (score >3) were considered, along with a subset of 30 randomly chosen low severity incidents. Each report was evaluated to identify which of 15 common QC checks could have detected it. The effectiveness was calculated, defined as the percentagemore » of incidents that each QC measure could detect, both for individual QC checks and for combinations of checks. Results: In total, 4407 incidents were reported, 292 of which had high-potential severity. High- and low-severity incidents were detectable by 4.0 {+-} 2.3 (mean {+-} SD) and 2.6 {+-} 1.4 QC checks, respectively (P<.001). All individual checks were less than 50% sensitive with the exception of pretreatment plan review by a physicist (63%). An effectiveness of 97% was achieved with 7 checks used in combination and was not further improved with more checks. The combination of checks with the highest effectiveness includes physics plan review, physician plan review, Electronic Portal Imaging Device-based in vivo portal dosimetry, radiation therapist timeout, weekly physics chart check, the use of checklists, port films, and source-to-skin distance checks. Some commonly used QC checks such as pretreatment intensity modulated radiation therapy QA do not substantially add to the ability to detect errors in these data. Conclusions: The effectiveness of QC measures in radiation oncology depends sensitively on which checks are used and in which combinations. A small percentage of errors cannot be detected by any of the standard formal QC checks currently in broad use, suggesting that further improvements are needed. These data require confirmation with a broader incident-reporting database.« less

  17. The Quality Control Algorithms Used in the Creation of NASA Kennedy Space Center Lightning Protection System Towers Meteorological Database

    NASA Technical Reports Server (NTRS)

    Orcutt, John M.; Brenton, James C.

    2016-01-01

    An accurate database of meteorological data is essential for designing any aerospace vehicle and for preparing launch commit criteria. Meteorological instrumentation were recently placed on the three Lightning Protection System (LPS) towers at Kennedy Space Center (KSC) launch complex 39B (LC-39B), which provide a unique meteorological dataset existing at the launch complex over an extensive altitude range. Data records of temperature, dew point, relative humidity, wind speed, and wind direction are produced at 40, 78, 116, and 139 m at each tower. The Marshall Space Flight Center Natural Environments Branch (EV44) received an archive that consists of one-minute averaged measurements for the period of record of January 2011 - April 2015. However, before the received database could be used EV44 needed to remove any erroneous data from within the database through a comprehensive quality control (QC) process. The QC process applied to the LPS towers' meteorological data is similar to other QC processes developed by EV44, which were used in the creation of meteorological databases for other towers at KSC. The QC process utilized in this study has been modified specifically for use with the LPS tower database. The QC process first includes a check of each individual sensor. This check includes removing any unrealistic data and checking the temporal consistency of each variable. Next, data from all three sensors at each height are checked against each other, checked against climatology, and checked for sensors that erroneously report a constant value. Then, a vertical consistency check of each variable at each tower is completed. Last, the upwind sensor at each level is selected to minimize the influence of the towers and other structures at LC-39B on the measurements. The selection process for the upwind sensor implemented a study of tower-induced turbulence. This paper describes in detail the QC process, QC results, and the attributes of the LPS towers meteorological database.

  18. Comparison of Different Matrices as Potential Quality Control Samples for Neurochemical Dementia Diagnostics.

    PubMed

    Lelental, Natalia; Brandner, Sebastian; Kofanova, Olga; Blennow, Kaj; Zetterberg, Henrik; Andreasson, Ulf; Engelborghs, Sebastiaan; Mroczko, Barbara; Gabryelewicz, Tomasz; Teunissen, Charlotte; Mollenhauer, Brit; Parnetti, Lucilla; Chiasserini, Davide; Molinuevo, Jose Luis; Perret-Liaudet, Armand; Verbeek, Marcel M; Andreasen, Niels; Brosseron, Frederic; Bahl, Justyna M C; Herukka, Sanna-Kaisa; Hausner, Lucrezia; Frölich, Lutz; Labonte, Anne; Poirier, Judes; Miller, Anne-Marie; Zilka, Norbert; Kovacech, Branislav; Urbani, Andrea; Suardi, Silvia; Oliveira, Catarina; Baldeiras, Ines; Dubois, Bruno; Rot, Uros; Lehmann, Sylvain; Skinningsrud, Anders; Betsou, Fay; Wiltfang, Jens; Gkatzima, Olymbia; Winblad, Bengt; Buchfelder, Michael; Kornhuber, Johannes; Lewczuk, Piotr

    2016-03-01

    Assay-vendor independent quality control (QC) samples for neurochemical dementia diagnostics (NDD) biomarkers are so far commercially unavailable. This requires that NDD laboratories prepare their own QC samples, for example by pooling leftover cerebrospinal fluid (CSF) samples. To prepare and test alternative matrices for QC samples that could facilitate intra- and inter-laboratory QC of the NDD biomarkers. Three matrices were validated in this study: (A) human pooled CSF, (B) Aβ peptides spiked into human prediluted plasma, and (C) Aβ peptides spiked into solution of bovine serum albumin in phosphate-buffered saline. All matrices were tested also after supplementation with an antibacterial agent (sodium azide). We analyzed short- and long-term stability of the biomarkers with ELISA and chemiluminescence (Fujirebio Europe, MSD, IBL International), and performed an inter-laboratory variability study. NDD biomarkers turned out to be stable in almost all samples stored at the tested conditions for up to 14 days as well as in samples stored deep-frozen (at - 80°C) for up to one year. Sodium azide did not influence biomarker stability. Inter-center variability of the samples sent at room temperature (pooled CSF, freeze-dried CSF, and four artificial matrices) was comparable to the results obtained on deep-frozen samples in other large-scale projects. Our results suggest that it is possible to replace self-made, CSF-based QC samples with large-scale volumes of QC materials prepared with artificial peptides and matrices. This would greatly facilitate intra- and inter-laboratory QC schedules for NDD measurements.

  19. Quality Control for Scoring Tests Administered in Continuous Mode: An NCME Instructional Module

    ERIC Educational Resources Information Center

    Allalouf, Avi; Gutentag, Tony; Baumer, Michal

    2017-01-01

    Quality control (QC) in testing is paramount. QC procedures for tests can be divided into two types. The first type, one that has been well researched, is QC for tests administered to large population groups on few administration dates using a small set of test forms (e.g., large-scale assessment). The second type is QC for tests, usually…

  20. Impact of dose calibrators quality control programme in Argentina

    NASA Astrophysics Data System (ADS)

    Furnari, J. C.; de Cabrejas, M. L.; del C. Rotta, M.; Iglicki, F. A.; Milá, M. I.; Magnavacca, C.; Dima, J. C.; Rodríguez Pasqués, R. H.

    1992-02-01

    The national Quality Control (QC) programme for radionuclide calibrators started 12 years ago. Accuracy and the implementation of a QC programme were evaluated over all these years at 95 nuclear medicine laboratories where dose calibrators were in use. During all that time, the Metrology Group of CNEA has distributed 137Cs sealed sources to check stability and has been performing periodic "checking rounds" and postal surveys using unknown samples (external quality control). An account of the results of both methods is presented. At present, more of 65% of the dose calibrators measure activities with an error less than 10%.

  1. From field notes to data portal - An operational QA/QC framework for tower networks

    NASA Astrophysics Data System (ADS)

    Sturtevant, C.; Hackley, S.; Meehan, T.; Roberti, J. A.; Holling, G.; Bonarrigo, S.

    2016-12-01

    Quality assurance and control (QA/QC) is one of the most important yet challenging aspects of producing research-quality data. This is especially so for environmental sensor networks collecting numerous high-frequency measurement streams at distributed sites. Here, the quality issues are multi-faceted, including sensor malfunctions, unmet theoretical assumptions, and measurement interference from the natural environment. To complicate matters, there are often multiple personnel managing different sites or different steps in the data flow. For large, centrally managed sensor networks such as NEON, the separation of field and processing duties is in the extreme. Tower networks such as Ameriflux, ICOS, and NEON continue to grow in size and sophistication, yet tools for robust, efficient, scalable QA/QC have lagged. Quality control remains a largely manual process relying on visual inspection of the data. In addition, notes of observed measurement interference or visible problems are often recorded on paper without an explicit pathway to data flagging during processing. As such, an increase in network size requires a near-proportional increase in personnel devoted to QA/QC, quickly stressing the human resources available. There is a need for a scalable, operational QA/QC framework that combines the efficiency and standardization of automated tests with the power and flexibility of visual checks, and includes an efficient communication pathway from field personnel to data processors to end users. Here we propose such a framework and an accompanying set of tools in development, including a mobile application template for recording tower maintenance and an R/shiny application for efficiently monitoring and synthesizing data quality issues. This framework seeks to incorporate lessons learned from the Ameriflux community and provide tools to aid continued network advancements.

  2. jqcML: an open-source java API for mass spectrometry quality control data in the qcML format.

    PubMed

    Bittremieux, Wout; Kelchtermans, Pieter; Valkenborg, Dirk; Martens, Lennart; Laukens, Kris

    2014-07-03

    The awareness that systematic quality control is an essential factor to enable the growth of proteomics into a mature analytical discipline has increased over the past few years. To this aim, a controlled vocabulary and document structure have recently been proposed by Walzer et al. to store and disseminate quality-control metrics for mass-spectrometry-based proteomics experiments, called qcML. To facilitate the adoption of this standardized quality control routine, we introduce jqcML, a Java application programming interface (API) for the qcML data format. First, jqcML provides a complete object model to represent qcML data. Second, jqcML provides the ability to read, write, and work in a uniform manner with qcML data from different sources, including the XML-based qcML file format and the relational database qcDB. Interaction with the XML-based file format is obtained through the Java Architecture for XML Binding (JAXB), while generic database functionality is obtained by the Java Persistence API (JPA). jqcML is released as open-source software under the permissive Apache 2.0 license and can be downloaded from https://bitbucket.org/proteinspector/jqcml .

  3. Use of Six Sigma Worksheets for assessment of internal and external failure costs associated with candidate quality control rules for an ADVIA 120 hematology analyzer.

    PubMed

    Cian, Francesco; Villiers, Elisabeth; Archer, Joy; Pitorri, Francesca; Freeman, Kathleen

    2014-06-01

    Quality control (QC) validation is an essential tool in total quality management of a veterinary clinical pathology laboratory. Cost-analysis can be a valuable technique to help identify an appropriate QC procedure for the laboratory, although this has never been reported in veterinary medicine. The aim of this study was to determine the applicability of the Six Sigma Quality Cost Worksheets in the evaluation of possible candidate QC rules identified by QC validation. Three months of internal QC records were analyzed. EZ Rules 3 software was used to evaluate candidate QC procedures, and the costs associated with the application of different QC rules were calculated using the Six Sigma Quality Cost Worksheets. The costs associated with the current and the candidate QC rules were compared, and the amount of cost savings was calculated. There was a significant saving when the candidate 1-2.5s, n = 3 rule was applied instead of the currently utilized 1-2s, n = 3 rule. The savings were 75% per year (£ 8232.5) based on re-evaluating all of the patient samples in addition to the controls, and 72% per year (£ 822.4) based on re-analyzing only the control materials. The savings were also shown to change accordingly with the number of samples analyzed and with the number of daily QC procedures performed. These calculations demonstrated the importance of the selection of an appropriate QC procedure, and the usefulness of the Six Sigma Costs Worksheet in determining the most cost-effective rule(s) when several candidate rules are identified by QC validation. © 2014 American Society for Veterinary Clinical Pathology and European Society for Veterinary Clinical Pathology.

  4. Quality Control Algorithms for the Kennedy Space Center 50-Megahertz Doppler Radar Wind Profiler Winds Database

    NASA Technical Reports Server (NTRS)

    Barbre, Robert E., Jr.

    2012-01-01

    This paper presents the process used by the Marshall Space Flight Center Natural Environments Branch (EV44) to quality control (QC) data from the Kennedy Space Center's 50-MHz Doppler Radar Wind Profiler for use in vehicle wind loads and steering commands. The database has been built to mitigate limitations of using the currently archived databases from weather balloons. The DRWP database contains wind measurements from approximately 2.7-18.6 km altitude at roughly five minute intervals for the August 1997 to December 2009 period of record, and the extensive QC process was designed to remove spurious data from various forms of atmospheric and non-atmospheric artifacts. The QC process is largely based on DRWP literature, but two new algorithms have been developed to remove data contaminated by convection and excessive first guess propagations from the Median Filter First Guess Algorithm. In addition to describing the automated and manual QC process in detail, this paper describes the extent of the data retained. Roughly 58% of all possible wind observations exist in the database, with approximately 100 times as many complete profile sets existing relative to the EV44 balloon databases. This increased sample of near-continuous wind profile measurements may help increase launch availability by reducing the uncertainty of wind changes during launch countdown

  5. Quality Control Practices for Chemistry and Immunochemistry in a Cohort of 21 Large Academic Medical Centers.

    PubMed

    Rosenbaum, Matthew W; Flood, James G; Melanson, Stacy E F; Baumann, Nikola A; Marzinke, Mark A; Rai, Alex J; Hayden, Joshua; Wu, Alan H B; Ladror, Megan; Lifshitz, Mark S; Scott, Mitchell G; Peck-Palmer, Octavia M; Bowen, Raffick; Babic, Nikolina; Sobhani, Kimia; Giacherio, Donald; Bocsi, Gregary T; Herman, Daniel S; Wang, Ping; Toffaletti, John; Handel, Elizabeth; Kelly, Kathleen A; Albeiroti, Sami; Wang, Sihe; Zimmer, Melissa; Driver, Brandon; Yi, Xin; Wilburn, Clayton; Lewandrowski, Kent B

    2018-05-29

    In the United States, minimum standards for quality control (QC) are specified in federal law under the Clinical Laboratory Improvement Amendment and its revisions. Beyond meeting this required standard, laboratories have flexibility to determine their overall QC program. We surveyed chemistry and immunochemistry QC procedures at 21 clinical laboratories within leading academic medical centers to assess if standardized QC practices exist for chemistry and immunochemistry testing. We observed significant variation and unexpected similarities in practice across laboratories, including QC frequency, cutoffs, number of levels analyzed, and other features. This variation in practice indicates an opportunity exists to establish an evidence-based approach to QC that can be generalized across institutions.

  6. New insight into the comparative power of quality-control rules that use control observations within a single analytical run.

    PubMed

    Parvin, C A

    1993-03-01

    The error detection characteristics of quality-control (QC) rules that use control observations within a single analytical run are investigated. Unlike the evaluation of QC rules that span multiple analytical runs, most of the fundamental results regarding the performance of QC rules applied within a single analytical run can be obtained from statistical theory, without the need for simulation studies. The case of two control observations per run is investigated for ease of graphical display, but the conclusions can be extended to more than two control observations per run. Results are summarized in a graphical format that offers many interesting insights into the relations among the various QC rules. The graphs provide heuristic support to the theoretical conclusions that no QC rule is best under all error conditions, but the multirule that combines the mean rule and a within-run standard deviation rule offers an attractive compromise.

  7. Quality control for federal clean water act and safe drinking water act regulatory compliance.

    PubMed

    Askew, Ed

    2013-01-01

    QC sample results are required in order to have confidence in the results from analytical tests. Some of the AOAC water methods include specific QC procedures, frequencies, and acceptance criteria. These are considered to be the minimum controls needed to perform the method successfully. Some regulatory programs, such as those in 40 CFR Part 136.7, require additional QC or have alternative acceptance limits. Essential QC measures include method calibration, reagent standardization, assessment of each analyst's capabilities, analysis of blind check samples, determination of the method's sensitivity (method detection level or quantification limit), and daily evaluation of bias, precision, and the presence of laboratory contamination or other analytical interference. The details of these procedures, their performance frequency, and expected ranges of results are set out in this manuscript. The specific regulatory requirements of 40 CFR Part 136.7 for the Clean Water Act, the laboratory certification requirements of 40 CFR Part 141 for the Safe Drinking Water Act, and the ISO 17025 accreditation requirements under The NELAC Institute are listed.

  8. Quality controls for wind measurement of a 1290-MHz boundary layer profiler under strong wind conditions.

    PubMed

    Liu, Zhao; Zheng, Chaorong; Wu, Yue

    2017-09-01

    Wind profilers have been widely adopted to observe the wind field information in the atmosphere for different purposes. But accuracy of its observation has limitations due to various noises or disturbances and hence need to be further improved. In this paper, the data measured under strong wind conditions, using a 1290-MHz boundary layer profiler (BLP), are quality controlled via a composite quality control (QC) procedure proposed by the authors. Then, through the comparison with the data measured by radiosonde flights (balloon observations), the critical thresholds in the composite QC procedure, including consensus average threshold T 1 and vertical shear threshold T 3 , are systematically discussed. And the performance of the BLP operated under precipitation is also evaluated. It is found that to ensure the high accuracy and high data collectable rate, the optimal range of subsets is determined to be 4 m/s. Although the number of data rejected by the combined algorithm of vertical shear examination and small median test is quite limited, it is proved that the algorithm is quite useful to recognize the outlier with a large discrepancy. And the optimal wind shear threshold T 3 can be recommended as 5 ms -1 /100m. During patchy precipitation, the quality of data measured by the four oblique beams (using the DBS measuring technique) can still be ensured. After the BLP data are quality controlled by the composite QC procedure, the output can show good agreement with the balloon observation.

  9. Automated evaluation of electronic discharge notes to assess quality of care for cardiovascular diseases using Medical Language Extraction and Encoding System (MedLEE)

    PubMed Central

    Lin, Jou-Wei; Yang, Chen-Wei

    2010-01-01

    The objective of this study was to develop and validate an automated acquisition system to assess quality of care (QC) measures for cardiovascular diseases. This system combining searching and retrieval algorithms was designed to extract QC measures from electronic discharge notes and to estimate the attainment rates to the current standards of care. It was developed on the patients with ST-segment elevation myocardial infarction and tested on the patients with unstable angina/non-ST-segment elevation myocardial infarction, both diseases sharing almost the same QC measures. The system was able to reach a reasonable agreement (κ value) with medical experts from 0.65 (early reperfusion rate) to 0.97 (β-blockers and lipid-lowering agents before discharge) for different QC measures in the test set, and then applied to evaluate QC in the patients who underwent coronary artery bypass grafting surgery. The result has validated a new tool to reliably extract QC measures for cardiovascular diseases. PMID:20442141

  10. An overview of quality control practices in Ontario with particular reference to cholesterol analysis.

    PubMed

    Krishnan, S; Webb, S; Henderson, A R; Cheung, C M; Nazir, D J; Richardson, H

    1999-03-01

    The Laboratory Proficiency Testing Program (LPTP) assesses the analytical performance of all licensed laboratories in Ontario. The LPTP Enzymes, Cardiac Markers, and Lipids Committee conducted a "Patterns of Practice" survey to assess the in-house quality control (QC) practices of laboratories in Ontario using cholesterol as the QC paradigm. The survey was questionnaire-based seeking information on statistical calculations, software rules, review process and data retention, and so on. Copies of the in-house cholesterol QC graphs were requested. A total of 120 of 210 laboratories were randomly chosen to receive the questionnaires during 1995 and 1996; 115 laboratories responded, although some did not answer all questions. The majority calculate means and standard deviations (SD) every month, using anywhere from 4 to >100 data points. 65% use a fixed mean and SD, while 17% use means calculated from the previous month. A few use a floating or cumulative mean. Some laboratories that do not use fixed means use a fixed SD. About 90% use some form of statistical quality control rules. The most common rules used to detect random error are 1(3s)/R4s while 2(2s)/4(1s)/10x are used for systematic errors. About 20% did not assay any QC at levels >5.5 mmol/L. Quality control data are reviewed daily (technologists), weekly and monthly (supervisors/directors). Most laboratories retain their QC records for up to 3 years on paper and magnetic media. On some QC graphs the mean and SD, QC product lot number, or reference to action logs are not apparent. Quality control practices in Ontario are, therefore, disappointing. Improvement is required in the use of clinically appropriate concentrations of QC material and documentation on QC graphs.

  11. Micelles based on methoxy poly(ethylene glycol)-cholesterol conjugate for controlled and targeted drug delivery of a poorly water soluble drug.

    PubMed

    Li, Junming; He, Zhiyao; Yu, Shui; Li, Shuangzhi; Ma, Qing; Yu, Yiyi; Zhang, Jialin; Li, Rui; Zheng, Yu; He, Gu; Song, Xiangrong

    2012-10-01

    In this study, quercetin (QC) with cancer chemoprevention effect and anticancer potential was loaded into polymeric micelles of methoxy poly(ethylene glycol)-cholesterol conjugate (mPEG-Chol) in order to increase its water solubility. MPEG-Chol with lower critical micelle concentration (CMC) value (4.0 x 10(-7) M - 13 x 10(-7) M) was firstly synthesized involving two steps of chemical modification on cholesterol by esterification, and then QC was incorporated into mPEG-Chol micelles by self-assembly method. After the process parameters were optimized, QC-loaded micelles had higher drug loading (3.66%) and entrapment efficiency (93.51%) and nano-sized diameter (116 nm). DSC analysis demonstrated that QC had been incorporated non-covalently into the micelles and existed as an amorphous state or a solid solution in the polymeric matrix. The freeze-dried formulation with addition of 1% (w/v) mannitol as cryoprotectant was successfully developed for the long-term storage of QC-loaded micelles. Compared to free QC, QC-loaded micelles could release QC more slowly. Moreover, the release of QC from micelles was slightly faster in PBS at pH 5 than that in PBS at pH 7.4, which implied that QC-loaded micelles might be pH-sensitive and thereby selectively deliver QC to tumor tissue with unwanted side effects. Therefore, mPEG-Chol was a promising micellar vector for the controlled and targeted drug delivery of QC to tumor and QC-loaded micelles were also worth being further investigated as a potential formulation for cancer chemoprevention and treatment.

  12. Technical Note: Independent component analysis for quality assurance in functional MRI.

    PubMed

    Astrakas, Loukas G; Kallistis, Nikolaos S; Kalef-Ezra, John A

    2016-02-01

    Independent component analysis (ICA) is an established method of analyzing human functional MRI (fMRI) data. Here, an ICA-based fMRI quality control (QC) tool was developed and used. ICA-based fMRI QC tool to be used with a commercial phantom was developed. In an attempt to assess the performance of the tool relative to preexisting alternative tools, it was used seven weeks before and eight weeks after repair of a faulty gradient amplifier of a non-state-of-the-art MRI unit. More specifically, its performance was compared with the AAPM 100 acceptance testing and quality assurance protocol and two fMRI QC protocols, proposed by Freidman et al. ["Report on a multicenter fMRI quality assurance protocol," J. Magn. Reson. Imaging 23, 827-839 (2006)] and Stocker et al. ["Automated quality assurance routines for fMRI data applied to a multicenter study," Hum. Brain Mapp. 25, 237-246 (2005)], respectively. The easily developed and applied ICA-based QC protocol provided fMRI QC indices and maps equally sensitive to fMRI instabilities with the indices and maps of other established protocols. The ICA fMRI QC indices were highly correlated with indices of other fMRI QC protocols and in some cases theoretically related to them. Three or four independent components with slow varying time series are detected under normal conditions. ICA applied on phantom measurements is an easy and efficient tool for fMRI QC. Additionally, it can protect against misinterpretations of artifact components as human brain activations. Evaluating fMRI QC indices in the central region of a phantom is not always the optimal choice.

  13. A comprehensive quality control workflow for paired tumor-normal NGS experiments.

    PubMed

    Schroeder, Christopher M; Hilke, Franz J; Löffler, Markus W; Bitzer, Michael; Lenz, Florian; Sturm, Marc

    2017-06-01

    Quality control (QC) is an important part of all NGS data analysis stages. Many available tools calculate QC metrics from different analysis steps of single sample experiments (raw reads, mapped reads and variant lists). Multi-sample experiments, as sequencing of tumor-normal pairs, require additional QC metrics to ensure validity of results. These multi-sample QC metrics still lack standardization. We therefore suggest a new workflow for QC of DNA sequencing of tumor-normal pairs. With this workflow well-known single-sample QC metrics and additional metrics specific for tumor-normal pairs can be calculated. The segmentation into different tools offers a high flexibility and allows reuse for other purposes. All tools produce qcML, a generic XML format for QC of -omics experiments. qcML uses quality metrics defined in an ontology, which was adapted for NGS. All QC tools are implemented in C ++ and run both under Linux and Windows. Plotting requires python 2.7 and matplotlib. The software is available under the 'GNU General Public License version 2' as part of the ngs-bits project: https://github.com/imgag/ngs-bits. christopher.schroeder@med.uni-tuebingen.de. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  14. Lean Six Sigma in Health Care: Improving Utilization and Reducing Waste.

    PubMed

    Almorsy, Lamia; Khalifa, Mohamed

    2016-01-01

    Healthcare costs have been increasing worldwide mainly due to over utilization of resources. The savings potentially achievable from systematic, comprehensive, and cooperative reduction in waste are far higher than from more direct and blunter cuts in care and coverage. At King Faisal Specialist Hospital and Research Center inappropriate and over utilization of the glucose test strips used for whole blood glucose determination using glucometers was observed. The hospital implemented a project to improve its utilization. Using the Six Sigma DMAIC approach (Define, Measure, Analyze, Improve and Control), an efficient practice was put in place including updating the related internal policies and procedures and the proper implementation of an effective users' training and competency check off program. That resulted in decreasing the unnecessary Quality Control (QC) runs from 13% to 4%, decreasing the failed QC runs from 14% to 7%, lowering the QC to patient testing ratio from 24/76 to 19/81.

  15. The importance of quality control in validating concentrations of contaminants of emerging concern in source and treated drinking water samples.

    PubMed

    Batt, Angela L; Furlong, Edward T; Mash, Heath E; Glassmeyer, Susan T; Kolpin, Dana W

    2017-02-01

    A national-scale survey of 247 contaminants of emerging concern (CECs), including organic and inorganic chemical compounds, and microbial contaminants, was conducted in source and treated drinking water samples from 25 treatment plants across the United States. Multiple methods were used to determine these CECs, including six analytical methods to measure 174 pharmaceuticals, personal care products, and pesticides. A three-component quality assurance/quality control (QA/QC) program was designed for the subset of 174 CECs which allowed us to assess and compare performances of the methods used. The three components included: 1) a common field QA/QC protocol and sample design, 2) individual investigator-developed method-specific QA/QC protocols, and 3) a suite of 46 method comparison analytes that were determined in two or more analytical methods. Overall method performance for the 174 organic chemical CECs was assessed by comparing spiked recoveries in reagent, source, and treated water over a two-year period. In addition to the 247 CECs reported in the larger drinking water study, another 48 pharmaceutical compounds measured did not consistently meet predetermined quality standards. Methodologies that did not seem suitable for these analytes are overviewed. The need to exclude analytes based on method performance demonstrates the importance of additional QA/QC protocols. Published by Elsevier B.V.

  16. Automated quality control in a file-based broadcasting workflow

    NASA Astrophysics Data System (ADS)

    Zhang, Lina

    2014-04-01

    Benefit from the development of information and internet technologies, television broadcasting is transforming from inefficient tape-based production and distribution to integrated file-based workflows. However, no matter how many changes have took place, successful broadcasting still depends on the ability to deliver a consistent high quality signal to the audiences. After the transition from tape to file, traditional methods of manual quality control (QC) become inadequate, subjective, and inefficient. Based on China Central Television's full file-based workflow in the new site, this paper introduces an automated quality control test system for accurate detection of hidden troubles in media contents. It discusses the system framework and workflow control when the automated QC is added. It puts forward a QC criterion and brings forth a QC software followed this criterion. It also does some experiments on QC speed by adopting parallel processing and distributed computing. The performance of the test system shows that the adoption of automated QC can make the production effective and efficient, and help the station to achieve a competitive advantage in the media market.

  17. Quality assurance and quality control for thermal/optical analysis of aerosol samples for organic and elemental carbon.

    PubMed

    Chow, Judith C; Watson, John G; Robles, Jerome; Wang, Xiaoliang; Chen, L-W Antony; Trimble, Dana L; Kohl, Steven D; Tropp, Richard J; Fung, Kochy K

    2011-12-01

    Accurate, precise, and valid organic and elemental carbon (OC and EC, respectively) measurements require more effort than the routine analysis of ambient aerosol and source samples. This paper documents the quality assurance (QA) and quality control (QC) procedures that should be implemented to ensure consistency of OC and EC measurements. Prior to field sampling, the appropriate filter substrate must be selected and tested for sampling effectiveness. Unexposed filters are pre-fired to remove contaminants and acceptance tested. After sampling, filters must be stored in the laboratory in clean, labeled containers under refrigeration (<4 °C) to minimize loss of semi-volatile OC. QA activities include participation in laboratory accreditation programs, external system audits, and interlaboratory comparisons. For thermal/optical carbon analyses, periodic QC tests include calibration of the flame ionization detector with different types of carbon standards, thermogram inspection, replicate analyses, quantification of trace oxygen concentrations (<100 ppmv) in the helium atmosphere, and calibration of the sample temperature sensor. These established QA/QC procedures are applicable to aerosol sampling and analysis for carbon and other chemical components.

  18. Autonomous Quality Control of Joint Orientation Measured with Inertial Sensors.

    PubMed

    Lebel, Karina; Boissy, Patrick; Nguyen, Hung; Duval, Christian

    2016-07-05

    Clinical mobility assessment is traditionally performed in laboratories using complex and expensive equipment. The low accessibility to such equipment, combined with the emerging trend to assess mobility in a free-living environment, creates a need for body-worn sensors (e.g., inertial measurement units-IMUs) that are capable of measuring the complexity in motor performance using meaningful measurements, such as joint orientation. However, accuracy of joint orientation estimates using IMUs may be affected by environment, the joint tracked, type of motion performed and velocity. This study investigates a quality control (QC) process to assess the quality of orientation data based on features extracted from the raw inertial sensors' signals. Joint orientation (trunk, hip, knee, ankle) of twenty participants was acquired by an optical motion capture system and IMUs during a variety of tasks (sit, sit-to-stand transition, walking, turning) performed under varying conditions (speed, environment). An artificial neural network was used to classify good and bad sequences of joint orientation with a sensitivity and a specificity above 83%. This study confirms the possibility to perform QC on IMU joint orientation data based on raw signal features. This innovative QC approach may be of particular interest in a big data context, such as for remote-monitoring of patients' mobility.

  19. Development and prospective evaluation of an automated software system for quality control of quantitative 99mTc-MAG3 renal studies.

    PubMed

    Folks, Russell D; Garcia, Ernest V; Taylor, Andrew T

    2007-03-01

    Quantitative nuclear renography has numerous potential sources of error. We previously reported the initial development of a computer software module for comprehensively addressing the issue of quality control (QC) in the analysis of radionuclide renal images. The objective of this study was to prospectively test the QC software. The QC software works in conjunction with standard quantitative renal image analysis using a renal quantification program. The software saves a text file that summarizes QC findings as possible errors in user-entered values, calculated values that may be unreliable because of the patient's clinical condition, and problems relating to acquisition or processing. To test the QC software, a technologist not involved in software development processed 83 consecutive nontransplant clinical studies. The QC findings of the software were then tabulated. QC events were defined as technical (study descriptors that were out of range or were entered and then changed, unusually sized or positioned regions of interest, or missing frames in the dynamic image set) or clinical (calculated functional values judged to be erroneous or unreliable). Technical QC events were identified in 36 (43%) of 83 studies. Clinical QC events were identified in 37 (45%) of 83 studies. Specific QC events included starting the camera after the bolus had reached the kidney, dose infiltration, oversubtraction of background activity, and missing frames in the dynamic image set. QC software has been developed to automatically verify user input, monitor calculation of renal functional parameters, summarize QC findings, and flag potentially unreliable values for the nuclear medicine physician. Incorporation of automated QC features into commercial or local renal software can reduce errors and improve technologist performance and should improve the efficiency and accuracy of image interpretation.

  20. Quality Control in Primary Schools: Progress from 2001-2006

    ERIC Educational Resources Information Center

    Hofman, Roelande H.; de Boom, Jan; Hofman, W. H. Adriaan

    2010-01-01

    This article presents findings of research into the quality control (QC) of schools from 2001-2006. In 2001 several targets for QC were set and the progress of 939 primary schools is presented. Furthermore, using cluster analysis, schools are classified into four QC-types that differ in their focus on school (self) evaluation and school…

  1. Ongoing quality control in digital radiography: Report of AAPM Imaging Physics Committee Task Group 151.

    PubMed

    Jones, A Kyle; Heintz, Philip; Geiser, William; Goldman, Lee; Jerjian, Khachig; Martin, Melissa; Peck, Donald; Pfeiffer, Douglas; Ranger, Nicole; Yorkston, John

    2015-11-01

    Quality control (QC) in medical imaging is an ongoing process and not just a series of infrequent evaluations of medical imaging equipment. The QC process involves designing and implementing a QC program, collecting and analyzing data, investigating results that are outside the acceptance levels for the QC program, and taking corrective action to bring these results back to an acceptable level. The QC process involves key personnel in the imaging department, including the radiologist, radiologic technologist, and the qualified medical physicist (QMP). The QMP performs detailed equipment evaluations and helps with oversight of the QC program, the radiologic technologist is responsible for the day-to-day operation of the QC program. The continued need for ongoing QC in digital radiography has been highlighted in the scientific literature. The charge of this task group was to recommend consistency tests designed to be performed by a medical physicist or a radiologic technologist under the direction of a medical physicist to identify problems with an imaging system that need further evaluation by a medical physicist, including a fault tree to define actions that need to be taken when certain fault conditions are identified. The focus of this final report is the ongoing QC process, including rejected image analysis, exposure analysis, and artifact identification. These QC tasks are vital for the optimal operation of a department performing digital radiography.

  2. Ongoing quality control in digital radiography: Report of AAPM Imaging Physics Committee Task Group 151

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, A. Kyle, E-mail: kyle.jones@mdanderson.org; Geiser, William; Heintz, Philip

    Quality control (QC) in medical imaging is an ongoing process and not just a series of infrequent evaluations of medical imaging equipment. The QC process involves designing and implementing a QC program, collecting and analyzing data, investigating results that are outside the acceptance levels for the QC program, and taking corrective action to bring these results back to an acceptable level. The QC process involves key personnel in the imaging department, including the radiologist, radiologic technologist, and the qualified medical physicist (QMP). The QMP performs detailed equipment evaluations and helps with oversight of the QC program, the radiologic technologist ismore » responsible for the day-to-day operation of the QC program. The continued need for ongoing QC in digital radiography has been highlighted in the scientific literature. The charge of this task group was to recommend consistency tests designed to be performed by a medical physicist or a radiologic technologist under the direction of a medical physicist to identify problems with an imaging system that need further evaluation by a medical physicist, including a fault tree to define actions that need to be taken when certain fault conditions are identified. The focus of this final report is the ongoing QC process, including rejected image analysis, exposure analysis, and artifact identification. These QC tasks are vital for the optimal operation of a department performing digital radiography.« less

  3. Aircraft Measurements for Understanding Air-Sea Coupling and Improving Coupled Model Predictions Over the Indian Ocean

    DTIC Science & Technology

    2012-09-30

    briefing for aircraft operations in Diego Garcia, reports posted on EOL field catalog in realtime (http://catalog.eol.ucar.edu/cgi- bin/dynamo/report...index); • Dropsonde data processing on all P3 flights and realtime QC/reporting to GTS; and • Science summary of aircraft missions posted on EOL ...data analysis, worked with EOL on data quality control (QC), participated in the DYNAMO Sounding Workshop at EOL /NCAR from 6-7 February 2012

  4. An Automated Treatment Plan Quality Control Tool for Intensity-Modulated Radiation Therapy Using a Voxel-Weighting Factor-Based Re-Optimization Algorithm.

    PubMed

    Song, Ting; Li, Nan; Zarepisheh, Masoud; Li, Yongbao; Gautier, Quentin; Zhou, Linghong; Mell, Loren; Jiang, Steve; Cerviño, Laura

    2016-01-01

    Intensity-modulated radiation therapy (IMRT) currently plays an important role in radiotherapy, but its treatment plan quality can vary significantly among institutions and planners. Treatment plan quality control (QC) is a necessary component for individual clinics to ensure that patients receive treatments with high therapeutic gain ratios. The voxel-weighting factor-based plan re-optimization mechanism has been proved able to explore a larger Pareto surface (solution domain) and therefore increase the possibility of finding an optimal treatment plan. In this study, we incorporated additional modules into an in-house developed voxel weighting factor-based re-optimization algorithm, which was enhanced as a highly automated and accurate IMRT plan QC tool (TPS-QC tool). After importing an under-assessment plan, the TPS-QC tool was able to generate a QC report within 2 minutes. This QC report contains the plan quality determination as well as information supporting the determination. Finally, the IMRT plan quality can be controlled by approving quality-passed plans and replacing quality-failed plans using the TPS-QC tool. The feasibility and accuracy of the proposed TPS-QC tool were evaluated using 25 clinically approved cervical cancer patient IMRT plans and 5 manually created poor-quality IMRT plans. The results showed high consistency between the QC report quality determinations and the actual plan quality. In the 25 clinically approved cases that the TPS-QC tool identified as passed, a greater difference could be observed for dosimetric endpoints for organs at risk (OAR) than for planning target volume (PTV), implying that better dose sparing could be achieved in OAR than in PTV. In addition, the dose-volume histogram (DVH) curves of the TPS-QC tool re-optimized plans satisfied the dosimetric criteria more frequently than did the under-assessment plans. In addition, the criteria for unsatisfied dosimetric endpoints in the 5 poor-quality plans could typically be satisfied when the TPS-QC tool generated re-optimized plans without sacrificing other dosimetric endpoints. In addition to its feasibility and accuracy, the proposed TPS-QC tool is also user-friendly and easy to operate, both of which are necessary characteristics for clinical use.

  5. Improvement of the quality of work in a biochemistry laboratory via measurement system analysis.

    PubMed

    Chen, Ming-Shu; Liao, Chen-Mao; Wu, Ming-Hsun; Lin, Chih-Ming

    2016-10-31

    An adequate and continuous monitoring of operational variations can effectively reduce the uncertainty and enhance the quality of laboratory reports. This study applied the evaluation rule of the measurement system analysis (MSA) method to estimate the quality of work conducted in a biochemistry laboratory. Using the gauge repeatability & reproducibility (GR&R) approach, variations in quality control (QC) data among medical technicians in conducting measurements of five biochemical items, namely, serum glucose (GLU), aspartate aminotransferase (AST), uric acid (UA), sodium (Na) and chloride (Cl), were evaluated. The measurements of the five biochemical items showed different levels of variance among the different technicians, with the variances in GLU measurements being higher than those for the other four items. The ratios of precision-to-tolerance (P/T) for Na, Cl and GLU were all above 0.5, implying inadequate gauge capability. The product variation contribution of Na was large (75.45% and 31.24% in normal and abnormal QC levels, respectively), which showed that the impact of insufficient usage of reagents could not be excluded. With regard to reproducibility, high contributions (of more than 30%) of variation for the selected items were found. These high operator variation levels implied that the possibility of inadequate gauge capacity could not be excluded. The analysis of variance (ANOVA) of GR&R showed that the operator variations in GLU measurements were significant (F=5.296, P=0.001 in the normal level and F=3.399, P=0.015 in the abnormal level, respectively). In addition to operator variations, product variations of Na were also significant for both QC levels. The heterogeneity of variance for the five technicians showed significant differences for the Na and Cl measurements in the normal QC level. The accuracy of QC for five technicians was identified for further operational improvement. This study revealed that MSA can be used to evaluate product and personnel errors and to improve the quality of work in a biochemical laboratory through proper corrective actions.

  6. Adjustment of pesticide concentrations for temporal changes in analytical recovery, 1992–2010

    USGS Publications Warehouse

    Martin, Jeffrey D.; Eberle, Michael

    2011-01-01

    Recovery is the proportion of a target analyte that is quantified by an analytical method and is a primary indicator of the analytical bias of a measurement. Recovery is measured by analysis of quality-control (QC) water samples that have known amounts of target analytes added ("spiked" QC samples). For pesticides, recovery is the measured amount of pesticide in the spiked QC sample expressed as a percentage of the amount spiked, ideally 100 percent. Temporal changes in recovery have the potential to adversely affect time-trend analysis of pesticide concentrations by introducing trends in apparent environmental concentrations that are caused by trends in performance of the analytical method rather than by trends in pesticide use or other environmental conditions. This report presents data and models related to the recovery of 44 pesticides and 8 pesticide degradates (hereafter referred to as "pesticides") that were selected for a national analysis of time trends in pesticide concentrations in streams. Water samples were analyzed for these pesticides from 1992 through 2010 by gas chromatography/mass spectrometry. Recovery was measured by analysis of pesticide-spiked QC water samples. Models of recovery, based on robust, locally weighted scatterplot smooths (lowess smooths) of matrix spikes, were developed separately for groundwater and stream-water samples. The models of recovery can be used to adjust concentrations of pesticides measured in groundwater or stream-water samples to 100 percent recovery to compensate for temporal changes in the performance (bias) of the analytical method.

  7. Building a Quality Controlled Database of Meteorological Data from NASA Kennedy Space Center and the United States Air Force's Eastern Range

    NASA Technical Reports Server (NTRS)

    Brenton, James C.; Barbre. Robert E., Jr.; Decker, Ryan K.; Orcutt, John M.

    2018-01-01

    The National Aeronautics and Space Administration's (NASA) Marshall Space Flight Center (MSFC) Natural Environments Branch (EV44) has provided atmospheric databases and analysis in support of space vehicle design and day-of-launch operations for NASA and commercial launch vehicle programs launching from the NASA Kennedy Space Center (KSC), co-located on the United States Air Force's Eastern Range (ER) at the Cape Canaveral Air Force Station. The ER complex is one of the most heavily instrumented sites in the United States with over 31 towers measuring various atmospheric parameters on a continuous basis. An inherent challenge with large sets of data consists of ensuring erroneous data is removed from databases, and thus excluded from launch vehicle design analyses. EV44 has put forth great effort in developing quality control (QC) procedures for individual meteorological instruments, however no standard QC procedures for all databases currently exists resulting in QC databases that have inconsistencies in variables, methodologies, and periods of record. The goal of this activity is to use the previous efforts by EV44 to develop a standardized set of QC procedures from which to build meteorological databases from KSC and the ER, while maintaining open communication with end users from the launch community to develop ways to improve, adapt and grow the QC database. Details of the QC procedures will be described. As the rate of launches increases with additional launch vehicle programs, it is becoming more important that weather databases are continually updated and checked for data quality before use in launch vehicle design and certification analyses.

  8. Multileaf collimator performance monitoring and improvement using semiautomated quality control testing and statistical process control.

    PubMed

    Létourneau, Daniel; Wang, An; Amin, Md Nurul; Pearce, Jim; McNiven, Andrea; Keller, Harald; Norrlinger, Bernhard; Jaffray, David A

    2014-12-01

    High-quality radiation therapy using highly conformal dose distributions and image-guided techniques requires optimum machine delivery performance. In this work, a monitoring system for multileaf collimator (MLC) performance, integrating semiautomated MLC quality control (QC) tests and statistical process control tools, was developed. The MLC performance monitoring system was used for almost a year on two commercially available MLC models. Control charts were used to establish MLC performance and assess test frequency required to achieve a given level of performance. MLC-related interlocks and servicing events were recorded during the monitoring period and were investigated as indicators of MLC performance variations. The QC test developed as part of the MLC performance monitoring system uses 2D megavoltage images (acquired using an electronic portal imaging device) of 23 fields to determine the location of the leaves with respect to the radiation isocenter. The precision of the MLC performance monitoring QC test and the MLC itself was assessed by detecting the MLC leaf positions on 127 megavoltage images of a static field. After initial calibration, the MLC performance monitoring QC test was performed 3-4 times/week over a period of 10-11 months to monitor positional accuracy of individual leaves for two different MLC models. Analysis of test results was performed using individuals control charts per leaf with control limits computed based on the measurements as well as two sets of specifications of ± 0.5 and ± 1 mm. Out-of-specification and out-of-control leaves were automatically flagged by the monitoring system and reviewed monthly by physicists. MLC-related interlocks reported by the linear accelerator and servicing events were recorded to help identify potential causes of nonrandom MLC leaf positioning variations. The precision of the MLC performance monitoring QC test and the MLC itself was within ± 0.22 mm for most MLC leaves and the majority of the apparent leaf motion was attributed to beam spot displacements between irradiations. The MLC QC test was performed 193 and 162 times over the monitoring period for the studied units and recalibration had to be repeated up to three times on one of these units. For both units, rate of MLC interlocks was moderately associated with MLC servicing events. The strongest association with the MLC performance was observed between the MLC servicing events and the total number of out-of-control leaves. The average elapsed time for which the number of out-of-specification or out-of-control leaves was within a given performance threshold was computed and used to assess adequacy of MLC test frequency. A MLC performance monitoring system has been developed and implemented to acquire high-quality QC data at high frequency. This is enabled by the relatively short acquisition time for the images and automatic image analysis. The monitoring system was also used to record and track the rate of MLC-related interlocks and servicing events. MLC performances for two commercially available MLC models have been assessed and the results support monthly test frequency for widely accepted ± 1 mm specifications. Higher QC test frequency is however required to maintain tighter specification and in-control behavior.

  9. AfterQC: automatic filtering, trimming, error removing and quality control for fastq data.

    PubMed

    Chen, Shifu; Huang, Tanxiao; Zhou, Yanqing; Han, Yue; Xu, Mingyan; Gu, Jia

    2017-03-14

    Some applications, especially those clinical applications requiring high accuracy of sequencing data, usually have to face the troubles caused by unavoidable sequencing errors. Several tools have been proposed to profile the sequencing quality, but few of them can quantify or correct the sequencing errors. This unmet requirement motivated us to develop AfterQC, a tool with functions to profile sequencing errors and correct most of them, plus highly automated quality control and data filtering features. Different from most tools, AfterQC analyses the overlapping of paired sequences for pair-end sequencing data. Based on overlapping analysis, AfterQC can detect and cut adapters, and furthermore it gives a novel function to correct wrong bases in the overlapping regions. Another new feature is to detect and visualise sequencing bubbles, which can be commonly found on the flowcell lanes and may raise sequencing errors. Besides normal per cycle quality and base content plotting, AfterQC also provides features like polyX (a long sub-sequence of a same base X) filtering, automatic trimming and K-MER based strand bias profiling. For each single or pair of FastQ files, AfterQC filters out bad reads, detects and eliminates sequencer's bubble effects, trims reads at front and tail, detects the sequencing errors and corrects part of them, and finally outputs clean data and generates HTML reports with interactive figures. AfterQC can run in batch mode with multiprocess support, it can run with a single FastQ file, a single pair of FastQ files (for pair-end sequencing), or a folder for all included FastQ files to be processed automatically. Based on overlapping analysis, AfterQC can estimate the sequencing error rate and profile the error transform distribution. The results of our error profiling tests show that the error distribution is highly platform dependent. Much more than just another new quality control (QC) tool, AfterQC is able to perform quality control, data filtering, error profiling and base correction automatically. Experimental results show that AfterQC can help to eliminate the sequencing errors for pair-end sequencing data to provide much cleaner outputs, and consequently help to reduce the false-positive variants, especially for the low-frequency somatic mutations. While providing rich configurable options, AfterQC can detect and set all the options automatically and require no argument in most cases.

  10. [Development of quality assurance/quality control web system in radiotherapy].

    PubMed

    Okamoto, Hiroyuki; Mochizuki, Toshihiko; Yokoyama, Kazutoshi; Wakita, Akihisa; Nakamura, Satoshi; Ueki, Heihachi; Shiozawa, Keiko; Sasaki, Koji; Fuse, Masashi; Abe, Yoshihisa; Itami, Jun

    2013-12-01

    Our purpose is to develop a QA/QC (quality assurance/quality control) web system using a server-side script language such as HTML (HyperText Markup Language) and PHP (Hypertext Preprocessor), which can be useful as a tool to share information about QA/QC in radiotherapy. The system proposed in this study can be easily built in one's own institute, because HTML can be easily handled. There are two desired functions in a QA/QC web system: (i) To review the results of QA/QC for a radiotherapy machine, manuals, and reports necessary for routinely performing radiotherapy through this system. By disclosing the results, transparency can be maintained, (ii) To reveal a protocol for QA/QC in one's own institute using pictures and movies relating to QA/QC for simplicity's sake, which can also be used as an educational tool for junior radiation technologists and medical physicists. By using this system, not only administrators, but also all staff involved in radiotherapy, can obtain information about the conditions and accuracy of treatment machines through the QA/QC web system.

  11. Instrument Quality Control.

    PubMed

    Jayakody, Chatura; Hull-Ryde, Emily A

    2016-01-01

    Well-defined quality control (QC) processes are used to determine whether a certain procedure or action conforms to a widely accepted standard and/or set of guidelines, and are important components of any laboratory quality assurance program (Popa-Burke et al., J Biomol Screen 14: 1017-1030, 2009). In this chapter, we describe QC procedures useful for monitoring the accuracy and precision of laboratory instrumentation, most notably automated liquid dispensers. Two techniques, gravimetric QC and photometric QC, are highlighted in this chapter. When used together, these simple techniques provide a robust process for evaluating liquid handler accuracy and precision, and critically underpin high-quality research programs.

  12. Wireless multipoint communication for optical sensors in the industrial environment using the new Bluetooth standard

    NASA Astrophysics Data System (ADS)

    Hussmann, Stephan; Lau, Wing Y.; Chu, Terry; Grothof, Markus

    2003-07-01

    Traditionally, the measuring or monitoring system of manufacturing industries uses sensors, computers and screens for their quality control (Q.C.). The acquired information is fed back to the control room by wires, which - for obvious reason - are not suitable in many environments. This paper describes a method to solve this problem by employing the new Bluetooth technology to set up a complete new system, where a total wireless solution is made feasible. This new Q.C. system allows several line scan cameras to be connected at once to a graphical user interface (GUI) that can monitor the production process. There are many Bluetooth devices available on the market such as cell-phones, headsets, printers, PDA etc. However, the detailed application is a novel implementation in the industrial Q.C. area. This paper will contain more details about the Bluetooth standard and why it is used (nework topologies, host controller interface, data rates, etc.), the Bluetooth implemetation in the microcontroller of the line scan camera, and the GUI and its features.

  13. Building a QC Database of Meteorological Data from NASA KSC and the United States Air Force's Eastern Range

    NASA Technical Reports Server (NTRS)

    Brenton, J. C.; Barbre, R. E.; Decker, R. K.; Orcutt, J. M.

    2018-01-01

    The National Aeronautics and Space Administration's (NASA) Marshall Space Flight Center (MSFC) Natural Environments Branch (EV44) provides atmospheric databases and analysis in support of space vehicle design and day-of-launch operations for NASA and commercial launch vehicle programs launching from the NASA Kennedy Space Center (KSC), co-located on the United States Air Force's Eastern Range (ER) at the Cape Canaveral Air Force Station. The ER complex is one of the most heavily instrumented sites in the United States with over 31 towers measuring various atmospheric parameters on a continuous basis. An inherent challenge with large datasets consists of ensuring erroneous data are removed from databases, and thus excluded from launch vehicle design analyses. EV44 has put forth great effort in developing quality control (QC) procedures for individual meteorological instruments, however no standard QC procedures for all databases currently exists resulting in QC databases that have inconsistencies in variables, development methodologies, and periods of record. The goal of this activity is to use the previous efforts to develop a standardized set of QC procedures from which to build meteorological databases from KSC and the ER, while maintaining open communication with end users from the launch community to develop ways to improve, adapt and grow the QC database. Details of the QC procedures will be described. As the rate of launches increases with additional launch vehicle programs, It is becoming more important that weather databases are continually updated and checked for data quality before use in launch vehicle design and certification analyses.

  14. Critical outlook and trends for environmental reference materials at the Measurements & Testing Generic Activity (European Commission).

    PubMed

    Quevauviller, P; Bennink, D; Bøwadt, S

    2001-05-01

    It is now well recognised that the quality control (QC) of all types of analyses, including environmental analyses depends on the appropriate use of reference materials. One of the ways to check the accuracy of methods is based on the use of Certified Reference Materials (CRMs), whereas other types of (not certified) Reference Materials (RMs) are used for routine quality control (establishment of control charts) and interlaboratory testing (e.g. proficiency testing). The perception of these materials, in particular with respect to their production and use, differs widely according to various perspectives (e.g. RM producers, routine laboratories, researchers). This review discusses some critical aspects of RM use and production for the QC of environmental analyses and describes the new approach followed by the Measurements & Testing Generic Activity (European Commission) to tackle new research and production needs.

  15. SU-D-201-04: Evaluation of Elekta Agility MLC Performance Using Statistical Process Control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meyers, SM; Balderson, MJ; Letourneau, D

    2016-06-15

    Purpose: to evaluate the performance and stability of the Elekta Agility MLC model using an automated quality control (QC) test in combination with statistical process control tools. Methods: Leaf positions were collected daily for 11 Elekta units over 5–19 months using the automated QC test, which analyzes 23 MV images to determine the location of MLC leaves relative to the radiation isocenter. The leaf positions are measured at 5 nominal positions, and images are acquired at collimator 0° and 180° to capture all MLC leaves in the field-of-view. Leaf positioning accuracy was assessed using individual and moving range control charts.more » Control limits were recomputed following MLC recalibration (occurred 1–2 times for 4 units). Specification levels of ±0.5, ±1 and ±1.5mm were tested. The mean and range of duration between out-of-control and out-of-specification events were determined. Results: Leaf position varied little over time, as confirmed by very tight individual control limits (mean ±0.19mm, range 0.09–0.44). Mean leaf position error was −0.03mm (range −0.89–0.83). Due to sporadic out-of-control events, the mean in-control duration was 3.3 days (range 1–23). Data stayed within ±1mm specification for 205 days on average (range 3–372) and within ±1.5mm for the entire date range. Measurements stayed within ±0.5mm for 1 day on average (range 0–17); however, our MLC leaves were not calibrated to this level of accuracy. Conclusion: The Elekta Agility MLC model was found to perform with high stability, as evidenced by the tight control limits. The in-specification durations support the current recommendation of monthly MLC QC tests with a ±1mm tolerance. Future work is on-going to determine if Agility performance can be optimized further using high-frequency QC test results to drive recalibration frequency. Factors that can affect leaf positioning accuracy, including beam spot motion, leaf gain calibration, drifting leaves, and image artifacts, are under investigation.« less

  16. An Automated Treatment Plan Quality Control Tool for Intensity-Modulated Radiation Therapy Using a Voxel-Weighting Factor-Based Re-Optimization Algorithm

    PubMed Central

    Song, Ting; Li, Nan; Zarepisheh, Masoud; Li, Yongbao; Gautier, Quentin; Zhou, Linghong; Mell, Loren; Jiang, Steve; Cerviño, Laura

    2016-01-01

    Intensity-modulated radiation therapy (IMRT) currently plays an important role in radiotherapy, but its treatment plan quality can vary significantly among institutions and planners. Treatment plan quality control (QC) is a necessary component for individual clinics to ensure that patients receive treatments with high therapeutic gain ratios. The voxel-weighting factor-based plan re-optimization mechanism has been proved able to explore a larger Pareto surface (solution domain) and therefore increase the possibility of finding an optimal treatment plan. In this study, we incorporated additional modules into an in-house developed voxel weighting factor-based re-optimization algorithm, which was enhanced as a highly automated and accurate IMRT plan QC tool (TPS-QC tool). After importing an under-assessment plan, the TPS-QC tool was able to generate a QC report within 2 minutes. This QC report contains the plan quality determination as well as information supporting the determination. Finally, the IMRT plan quality can be controlled by approving quality-passed plans and replacing quality-failed plans using the TPS-QC tool. The feasibility and accuracy of the proposed TPS-QC tool were evaluated using 25 clinically approved cervical cancer patient IMRT plans and 5 manually created poor-quality IMRT plans. The results showed high consistency between the QC report quality determinations and the actual plan quality. In the 25 clinically approved cases that the TPS-QC tool identified as passed, a greater difference could be observed for dosimetric endpoints for organs at risk (OAR) than for planning target volume (PTV), implying that better dose sparing could be achieved in OAR than in PTV. In addition, the dose-volume histogram (DVH) curves of the TPS-QC tool re-optimized plans satisfied the dosimetric criteria more frequently than did the under-assessment plans. In addition, the criteria for unsatisfied dosimetric endpoints in the 5 poor-quality plans could typically be satisfied when the TPS-QC tool generated re-optimized plans without sacrificing other dosimetric endpoints. In addition to its feasibility and accuracy, the proposed TPS-QC tool is also user-friendly and easy to operate, both of which are necessary characteristics for clinical use. PMID:26930204

  17. The April 1994 and October 1994 radon intercomparisons at EML

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fisenne, I.M.; George, A.C.; Perry, P.M.

    1995-10-01

    Quality assurance/quality control (QA/QC) are the backbone of many commercial and research processes and programs. QA/QC research tests the state of a functioning system, be it the production of manufactured goods or the ability to make accurate and precise measurements. The quality of the radon measurements in the US have been tested under controlled conditions in semi-annual radon gas intercomparison exercises sponsored by the Environmental Measurements Laboratory (EML) since 1981. The two Calendar Year 1994 radon gas intercomparison exercises were conducted in the EML exposure chamber. Thirty-two groups including US Federal facilities, USDOE contractors, national and state laboratories, universities andmore » foreign institutions participated in these exercises. The majority of the participant`s results were within {+-}10% of the EML value at radon concentrations of 570 and 945 Bq m{sup {minus}3}.« less

  18. Measurement of pulmonary capillary blood flow in infants by plethysmography.

    PubMed Central

    Stocks, J; Costeloe, K; Winlove, C P; Godfrey, S

    1977-01-01

    An accurate method for measuring effective pulmonary capillary blood flow (Qc eff) in infants has been developed with an adaptation of the plethysmographic technique. Measurements were made on 19 preterm. 14 small-for-dates, and 7 fullterm normal infants with a constant volume whole body plethysmograph in which the infant rebreathed nitrous oxide. There was a highly significant correlation between Qc eff and body weight, and this relationship was unaffected by premature delivery or intrauterine growth retardation. Mean Qc eff in preterm, small-for dates, and fullterm infants was 203, 208 and 197 ml min-1 kg-1, respectively, with no significant differences between the groups. A significant negative correlation existed between Qc eff and haematocrit in the preterm infants. There was no relationship between weight standardized Qc eff and postnatal age in any of the groups. With this technique, it was possible to readily recognise the presence of rapid recirculation (indicative of shunting) in several of the infants, suggesting that rebreathing methods for the assessment of Qc eff should not be applied indiscriminately during the neonatal period. By taking care to overcome the potential sources of technical error, it was possible to obtain highly reproducible results of Qc eff in infants over a wider age range than has been previously reported. PMID:838861

  19. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, MIRATECH CORPORATIONM GECO 3001 AIR/FUEL RATIO CONTROLLER

    EPA Science Inventory

    Details on the verification test design, measurement test procedures, and Quality assurance/Quality Control (QA/QC) procedures can be found in the test plan titled Testing and Quality Assurance Plan, MIRATECH Corporation GECO 3100 Air/Fuel Ratio Controller (SRI 2001). It can be d...

  20. Quality control in urinalysis.

    PubMed

    Takubo, T; Tatsumi, N

    1999-01-01

    Quality control (QC) has been introduced in laboratories, and QC surveys in urinalysis have been performed by College of American Pathologist, by Japanese Association of Medical Technologists, by Osaka Medical Association and by manufacturers. QC survey in urinalysis for synthetic urine by the reagent strip and instrument made in same manufacturer, and by an automated urine cell analyser provided satisfactory results among laboratories. QC survey in urinalysis for synthetic urine by the reagent strips and instruments made by various manufacturers indicated differences in the determination values among manufacturers, and between manual and automated methods because the reagent strips and instruments have different characteristics, respectively. QC photo survey in urinalysis on the microscopic photos of urine sediment constituents indicated differences in the identification of cells among laboratories. From the results, it is necessary to standardize a reagent strip method, manual and automated methods, and synthetic urine.

  1. FQC Dashboard: integrates FastQC results into a web-based, interactive, and extensible FASTQ quality control tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Joseph; Pirrung, Meg; McCue, Lee Ann

    FQC is software that facilitates quality control of FASTQ files by carrying out a QC protocol using FastQC, parsing results, and aggregating quality metrics into an interactive dashboard designed to richly summarize individual sequencing runs. The dashboard groups samples in dropdowns for navigation among the data sets, utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data.

  2. FQC Dashboard: integrates FastQC results into a web-based, interactive, and extensible FASTQ quality control tool

    DOE PAGES

    Brown, Joseph; Pirrung, Meg; McCue, Lee Ann

    2017-06-09

    FQC is software that facilitates quality control of FASTQ files by carrying out a QC protocol using FastQC, parsing results, and aggregating quality metrics into an interactive dashboard designed to richly summarize individual sequencing runs. The dashboard groups samples in dropdowns for navigation among the data sets, utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data.

  3. 40 CFR 98.284 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... accounting purposes including direct measurement weighing the petroleum coke fed into your process (by belt... used to ensure the accuracy of monthly petroleum coke consumption measurements. (c) For CO2 process... quality assurance and quality control of the supplier data, you must conduct an annual measurement of the...

  4. Quantum key distribution using card, base station and trusted authority

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nordholt, Jane E.; Hughes, Richard John; Newell, Raymond Thorson

    Techniques and tools for quantum key distribution ("QKD") between a quantum communication ("QC") card, base station and trusted authority are described herein. In example implementations, a QC card contains a miniaturized QC transmitter and couples with a base station. The base station provides a network connection with the trusted authority and can also provide electric power to the QC card. When coupled to the base station, after authentication by the trusted authority, the QC card acquires keys through QKD with a trust authority. The keys can be used to set up secure communication, for authentication, for access control, or formore » other purposes. The QC card can be implemented as part of a smart phone or other mobile computing device, or the QC card can be used as a fillgun for distribution of the keys.« less

  5. Quantum key distribution using card, base station and trusted authority

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nordholt, Jane Elizabeth; Hughes, Richard John; Newell, Raymond Thorson

    Techniques and tools for quantum key distribution ("QKD") between a quantum communication ("QC") card, base station and trusted authority are described herein. In example implementations, a QC card contains a miniaturized QC transmitter and couples with a base station. The base station provides a network connection with the trusted authority and can also provide electric power to the QC card. When coupled to the base station, after authentication by the trusted authority, the QC card acquires keys through QKD with a trusted authority. The keys can be used to set up secure communication, for authentication, for access control, or formore » other purposes. The QC card can be implemented as part of a smart phone or other mobile computing device, or the QC card can be used as a fillgun for distribution of the keys.« less

  6. Multileaf collimator performance monitoring and improvement using semiautomated quality control testing and statistical process control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Létourneau, Daniel, E-mail: daniel.letourneau@rmp.uh.on.ca; McNiven, Andrea; Keller, Harald

    2014-12-15

    Purpose: High-quality radiation therapy using highly conformal dose distributions and image-guided techniques requires optimum machine delivery performance. In this work, a monitoring system for multileaf collimator (MLC) performance, integrating semiautomated MLC quality control (QC) tests and statistical process control tools, was developed. The MLC performance monitoring system was used for almost a year on two commercially available MLC models. Control charts were used to establish MLC performance and assess test frequency required to achieve a given level of performance. MLC-related interlocks and servicing events were recorded during the monitoring period and were investigated as indicators of MLC performance variations. Methods:more » The QC test developed as part of the MLC performance monitoring system uses 2D megavoltage images (acquired using an electronic portal imaging device) of 23 fields to determine the location of the leaves with respect to the radiation isocenter. The precision of the MLC performance monitoring QC test and the MLC itself was assessed by detecting the MLC leaf positions on 127 megavoltage images of a static field. After initial calibration, the MLC performance monitoring QC test was performed 3–4 times/week over a period of 10–11 months to monitor positional accuracy of individual leaves for two different MLC models. Analysis of test results was performed using individuals control charts per leaf with control limits computed based on the measurements as well as two sets of specifications of ±0.5 and ±1 mm. Out-of-specification and out-of-control leaves were automatically flagged by the monitoring system and reviewed monthly by physicists. MLC-related interlocks reported by the linear accelerator and servicing events were recorded to help identify potential causes of nonrandom MLC leaf positioning variations. Results: The precision of the MLC performance monitoring QC test and the MLC itself was within ±0.22 mm for most MLC leaves and the majority of the apparent leaf motion was attributed to beam spot displacements between irradiations. The MLC QC test was performed 193 and 162 times over the monitoring period for the studied units and recalibration had to be repeated up to three times on one of these units. For both units, rate of MLC interlocks was moderately associated with MLC servicing events. The strongest association with the MLC performance was observed between the MLC servicing events and the total number of out-of-control leaves. The average elapsed time for which the number of out-of-specification or out-of-control leaves was within a given performance threshold was computed and used to assess adequacy of MLC test frequency. Conclusions: A MLC performance monitoring system has been developed and implemented to acquire high-quality QC data at high frequency. This is enabled by the relatively short acquisition time for the images and automatic image analysis. The monitoring system was also used to record and track the rate of MLC-related interlocks and servicing events. MLC performances for two commercially available MLC models have been assessed and the results support monthly test frequency for widely accepted ±1 mm specifications. Higher QC test frequency is however required to maintain tighter specification and in-control behavior.« less

  7. Applying Sigma Metrics to Reduce Outliers.

    PubMed

    Litten, Joseph

    2017-03-01

    Sigma metrics can be used to predict assay quality, allowing easy comparison of instrument quality and predicting which tests will require minimal quality control (QC) rules to monitor the performance of the method. A Six Sigma QC program can result in fewer controls and fewer QC failures for methods with a sigma metric of 5 or better. The higher the number of methods with a sigma metric of 5 or better, the lower the costs for reagents, supplies, and control material required to monitor the performance of the methods. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. Single-mode, narrow-linewidth external cavity quantum cascade laser through optical feedback from a partial-reflector.

    PubMed

    Cendejas, Richard A; Phillips, Mark C; Myers, Tanya L; Taubman, Matthew S

    2010-12-06

    An external-cavity (EC) quantum cascade (QC) laser using optical feedback from a partial-reflector is reported. With this configuration, the otherwise multi-mode emission of a Fabry-Perot QC laser was made single-mode with optical output powers exceeding 40 mW. A mode-hop free tuning range of 2.46 cm(-1) was achieved by synchronously tuning the EC length and QC laser current. The linewidth of the partial-reflector EC-QC laser was measured for integration times from 100 μs to 4 seconds, and compared to a distributed feedback QC laser. Linewidths as small as 480 kHz were recorded for the EC-QC laser.

  9. Characterisation of imperial college reactor centre legacy waste using gamma-ray spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shuhaimi, Alif Imran Mohd

    Waste characterisation is a principal component in waste management strategy. The characterisation includes identification of chemical, physical and radiochemical parameters of radioactive waste. Failure to determine specific waste properties may result in sentencing waste packages which are not compliant with the regulation of long term storage or disposal. This project involved measurement of intensity and energy of gamma photons which may be emitted by radioactive waste generated during decommissioning of Imperial College Reactor Centre (ICRC). The measurement will use High Purity Germanium (HPGe) as Gamma-ray detector and ISOTOPIC-32 V4.1 as analyser. In order to ensure the measurements provide reliable results,more » two quality control (QC) measurements using difference matrices have been conducted. The results from QC measurements were used to determine the accuracy of the ISOTOPIC software.« less

  10. Effects of Data Quality on the Characterization of Aerosol Properties from Multiple Sensors

    NASA Technical Reports Server (NTRS)

    Petrenko, Maksym; Ichoku, Charles; Leptoukh, Gregory

    2011-01-01

    Cross-comparison of aerosol properties between ground-based and spaceborne measurements is an important validation technique that helps to investigate the uncertainties of aerosol products acquired using spaceborne sensors. However, it has been shown that even minor differences in the cross-characterization procedure may significantly impact the results of such validation. Of particular consideration is the quality assurance I quality control (QA/QC) information - an auxiliary data indicating a "confidence" level (e.g., Bad, Fair, Good, Excellent, etc.) conferred by the retrieval algorithms on the produced data. Depending on the treatment of available QA/QC information, a cross-characterization procedure has the potential of filtering out invalid data points, such as uncertain or erroneous retrievals, which tend to reduce the credibility of such comparisons. However, under certain circumstances, even high QA/QC values may not fully guarantee the quality of the data. For example, retrievals in proximity of a cloud might be particularly perplexing for an aerosol retrieval algorithm, resulting in an invalid data that, nonetheless, could be assigned a high QA/QC confidence. In this presentation, we will study the effects of several QA/QC parameters on cross-characterization of aerosol properties between the data acquired by multiple spaceborne sensors. We will utilize the Multi-sensor Aerosol Products Sampling System (MAPSS) that provides a consistent platform for multi-sensor comparison, including collocation with measurements acquired by the ground-based Aerosol Robotic Network (AERONET), The multi-sensor spaceborne data analyzed include those acquired by the Terra-MODIS, Aqua-MODIS, Terra-MISR, Aura-OMI, Parasol-POLDER, and CalipsoCALIOP satellite instruments.

  11. Diffusion imaging quality control via entropy of principal direction distribution.

    PubMed

    Farzinfar, Mahshid; Oguz, Ipek; Smith, Rachel G; Verde, Audrey R; Dietrich, Cheryl; Gupta, Aditya; Escolar, Maria L; Piven, Joseph; Pujol, Sonia; Vachet, Clement; Gouttard, Sylvain; Gerig, Guido; Dager, Stephen; McKinstry, Robert C; Paterson, Sarah; Evans, Alan C; Styner, Martin A

    2013-11-15

    Diffusion MR imaging has received increasing attention in the neuroimaging community, as it yields new insights into the microstructural organization of white matter that are not available with conventional MRI techniques. While the technology has enormous potential, diffusion MRI suffers from a unique and complex set of image quality problems, limiting the sensitivity of studies and reducing the accuracy of findings. Furthermore, the acquisition time for diffusion MRI is longer than conventional MRI due to the need for multiple acquisitions to obtain directionally encoded Diffusion Weighted Images (DWI). This leads to increased motion artifacts, reduced signal-to-noise ratio (SNR), and increased proneness to a wide variety of artifacts, including eddy-current and motion artifacts, "venetian blind" artifacts, as well as slice-wise and gradient-wise inconsistencies. Such artifacts mandate stringent Quality Control (QC) schemes in the processing of diffusion MRI data. Most existing QC procedures are conducted in the DWI domain and/or on a voxel level, but our own experiments show that these methods often do not fully detect and eliminate certain types of artifacts, often only visible when investigating groups of DWI's or a derived diffusion model, such as the most-employed diffusion tensor imaging (DTI). Here, we propose a novel regional QC measure in the DTI domain that employs the entropy of the regional distribution of the principal directions (PD). The PD entropy quantifies the scattering and spread of the principal diffusion directions and is invariant to the patient's position in the scanner. High entropy value indicates that the PDs are distributed relatively uniformly, while low entropy value indicates the presence of clusters in the PD distribution. The novel QC measure is intended to complement the existing set of QC procedures by detecting and correcting residual artifacts. Such residual artifacts cause directional bias in the measured PD and here called dominant direction artifacts. Experiments show that our automatic method can reliably detect and potentially correct such artifacts, especially the ones caused by the vibrations of the scanner table during the scan. The results further indicate the usefulness of this method for general quality assessment in DTI studies. Copyright © 2013 Elsevier Inc. All rights reserved.

  12. Diffusion imaging quality control via entropy of principal direction distribution

    PubMed Central

    Oguz, Ipek; Smith, Rachel G.; Verde, Audrey R.; Dietrich, Cheryl; Gupta, Aditya; Escolar, Maria L.; Piven, Joseph; Pujol, Sonia; Vachet, Clement; Gouttard, Sylvain; Gerig, Guido; Dager, Stephen; McKinstry, Robert C.; Paterson, Sarah; Evans, Alan C.; Styner, Martin A.

    2013-01-01

    Diffusion MR imaging has received increasing attention in the neuroimaging community, as it yields new insights into the microstructural organization of white matter that are not available with conventional MRI techniques. While the technology has enormous potential, diffusion MRI suffers from a unique and complex set of image quality problems, limiting the sensitivity of studies and reducing the accuracy of findings. Furthermore, the acquisition time for diffusion MRI is longer than conventional MRI due to the need for multiple acquisitions to obtain directionally encoded Diffusion Weighted Images (DWI). This leads to increased motion artifacts, reduced signal-to-noise ratio (SNR), and increased proneness to a wide variety of artifacts, including eddy-current and motion artifacts, “venetian blind” artifacts, as well as slice-wise and gradient-wise inconsistencies. Such artifacts mandate stringent Quality Control (QC) schemes in the processing of diffusion MRI data. Most existing QC procedures are conducted in the DWI domain and/or on a voxel level, but our own experiments show that these methods often do not fully detect and eliminate certain types of artifacts, often only visible when investigating groups of DWI's or a derived diffusion model, such as the most-employed diffusion tensor imaging (DTI). Here, we propose a novel regional QC measure in the DTI domain that employs the entropy of the regional distribution of the principal directions (PD). The PD entropy quantifies the scattering and spread of the principal diffusion directions and is invariant to the patient's position in the scanner. High entropy value indicates that the PDs are distributed relatively uniformly, while low entropy value indicates the presence of clusters in the PD distribution. The novel QC measure is intended to complement the existing set of QC procedures by detecting and correcting residual artifacts. Such residual artifacts cause directional bias in the measured PD and here called dominant direction artifacts. Experiments show that our automatic method can reliably detect and potentially correct such artifacts, especially the ones caused by the vibrations of the scanner table during the scan. The results further indicate the usefulness of this method for general quality assessment in DTI studies. PMID:23684874

  13. Machine-Specific Magnetic Resonance Imaging Quality Control Procedures for Stereotactic Radiosurgery Treatment Planning

    PubMed Central

    Taghizadeh, Somayeh; Yang, Claus Chunli; R. Kanakamedala, Madhava; Morris, Bart; Vijayakumar, Srinivasan

    2017-01-01

    Purpose Magnetic resonance (MR) images are necessary for accurate contouring of intracranial targets, determination of gross target volume and evaluation of organs at risk during stereotactic radiosurgery (SRS) treatment planning procedures. Many centers use magnetic resonance imaging (MRI) simulators or regular diagnostic MRI machines for SRS treatment planning; while both types of machine require two stages of quality control (QC), both machine- and patient-specific, before use for SRS, no accepted guidelines for such QC currently exist. This article describes appropriate machine-specific QC procedures for SRS applications. Methods and materials We describe the adaptation of American College of Radiology (ACR)-recommended QC tests using an ACR MRI phantom for SRS treatment planning. In addition, commercial Quasar MRID3D and Quasar GRID3D phantoms were used to evaluate the effects of static magnetic field (B0) inhomogeneity, gradient nonlinearity, and a Leksell G frame (SRS frame) and its accessories on geometrical distortion in MR images. Results QC procedures found in-plane distortions (Maximum = 3.5 mm, Mean = 0.91 mm, Standard deviation = 0.67 mm, >2.5 mm (%) = 2) in X-direction (Maximum = 2.51 mm, Mean = 0.52 mm, Standard deviation = 0.39 mm, > 2.5 mm (%) = 0) and in Y-direction (Maximum = 13. 1 mm , Mean = 2.38 mm, Standard deviation = 2.45 mm, > 2.5 mm (%) = 34) in Z-direction and < 1 mm distortion at a head-sized region of interest. MR images acquired using a Leksell G frame and localization devices showed a mean absolute deviation of 2.3 mm from isocenter. The results of modified ACR tests were all within recommended limits, and baseline measurements have been defined for regular weekly QC tests. Conclusions With appropriate QC procedures in place, it is possible to routinely obtain clinically useful MR images suitable for SRS treatment planning purposes. MRI examination for SRS planning can benefit from the improved localization and planning possible with the superior image quality and soft tissue contrast achieved under optimal conditions. PMID:29487771

  14. Machine-Specific Magnetic Resonance Imaging Quality Control Procedures for Stereotactic Radiosurgery Treatment Planning.

    PubMed

    Fatemi, Ali; Taghizadeh, Somayeh; Yang, Claus Chunli; R Kanakamedala, Madhava; Morris, Bart; Vijayakumar, Srinivasan

    2017-12-18

    Purpose Magnetic resonance (MR) images are necessary for accurate contouring of intracranial targets, determination of gross target volume and evaluation of organs at risk during stereotactic radiosurgery (SRS) treatment planning procedures. Many centers use magnetic resonance imaging (MRI) simulators or regular diagnostic MRI machines for SRS treatment planning; while both types of machine require two stages of quality control (QC), both machine- and patient-specific, before use for SRS, no accepted guidelines for such QC currently exist. This article describes appropriate machine-specific QC procedures for SRS applications. Methods and materials We describe the adaptation of American College of Radiology (ACR)-recommended QC tests using an ACR MRI phantom for SRS treatment planning. In addition, commercial Quasar MRID 3D and Quasar GRID 3D phantoms were used to evaluate the effects of static magnetic field (B 0 ) inhomogeneity, gradient nonlinearity, and a Leksell G frame (SRS frame) and its accessories on geometrical distortion in MR images. Results QC procedures found in-plane distortions (Maximum = 3.5 mm, Mean = 0.91 mm, Standard deviation = 0.67 mm, >2.5 mm (%) = 2) in X-direction (Maximum = 2.51 mm, Mean = 0.52 mm, Standard deviation = 0.39 mm, > 2.5 mm (%) = 0) and in Y-direction (Maximum = 13. 1 mm , Mean = 2.38 mm, Standard deviation = 2.45 mm, > 2.5 mm (%) = 34) in Z-direction and < 1 mm distortion at a head-sized region of interest. MR images acquired using a Leksell G frame and localization devices showed a mean absolute deviation of 2.3 mm from isocenter. The results of modified ACR tests were all within recommended limits, and baseline measurements have been defined for regular weekly QC tests. Conclusions With appropriate QC procedures in place, it is possible to routinely obtain clinically useful MR images suitable for SRS treatment planning purposes. MRI examination for SRS planning can benefit from the improved localization and planning possible with the superior image quality and soft tissue contrast achieved under optimal conditions.

  15. Embankment quality and assessment of moisture control implementation.

    DOT National Transportation Integrated Search

    2016-02-01

    A specification for contractor moisture quality control (QC) in roadway embankment construction has been in use for approximately 10 : years in Iowa on about 190 projects. The use of this QC specification and the development of the soils certificatio...

  16. The effect of genome-wide association scan quality control on imputation outcome for common variants.

    PubMed

    Southam, Lorraine; Panoutsopoulou, Kalliope; Rayner, N William; Chapman, Kay; Durrant, Caroline; Ferreira, Teresa; Arden, Nigel; Carr, Andrew; Deloukas, Panos; Doherty, Michael; Loughlin, John; McCaskie, Andrew; Ollier, William E R; Ralston, Stuart; Spector, Timothy D; Valdes, Ana M; Wallis, Gillian A; Wilkinson, J Mark; Marchini, Jonathan; Zeggini, Eleftheria

    2011-05-01

    Imputation is an extremely valuable tool in conducting and synthesising genome-wide association studies (GWASs). Directly typed SNP quality control (QC) is thought to affect imputation quality. It is, therefore, common practise to use quality-controlled (QCed) data as an input for imputing genotypes. This study aims to determine the effect of commonly applied QC steps on imputation outcomes. We performed several iterations of imputing SNPs across chromosome 22 in a dataset consisting of 3177 samples with Illumina 610 k (Illumina, San Diego, CA, USA) GWAS data, applying different QC steps each time. The imputed genotypes were compared with the directly typed genotypes. In addition, we investigated the correlation between alternatively QCed data. We also applied a series of post-imputation QC steps balancing elimination of poorly imputed SNPs and information loss. We found that the difference between the unQCed data and the fully QCed data on imputation outcome was minimal. Our study shows that imputation of common variants is generally very accurate and robust to GWAS QC, which is not a major factor affecting imputation outcome. A minority of common-frequency SNPs with particular properties cannot be accurately imputed regardless of QC stringency. These findings may not generalise to the imputation of low frequency and rare variants.

  17. Assays for Qualification and Quality Stratification of Clinical Biospecimens Used in Research: A Technical Report from the ISBER Biospecimen Science Working Group.

    PubMed

    Betsou, Fay; Bulla, Alexandre; Cho, Sang Yun; Clements, Judith; Chuaqui, Rodrigo; Coppola, Domenico; De Souza, Yvonne; De Wilde, Annemieke; Grizzle, William; Guadagni, Fiorella; Gunter, Elaine; Heil, Stacey; Hodgkinson, Verity; Kessler, Joseph; Kiehntopf, Michael; Kim, Hee Sung; Koppandi, Iren; Shea, Katheryn; Singh, Rajeev; Sobel, Marc; Somiari, Stella; Spyropoulos, Demetri; Stone, Mars; Tybring, Gunnel; Valyi-Nagy, Klara; Van den Eynden, Gert; Wadhwa, Lalita

    2016-10-01

    This technical report presents quality control (QC) assays that can be performed in order to qualify clinical biospecimens that have been biobanked for use in research. Some QC assays are specific to a disease area. Some QC assays are specific to a particular downstream analytical platform. When such a qualification is not possible, QC assays are presented that can be performed to stratify clinical biospecimens according to their biomolecular quality.

  18. Assays for Qualification and Quality Stratification of Clinical Biospecimens Used in Research: A Technical Report from the ISBER Biospecimen Science Working Group

    PubMed Central

    Bulla, Alexandre; Cho, Sang Yun; Clements, Judith; Chuaqui, Rodrigo; Coppola, Domenico; De Souza, Yvonne; De Wilde, Annemieke; Grizzle, William; Guadagni, Fiorella; Gunter, Elaine; Heil, Stacey; Hodgkinson, Verity; Kessler, Joseph; Kiehntopf, Michael; Kim, Hee Sung; Koppandi, Iren; Shea, Katheryn; Singh, Rajeev; Sobel, Marc; Somiari, Stella; Spyropoulos, Demetri; Stone, Mars; Tybring, Gunnel; Valyi-Nagy, Klara; Van den Eynden, Gert; Wadhwa, Lalita

    2016-01-01

    This technical report presents quality control (QC) assays that can be performed in order to qualify clinical biospecimens that have been biobanked for use in research. Some QC assays are specific to a disease area. Some QC assays are specific to a particular downstream analytical platform. When such a qualification is not possible, QC assays are presented that can be performed to stratify clinical biospecimens according to their biomolecular quality. PMID:27046294

  19. Genome measures used for quality control are dependent on gene function and ancestry.

    PubMed

    Wang, Jing; Raskin, Leon; Samuels, David C; Shyr, Yu; Guo, Yan

    2015-02-01

    The transition/transversion (Ti/Tv) ratio and heterozygous/nonreference-homozygous (het/nonref-hom) ratio have been commonly computed in genetic studies as a quality control (QC) measurement. Additionally, these two ratios are helpful in our understanding of the patterns of DNA sequence evolution. To thoroughly understand these two genomic measures, we performed a study using 1000 Genomes Project (1000G) released genotype data (N=1092). An additional two datasets (N=581 and N=6) were used to validate our findings from the 1000G dataset. We compared the two ratios among continental ancestry, genome regions and gene functionality. We found that the Ti/Tv ratio can be used as a quality indicator for single nucleotide polymorphisms inferred from high-throughput sequencing data. The Ti/Tv ratio varies greatly by genome region and functionality, but not by ancestry. The het/nonref-hom ratio varies greatly by ancestry, but not by genome regions and functionality. Furthermore, extreme guanine + cytosine content (either high or low) is negatively associated with the Ti/Tv ratio magnitude. Thus, when performing QC assessment using these two measures, care must be taken to apply the correct thresholds based on ancestry and genome region. Failure to take these considerations into account at the QC stage will bias any following analysis. yan.guo@vanderbilt.edu Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  20. Quality control of CT systems by automated monitoring of key performance indicators: a two-year study.

    PubMed

    Nowik, Patrik; Bujila, Robert; Poludniowski, Gavin; Fransson, Annette

    2015-07-08

    The purpose of this study was to develop a method of performing routine periodical quality controls (QC) of CT systems by automatically analyzing key performance indicators (KPIs), obtainable from images of manufacturers' quality assurance (QA) phantoms. A KPI pertains to a measurable or determinable QC parameter that is influenced by other underlying fundamental QC parameters. The established KPIs are based on relationships between existing QC parameters used in the annual testing program of CT scanners at the Karolinska University Hospital in Stockholm, Sweden. The KPIs include positioning, image noise, uniformity, homogeneity, the CT number of water, and the CT number of air. An application (MonitorCT) was developed to automatically evaluate phantom images in terms of the established KPIs. The developed methodology has been used for two years in clinical routine, where CT technologists perform daily scans of the manufacturer's QA phantom and automatically send the images to MonitorCT for KPI evaluation. In the cases where results were out of tolerance, actions could be initiated in less than 10 min. 900 QC scans from two CT scanners have been collected and analyzed over the two-year period that MonitorCT has been active. Two types of errors have been registered in this period: a ring artifact was discovered with the image noise test, and a calibration error was detected multiple times with the CT number test. In both cases, results were outside the tolerances defined for MonitorCT, as well as by the vendor. Automated monitoring of KPIs is a powerful tool that can be used to supplement established QC methodologies. Medical physicists and other professionals concerned with the performance of a CT system will, using such methods, have access to comprehensive data on the current and historical (trend) status of the system such that swift actions can be taken in order to ensure the quality of the CT examinations, patient safety, and minimal disruption of service.

  1. 77 FR 75968 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-26

    ... information unless it displays a currently valid OMB control number. Food and Nutrition Service Title: Quality... required to perform Quality Control (QC) review for the Supplemental Nutrition Assistance Program (SNAP). The FNS-380-1, Quality Control Review Schedule is for State use to collect both QC data and case...

  2. A real-time automated quality control of rain gauge data based on multiple sensors

    NASA Astrophysics Data System (ADS)

    qi, Y.; Zhang, J.

    2013-12-01

    Precipitation is one of the most important meteorological and hydrological variables. Automated rain gauge networks provide direct measurements of precipitation and have been used for numerous applications such as generating regional and national precipitation maps, calibrating remote sensing data, and validating hydrological and meteorological model predictions. Automated gauge observations are prone to a variety of error sources (instrument malfunction, transmission errors, format changes), and require careful quality controls (QC). Many previous gauge QC techniques were based on neighborhood checks within the gauge network itself and the effectiveness is dependent on gauge densities and precipitation regimes. The current study takes advantage of the multi-sensor data sources in the National Mosaic and Multi-Sensor QPE (NMQ/Q2) system and developes an automated gauge QC scheme based the consistency of radar hourly QPEs and gauge observations. Error characteristics of radar and gauge as a function of the radar sampling geometry, precipitation regimes, and the freezing level height are considered. The new scheme was evaluated by comparing an NMQ national gauge-based precipitation product with independent manual gauge observations. Twelve heavy rainfall events from different seasons and areas of the United States are selected for the evaluation, and the results show that the new NMQ product with QC'ed gauges has a more physically spatial distribution than the old product. And the new product agrees much better statistically with the independent gauges.

  3. Development of a quality control test procedure for characterizing fracture properties of asphalt mixtures.

    DOT National Transportation Integrated Search

    2011-06-01

    The main objective of this study is to investigate the use of the semi-circular bend (SCB) : test as a quality assurance/quality control (QA/QC) measure for field construction. : Comparison of fracture properties from the SCB test and fatigue beam te...

  4. WE-AB-206-00: Diagnostic QA/QC Hands-On Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    The involvement of medical physicists in diagnostic ultrasound imaging service is increasing due to QC and accreditation requirements. The goal of this ultrasound hands-on workshop is to demonstrate quality control (QC) testing in diagnostic ultrasound and to provide updates in ACR ultrasound accreditation requirements. The first half of this workshop will include two presentations reviewing diagnostic ultrasound QA/QC and ACR ultrasound accreditation requirements. The second half of the workshop will include live demonstrations of basic QC tests. An array of ultrasound testing phantoms and ultrasound scanners will be available for attendees to learn diagnostic ultrasound QC in a hands-on environmentmore » with live demonstrations and on-site instructors. The targeted attendees are medical physicists in diagnostic imaging. Learning Objectives: Gain familiarity with common elements of a QA/QC program for diagnostic ultrasound imaging dentify QC tools available for testing diagnostic ultrasound systems and learn how to use these tools Learn ACR ultrasound accreditation requirements Jennifer Walter is an employee of American College of Radiology on Ultrasound Accreditation.« less

  5. Robust modular product family design

    NASA Astrophysics Data System (ADS)

    Jiang, Lan; Allada, Venkat

    2001-10-01

    This paper presents a modified Taguchi methodology to improve the robustness of modular product families against changes in customer requirements. The general research questions posed in this paper are: (1) How to effectively design a product family (PF) that is robust enough to accommodate future customer requirements. (2) How far into the future should designers look to design a robust product family? An example of a simplified vacuum product family is used to illustrate our methodology. In the example, customer requirements are selected as signal factors; future changes of customer requirements are selected as noise factors; an index called quality characteristic (QC) is set to evaluate the product vacuum family; and the module instance matrix (M) is selected as control factor. Initially a relation between the objective function (QC) and the control factor (M) is established, and then the feasible M space is systemically explored using a simplex method to determine the optimum M and the corresponding QC values. Next, various noise levels at different time points are introduced into the system. For each noise level, the optimal values of M and QC are computed and plotted on a QC-chart. The tunable time period of the control factor (the module matrix, M) is computed using the QC-chart. The tunable time period represents the maximum time for which a given control factor can be used to satisfy current and future customer needs. Finally, a robustness index is used to break up the tunable time period into suitable time periods that designers should consider while designing product families.

  6. 40 CFR 98.424 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Carbon Dioxide § 98.424 Monitoring and QA/QC... containers shall measure the mass in each CO2 container using weigh bills, scales, or load cells and sum the...

  7. Integrative Blood Pressure Response to Upright Tilt Post Renal Denervation

    PubMed Central

    Howden, Erin J.; East, Cara; Lawley, Justin S.; Stickford, Abigail S.L.; Verhees, Myrthe; Fu, Qi

    2017-01-01

    Abstract BACKGROUND Whether renal denervation (RDN) in patients with resistant hypertension normalizes blood pressure (BP) regulation in response to routine cardiovascular stimuli such as upright posture is unknown. We conducted an integrative study of BP regulation in patients with resistant hypertension who had received RDN to characterize autonomic circulatory control. METHODS Twelve patients (60 ± 9 [SD] years, n = 10 males) who participated in the Symplicity HTN-3 trial were studied and compared to 2 age-matched normotensive (Norm) and hypertensive (unmedicated, HTN) control groups. BP, heart rate (HR), cardiac output (Qc), muscle sympathetic nerve activity (MSNA), and neurohormonal variables were measured supine, and 30° (5 minutes) and 60° (20 minutes) head-up-tilt (HUT). Total peripheral resistance (TPR) was calculated from mean arterial pressure and Qc. RESULTS Despite treatment with RDN and 4.8 (range, 3–7) antihypertensive medications, the RDN had significantly higher supine systolic BP compared to Norm and HTN (149 ± 15 vs. 118 ± 6, 108 ± 8 mm Hg, P < 0.001). When supine, RDN had higher HR, TPR, MSNA, plasma norepinephrine, and effective arterial elastance compared to Norm. Plasma norepinephrine, Qc, and HR were also higher in the RDN vs. HTN. During HUT, BP remained higher in the RDN, due to increases in Qc, plasma norepinephrine, and aldosterone. CONCLUSION We provide evidence of a possible mechanism by which BP remains elevated post RDN, with the observation of increased Qc and arterial stiffness, as well as plasma norepinephrine and aldosterone levels at approximately 2 years post treatment. These findings may be the consequence of incomplete ablation of sympathetic renal nerves or be related to other factors. PMID:28338768

  8. WE-AB-206-03: Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Z.

    The involvement of medical physicists in diagnostic ultrasound imaging service is increasing due to QC and accreditation requirements. The goal of this ultrasound hands-on workshop is to demonstrate quality control (QC) testing in diagnostic ultrasound and to provide updates in ACR ultrasound accreditation requirements. The first half of this workshop will include two presentations reviewing diagnostic ultrasound QA/QC and ACR ultrasound accreditation requirements. The second half of the workshop will include live demonstrations of basic QC tests. An array of ultrasound testing phantoms and ultrasound scanners will be available for attendees to learn diagnostic ultrasound QC in a hands-on environmentmore » with live demonstrations and on-site instructors. The targeted attendees are medical physicists in diagnostic imaging. Learning Objectives: Gain familiarity with common elements of a QA/QC program for diagnostic ultrasound imaging dentify QC tools available for testing diagnostic ultrasound systems and learn how to use these tools Learn ACR ultrasound accreditation requirements Jennifer Walter is an employee of American College of Radiology on Ultrasound Accreditation.« less

  9. WE-AB-206-01: Diagnostic Ultrasound Imaging Quality Assurance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zagzebski, J.

    The involvement of medical physicists in diagnostic ultrasound imaging service is increasing due to QC and accreditation requirements. The goal of this ultrasound hands-on workshop is to demonstrate quality control (QC) testing in diagnostic ultrasound and to provide updates in ACR ultrasound accreditation requirements. The first half of this workshop will include two presentations reviewing diagnostic ultrasound QA/QC and ACR ultrasound accreditation requirements. The second half of the workshop will include live demonstrations of basic QC tests. An array of ultrasound testing phantoms and ultrasound scanners will be available for attendees to learn diagnostic ultrasound QC in a hands-on environmentmore » with live demonstrations and on-site instructors. The targeted attendees are medical physicists in diagnostic imaging. Learning Objectives: Gain familiarity with common elements of a QA/QC program for diagnostic ultrasound imaging dentify QC tools available for testing diagnostic ultrasound systems and learn how to use these tools Learn ACR ultrasound accreditation requirements Jennifer Walter is an employee of American College of Radiology on Ultrasound Accreditation.« less

  10. Ensuring the reliability of stable isotope ratio data--beyond the principle of identical treatment.

    PubMed

    Carter, J F; Fry, B

    2013-03-01

    The need for inter-laboratory comparability is crucial to facilitate the globalisation of scientific networks and the development of international databases to support scientific and criminal investigations. This article considers what lessons can be learned from a series of inter-laboratory comparison exercises organised by the Forensic Isotope Ratio Mass Spectrometry (FIRMS) network in terms of reference materials (RMs), the management of data quality, and technical limitations. The results showed that within-laboratory precision (repeatability) was generally good but between-laboratory accuracy (reproducibility) called for improvements. This review considers how stable isotope laboratories can establish a system of quality control (QC) and quality assurance (QA), emphasising issues of repeatability and reproducibility. For results to be comparable between laboratories, measurements must be traceable to the international δ-scales and, because isotope ratio measurements are reported relative to standards, a key aspect is the correct selection, calibration, and use of international and in-house RMs. The authors identify four principles which promote good laboratory practice. The principle of identical treatment by which samples and RMs are processed in an identical manner and which incorporates three further principles; the principle of identical correction (by which necessary corrections are identified and evenly applied), the principle of identical scaling (by which data are shifted and stretched to the international δ-scales), and the principle of error detection by which QC and QA results are monitored and acted upon. To achieve both good repeatability and good reproducibility it is essential to obtain RMs with internationally agreed δ-values. These RMs will act as the basis for QC and can be used to calibrate further in-house QC RMs tailored to the activities of specific laboratories. In-house QA standards must also be developed to ensure that QC-based calibrations and corrections lead to accurate results for samples. The δ-values assigned to RMs must be recorded and reported with all data. Reference materials must be used to determine what corrections are necessary for measured data. Each analytical sequence of samples must include both QC and QA materials which are subject to identical treatment during measurement and data processing. Results for these materials must be plotted, monitored, and acted upon. Periodically international RMs should be analysed as an in-house proficiency test to demonstrate results are accurate.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Angers, Crystal Plume; Bottema, Ryan; Buckley, Les

    Purpose: Treatment unit uptime statistics are typically used to monitor radiation equipment performance. The Ottawa Hospital Cancer Centre has introduced the use of Quality Control (QC) test success as a quality indicator for equipment performance and overall health of the equipment QC program. Methods: Implemented in 2012, QATrack+ is used to record and monitor over 1100 routine machine QC tests each month for 20 treatment and imaging units ( http://qatrackplus.com/ ). Using an SQL (structured query language) script, automated queries of the QATrack+ database are used to generate program metrics such as the number of QC tests executed and themore » percentage of tests passing, at tolerance or at action. These metrics are compared against machine uptime statistics already reported within the program. Results: Program metrics for 2015 show good correlation between pass rate of QC tests and uptime for a given machine. For the nine conventional linacs, the QC test success rate was consistently greater than 97%. The corresponding uptimes for these units are better than 98%. Machines that consistently show higher failure or tolerance rates in the QC tests have lower uptimes. This points to either poor machine performance requiring corrective action or to problems with the QC program. Conclusions: QATrack+ significantly improves the organization of QC data but can also aid in overall equipment management. Complimenting machine uptime statistics with QC test metrics provides a more complete picture of overall machine performance and can be used to identify areas of improvement in the machine service and QC programs.« less

  12. Quality control in urodynamics and the role of software support in the QC procedure.

    PubMed

    Hogan, S; Jarvis, P; Gammie, A; Abrams, P

    2011-11-01

    This article aims to identify quality control (QC) best practice, to review published QC audits in order to identify how closely good practice is followed, and to carry out a market survey of the software features that support QC offered by urodynamics machines available in the UK. All UK distributors of urodynamic systems were contacted and asked to provide information on the software features relating to data quality of the products they supply. The results of the market survey show that the features offered by manufacturers differ greatly. Automated features, which can be turned off in most cases, include: cough recognition, detrusor contraction detection, and high pressure alerts. There are currently no systems that assess data quality based on published guidelines. A literature review of current QC guidelines for urodynamics was carried out; QC audits were included in the literature review to see how closely guidelines were being followed. This review highlights the fact that basic QC is not being carried out effectively by urodynamicists. Based on the software features currently available and the results of the literature review there is both the need and capacity for a greater degree of automation in relation to urodynamic data quality and accuracy assessment. Some progress has been made in this area and certain manufacturers have already developed automated cough detection. Copyright © 2011 Wiley Periodicals, Inc.

  13. 40 CFR 98.74 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Relative Molecular Mass of Petroleum Oils from Viscosity Measurements (incorporated by reference, see § 98... Weight) of Hydrocarbons by Thermoelectric Measurement of Vapor Pressure (incorporated by reference, see... measurements according to the monitoring and QA/QC requirements for the Tier 3 methodology in § 98.34(b). (e...

  14. Quality control management and communication between radiologists and technologists.

    PubMed

    Nagy, Paul G; Pierce, Benjamin; Otto, Misty; Safdar, Nabile M

    2008-06-01

    The greatest barrier to quality control (QC) in the digital imaging environment is the lack of communication and documentation between those who interpret images and those who acquire them. Paper-based QC methods are insufficient in a digital image management system. Problem work flow must be incorporated into reengineering efforts when migrating to a digital practice. The authors implemented a Web-based QC feedback tool to document and facilitate the communication of issues identified by radiologists. The goal was to promote a responsive and constructive tool that contributes to a culture of quality. The hypothesis was that by making it easier for radiologists to submit quality issues, the number of QC issues submitted would increase. The authors integrated their Web-based quality tracking system with a clinical picture archiving and communication system so that radiologists could report quality issues without disrupting clinical work flow. Graphical dashboarding techniques aid supervisors in using this database to identify the root causes of different types of issues. Over the initial 12-month rollout period, starting in the general section, the authors recorded 20 times more QC issues submitted by radiologists, accompanied by a rise in technologists' responsiveness to QC issues. For technologists with high numbers of QC issues, the incorporation of data from this tracking system proved useful in performance appraisals and in driving individual improvement. This tool is an example of the types of information technology innovations that can be leveraged to support QC in the digital imaging environment. Initial data suggest that the result is not only an improvement in quality but higher levels of satisfaction for both radiologists and technologists.

  15. THE MAQC PROJECT: ESTABLISHING QC METRICS AND THRESHOLDS FOR MICROARRAY QUALITY CONTROL

    EPA Science Inventory

    Microarrays represent a core technology in pharmacogenomics and toxicogenomics; however, before this technology can successfully and reliably be applied in clinical practice and regulatory decision-making, standards and quality measures need to be developed. The Microarray Qualit...

  16. Results-driven approach to improving quality and productivity

    Treesearch

    John Dramm

    2000-01-01

    Quality control (QC) programs do not often realize their full potential. Elaborate and expensive QC programs can easily get side tracked by the process of building a program with promises of “Someday, this will all pay off.” Training employees in QC methods is no guarantee that quality will improve. Several documented cases show that such activity-centered efforts...

  17. Building a QC Database of Meteorological Data From NASA KSC and the United States Air Force's Eastern Range

    NASA Technical Reports Server (NTRS)

    Brenton, James C.; Barbre, Robert E.; Orcutt, John M.; Decker, Ryan K.

    2018-01-01

    The National Aeronautics and Space Administration's (NASA) Marshall Space Flight Center (MSFC) Natural Environments Branch (EV44) has provided atmospheric databases and analysis in support of space vehicle design and day-of-launch operations for NASA and commercial launch vehicle programs launching from the NASA Kennedy Space Center (KSC), co-located on the United States Air Force's Eastern Range (ER) at the Cape Canaveral Air Force Station. The ER is one of the most heavily instrumented sites in the United States measuring various atmospheric parameters on a continuous basis. An inherent challenge with the large databases that EV44 receives from the ER consists of ensuring erroneous data are removed from the databases, and thus excluded from launch vehicle design analyses. EV44 has put forth great effort in developing quality control (QC) procedures for individual meteorological instruments; however, no standard QC procedures for all databases currently exist resulting in QC databases that have inconsistencies in variables, methodologies, and periods of record. The goal of this activity is to use the previous efforts by EV44 to develop a standardized set of QC procedures from which to build flags within the meteorological databases from KSC and the ER, while maintaining open communication with end users from the launch community to develop ways to improve, adapt and grow the QC database. Details of the QC checks are described. The flagged data points will be plotted in a graphical user interface (GUI) as part of a manual confirmation that the flagged data do indeed need to be removed from the archive. As the rate of launches increases with additional launch vehicle programs, more emphasis is being placed to continually update and check weather databases for data quality before use in launch vehicle design and certification analyses.

  18. Individualized Quality Control Plan (IQCP): Is It Value-Added for Clinical Microbiology?

    PubMed

    Sharp, Susan E; Miller, Melissa B; Hindler, Janet

    2015-12-01

    The Center for Medicaid and Medicare Services (CMS) recently published their Individualized Quality Control Plan (IQCP [https://www.cms.gov/regulations-and-guidance/legislation/CLIA/Individualized_Quality_Control_Plan_IQCP.html]), which will be the only option for quality control (QC) starting in January 2016 if laboratories choose not to perform Clinical Laboratory Improvement Act (CLIA) [U.S. Statutes at Large 81(1967):533] default QC. Laboratories will no longer be able to use "equivalent QC" (EQC) or the Clinical and Laboratory Standards Institute (CLSI) standards alone for quality control of their microbiology systems. The implementation of IQCP in clinical microbiology laboratories will most certainly be an added burden, the benefits of which are currently unknown. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  19. Measurement of the quantum capacitance from two-dimensional surface state of a topological insulator at room temperature

    NASA Astrophysics Data System (ADS)

    Choi, Hyunwoo; Kim, Tae Geun; Shin, Changhwan

    2017-06-01

    A topological insulator (TI) is a new kind of material that exhibits unique electronic properties owing to its topological surface state (TSS). Previous studies focused on the transport properties of the TSS, since it can be used as the active channel layer in metal-oxide-semiconductor field-effect transistors (MOSFETs). However, a TI with a negative quantum capacitance (QC) effect can be used in the gate stack of MOSFETs, thereby facilitating the creation of ultra-low power electronics. Therefore, it is important to study the physics behind the QC in TIs in the absence of any external magnetic field, at room temperature. We fabricated a simple capacitor structure using a TI (TI-capacitor: Au-TI-SiO2-Si), which shows clear evidence of QC at room temperature. In the capacitance-voltage (C-V) measurement, the total capacitance of the TI-capacitor increases in the accumulation regime, since QC is the dominant capacitive component in the series capacitor model (i.e., CT-1 = CQ-1 + CSiO2-1). Based on the QC model of the two-dimensional electron systems, we quantitatively calculated the QC, and observed that the simulated C-V curve theoretically supports the conclusion that the QC of the TI-capacitor is originated from electron-electron interaction in the two-dimensional surface state of the TI.

  20. WE-AB-206-02: ACR Ultrasound Accreditation: Requirements and Pitfalls

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walter, J.

    The involvement of medical physicists in diagnostic ultrasound imaging service is increasing due to QC and accreditation requirements. The goal of this ultrasound hands-on workshop is to demonstrate quality control (QC) testing in diagnostic ultrasound and to provide updates in ACR ultrasound accreditation requirements. The first half of this workshop will include two presentations reviewing diagnostic ultrasound QA/QC and ACR ultrasound accreditation requirements. The second half of the workshop will include live demonstrations of basic QC tests. An array of ultrasound testing phantoms and ultrasound scanners will be available for attendees to learn diagnostic ultrasound QC in a hands-on environmentmore » with live demonstrations and on-site instructors. The targeted attendees are medical physicists in diagnostic imaging. Learning Objectives: Gain familiarity with common elements of a QA/QC program for diagnostic ultrasound imaging dentify QC tools available for testing diagnostic ultrasound systems and learn how to use these tools Learn ACR ultrasound accreditation requirements Jennifer Walter is an employee of American College of Radiology on Ultrasound Accreditation.« less

  1. [Does implementation of benchmarking in quality circles improve the quality of care of patients with asthma and reduce drug interaction?].

    PubMed

    Kaufmann-Kolle, Petra; Szecsenyi, Joachim; Broge, Björn; Haefeli, Walter Emil; Schneider, Antonius

    2011-01-01

    The purpose of this cluster-randomised controlled trial was to evaluate the efficacy of quality circles (QCs) working either with general data-based feedback or with an open benchmark within the field of asthma care and drug-drug interactions. Twelve QCs, involving 96 general practitioners from 85 practices, were randomised. Six QCs worked with traditional anonymous feedback and six with an open benchmark. Two QC meetings supported with feedback reports were held covering the topics "drug-drug interactions" and "asthma"; in both cases discussions were guided by a trained moderator. Outcome measures included health-related quality of life and patient satisfaction with treatment, asthma severity and number of potentially inappropriate drug combinations as well as the general practitioners' satisfaction in relation to the performance of the QC. A significant improvement in the treatment of asthma was observed in both trial arms. However, there was only a slight improvement regarding inappropriate drug combinations. There were no relevant differences between the group with open benchmark (B-QC) and traditional quality circles (T-QC). The physicians' satisfaction with the QC performance was significantly higher in the T-QCs. General practitioners seem to take a critical perspective about open benchmarking in quality circles. Caution should be used when implementing benchmarking in a quality circle as it did not improve healthcare when compared to the traditional procedure with anonymised comparisons. Copyright © 2011. Published by Elsevier GmbH.

  2. Operational quality control of daily precipitation using spatio-climatological consistency testing

    NASA Astrophysics Data System (ADS)

    Scherrer, S. C.; Croci-Maspoli, M.; van Geijtenbeek, D.; Naguel, C.; Appenzeller, C.

    2010-09-01

    Quality control (QC) of meteorological data is of utmost importance for climate related decisions. The search for an effective automated QC of precipitation data has proven difficult and many weather services still use mainly manual inspection of daily precipitation including MeteoSwiss. However, man power limitations force many weather services to move towards less labour intensive and more automated QC with the challenge to keeping data quality high. In the last decade, several approaches have been presented to objectify daily precipitation QC. Here we present a spatio-climatological approach that will be implemented operationally at MeteoSwiss. It combines the information from the event based spatial distribution of everyday's precipitation field and the historical information of the interpolation error using different precipitation intensity intervals. Expert judgement shows that the system is able to detect potential outliers very well (hardly any missed errors) without creating too many false alarms that need human inspection. 50-80% of all flagged values have been classified as real errors by the data editor. This is much better than the roughly 15-20% using standard spatial regression tests. Very helpful in the QC process is the automatic redistribution of accumulated several day sums. Manual inspection in operations can be reduced and the QC of precipitation objectified substantially.

  3. Comparison of techniques that use the single scattering model to compute the quality factor Q from coda waves

    USGS Publications Warehouse

    Novelo-Casanova, D. A.; Lee, W.H.K.

    1991-01-01

    Using simulated coda waves, the resolution of the single-scattering model to extract coda Q (Qc) and its power law frequency dependence was tested. The back-scattering model of Aki and Chouet (1975) and the single isotropic-scattering model of Sato (1977) were examined. The results indicate that: (1) The input Qc models are reasonably well approximated by the two methods; (2) almost equal Qc values are recovered when the techniques sample the same coda windows; (3) low Qc models are well estimated in the frequency domain from the early and late part of the coda; and (4) models with high Qc values are more accurately extracted from late code measurements. ?? 1991 Birkha??user Verlag.

  4. NDA Batch 2002-13

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hollister, R

    QC sample results (daily background check drum and 100-gram SGS check drum) were within acceptance criteria established by WIPP's Quality Assurance Objectives for TRU Waste Characterization. Replicate runs were performed on drum LL85501243TRU. Replicate measurement results are identical at the 95% confidence level as established by WIPP criteria. HWM NCAR No. 02-1000168 issued on 17-Oct-2002 regarding a partially dislodged Cd sheet filter on the HPGe coaxial detector. This physical geometry occurred on 01-Oct-2002 and was not corrected until 10-Oct-2002, during which period is inclusive of the present batch run of drums. Per discussions among the Independent Technical Reviewer, Expert Reviewermore » and the Technical QA Supervisor, as well as in consultation with John Fleissner, Technical Point of Contact from Canberra, the analytical results are technically reliable. All QC standard runs during this period were in control. Data packet for SGS Batch 2002-13 generated using passive gamma-ray spectroscopy with the Pu Facility SGS unit is technically reasonable. All QC samples are in compliance with establiShed control limits. The batch data packet has been reviewed for correctness, completeness, consistency and compliance with WIPP's Quality Assurance Objectives and determined to be acceptable.« less

  5. The Quality Control Circle: Is It for Education?

    ERIC Educational Resources Information Center

    Land, Arthur J.

    From its start in Japan after World War II, the Quality Control Circle (Q.C.) approach to management and organizational operation evolved into what it is today: people doing similar work meeting regularly to identify, objectively analyze, and develop solutions to problems. The Q.C. approach meets Maslow's theory of motivation by inviting…

  6. Analytical approaches to quality assurance and quality control in rangeland monitoring data

    USDA-ARS?s Scientific Manuscript database

    Producing quality data to support land management decisions is the goal of every rangeland monitoring program. However, the results of quality assurance (QA) and quality control (QC) efforts to improve data quality are rarely reported. The purpose of QA and QC is to prevent and describe non-sampling...

  7. qcML: An Exchange Format for Quality Control Metrics from Mass Spectrometry Experiments*

    PubMed Central

    Walzer, Mathias; Pernas, Lucia Espona; Nasso, Sara; Bittremieux, Wout; Nahnsen, Sven; Kelchtermans, Pieter; Pichler, Peter; van den Toorn, Henk W. P.; Staes, An; Vandenbussche, Jonathan; Mazanek, Michael; Taus, Thomas; Scheltema, Richard A.; Kelstrup, Christian D.; Gatto, Laurent; van Breukelen, Bas; Aiche, Stephan; Valkenborg, Dirk; Laukens, Kris; Lilley, Kathryn S.; Olsen, Jesper V.; Heck, Albert J. R.; Mechtler, Karl; Aebersold, Ruedi; Gevaert, Kris; Vizcaíno, Juan Antonio; Hermjakob, Henning; Kohlbacher, Oliver; Martens, Lennart

    2014-01-01

    Quality control is increasingly recognized as a crucial aspect of mass spectrometry based proteomics. Several recent papers discuss relevant parameters for quality control and present applications to extract these from the instrumental raw data. What has been missing, however, is a standard data exchange format for reporting these performance metrics. We therefore developed the qcML format, an XML-based standard that follows the design principles of the related mzML, mzIdentML, mzQuantML, and TraML standards from the HUPO-PSI (Proteomics Standards Initiative). In addition to the XML format, we also provide tools for the calculation of a wide range of quality metrics as well as a database format and interconversion tools, so that existing LIMS systems can easily add relational storage of the quality control data to their existing schema. We here describe the qcML specification, along with possible use cases and an illustrative example of the subsequent analysis possibilities. All information about qcML is available at http://code.google.com/p/qcml. PMID:24760958

  8. qcML: an exchange format for quality control metrics from mass spectrometry experiments.

    PubMed

    Walzer, Mathias; Pernas, Lucia Espona; Nasso, Sara; Bittremieux, Wout; Nahnsen, Sven; Kelchtermans, Pieter; Pichler, Peter; van den Toorn, Henk W P; Staes, An; Vandenbussche, Jonathan; Mazanek, Michael; Taus, Thomas; Scheltema, Richard A; Kelstrup, Christian D; Gatto, Laurent; van Breukelen, Bas; Aiche, Stephan; Valkenborg, Dirk; Laukens, Kris; Lilley, Kathryn S; Olsen, Jesper V; Heck, Albert J R; Mechtler, Karl; Aebersold, Ruedi; Gevaert, Kris; Vizcaíno, Juan Antonio; Hermjakob, Henning; Kohlbacher, Oliver; Martens, Lennart

    2014-08-01

    Quality control is increasingly recognized as a crucial aspect of mass spectrometry based proteomics. Several recent papers discuss relevant parameters for quality control and present applications to extract these from the instrumental raw data. What has been missing, however, is a standard data exchange format for reporting these performance metrics. We therefore developed the qcML format, an XML-based standard that follows the design principles of the related mzML, mzIdentML, mzQuantML, and TraML standards from the HUPO-PSI (Proteomics Standards Initiative). In addition to the XML format, we also provide tools for the calculation of a wide range of quality metrics as well as a database format and interconversion tools, so that existing LIMS systems can easily add relational storage of the quality control data to their existing schema. We here describe the qcML specification, along with possible use cases and an illustrative example of the subsequent analysis possibilities. All information about qcML is available at http://code.google.com/p/qcml. © 2014 by The American Society for Biochemistry and Molecular Biology, Inc.

  9. Quality Control of Meteorological Observations

    NASA Technical Reports Server (NTRS)

    Collins, William; Dee, Dick; Rukhovets, Leonid

    1999-01-01

    For the first time, a problem of the meteorological observation quality control (QC) was formulated by L.S. Gandin at the Main Geophysical Observatory in the 70's. Later in 1988 L.S. Gandin began adapting his ideas in complex quality control (CQC) to the operational environment at the National Centers for Environmental Prediction. The CQC was first applied by L.S.Gandin and his colleagues to detection and correction of errors in rawinsonde heights and temperatures using a complex of hydrostatic residuals.Later, a full complex of residuals, vertical and horizontal optimal interpolations and baseline checks were added for the checking and correction of a wide range of meteorological variables. some other of Gandin's ideas were applied and substantially developed at other meteorological centers. A new statistical QC was recently implemented in the Goddard Data Assimilation System. The central component of any quality control is a buddy check which is a test of individual suspect observations against available nearby non-suspect observations. A novel feature of this test is that the error variances which are used for QC decision are re-estimated on-line. As a result, the allowed tolerances for suspect observations can depend on local atmospheric conditions. The system is then better able to accept extreme values observed in deep cyclones, jet streams and so on. The basic statements of this adaptive buddy check are described. Some results of the on-line QC including moisture QC are presented.

  10. Coda Q and its Frequency Dependence in the Eastern Himalayan and Indo-Burman Plate Boundary Systems

    NASA Astrophysics Data System (ADS)

    Mitra, S.; Kumar, A.

    2015-12-01

    We use broadband waveform data for 305 local earthquakes from the Eastern Himalayan and Indo-Burman plate boundary systems, to model the seismic attenuation in NE India. We measure the decay in amplitude of coda waves at discreet frequencies (between 1 and 12Hz) to evaluate the quality factor (Qc) as a function of frequency. We combine these measurements to evaluate the frequency dependence of Qc of the form Qc(f)=Qof η, where Qo is the quality factor at 1Hz and η is the frequency dependence. Computed Qo values range from 80-360 and η ranges from 0.85-1.45. To study the lateral variation in Qo and η, we regionalise the Qc by combining all source-receiver measurements using a back-projection algorithm. For a single back scatter model, the coda waves sample an elliptical area with the epicenter and receiver at the two foci. We parameterize the region using square grids. The algorithm calculates the overlap in area and distributes Qc in the sampled grids using the average Qc as the boundary value. This is done in an iterative manner, by minimising the misfit between the observed and computed Qc within each grid. This process is repeated for all frequencies and η is computed for each grid by combining Qc for all frequencies. Our results reveal strong variation in Qo and η across NE India. The highest Qo are in the Bengal Basin (210-280) and the Indo-Burman subduction zone (300-360). The Shillong Plateau and Mikir Hills have intermediate Qo (~160) and the lowest Qo (~80) is observed in the Naga fold thrust belt. This variation in Qo demarcates the boundary between the continental crust beneath the Shillong Plateau and Mikir Hills and the transitional to oceanic crust beneath the Bengal Basin and Indo-Burman subduction zone. Thick pile of sedimentary strata in the Naga fold thrust belt results in the low Qo. Frequency dependence (η) of Qc across NE India is observed to be very high, with regions of high Qo being associated with relatively higher η.

  11. Quality control of CT systems by automated monitoring of key performance indicators: a two‐year study

    PubMed Central

    Bujila, Robert; Poludniowski, Gavin; Fransson, Annette

    2015-01-01

    The purpose of this study was to develop a method of performing routine periodical quality controls (QC) of CT systems by automatically analyzing key performance indicators (KPIs), obtainable from images of manufacturers' quality assurance (QA) phantoms. A KPI pertains to a measurable or determinable QC parameter that is influenced by other underlying fundamental QC parameters. The established KPIs are based on relationships between existing QC parameters used in the annual testing program of CT scanners at the Karolinska University Hospital in Stockholm, Sweden. The KPIs include positioning, image noise, uniformity, homogeneity, the CT number of water, and the CT number of air. An application (MonitorCT) was developed to automatically evaluate phantom images in terms of the established KPIs. The developed methodology has been used for two years in clinical routine, where CT technologists perform daily scans of the manufacturer's QA phantom and automatically send the images to MonitorCT for KPI evaluation. In the cases where results were out of tolerance, actions could be initiated in less than 10 min. 900 QC scans from two CT scanners have been collected and analyzed over the two‐year period that MonitorCT has been active. Two types of errors have been registered in this period: a ring artifact was discovered with the image noise test, and a calibration error was detected multiple times with the CT number test. In both cases, results were outside the tolerances defined for MonitorCT, as well as by the vendor. Automated monitoring of KPIs is a powerful tool that can be used to supplement established QC methodologies. Medical physicists and other professionals concerned with the performance of a CT system will, using such methods, have access to comprehensive data on the current and historical (trend) status of the system such that swift actions can be taken in order to ensure the quality of the CT examinations, patient safety, and minimal disruption of service PACS numbers: 87.57.C‐, 87.57.N‐, 87.57.Q‐ PMID:26219012

  12. QA/QC requirements for physical properties sampling and analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Innis, B.E.

    1993-07-21

    This report presents results of an assessment of the available information concerning US Environmental Protection Agency (EPA) quality assurance/quality control (QA/QC) requirements and guidance applicable to sampling, handling, and analyzing physical parameter samples at Comprehensive Environmental Restoration, Compensation, and Liability Act (CERCLA) investigation sites. Geotechnical testing laboratories measure the following physical properties of soil and sediment samples collected during CERCLA remedial investigations (RI) at the Hanford Site: moisture content, grain size by sieve, grain size by hydrometer, specific gravity, bulk density/porosity, saturated hydraulic conductivity, moisture retention, unsaturated hydraulic conductivity, and permeability of rocks by flowing air. Geotechnical testing laboratories alsomore » measure the following chemical parameters of soil and sediment samples collected during Hanford Site CERCLA RI: calcium carbonate and saturated column leach testing. Physical parameter data are used for (1) characterization of vadose and saturated zone geology and hydrogeology, (2) selection of monitoring well screen sizes, (3) to support modeling and analysis of the vadose and saturated zones, and (4) for engineering design. The objectives of this report are to determine the QA/QC levels accepted in the EPA Region 10 for the sampling, handling, and analysis of soil samples for physical parameters during CERCLA RI.« less

  13. Impact of Case Mix Severity on Quality Improvement in a Patient-centered Medical Home (PCMH) in the Maryland Multi-Payor Program.

    PubMed

    Khanna, Niharika; Shaya, Fadia T; Chirikov, Viktor V; Sharp, David; Steffen, Ben

    2016-01-01

    We present data on quality of care (QC) improvement in 35 of 45 National Quality Forum metrics reported annually by 52 primary care practices recognized as patient-centered medical homes (PCMHs) that participated in the Maryland Multi-Payor Program from 2011 to 2013. We assigned QC metrics to (1) chronic, (2) preventive, and (3) mental health care domains. The study used a panel data design with no control group. Using longitudinal fixed-effects regressions, we modeled QC and case mix severity in a PCMH. Overall, 35 of 45 quality metrics reported by 52 PCMHs demonstrated improvement over 3 years, and case mix severity did not affect the achievement of quality improvement. From 2011 to 2012, QC increased by 0.14 (P < .01) for chronic, 0.15 (P < .01) for preventive, and 0.34 (P < .01) for mental health care domains; from 2012 to 2013 these domains increased by 0.03 (P = .06), 0.04 (P = .05), and 0.07 (P = .12), respectively. In univariate analyses, lower National Commission on Quality Assurance PCMH level was associated with higher QC for the mental health care domain, whereas case mix severity did not correlate with QC. In multivariate analyses, higher QC correlated with larger practices, greater proportion of older patients, and readmission visits. Rural practices had higher proportions of Medicaid patients, lower QC, and higher QC improvement in interaction analyses with time. The gains in QC in the chronic disease domain, the preventive care domain, and, most significantly, the mental health care domain were observed over time regardless of patient case mix severity. QC improvement was generally not modified by practice characteristics, except for rurality. © Copyright 2016 by the American Board of Family Medicine.

  14. Establishing daily quality control (QC) in screen-film mammography using leeds tor (max) phantom at the breast imaging unit of USTH-Benavides Cancer Institute

    NASA Astrophysics Data System (ADS)

    Acaba, K. J. C.; Cinco, L. D.; Melchor, J. N.

    2016-03-01

    Daily QC tests performed on screen film mammography (SFM) equipment are essential to ensure that both SFM unit and film processor are working in a consistent manner. The Breast Imaging Unit of USTH-Benavides Cancer Institute has been conducting QC following the test protocols in the IAEA Human Health Series No.2 manual. However, the availability of Leeds breast phantom (CRP E13039) in the facility made the task easier. Instead of carrying out separate tests on AEC constancy and light sensitometry, only one exposure of the phantom is done to accomplish the two tests. It was observed that measurements made on mAs output and optical densities (ODs) using the Leeds TOR (MAX) phantom are comparable with that obtained from the usual conduct of tests, taking into account the attenuation characteristic of the phantom. Image quality parameters such as low contrast and high contrast details were also evaluated from the phantom image. The authors recognize the usefulness of the phantom in determining technical factors that will help improve detection of smallest pathological details on breast images. The phantom is also convenient for daily QC monitoring and economical since less number of films is expended.

  15. Evaluation of peak picking quality in LC-MS metabolomics data.

    PubMed

    Brodsky, Leonid; Moussaieff, Arieh; Shahaf, Nir; Aharoni, Asaph; Rogachev, Ilana

    2010-11-15

    The output of LC-MS metabolomics experiments consists of mass-peak intensities identified through a peak-picking/alignment procedure. Besides imperfections in biological samples and instrumentation, data accuracy is highly dependent on the applied algorithms and their parameters. Consequently, quality control (QC) is essential for further data analysis. Here, we present a QC approach that is based on discrepancies between replicate samples. First, the quantile normalization of per-sample log-signal distributions is applied to each group of biologically homogeneous samples. Next, the overall quality of each replicate group is characterized by the Z-transformed correlation coefficients between samples. This general QC allows a tuning of the procedure's parameters which minimizes the inter-replicate discrepancies in the generated output. Subsequently, an in-depth QC measure detects local neighborhoods on a template of aligned chromatograms that are enriched by divergences between intensity profiles of replicate samples. These neighborhoods are determined through a segmentation algorithm. The retention time (RT)-m/z positions of the neighborhoods with local divergences are indicative of either: incorrect alignment of chromatographic features, technical problems in the chromatograms, or to a true biological discrepancy between replicates for particular metabolites. We expect this method to aid in the accurate analysis of metabolomics data and in the development of new peak-picking/alignment procedures.

  16. A Rotatable Quality Control Phantom for Evaluating the Performance of Flat Panel Detectors in Imaging Moving Objects.

    PubMed

    Haga, Yoshihiro; Chida, Koichi; Inaba, Yohei; Kaga, Yuji; Meguro, Taiichiro; Zuguchi, Masayuki

    2016-02-01

    As the use of diagnostic X-ray equipment with flat panel detectors (FPDs) has increased, so has the importance of proper management of FPD systems. To ensure quality control (QC) of FPD system, an easy method for evaluating FPD imaging performance for both stationary and moving objects is required. Until now, simple rotatable QC phantoms have not been available for the easy evaluation of the performance (spatial resolution and dynamic range) of FPD in imaging moving objects. We developed a QC phantom for this purpose. It consists of three thicknesses of copper and a rotatable test pattern of piano wires of various diameters. Initial tests confirmed its stable performance. Our moving phantom is very useful for QC of FPD images of moving objects because it enables visual evaluation of image performance (spatial resolution and dynamic range) easily.

  17. AutoLock: a semiautomated system for radiotherapy treatment plan quality control

    PubMed Central

    Lowe, Matthew; Hardy, Mark J.; Boylan, Christopher J.; Whitehurst, Philip; Rowbottom, Carl G.

    2015-01-01

    A semiautomated system for radiotherapy treatment plan quality control (QC), named AutoLock, is presented. AutoLock is designed to augment treatment plan QC by automatically checking aspects of treatment plans that are well suited to computational evaluation, whilst summarizing more subjective aspects in the form of a checklist. The treatment plan must pass all automated checks and all checklist items must be acknowledged by the planner as correct before the plan is finalized. Thus AutoLock uniquely integrates automated treatment plan QC, an electronic checklist, and plan finalization. In addition to reducing the potential for the propagation of errors, the integration of AutoLock into the plan finalization workflow has improved efficiency at our center. Detailed audit data are presented, demonstrating that the treatment plan QC rejection rate fell by around a third following the clinical introduction of AutoLock. PACS number: 87.55.Qr PMID:26103498

  18. Quality control and conduct of genome-wide association meta-analyses.

    PubMed

    Winkler, Thomas W; Day, Felix R; Croteau-Chonka, Damien C; Wood, Andrew R; Locke, Adam E; Mägi, Reedik; Ferreira, Teresa; Fall, Tove; Graff, Mariaelisa; Justice, Anne E; Luan, Jian'an; Gustafsson, Stefan; Randall, Joshua C; Vedantam, Sailaja; Workalemahu, Tsegaselassie; Kilpeläinen, Tuomas O; Scherag, André; Esko, Tonu; Kutalik, Zoltán; Heid, Iris M; Loos, Ruth J F

    2014-05-01

    Rigorous organization and quality control (QC) are necessary to facilitate successful genome-wide association meta-analyses (GWAMAs) of statistics aggregated across multiple genome-wide association studies. This protocol provides guidelines for (i) organizational aspects of GWAMAs, and for (ii) QC at the study file level, the meta-level across studies and the meta-analysis output level. Real-world examples highlight issues experienced and solutions developed by the GIANT Consortium that has conducted meta-analyses including data from 125 studies comprising more than 330,000 individuals. We provide a general protocol for conducting GWAMAs and carrying out QC to minimize errors and to guarantee maximum use of the data. We also include details for the use of a powerful and flexible software package called EasyQC. Precise timings will be greatly influenced by consortium size. For consortia of comparable size to the GIANT Consortium, this protocol takes a minimum of about 10 months to complete.

  19. AutoLock: a semiautomated system for radiotherapy treatment plan quality control.

    PubMed

    Dewhurst, Joseph M; Lowe, Matthew; Hardy, Mark J; Boylan, Christopher J; Whitehurst, Philip; Rowbottom, Carl G

    2015-05-08

    A semiautomated system for radiotherapy treatment plan quality control (QC), named AutoLock, is presented. AutoLock is designed to augment treatment plan QC by automatically checking aspects of treatment plans that are well suited to computational evaluation, whilst summarizing more subjective aspects in the form of a checklist. The treatment plan must pass all automated checks and all checklist items must be acknowledged by the planner as correct before the plan is finalized. Thus AutoLock uniquely integrates automated treatment plan QC, an electronic checklist, and plan finalization. In addition to reducing the potential for the propagation of errors, the integration of AutoLock into the plan finalization workflow has improved efficiency at our center. Detailed audit data are presented, demonstrating that the treatment plan QC rejection rate fell by around a third following the clinical introduction of AutoLock.

  20. Quality control and conduct of genome-wide association meta-analyses

    PubMed Central

    Winkler, Thomas W; Day, Felix R; Croteau-Chonka, Damien C; Wood, Andrew R; Locke, Adam E; Mägi, Reedik; Ferreira, Teresa; Fall, Tove; Graff, Mariaelisa; Justice, Anne E; Luan, Jian'an; Gustafsson, Stefan; Randall, Joshua C; Vedantam, Sailaja; Workalemahu, Tsegaselassie; Kilpeläinen, Tuomas O; Scherag, André; Esko, Tonu; Kutalik, Zoltán; Heid, Iris M; Loos, Ruth JF

    2014-01-01

    Rigorous organization and quality control (QC) are necessary to facilitate successful genome-wide association meta-analyses (GWAMAs) of statistics aggregated across multiple genome-wide association studies. This protocol provides guidelines for [1] organizational aspects of GWAMAs, and for [2] QC at the study file level, the meta-level across studies, and the meta-analysis output level. Real–world examples highlight issues experienced and solutions developed by the GIANT Consortium that has conducted meta-analyses including data from 125 studies comprising more than 330,000 individuals. We provide a general protocol for conducting GWAMAs and carrying out QC to minimize errors and to guarantee maximum use of the data. We also include details for use of a powerful and flexible software package called EasyQC. For consortia of comparable size to the GIANT consortium, the present protocol takes a minimum of about 10 months to complete. PMID:24762786

  1. Development and Testing of a Nuclear Quality Assurance/Quality Control Technician Curriculum. Final Report.

    ERIC Educational Resources Information Center

    Espy, John; And Others

    A project was conducted to field test selected first- and second-year courses in a postsecondary nuclear quality assurance/quality control (QA/QC) technician curriculum and to develop the teaching/learning modules for seven technical specialty courses remaining in the QA/QC technician curriculum. The field testing phase of the project involved the…

  2. Preliminary Quality Control System Design for the Pell Grant Program.

    ERIC Educational Resources Information Center

    Advanced Technology, Inc., Reston, VA.

    A preliminary design for a quality control (QC) system for the Pell Grant Program is proposed, based on the needs of the Office of Student Financial Assistance (OSFA). The applicability of the general design for other student aid programs administered by OSFA is also considered. The following steps included in a strategic approach to QC system…

  3. Poster — Thur Eve — 02: Measurement of CT radiation profile width using Fuji CR imaging plate raw data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bjarnason, T A; Department of Radiology, University of British Columbia, Vancouver; Yang, C J

    2014-08-15

    Measuring the CT collimation width and assessing the shape of the overall profile is a relatively straightforward quality control (QC) measure that impacts both image quality and patient dose, and is often required at acceptance and routine testing. Most CT facilities have access to computed radiography (CR) systems, so performing CT collimation profile assessments using CR plates requires no additional equipment. Previous studies have shown how to effectively use CR plates to measure the radiation profile width. However, a major limitation of the previous work is that the full dynamic range of CR detector plates are not used, since themore » CR processing technology reduces the dynamic range of the DICOM output to 2{sup 10}, requiring the sensitivity and latitude settings of CR reader to be adjusted to prevent clipping of the CT profile data. Such adjustments to CR readers unnecessarily complicate the QC procedure. These clipping artefacts hinder the ability to accurately assess CT collimation width because the full-width at half maximum value of the penumbras are not properly determined if the maximum dose of the profile is not available. Furthermore, any inconsistencies in the radiation profile shape are lost if the profile plateau is clipped off. In this work we developed an opensource Matlab script for straightforward CT profile width measurements using raw CR data that also allows assessment of the profile shape without clipping, and applied this approach during CT QC.« less

  4. Molecular mechanism of ER stress-induced pre-emptive quality control involving association of the translocon, Derlin-1, and HRD1.

    PubMed

    Kadowaki, Hisae; Satrimafitrah, Pasjan; Takami, Yasunari; Nishitoh, Hideki

    2018-05-09

    The maintenance of endoplasmic reticulum (ER) homeostasis is essential for cell function. ER stress-induced pre-emptive quality control (ERpQC) helps alleviate the burden to a stressed ER by limiting further protein loading. We have previously reported the mechanisms of ERpQC, which includes a rerouting step and a degradation step. Under ER stress conditions, Derlin family proteins (Derlins), which are components of ER-associated degradation, reroute specific ER-targeting proteins to the cytosol. Newly synthesized rerouted polypeptides are degraded via the cytosolic chaperone Bag6 and the AAA-ATPase p97 in the ubiquitin-proteasome system. However, the mechanisms by which ER-targeting proteins are rerouted from the ER translocation pathway to the cytosolic degradation pathway and how the E3 ligase ubiquitinates ERpQC substrates remain unclear. Here, we show that ERpQC substrates are captured by the carboxyl-terminus region of Derlin-1 and ubiquitinated by the HRD1 E3 ubiquitin ligase prior to degradation. Moreover, HRD1 forms a large ERpQC-related complex composed of Sec61α and Derlin-1 during ER stress. These findings indicate that the association of the degradation factor HRD1 with the translocon and the rerouting factor Derlin-1 may be necessary for the smooth and effective clearance of ERpQC substrates.

  5. Improvement of early detection of breast cancer through collaborative multi-country efforts: Medical physics component.

    PubMed

    Mora, Patricia; Faulkner, Keith; Mahmoud, Ahmed M; Gershan, Vesna; Kausik, Aruna; Zdesar, Urban; Brandan, María-Ester; Kurt, Serap; Davidović, Jasna; Salama, Dina H; Aribal, Erkin; Odio, Clara; Chaturvedi, Arvind K; Sabih, Zahida; Vujnović, Saša; Paez, Diana; Delis, Harry

    2018-04-01

    The International Atomic Energy Agency (IAEA) through a Coordinated Research Project on "Enhancing Capacity for Early Detection and Diagnosis of Breast Cancer through Imaging", brought together a group of mammography radiologists, medical physicists and radiographers; to investigate current practices and improve procedures for the early detection of breast cancer by strengthening both the clinical and medical physics components. This paper addresses the medical physics component. The countries that participated in the CRP were Bosnia and Herzegovina, Costa Rica, Egypt, India, Kenya, the Frmr. Yug. Rep. of Macedonia, Mexico, Nigeria, Pakistan, Philippines, Slovenia, Turkey, Uganda, United Kingdom and Zambia. Ten institutions participated using IAEA quality control protocols in 9 digital and 3 analogue mammography equipment. A spreadsheet for data collection was generated and distributed. Evaluation of image quality was done using TOR MAX and DMAM2 Gold phantoms. QC results for analogue equipment showed satisfactory results. QC tests performed on digital systems showed that improvements needed to be implemented, especially in thickness accuracy, signal difference to noise ratio (SDNR) values for achievable levels, uniformity and modulation transfer function (MTF). Mean glandular dose (MGD) was below international recommended levels for patient radiation protection. Evaluation of image quality by phantoms also indicated the need for improvement. Common activities facilitated improvement in mammography practice, including training of medical physicists in QC programs and infrastructure was improved and strengthened; networking among medical physicists and radiologists took place and was maintained over time. IAEA QC protocols provided a uniformed approach to QC measurements. Copyright © 2018 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  6. Develop a Methodology to Evaluate the Effectiveness of QC/QA Specifications (Phase II)

    DOT National Transportation Integrated Search

    1998-08-01

    The Texas Department of Transportation (TxDOT) has been implementing statistically based quality control/quality assurance (QC/QA) specifications for hot mix asphalt concrete pavements since the early 1990s. These specifications have been continuousl...

  7. A Benchmark Study on Error Assessment and Quality Control of CCS Reads Derived from the PacBio RS

    PubMed Central

    Jiao, Xiaoli; Zheng, Xin; Ma, Liang; Kutty, Geetha; Gogineni, Emile; Sun, Qiang; Sherman, Brad T.; Hu, Xiaojun; Jones, Kristine; Raley, Castle; Tran, Bao; Munroe, David J.; Stephens, Robert; Liang, Dun; Imamichi, Tomozumi; Kovacs, Joseph A.; Lempicki, Richard A.; Huang, Da Wei

    2013-01-01

    PacBio RS, a newly emerging third-generation DNA sequencing platform, is based on a real-time, single-molecule, nano-nitch sequencing technology that can generate very long reads (up to 20-kb) in contrast to the shorter reads produced by the first and second generation sequencing technologies. As a new platform, it is important to assess the sequencing error rate, as well as the quality control (QC) parameters associated with the PacBio sequence data. In this study, a mixture of 10 prior known, closely related DNA amplicons were sequenced using the PacBio RS sequencing platform. After aligning Circular Consensus Sequence (CCS) reads derived from the above sequencing experiment to the known reference sequences, we found that the median error rate was 2.5% without read QC, and improved to 1.3% with an SVM based multi-parameter QC method. In addition, a De Novo assembly was used as a downstream application to evaluate the effects of different QC approaches. This benchmark study indicates that even though CCS reads are post error-corrected it is still necessary to perform appropriate QC on CCS reads in order to produce successful downstream bioinformatics analytical results. PMID:24179701

  8. A Benchmark Study on Error Assessment and Quality Control of CCS Reads Derived from the PacBio RS.

    PubMed

    Jiao, Xiaoli; Zheng, Xin; Ma, Liang; Kutty, Geetha; Gogineni, Emile; Sun, Qiang; Sherman, Brad T; Hu, Xiaojun; Jones, Kristine; Raley, Castle; Tran, Bao; Munroe, David J; Stephens, Robert; Liang, Dun; Imamichi, Tomozumi; Kovacs, Joseph A; Lempicki, Richard A; Huang, Da Wei

    2013-07-31

    PacBio RS, a newly emerging third-generation DNA sequencing platform, is based on a real-time, single-molecule, nano-nitch sequencing technology that can generate very long reads (up to 20-kb) in contrast to the shorter reads produced by the first and second generation sequencing technologies. As a new platform, it is important to assess the sequencing error rate, as well as the quality control (QC) parameters associated with the PacBio sequence data. In this study, a mixture of 10 prior known, closely related DNA amplicons were sequenced using the PacBio RS sequencing platform. After aligning Circular Consensus Sequence (CCS) reads derived from the above sequencing experiment to the known reference sequences, we found that the median error rate was 2.5% without read QC, and improved to 1.3% with an SVM based multi-parameter QC method. In addition, a De Novo assembly was used as a downstream application to evaluate the effects of different QC approaches. This benchmark study indicates that even though CCS reads are post error-corrected it is still necessary to perform appropriate QC on CCS reads in order to produce successful downstream bioinformatics analytical results.

  9. Establishing quality control ranges for antimicrobial susceptibility testing of Escherichia coli, Pseudomonas aeruginosa, and Staphylococcus aureus: a cornerstone to develop reference strains for Korean clinical microbiology laboratories.

    PubMed

    Hong, Sung Kuk; Choi, Seung Jun; Shin, Saeam; Lee, Wonmok; Pinto, Naina; Shin, Nari; Lee, Kwangjun; Hong, Seong Geun; Kim, Young Ah; Lee, Hyukmin; Kim, Heejung; Song, Wonkeun; Lee, Sun Hwa; Yong, Dongeun; Lee, Kyungwon; Chong, Yunsop

    2015-11-01

    Quality control (QC) processes are being performed in the majority of clinical microbiology laboratories to ensure the performance of microbial identification and antimicrobial susceptibility testing by using ATCC strains. To obtain these ATCC strains, some inconveniences are encountered concerning the purchase cost of the strains and the shipping time required. This study was focused on constructing a database of reference strains for QC processes using domestic bacterial strains, concentrating primarily on antimicrobial susceptibility testing. Three strains (Escherichia coli, Pseudomonas aeruginosa, and Staphylococcus aureus) that showed legible results in preliminary testing were selected. The minimal inhibitory concentrations (MICs) and zone diameters (ZDs) of eight antimicrobials for each strain were determined according to the CLSI M23. All resulting MIC and ZD ranges included at least 95% of the data. The ZD QC ranges obtained by using the CLSI method were less than 12 mm, and the MIC QC ranges extended no more than five dilutions. This study is a preliminary attempt to construct a bank of Korean QC strains. With further studies, a positive outcome toward cost and time reduction can be anticipated.

  10. FQC Dashboard: integrates FastQC results into a web-based, interactive, and extensible FASTQ quality control tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Joseph; Pirrung, Meg; McCue, Lee Ann

    FQC is software that facilitates large-scale quality control of FASTQ files by carrying out a QC protocol, parsing results, and aggregating quality metrics within and across experiments into an interactive dashboard. The dashboard utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data.

  11. ITC Guidelines on Quality Control in Scoring, Test Analysis, and Reporting of Test Scores

    ERIC Educational Resources Information Center

    Allalouf, Avi

    2014-01-01

    The Quality Control (QC) Guidelines are intended to increase the efficiency, precision, and accuracy of the scoring, analysis, and reporting process of testing. The QC Guidelines focus on large-scale testing operations where multiple forms of tests are created for use on set dates. However, they may also be used for a wide variety of other testing…

  12. Methods for motion correction evaluation using 18F-FDG human brain scans on a high-resolution PET scanner.

    PubMed

    Keller, Sune H; Sibomana, Merence; Olesen, Oline V; Svarer, Claus; Holm, Søren; Andersen, Flemming L; Højgaard, Liselotte

    2012-03-01

    Many authors have reported the importance of motion correction (MC) for PET. Patient motion during scanning disturbs kinetic analysis and degrades resolution. In addition, using misaligned transmission for attenuation and scatter correction may produce regional quantification bias in the reconstructed emission images. The purpose of this work was the development of quality control (QC) methods for MC procedures based on external motion tracking (EMT) for human scanning using an optical motion tracking system. Two scans with minor motion and 5 with major motion (as reported by the optical motion tracking system) were selected from (18)F-FDG scans acquired on a PET scanner. The motion was measured as the maximum displacement of the markers attached to the subject's head and was considered to be major if larger than 4 mm and minor if less than 2 mm. After allowing a 40- to 60-min uptake time after tracer injection, we acquired a 6-min transmission scan, followed by a 40-min emission list-mode scan. Each emission list-mode dataset was divided into 8 frames of 5 min. The reconstructed time-framed images were aligned to a selected reference frame using either EMT or the AIR (automated image registration) software. The following 3 QC methods were used to evaluate the EMT and AIR MC: a method using the ratio between 2 regions of interest with gray matter voxels (GM) and white matter voxels (WM), called GM/WM; mutual information; and cross correlation. The results of the 3 QC methods were in agreement with one another and with a visual subjective inspection of the image data. Before MC, the QC method measures varied significantly in scans with major motion and displayed limited variations on scans with minor motion. The variation was significantly reduced and measures improved after MC with AIR, whereas EMT MC performed less well. The 3 presented QC methods produced similar results and are useful for evaluating tracer-independent external-tracking motion-correction methods for human brain scans.

  13. Performance-based quality assurance/quality control (QA/QC) acceptance procedures for in-place soil testing phase 3.

    DOT National Transportation Integrated Search

    2015-01-01

    One of the objectives of this study was to evaluate soil testing equipment based on its capability of measuring in-place stiffness or modulus values. : As design criteria transition from empirical to mechanistic-empirical, soil test methods and equip...

  14. Development of concrete QC/QA specifications for highway construction in Kentucky.

    DOT National Transportation Integrated Search

    2001-08-01

    There is a growing trend toward quality-based specifications in highway construction. A large number of quality control/quality assurance (QC/QA) specifications shift the responsibility of day-to-day testing from the state DOH to the contractor. This...

  15. Portland cement concrete pavement review of QC/QA data 2000 through 2009.

    DOT National Transportation Integrated Search

    2011-04-01

    This report analyzes the Quality Control/Quality Assurance (QC/QA) data for Portland cement concrete pavement : (PCCP) awarded in the years 2000 through 2009. Analysis of the overall performance of the projects is accomplished by : reviewing the Calc...

  16. FQC Dashboard: integrates FastQC results into a web-based, interactive, and extensible FASTQ quality control tool.

    PubMed

    Brown, Joseph; Pirrung, Meg; McCue, Lee Ann

    2017-06-09

    FQC is software that facilitates quality control of FASTQ files by carrying out a QC protocol using FastQC, parsing results, and aggregating quality metrics into an interactive dashboard designed to richly summarize individual sequencing runs. The dashboard groups samples in dropdowns for navigation among the data sets, utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data. FQC is implemented in Python 3 and Javascript, and is maintained under an MIT license. Documentation and source code is available at: https://github.com/pnnl/fqc . joseph.brown@pnnl.gov. © The Author(s) 2017. Published by Oxford University Press.

  17. SNP Data Quality Control in a National Beef and Dairy Cattle System and Highly Accurate SNP Based Parentage Verification and Identification

    PubMed Central

    McClure, Matthew C.; McCarthy, John; Flynn, Paul; McClure, Jennifer C.; Dair, Emma; O'Connell, D. K.; Kearney, John F.

    2018-01-01

    A major use of genetic data is parentage verification and identification as inaccurate pedigrees negatively affect genetic gain. Since 2012 the international standard for single nucleotide polymorphism (SNP) verification in Bos taurus cattle has been the ISAG SNP panels. While these ISAG panels provide an increased level of parentage accuracy over microsatellite markers (MS), they can validate the wrong parent at ≤1% misconcordance rate levels, indicating that more SNP are needed if a more accurate pedigree is required. With rapidly increasing numbers of cattle being genotyped in Ireland that represent 61 B. taurus breeds from a wide range of farm types: beef/dairy, AI/pedigree/commercial, purebred/crossbred, and large to small herd size the Irish Cattle Breeding Federation (ICBF) analyzed different SNP densities to determine that at a minimum ≥500 SNP are needed to consistently predict only one set of parents at a ≤1% misconcordance rate. For parentage validation and prediction ICBF uses 800 SNP (ICBF800) selected based on SNP clustering quality, ISAG200 inclusion, call rate (CR), and minor allele frequency (MAF) in the Irish cattle population. Large datasets require sample and SNP quality control (QC). Most publications only deal with SNP QC via CR, MAF, parent-progeny conflicts, and Hardy-Weinberg deviation, but not sample QC. We report here parentage, SNP QC, and a genomic sample QC pipelines to deal with the unique challenges of >1 million genotypes from a national herd such as SNP genotype errors from mis-tagging of animals, lab errors, farm errors, and multiple other issues that can arise. We divide the pipeline into two parts: a Genotype QC and an Animal QC pipeline. The Genotype QC identifies samples with low call rate, missing or mixed genotype classes (no BB genotype or ABTG alleles present), and low genotype frequencies. The Animal QC handles situations where the genotype might not belong to the listed individual by identifying: >1 non-matching genotypes per animal, SNP duplicates, sex and breed prediction mismatches, parentage and progeny validation results, and other situations. The Animal QC pipeline make use of ICBF800 SNP set where appropriate to identify errors in a computationally efficient yet still highly accurate method. PMID:29599798

  18. CUSUM-Logistic Regression analysis for the rapid detection of errors in clinical laboratory test results.

    PubMed

    Sampson, Maureen L; Gounden, Verena; van Deventer, Hendrik E; Remaley, Alan T

    2016-02-01

    The main drawback of the periodic analysis of quality control (QC) material is that test performance is not monitored in time periods between QC analyses, potentially leading to the reporting of faulty test results. The objective of this study was to develop a patient based QC procedure for the more timely detection of test errors. Results from a Chem-14 panel measured on the Beckman LX20 analyzer were used to develop the model. Each test result was predicted from the other 13 members of the panel by multiple regression, which resulted in correlation coefficients between the predicted and measured result of >0.7 for 8 of the 14 tests. A logistic regression model, which utilized the measured test result, the predicted test result, the day of the week and time of day, was then developed for predicting test errors. The output of the logistic regression was tallied by a daily CUSUM approach and used to predict test errors, with a fixed specificity of 90%. The mean average run length (ARL) before error detection by CUSUM-Logistic Regression (CSLR) was 20 with a mean sensitivity of 97%, which was considerably shorter than the mean ARL of 53 (sensitivity 87.5%) for a simple prediction model that only used the measured result for error detection. A CUSUM-Logistic Regression analysis of patient laboratory data can be an effective approach for the rapid and sensitive detection of clinical laboratory errors. Published by Elsevier Inc.

  19. An International Coordinated Effort to Further the Documentation & Development of Quality Assurance, Quality Control, and Best Practices for Oceanographic Observations

    NASA Astrophysics Data System (ADS)

    Bushnell, M.; Waldmann, C.; Hermes, J.; Tamburri, M.

    2017-12-01

    Many oceanographic observation groups create and maintain QA, QC, and best practices (BP) to ensure efficient and accurate data collection and quantify quality. Several entities - IOOS® QARTOD, AtlantOS, ACT, WMO/IOC JCOMM OCG - have joined forces to document existing practices, identify gaps, and support development of emerging techniques. While each group has a slightly different focus, many underlying QA/QC/BP needs can be quite common. QARTOD focuses upon real-time data QC, and has produced manuals that address QC tests for eleven ocean variables. AtlantOS is a research and innovation project working towards the integration of ocean-observing activities across all disciplines in the Atlantic Basin. ACT brings together research institutions, resource managers, and private companies to foster the development and adoption of effective and reliable sensors for coastal, freshwater, and ocean environments. JCOMM promotes broad international coordination of oceanographic and marine meteorological observations and data management and services. Leveraging existing efforts of these organizations is an efficient way to consolidate available information, develop new practices, and evaluate the use of ISO standards to judge the quality of measurements. ISO standards may offer accepted support for a framework for an ocean data quality management system, similar to the meteorological standards defined by WMO (https://www.wmo.int/pages/prog/arep/gaw/qassurance.html). We will first cooperatively develop a plan to create a QA/QC/BP manual. The resulting plan will describe the need for such a manual, the extent of the manual, the process used to engage the community in creating it, the maintenance of the resultant document, and how these things will be done. It will also investigate standards for metadata. The plan will subsequently be used to develop the QA/QC/BP manual, providing guidance which advances the standards adopted by IOOS, AtlantOS, JCOMM, and others.

  20. Development of a Climatology of Vertically Complete Wind Profiles from Doppler Radar Wind Profiler Systems

    NASA Technical Reports Server (NTRS)

    Barbre, Robert, Jr.

    2015-01-01

    Assessment of space vehicle loads and trajectories during design requires a large sample of wind profiles at the altitudes where winds affect the vehicle. Traditionally, this altitude region extends from near 8-14 km to address maximum dynamic pressure upon ascent into space, but some applications require knowledge of measured wind profiles at lower altitudes. Such applications include crew capsule pad abort and plume damage analyses. Two Doppler Radar Wind Profiler (DRWP) systems exist at the United States Air Force (USAF) Eastern Range and at the National Aeronautics and Space Administration's Kennedy Space Center. The 50-MHz DRWP provides wind profiles every 3-5 minutes from roughly 2.5-18.5 km, and five 915-MHz DRWPs provide wind profiles every 15 minutes from approximately 0.2-3.0 km. Archived wind profiles from all systems underwent rigorous quality control (QC) processes, and concurrent measurements from the QC'ed 50- and 915-MHz DRWP archives were spliced into individual profiles that extend from about 0.2-18.5 km. The archive contains combined profiles from April 2000 to December 2009, and thousands of profiles during each month are available for use by the launch vehicle community. This paper presents the details of the QC and splice methodology, as well as some attributes of the archive.

  1. Comprehensive Testing Guidelines to Increase Efficiency in INDOT Operations : [Technical Summary

    DOT National Transportation Integrated Search

    2012-01-01

    When the Indiana Department of Transportation designs a pavement project, a decision for QC/QA (Quality Control/ Quality Assurance) or nonQC/QA is made solely based on the quantity of pavement materials to be used in the project. Once the pavement...

  2. Comprehensive Testing Guidelines to Increase Efficiency in INDOT Operations : [Technical Summary

    DOT National Transportation Integrated Search

    2012-01-01

    When the Indiana Department of Transportation designs : a pavement project, a decision for QC/QA (Quality Control/ : Quality Assurance) or nonQC/QA is made solely : based on the quantity of pavement materials to be used : in the project. Once the ...

  3. Hot mix asphalt voids acceptance review of QC/QA data 2000 through 2010.

    DOT National Transportation Integrated Search

    2011-10-01

    This report analyzes the quality control/quality assurance (QC/QA) data for hot mix asphalt (HMA) using : voids acceptance as the testing criteria awarded in the years 2000 through 2010. Analysis of the overall : performance of the projects is accomp...

  4. An Automatic Quality Control Pipeline for High-Throughput Screening Hit Identification.

    PubMed

    Zhai, Yufeng; Chen, Kaisheng; Zhong, Yang; Zhou, Bin; Ainscow, Edward; Wu, Ying-Ta; Zhou, Yingyao

    2016-09-01

    The correction or removal of signal errors in high-throughput screening (HTS) data is critical to the identification of high-quality lead candidates. Although a number of strategies have been previously developed to correct systematic errors and to remove screening artifacts, they are not universally effective and still require fair amount of human intervention. We introduce a fully automated quality control (QC) pipeline that can correct generic interplate systematic errors and remove intraplate random artifacts. The new pipeline was first applied to ~100 large-scale historical HTS assays; in silico analysis showed auto-QC led to a noticeably stronger structure-activity relationship. The method was further tested in several independent HTS runs, where QC results were sampled for experimental validation. Significantly increased hit confirmation rates were obtained after the QC steps, confirming that the proposed method was effective in enriching true-positive hits. An implementation of the algorithm is available to the screening community. © 2016 Society for Laboratory Automation and Screening.

  5. Non-monotonicity and divergent time scale in Axelrod model dynamics

    NASA Astrophysics Data System (ADS)

    Vazquez, F.; Redner, S.

    2007-04-01

    We study the evolution of the Axelrod model for cultural diversity, a prototypical non-equilibrium process that exhibits rich dynamics and a dynamic phase transition between diversity and an inactive state. We consider a simple version of the model in which each individual possesses two features that can assume q possibilities. Within a mean-field description in which each individual has just a few interaction partners, we find a phase transition at a critical value qc between an active, diverse state for q < qc and a frozen state. For q lesssim qc, the density of active links is non-monotonic in time and the asymptotic approach to the steady state is controlled by a time scale that diverges as (q-qc)-1/2.

  6. SprayQc: a real-time LC-MS/MS quality monitoring system to maximize uptime using off the shelf components.

    PubMed

    Scheltema, Richard A; Mann, Matthias

    2012-06-01

    With the advent of high-throughput mass spectrometry (MS)-based proteomics, the magnitude and complexity of the performed experiments has increased dramatically. Likewise, investments in chromatographic and MS instrumentation are a large proportion of the budget of proteomics laboratories. Guarding measurement quality and maximizing uptime of the LC-MS/MS systems therefore requires constant care despite automated workflows. We describe a real-time surveillance system, called SprayQc, that continuously monitors the status of the peripheral equipment to ensure that operational parameters are within an acceptable range. SprayQc is composed of multiple plug-in software components that use computer vision to analyze electrospray conditions, monitor the chromatographic device for stable backpressure, interact with a column oven to control pressure by temperature, and ensure that the mass spectrometer is still acquiring data. Action is taken when a failure condition has been detected, such as stopping the column oven and the LC flow, as well as automatically notifying the appropriate operator. Additionally, all defined metrics can be recorded synchronized on retention time with the MS acquisition file, allowing for later inspection and providing valuable information for optimization. SprayQc has been extensively tested in our laboratory, supports third-party plug-in development, and is freely available for download from http://sourceforge.org/projects/sprayqc .

  7. Comparison of four methods of establishing control limits for monitoring quality controls in infectious disease serology testing.

    PubMed

    Dimech, Wayne; Karakaltsas, Marina; Vincini, Giuseppe A

    2018-05-25

    A general trend towards conducting infectious disease serology testing in centralized laboratories means that quality control (QC) principles used for clinical chemistry testing are applied to infectious disease testing. However, no systematic assessment of methods used to establish QC limits has been applied to infectious disease serology testing. A total of 103 QC data sets, obtained from six different infectious disease serology analytes, were parsed through standard methods for establishing statistical control limits, including guidelines from Public Health England, USA Clinical and Laboratory Standards Institute (CLSI), German Richtlinien der Bundesärztekammer (RiliBÄK) and Australian QConnect. The percentage of QC results failing each method was compared. The percentage of data sets having more than 20% of QC results failing Westgard rules when the first 20 results were used to calculate the mean±2 standard deviation (SD) ranged from 3 (2.9%) for R4S to 66 (64.1%) for 10X rule, whereas the percentage ranged from 0 (0%) for R4S to 32 (40.5%) for 10X when the first 100 results were used to calculate the mean±2 SD. By contrast, the percentage of data sets with >20% failing the RiliBÄK control limits was 25 (24.3%). Only two data sets (1.9%) had more than 20% of results outside the QConnect Limits. The rate of failure of QCs using QConnect Limits was more applicable for monitoring infectious disease serology testing compared with UK Public Health, CLSI and RiliBÄK, as the alternatives to QConnect Limits reported an unacceptably high percentage of failures across the 103 data sets.

  8. Operational Processing of Ground Validation Data for the Tropical Rainfall Measuring Mission

    NASA Technical Reports Server (NTRS)

    Kulie, Mark S.; Robinson, Mike; Marks, David A.; Ferrier, Brad S.; Rosenfeld, Danny; Wolff, David B.

    1999-01-01

    The Tropical Rainfall Measuring Mission (TRMM) satellite was successfully launched in November 1997. A primary goal of TRMM is to sample tropical rainfall using the first active spaceborne precipitation radar. To validate TRMM satellite observations, a comprehensive Ground Validation (GV) Program has been implemented for this mission. A key component of GV is the analysis and quality control of meteorological ground-based radar data from four primary sites: Melbourne, FL; Houston, TX; Darwin, Australia; and Kwajalein Atoll, RMI. As part of the TRMM GV effort, the Joint Center for Earth Systems Technology (JCET) at the University of Maryland, Baltimore County, has been tasked with developing and implementing an operational system to quality control (QC), archive, and provide data for subsequent rainfall product generation from the four primary GV sites. This paper provides an overview of the JCET operational environment. A description of the QC algorithm and performance, in addition to the data flow procedure between JCET and the TRNM science and Data Information System (TSDIS), are presented. The impact of quality-controlled data on higher level rainfall and reflectivity products will also be addressed, Finally, a brief description of JCET's expanded role into producing reference rainfall products will be discussed.

  9. Hot mix asphalt voids acceptance review of QC/QA data 2000 through 2004.

    DOT National Transportation Integrated Search

    2006-07-01

    This report analyzes the Quality Control/Quality Assurance (QC/QA) data for hot mix asphalt using voids acceptance as : the testing criteria for the years 2000 through 2004. Analysis of the overall quality of the HMA is accomplished by : reviewing th...

  10. Design, fabrication, and optimization of quantum cascade laser cavities and spectroscopy of the intersubband gain

    NASA Astrophysics Data System (ADS)

    Dirisu, Afusat Olayinka

    Quantum Cascade (QC) lasers are intersubband light sources operating in the wavelength range of ˜ 3 to 300 mum and are used in applications such as sensing (environmental, biological, and hazardous chemical), infrared countermeasures, and free-space infrared communications. The mid-infrared range (i.e. lambda ˜ 3-30 mum) is of particular importance in sensing because of the strong interaction of laser radiation with various chemical species, while in free space communications the atmospheric windows of 3-5 mum and 8-12 mum are highly desirable for low loss transmission. Some of the requirements of these applications include, (1) high output power for improved sensitivity; (2) high operating temperatures for compact and cost-effective systems; (3) wide tunability; (4) single mode operation for high selectivity. In the past, available mid-infrared sources, such as the lead-salt and solid-state lasers, were bulky, expensive, or emit low output power. In recent years, QC lasers have been explored as cost-effective and compact sources because of their potential to satisfy and exceed all the above requirements. Also, the ultrafast carrier lifetimes of intersubband transitions in QC lasers are promising for high bandwidth free-space infrared communication. This thesis was focused on the improvement of QC lasers through the design and optimization of the laser cavity and characterization of the laser gain medium. The optimization of the laser cavity included, (1) the design and fabrication of high reflection Bragg gratings and subwavelength antireflection gratings, by focused ion beam milling, to achieve tunable, single mode and high power QC lasers, and (2) modeling of slab-coupled optical waveguide QC lasers for high brightness output beams. The characterization of the QC laser gain medium was carried out using the single-pass transmission experiment, a sensitive measurement technique, for probing the intersubband transitions and the electron distribution of QC lasers under different temperatures and applied bias conditions, unlike typical infrared measurement techniques that are restricted to non-functional devices. With the single-pass technique, basic understanding of the physics behind the workings of the QC laser gain can be achieved, which is invaluable in the design of QC lasers with high output power and high operating temperatures.

  11. Summary Report for the Evaluation of Current QA Processes Within the FRMAC FAL and EPA MERL.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shanks, Sonoya T.; Redding, Ted; Jaussi, Lynn

    The Federal Radiological Monitoring and Assessment Center (FRMAC) relies on accurate and defensible analytical laboratory data to support its mission. Therefore, FRMAC must ensure that the environmental analytical laboratories providing analytical services maintain an ongoing capability to provide accurate analytical results to DOE. It is undeniable that the more Quality Assurance (QA) and Quality Control (QC) measures required of the laboratory, the less resources that are available for analysis of response samples. Being that QA and QC measures in general are understood to comprise a major effort related to a laboratory’s operations, requirements should only be considered if they aremore » deemed “value-added” for the FRMAC mission. This report provides observations of areas for improvement and potential interoperability opportunities in the areas of Batch Quality Control Requirements, Written Communications, Data Review Processes, Data Reporting Processes, along with the lessons learned as they apply to items in the early phase of a response that will be critical for developing a more efficient, integrated response for future interactions between the FRMAC and EPA assets.« less

  12. Evaluating Cell Processes, Quality, and Biomarkers in Pluripotent Stem Cells Using Video Bioinformatics

    PubMed Central

    Lin, Sabrina C.; Bays, Brett C.; Omaiye, Esther; Bhanu, Bir; Talbot, Prue

    2016-01-01

    There is a foundational need for quality control tools in stem cell laboratories engaged in basic research, regenerative therapies, and toxicological studies. These tools require automated methods for evaluating cell processes and quality during in vitro passaging, expansion, maintenance, and differentiation. In this paper, an unbiased, automated high-content profiling toolkit, StemCellQC, is presented that non-invasively extracts information on cell quality and cellular processes from time-lapse phase-contrast videos. Twenty four (24) morphological and dynamic features were analyzed in healthy, unhealthy, and dying human embryonic stem cell (hESC) colonies to identify those features that were affected in each group. Multiple features differed in the healthy versus unhealthy/dying groups, and these features were linked to growth, motility, and death. Biomarkers were discovered that predicted cell processes before they were detectable by manual observation. StemCellQC distinguished healthy and unhealthy/dying hESC colonies with 96% accuracy by non-invasively measuring and tracking dynamic and morphological features over 48 hours. Changes in cellular processes can be monitored by StemCellQC and predictions can be made about the quality of pluripotent stem cell colonies. This toolkit reduced the time and resources required to track multiple pluripotent stem cell colonies and eliminated handling errors and false classifications due to human bias. StemCellQC provided both user-specified and classifier-determined analysis in cases where the affected features are not intuitive or anticipated. Video analysis algorithms allowed assessment of biological phenomena using automatic detection analysis, which can aid facilities where maintaining stem cell quality and/or monitoring changes in cellular processes are essential. In the future StemCellQC can be expanded to include other features, cell types, treatments, and differentiating cells. PMID:26848582

  13. Evaluating Cell Processes, Quality, and Biomarkers in Pluripotent Stem Cells Using Video Bioinformatics.

    PubMed

    Zahedi, Atena; On, Vincent; Lin, Sabrina C; Bays, Brett C; Omaiye, Esther; Bhanu, Bir; Talbot, Prue

    2016-01-01

    There is a foundational need for quality control tools in stem cell laboratories engaged in basic research, regenerative therapies, and toxicological studies. These tools require automated methods for evaluating cell processes and quality during in vitro passaging, expansion, maintenance, and differentiation. In this paper, an unbiased, automated high-content profiling toolkit, StemCellQC, is presented that non-invasively extracts information on cell quality and cellular processes from time-lapse phase-contrast videos. Twenty four (24) morphological and dynamic features were analyzed in healthy, unhealthy, and dying human embryonic stem cell (hESC) colonies to identify those features that were affected in each group. Multiple features differed in the healthy versus unhealthy/dying groups, and these features were linked to growth, motility, and death. Biomarkers were discovered that predicted cell processes before they were detectable by manual observation. StemCellQC distinguished healthy and unhealthy/dying hESC colonies with 96% accuracy by non-invasively measuring and tracking dynamic and morphological features over 48 hours. Changes in cellular processes can be monitored by StemCellQC and predictions can be made about the quality of pluripotent stem cell colonies. This toolkit reduced the time and resources required to track multiple pluripotent stem cell colonies and eliminated handling errors and false classifications due to human bias. StemCellQC provided both user-specified and classifier-determined analysis in cases where the affected features are not intuitive or anticipated. Video analysis algorithms allowed assessment of biological phenomena using automatic detection analysis, which can aid facilities where maintaining stem cell quality and/or monitoring changes in cellular processes are essential. In the future StemCellQC can be expanded to include other features, cell types, treatments, and differentiating cells.

  14. A Strategy to Establish a Quality Assurance/Quality Control Plan for the Application of Biosensors for the Detection of E. coli in Water.

    PubMed

    Hesari, Nikou; Kıratlı Yılmazçoban, Nursel; Elzein, Mohamad; Alum, Absar; Abbaszadegan, Morteza

    2017-01-03

    Rapid bacterial detection using biosensors is a novel approach for microbiological testing applications. Validation of such methods is an obstacle in the adoption of new bio-sensing technologies for water testing. Therefore, establishing a quality assurance and quality control (QA/QC) plan is essential to demonstrate accuracy and reliability of the biosensor method for the detection of E. coli in drinking water samples. In this study, different reagents and assay conditions including temperatures, holding time, E. coli strains and concentrations, dissolving agents, salinity and pH effects, quality of substrates of various suppliers of 4-methylumbelliferyl glucuronide (MUG), and environmental water samples were included in the QA/QC plan and used in the assay optimization and documentation. Furthermore, the procedural QA/QC for the monitoring of drinking water samples was established to validate the performance of the biosensor platform for the detection of E. coli using a culture-based standard technique. Implementing the developed QA/QC plan, the same level of precision and accuracy was achieved using both the standard and the biosensor methods. The established procedural QA/QC for the biosensor will provide a reliable tool for a near real-time monitoring of E. coli in drinking water samples to both industry and regulatory authorities.

  15. User-friendly solutions for microarray quality control and pre-processing on ArrayAnalysis.org

    PubMed Central

    Eijssen, Lars M. T.; Jaillard, Magali; Adriaens, Michiel E.; Gaj, Stan; de Groot, Philip J.; Müller, Michael; Evelo, Chris T.

    2013-01-01

    Quality control (QC) is crucial for any scientific method producing data. Applying adequate QC introduces new challenges in the genomics field where large amounts of data are produced with complex technologies. For DNA microarrays, specific algorithms for QC and pre-processing including normalization have been developed by the scientific community, especially for expression chips of the Affymetrix platform. Many of these have been implemented in the statistical scripting language R and are available from the Bioconductor repository. However, application is hampered by lack of integrative tools that can be used by users of any experience level. To fill this gap, we developed a freely available tool for QC and pre-processing of Affymetrix gene expression results, extending, integrating and harmonizing functionality of Bioconductor packages. The tool can be easily accessed through a wizard-like web portal at http://www.arrayanalysis.org or downloaded for local use in R. The portal provides extensive documentation, including user guides, interpretation help with real output illustrations and detailed technical documentation. It assists newcomers to the field in performing state-of-the-art QC and pre-processing while offering data analysts an integral open-source package. Providing the scientific community with this easily accessible tool will allow improving data quality and reuse and adoption of standards. PMID:23620278

  16. A Strategy to Establish a Quality Assurance/Quality Control Plan for the Application of Biosensors for the Detection of E. coli in Water

    PubMed Central

    Hesari, Nikou; Kıratlı Yılmazçoban, Nursel; Elzein, Mohamad; Alum, Absar; Abbaszadegan, Morteza

    2017-01-01

    Rapid bacterial detection using biosensors is a novel approach for microbiological testing applications. Validation of such methods is an obstacle in the adoption of new bio-sensing technologies for water testing. Therefore, establishing a quality assurance and quality control (QA/QC) plan is essential to demonstrate accuracy and reliability of the biosensor method for the detection of E. coli in drinking water samples. In this study, different reagents and assay conditions including temperatures, holding time, E. coli strains and concentrations, dissolving agents, salinity and pH effects, quality of substrates of various suppliers of 4-methylumbelliferyl glucuronide (MUG), and environmental water samples were included in the QA/QC plan and used in the assay optimization and documentation. Furthermore, the procedural QA/QC for the monitoring of drinking water samples was established to validate the performance of the biosensor platform for the detection of E. coli using a culture-based standard technique. Implementing the developed QA/QC plan, the same level of precision and accuracy was achieved using both the standard and the biosensor methods. The established procedural QA/QC for the biosensor will provide a reliable tool for a near real-time monitoring of E. coli in drinking water samples to both industry and regulatory authorities. PMID:28054956

  17. Molecular Characterization of Tick Salivary Gland Glutaminyl Cyclase

    PubMed Central

    Adamson, Steven W.; Browning, Rebecca E.; Chao, Chien-Chung; Bateman, Robert C.; Ching, Wei-Mei; Karim, Shahid

    2013-01-01

    Glutaminyl cyclase (QC) catalyzes the cyclization of N-terminal glutamine residues into pyroglutamate. This post-translational modification extends the half-life of peptides and, in some cases, is essential in binding to their cognate receptor. Due to its potential role in the post-translational modification of tick neuropeptides, we report the molecular, biochemical and physiological characterization of salivary gland QC during the prolonged blood-feeding of the black-legged tick (Ixodes scapularis) and the gulf-coast tick (Amblyomma maculatum). QC sequences from I. scapularis and A. maculatum showed a high degree of amino acid identity to each other and other arthropods and residues critical for zinc-binding/catalysis (D159, E202, and H330) or intermediate stabilization (E201, W207, D248, D305, F325, and W329) are conserved. Analysis of QC transcriptional gene expression kinetics depicts an upregulation during the blood-meal of adult female ticks prior to fast feeding phases in both I. scapularis and A. maculatum suggesting a functional link with blood meal uptake. QC enzymatic activity was detected in saliva and extracts of tick salivary glands and midguts. Recombinant QC was shown to be catalytically active. Furthermore, knockdown of QC-transcript by RNA interference resulted in lower enzymatic activity, and small, unviable egg masses in both studied tick species as well as lower engorged tick weights for I. scapularis. These results suggest that the post-translational modification of neurotransmitters and other bioactive peptides by QC is critical to oviposition and potentially other physiological processes. Moreover, these data suggest that tick-specific QC-modified neurotransmitters/hormones or other relevant parts of this system could potentially be used as novel physiological targets for tick control. PMID:23770496

  18. Comparison of quality control software tools for diffusion tensor imaging.

    PubMed

    Liu, Bilan; Zhu, Tong; Zhong, Jianhui

    2015-04-01

    Image quality of diffusion tensor imaging (DTI) is critical for image interpretation, diagnostic accuracy and efficiency. However, DTI is susceptible to numerous detrimental artifacts that may impair the reliability and validity of the obtained data. Although many quality control (QC) software tools are being developed and are widely used and each has its different tradeoffs, there is still no general agreement on an image quality control routine for DTIs, and the practical impact of these tradeoffs is not well studied. An objective comparison that identifies the pros and cons of each of the QC tools will be helpful for the users to make the best choice among tools for specific DTI applications. This study aims to quantitatively compare the effectiveness of three popular QC tools including DTI studio (Johns Hopkins University), DTIprep (University of North Carolina at Chapel Hill, University of Iowa and University of Utah) and TORTOISE (National Institute of Health). Both synthetic and in vivo human brain data were used to quantify adverse effects of major DTI artifacts to tensor calculation as well as the effectiveness of different QC tools in identifying and correcting these artifacts. The technical basis of each tool was discussed, and the ways in which particular techniques affect the output of each of the tools were analyzed. The different functions and I/O formats that three QC tools provide for building a general DTI processing pipeline and integration with other popular image processing tools were also discussed. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. TU-F-CAMPUS-I-05: Semi-Automated, Open Source MRI Quality Assurance and Quality Control Program for Multi-Unit Institution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yung, J; Stefan, W; Reeve, D

    2015-06-15

    Purpose: Phantom measurements allow for the performance of magnetic resonance (MR) systems to be evaluated. Association of Physicists in Medicine (AAPM) Report No. 100 Acceptance Testing and Quality Assurance Procedures for MR Imaging Facilities, American College of Radiology (ACR) MR Accreditation Program MR phantom testing, and ACR MRI quality control (QC) program documents help to outline specific tests for establishing system performance baselines as well as system stability over time. Analyzing and processing tests from multiple systems can be time-consuming for medical physicists. Besides determining whether tests are within predetermined limits or criteria, monitoring longitudinal trends can also help preventmore » costly downtime of systems during clinical operation. In this work, a semi-automated QC program was developed to analyze and record measurements in a database that allowed for easy access to historical data. Methods: Image analysis was performed on 27 different MR systems of 1.5T and 3.0T field strengths from GE and Siemens manufacturers. Recommended measurements involved the ACR MRI Accreditation Phantom, spherical homogenous phantoms, and a phantom with an uniform hole pattern. Measurements assessed geometric accuracy and linearity, position accuracy, image uniformity, signal, noise, ghosting, transmit gain, center frequency, and magnetic field drift. The program was designed with open source tools, employing Linux, Apache, MySQL database and Python programming language for the front and backend. Results: Processing time for each image is <2 seconds. Figures are produced to show regions of interests (ROIs) for analysis. Historical data can be reviewed to compare previous year data and to inspect for trends. Conclusion: A MRI quality assurance and QC program is necessary for maintaining high quality, ACR MRI Accredited MR programs. A reviewable database of phantom measurements assists medical physicists with processing and monitoring of large datasets. Longitudinal data can reveal trends that although are within passing criteria indicate underlying system issues.« less

  20. Practical Shipbuilding Standards for Surface Preparation and Coatings

    DTIC Science & Technology

    1979-07-01

    strong solvent and apply over last coat of epoxy within 48 hours. *Minimum Dry Film Thickness 12.0 SAFETY AND POLUTION CONTROL 12.5 Safety solvents shall...Owner Inspec ion (3) QA/QC Dept. Inspectors. (4) Craft Inspectors (5) Craft Supervision Inspection Only (6) QA/QC Dept. Audit Only (7) Are

  1. Implementation of quality assurance in diagnostic radiology in Bosnia and Herzegovina (Republic of Srpska).

    PubMed

    Bosnjak, J; Ciraj-Bjelac, O; Strbac, B

    2008-01-01

    Application of a quality control (QC) programme is very important when optimisation of image quality and reduction of patient exposure is desired. QC surveys of diagnostics imaging equipment in Republic of Srpska (entity of Bosnia and Herzegovina) has been systematically performed since 2001. The presented results are mostly related to the QC test results of X-ray tubes and generators for diagnostic radiology units in 92 radiology departments. In addition, results include workplace monitoring and usage of personal protective devices for staff and patients. Presented results showed the improvements in the implementation of the QC programme within the period 2001--2005. Also, more attention is given to appropriate maintenance of imaging equipment, which was one of the main problems in the past. Implementation of a QC programme is a continuous and complex process. To achieve good performance of imaging equipment, additional tests are to be introduced, along with image quality assessment and patient dosimetry. Training is very important in order to achieve these goals.

  2. In-Situ Atmospheric Sounding Data Lifecycle from Data Collection, Analysis and Quality Control to Documentation, Archival and Tracking

    NASA Astrophysics Data System (ADS)

    Young, K.; Voemel, H.; Morris, D.

    2015-12-01

    In-situ measurement systems are used to monitor the atmosphere whereby instruments are located in the area of interest and are in direct contact with what is being measured. Dropsondes and radiosondes are instruments used to collect high-vertical-resolution profiles of the atmosphere. The dropsondes are deployed from aircraft and, as they descend, they collect pressure, temperature and humidity data at a half-second rate, and GPS wind data at a quarter-second rate. Radiosondes are used to collect high-resolution measurements of the atmosphere, from the ground to approximately 30 kilometers. Carried by a large helium-filled balloon, they ascend upward through the atmosphere measuring pressure, temperature, relative humidity, and GPS winds at a one-second rate. Advancements in atmospheric research, technology and data assimilation techniques have contributed to driving the need for higher quality, higher resolution radiosonde and dropsonde data at an increasingly rapid rate. These data most notably represent a valuable resource for initializing numerical prediction models, calibrating and validating satellite retrieval techniques for atmospheric profiles, and for climatological research. The In-Situ Sensing Facility, at NCAR, has developed an extensive, multi-step process of quality control (QC). Traditionally, QC has been a time intensive process that involves evaluating data products using a variety of visualization tools and statistical methods. With a greater need for real-time data in the field and a reduced turn-around time for final quality controlled data, new and improved procedures for streamlining statistical analysis and QC are being implemented. Improvements have also been made on two fronts regarding implementation of a comprehensive data management plan. The first was ensuring ease of data accessibility through an intuitive centralized data archive system, that both keeps a record of data users and assigns digital object identifiers to each unique data set. The second improvement was to define appropriate criteria needed for documentation and metadata so that data users have all of the relevant information needed to properly use and understand the complexities of these measurements.

  3. Mammography dosimetry using an in-house developed polymethyl methacrylate phantom.

    PubMed

    Sharma, Reena; Sharma, Sunil Dutt; Mayya, Y S; Chourasiya, G

    2012-08-01

    Phantom-based measurements in mammography are well-established for quality assurance (QA) and quality control (QC) procedures involving equipment performance and comparisons of X-ray machines. Polymethyl methacrylate (PMMA) is among the best suitable materials for simulation of the breast. For carrying out QA/QC exercises in India, a mammographic PMMA phantom with engraved slots for keeping thermoluminescence dosemeters (TLD) has been developed. The radiation transmission property of the developed phantom was compared with the commercially available phantoms for verifying its suitability for mammography dosimetry. The breast entrance exposure (BEE), mean glandular dose (MGD), percentage depth dose (PDD), percentage surface dose distribution (PSDD), calibration testing of automatic exposure control (AEC) and density control function of a mammography machine were measured using this phantom. MGD was derived from the measured BEE following two different methodologies and the results were compared. The PDD and PSDD measurements were carried out using LiF: Mg, Cu, P chips. The in-house phantom was found comparable with the commercially available phantoms. The difference in the MGD values derived using two different methods were found in the range of 17.5-32.6 %. Measured depth ranges in the phantom lie between 0.32 and 0.40 cm for 75 % depth dose, 0.73 and 0.92 cm for 50 % depth dose, and 1.54 and 1.78 cm for 25 % depth dose. Higher PSDD value was observed towards chest wall edge side of the phantom, which is due to the orientation of cathode-anode axis along the chest wall to the nipple direction. Results obtained for AEC configuration testing shows that the observed mean optical density (O.D) of the phantom image was 1.59 and O.D difference for every successive increase in thickness of the phantom was within±0.15 O.D. Under density control function testing, at -2 and -1 density settings, the variation in film image O.D was within±0.15 O.D of the normal density setting '0' and at +2 and +1 density setting, it was observed to be within±0.30 O.D. This study indicates that the locally made PMMA TLD slot phantom can be used to measure various mammography QC parameters which are essentially required for better outcomes in mammography.

  4. Energy transfer networks: Quasicontinuum photoluminescence linked to high densities of defects

    DOE PAGES

    Laurence, Ted A.; Ly, Sonny; Bude, Jeff D.; ...

    2017-11-06

    In a series of studies related to laser-induced damage of optical materials and deposition of plastics, we discovered a broadly emitting photoluminescence with fast lifetimes that we termed quasicontinuum photoluminescence (QC-PL). Here in this paper, we suggest that a high density of optically active defects leads to QC-PL, where interactions between defects affect the temporal and spectral characteristics of both excitation and emission. We develop a model that predicts the temporal characteristics of QC-PL, based on energy transfer interactions between high densities of defects. Our model does not explain all spectral broadening and redshifts found in QC-PL, since we domore » not model spectral changes in defects due to proximity to other defects. However, we do provide an example of a well-defined system that exhibits the QC-PL characteristics of a distribution in shortened lifetimes and broadened, redshifted energy levels: an organic chromophore (fluorescein) that has been dried rapidly on a fused silica surface. Recently, we showed that regions of fused silica exposed to up to 1 billion high-fluence laser shots at 351 rm nm at subdamage fluences exhibit significant transmission losses at the surface. Here, we find that these laser-exposed regions also exhibit QC-PL. Increases in the density of induced defects on these laser-exposed surfaces, as measured by the local transmission loss, lead to decreases in the observed lifetime and redshifts in the spectrum of the QC-PL, consistent with our explanation for QC-PL. In conclusion, we have found QC-PL in an increasing variety of situations and materials, and we believe it is a phenomenon commonly found on surfaces and nanostructured materials.« less

  5. Energy transfer networks: Quasicontinuum photoluminescence linked to high densities of defects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Laurence, Ted A.; Ly, Sonny; Bude, Jeff D.

    In a series of studies related to laser-induced damage of optical materials and deposition of plastics, we discovered a broadly emitting photoluminescence with fast lifetimes that we termed quasicontinuum photoluminescence (QC-PL). Here in this paper, we suggest that a high density of optically active defects leads to QC-PL, where interactions between defects affect the temporal and spectral characteristics of both excitation and emission. We develop a model that predicts the temporal characteristics of QC-PL, based on energy transfer interactions between high densities of defects. Our model does not explain all spectral broadening and redshifts found in QC-PL, since we domore » not model spectral changes in defects due to proximity to other defects. However, we do provide an example of a well-defined system that exhibits the QC-PL characteristics of a distribution in shortened lifetimes and broadened, redshifted energy levels: an organic chromophore (fluorescein) that has been dried rapidly on a fused silica surface. Recently, we showed that regions of fused silica exposed to up to 1 billion high-fluence laser shots at 351 rm nm at subdamage fluences exhibit significant transmission losses at the surface. Here, we find that these laser-exposed regions also exhibit QC-PL. Increases in the density of induced defects on these laser-exposed surfaces, as measured by the local transmission loss, lead to decreases in the observed lifetime and redshifts in the spectrum of the QC-PL, consistent with our explanation for QC-PL. In conclusion, we have found QC-PL in an increasing variety of situations and materials, and we believe it is a phenomenon commonly found on surfaces and nanostructured materials.« less

  6. Energy transfer networks: Quasicontinuum photoluminescence linked to high densities of defects

    NASA Astrophysics Data System (ADS)

    Laurence, Ted A.; Ly, Sonny; Bude, Jeff D.; Baxamusa, Salmaan H.; Lepró, Xavier; Ehrmann, Paul

    2017-11-01

    In a series of studies related to laser-induced damage of optical materials and deposition of plastics, we discovered a broadly emitting photoluminescence with fast lifetimes that we termed quasicontinuum photoluminescence (QC-PL). Here, we suggest that a high density of optically active defects leads to QC-PL, where interactions between defects affect the temporal and spectral characteristics of both excitation and emission. We develop a model that predicts the temporal characteristics of QC-PL, based on energy transfer interactions between high densities of defects. Our model does not explain all spectral broadening and redshifts found in QC-PL, since we do not model spectral changes in defects due to proximity to other defects. However, we do provide an example of a well-defined system that exhibits the QC-PL characteristics of a distribution in shortened lifetimes and broadened, redshifted energy levels: an organic chromophore (fluorescein) that has been dried rapidly on a fused silica surface. Recently, we showed that regions of fused silica exposed to up to 1 billion high-fluence laser shots at 351 rm nm at subdamage fluences exhibit significant transmission losses at the surface. Here, we find that these laser-exposed regions also exhibit QC-PL. Increases in the density of induced defects on these laser-exposed surfaces, as measured by the local transmission loss, lead to decreases in the observed lifetime and redshifts in the spectrum of the QC-PL, consistent with our explanation for QC-PL. We have found QC-PL in an increasing variety of situations and materials, and we believe it is a phenomenon commonly found on surfaces and nanostructured materials.

  7. Quality Control of Wind Data from 50-MHz Doppler Radar Wind Profiler

    NASA Technical Reports Server (NTRS)

    Vacek, Austin

    2016-01-01

    Upper-level wind profiles obtained from a 50-MHz Doppler Radar Wind Profiler (DRWP) instrument at Kennedy Space Center are incorporated in space launch vehicle design and day-of-launch operations to assess wind effects on the vehicle during ascent. Automated and manual quality control (QC) techniques are implemented to remove spurious data in the upper-level wind profiles caused from atmospheric and non-atmospheric artifacts over the 2010-2012 period of record (POR). By adding the new quality controlled profiles with older profiles from 1997-2009, a robust database will be constructed of upper-level wind characteristics. Statistical analysis will determine the maximum, minimum, and 95th percentile of the wind components from the DRWP profiles over recent POR and compare against the older database. Additionally, this study identifies specific QC flags triggered during the QC process to understand how much data is retained and removed from the profiles.

  8. Quality Control of Wind Data from 50-MHz Doppler Radar Wind Profiler

    NASA Technical Reports Server (NTRS)

    Vacek, Austin

    2015-01-01

    Upper-level wind profiles obtained from a 50-MHz Doppler Radar Wind Profiler (DRWP) instrument at Kennedy Space Center are incorporated in space launch vehicle design and day-of-launch operations to assess wind effects on the vehicle during ascent. Automated and manual quality control (QC) techniques are implemented to remove spurious data in the upper-level wind profiles caused from atmospheric and non-atmospheric artifacts over the 2010-2012 period of record (POR). By adding the new quality controlled profiles with older profiles from 1997-2009, a robust database will be constructed of upper-level wind characteristics. Statistical analysis will determine the maximum, minimum, and 95th percentile of the wind components from the DRWP profiles over recent POR and compare against the older database. Additionally, this study identifies specific QC flags triggered during the QC process to understand how much data is retained and removed from the profiles.

  9. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR FORM QA/QC CHECKS (UA-C-2.0)

    EPA Science Inventory

    The purpose of this SOP is to outline the process of Field Quality Assurance and Quality Control checks. This procedure was followed to ensure consistent data retrieval during the Arizona NHEXAS project and the "Border" study. Keywords: custody; QA/QC; field checks.

    The Nation...

  10. Quantum Cascade Laser Measurements of Stratospheric Methane (CHsub4) and Nitrous Oxide (NSub20)

    NASA Technical Reports Server (NTRS)

    Webster, C.; Flesch, G.; Scott, D.; Swanson, J.; May, R.; Gmachl, S.; Capasso, F.; Sivco, D.; Baillargeon, J.; Hutchinson, A.; hide

    1999-01-01

    A tunable Quantum-Cascade (QC) laser has been flown on NASA's ER-2 high-altitude aircraft to produce the first atmospheric gas mearsurements using this newly-invented device, an important milestone in the QC laser's much-anticipated future planetary, industrial, and commercial application.

  11. 40 CFR 98.144 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Glass Production § 98.144 Monitoring and QA/QC requirements. (a) You must measure annual amounts of carbonate-based raw materials charged to each continuous glass... calibrated scales or weigh hoppers. Total annual mass charged to glass melting furnaces at the facility shall...

  12. 40 CFR 98.414 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Industrial Greenhouse Gases § 98.414 Monitoring... or better. If the mass in paragraph (a) of this section is measured by weighing containers that...

  13. A Multilaboratory, Multicountry Study To Determine Bedaquiline MIC Quality Control Ranges for Phenotypic Drug Susceptibility Testing

    PubMed Central

    Cirillo, Daniela M.; Hoffner, Sven; Ismail, Nazir A.; Kaur, Devinder; Lounis, Nacer; Metchock, Beverly; Pfyffer, Gaby E.; Venter, Amour

    2016-01-01

    The aim of this study was to establish standardized drug susceptibility testing (DST) methodologies and reference MIC quality control (QC) ranges for bedaquiline, a diarylquinoline antimycobacterial, used in the treatment of adults with multidrug-resistant tuberculosis. Two tier-2 QC reproducibility studies of bedaquiline DST were conducted in eight laboratories using Clinical Laboratory and Standards Institute (CLSI) guidelines. Agar dilution and broth microdilution methods were evaluated. Mycobacterium tuberculosis H37Rv was used as the QC reference strain. Bedaquiline MIC frequency, mode, and geometric mean were calculated. When resulting data occurred outside predefined CLSI criteria, the entire laboratory data set was excluded. For the agar dilution MIC, a 4-dilution QC range (0.015 to 0.12 μg/ml) centered around the geometric mean included 95.8% (7H10 agar dilution; 204/213 observations with one data set excluded) or 95.9% (7H11 agar dilution; 232/242) of bedaquiline MICs. For the 7H9 broth microdilution MIC, a 3-dilution QC range (0.015 to 0.06 μg/ml) centered around the mode included 98.1% (207/211, with one data set excluded) of bedaquiline MICs. Microbiological equivalence was demonstrated for bedaquiline MICs determined using 7H10 agar and 7H11 agar but not for bedaquiline MICs determined using 7H9 broth and 7H10 agar or 7H9 broth and 7H11 agar. Bedaquiline DST methodologies and MIC QC ranges against the H37Rv M. tuberculosis reference strain have been established: 0.015 to 0.12 μg/ml for the 7H10 and 7H11 agar dilution MICs and 0.015 to 0.06 μg/ml for the 7H9 broth microdilution MIC. These methodologies and QC ranges will be submitted to CLSI and EUCAST to inform future research and provide guidance for routine clinical bedaquiline DST in laboratories worldwide. PMID:27654337

  14. MO-AB-210-00: Diagnostic Ultrasound Imaging Quality Control and High Intensity Focused Ultrasound Therapy Hands-On Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    The goal of this ultrasound hands-on workshop is to demonstrate advancements in high intensity focused ultrasound (HIFU) and to demonstrate quality control (QC) testing in diagnostic ultrasound. HIFU is a therapeutic modality that uses ultrasound waves as carriers of energy. HIFU is used to focus a beam of ultrasound energy into a small volume at specific target locations within the body. The focused beam causes localized high temperatures and produces a well-defined regions of necrosis. This completely non-invasive technology has great potential for tumor ablation and targeted drug delivery. At the workshop, attendees will see configurations, applications, and hands-on demonstrationsmore » with on-site instructors at separate stations. The involvement of medical physicists in diagnostic ultrasound imaging service is increasing due to QC and accreditation requirements. At the workshop, an array of ultrasound testing phantoms and ultrasound scanners will be provided for attendees to learn diagnostic ultrasound QC in a hands-on environment with live demonstrations of the techniques. Target audience: Medical physicists and other medical professionals in diagnostic imaging and radiation oncology with interest in high-intensity focused ultrasound and in diagnostic ultrasound QC. Learning Objectives: Learn ultrasound physics and safety for HIFU applications through live demonstrations Get an overview of the state-of-the art in HIFU technologies and equipment Gain familiarity with common elements of a quality control program for diagnostic ultrasound imaging Identify QC tools available for testing diagnostic ultrasound systems and learn how to use these tools List of supporting vendors for HIFU and diagnostic ultrasound QC hands-on workshop: Philips Healthcare Alpinion Medical Systems Verasonics, Inc Zonare Medical Systems, Inc Computerized Imaging Reference Systems (CIRS), Inc. GAMMEX, Inc., Cablon Medical BV Steffen Sammet: NIH/NCI grant 5R25CA132822, NIH/NINDS grant 5R25NS080949 and Philips Healthcare research grant C32.« less

  15. Land 3D-seismic data: Preprocessing quality control utilizing survey design specifications, noise properties, normal moveout, first breaks, and offset

    USGS Publications Warehouse

    Raef, A.

    2009-01-01

    The recent proliferation of the 3D reflection seismic method into the near-surface area of geophysical applications, especially in response to the emergence of the need to comprehensively characterize and monitor near-surface carbon dioxide sequestration in shallow saline aquifers around the world, justifies the emphasis on cost-effective and robust quality control and assurance (QC/QA) workflow of 3D seismic data preprocessing that is suitable for near-surface applications. The main purpose of our seismic data preprocessing QC is to enable the use of appropriate header information, data that are free of noise-dominated traces, and/or flawed vertical stacking in subsequent processing steps. In this article, I provide an account of utilizing survey design specifications, noise properties, first breaks, and normal moveout for rapid and thorough graphical QC/QA diagnostics, which are easy to apply and efficient in the diagnosis of inconsistencies. A correlated vibroseis time-lapse 3D-seismic data set from a CO2-flood monitoring survey is used for demonstrating QC diagnostics. An important by-product of the QC workflow is establishing the number of layers for a refraction statics model in a data-driven graphical manner that capitalizes on the spatial coverage of the 3D seismic data. ?? China University of Geosciences (Wuhan) and Springer-Verlag GmbH 2009.

  16. The quality control theory of aging.

    PubMed

    Ladiges, Warren

    2014-01-01

    The quality control (QC) theory of aging is based on the concept that aging is the result of a reduction in QC of cellular systems designed to maintain lifelong homeostasis. Four QC systems associated with aging are 1) inadequate protein processing in a distressed endoplasmic reticulum (ER); 2) histone deacetylase (HDAC) processing of genomic histones and gene silencing; 3) suppressed AMPK nutrient sensing with inefficient energy utilization and excessive fat accumulation; and 4) beta-adrenergic receptor (BAR) signaling and environmental and emotional stress. Reprogramming these systems to maintain efficiency and prevent aging would be a rational strategy for increased lifespan and improved health. The QC theory can be tested with a pharmacological approach using three well-known and safe, FDA-approved drugs: 1) phenyl butyric acid, a chemical chaperone that enhances ER function and is also an HDAC inhibitor, 2) metformin, which activates AMPK and is used to treat type 2 diabetes, and 3) propranolol, a beta blocker which inhibits BAR signaling and is used to treat hypertension and anxiety. A critical aspect of the QC theory, then, is that aging is associated with multiple cellular systems that can be targeted with drug combinations more effectively than with single drugs. But more importantly, these drug combinations will effectively prevent, delay, or reverse chronic diseases of aging that impose such a tremendous health burden on our society.

  17. Use of Enterococcus faecalis and Bacillus atrophaeus as surrogates to establish and maintain laboratory proficiency for concentration of water samples using ultrafiltration.

    PubMed

    Mapp, Latisha; Klonicki, Patricia; Takundwa, Prisca; Hill, Vincent R; Schneeberger, Chandra; Knee, Jackie; Raynor, Malik; Hwang, Nina; Chambers, Yildiz; Miller, Kenneth; Pope, Misty

    2015-11-01

    The U.S. Environmental Protection Agency's (EPA) Water Laboratory Alliance (WLA) currently uses ultrafiltration (UF) for concentration of biosafety level 3 (BSL-3) agents from large volumes (up to 100-L) of drinking water prior to analysis. Most UF procedures require comprehensive training and practice to achieve and maintain proficiency. As a result, there was a critical need to develop quality control (QC) criteria. Because select agents are difficult to work with and pose a significant safety hazard, QC criteria were developed using surrogates, including Enterococcus faecalis and Bacillus atrophaeus. This article presents the results from the QC criteria development study and results from a subsequent demonstration exercise in which E. faecalis was used to evaluate proficiency using UF to concentrate large volume drinking water samples. Based on preliminary testing EPA Method 1600 and Standard Methods 9218, for E. faecalis and B. atrophaeus respectively, were selected for use during the QC criteria development study. The QC criteria established for Method 1600 were used to assess laboratory performance during the demonstration exercise. Based on the results of the QC criteria study E. faecalis and B. atrophaeus can be used effectively to demonstrate and maintain proficiency using ultrafiltration. Published by Elsevier B.V.

  18. The importance of quality assurance/quality control of diagnostics to increase the confidence in global foot-and-mouth disease control.

    PubMed

    De Clercq, K; Goris, N; Barnett, P V; MacKay, D K

    2008-01-01

    The last decade international trade in animals and animal products was liberated and confidence in this global trade can increase only if appropriate control measures are applied. As foot-and-mouth disease (FMD) diagnostics will play an essential role in this respect, the Food and Agriculture Organization European Commission for the Control of Foot-and-Mouth Disease (EUFMD) co-ordinates, in collaboration with the European Commission, several programmes to increase the quality of FMD diagnostics. A quality assurance (QA) system is deemed essential for laboratories involved in certifying absence of FMDV or antibodies against the virus. Therefore, laboratories are encouraged to validate their diagnostic tests fully and to install a continuous quality control (QC) monitoring system. Knowledge of performance characteristics of diagnostics is essential to interpret results correctly and to calculate sample rates in regional surveillance campaigns. Different aspects of QA/QC of classical and new FMD virological and serological diagnostics are discussed in respect to the EU FMD directive (2003/85/EC). We recommended accepting trade certificates only from laboratories participating in international proficiency testing on a regular basis.

  19. Quality-Assurance/Quality-Control Manual for Collection and Analysis of Water-Quality Data in the Ohio District, US Geological Survey

    USGS Publications Warehouse

    Francy, D.S.; Jones, A.L.; Myers, Donna N.; Rowe, G.L.; Eberle, Michael; Sarver, K.M.

    1998-01-01

    The U.S. Geological Survey (USGS), Water Resources Division (WRD), requires that quality-assurance/quality-control (QA/QC) activities be included in any sampling and analysis program. Operational QA/QC procedures address local needs while incorporating national policies. Therefore, specific technical policies were established for all activities associated with water-quality project being done by the Ohio District. The policies described in this report provide Ohio District personnel, cooperating agencies, and others with a reference manual on QA/QC procedures that are followed in collecitng and analyzing water-quality samples and reporting water-quality information in the Ohio District. The project chief, project support staff, District Water-Quality Specialist, and District Laboratory Coordinator are all involved in planning and implementing QA/QC activities at the district level. The District Chief and other district-level managers provide oversight, and the Regional Water-Quality Specialist, Office of Water Quality (USGS headquarters), and the Branch of Quality Systems within the Office of Water Quality create national QA/QC polices and provide assistance to District personnel. In the literature, the quality of all measurement data is expressed in terms of precision, variability, bias, accuracy, completeness, representativeness, and comparability. In the Ohio District, bias and variability will be used to describe quality-control data generated from samples in the field and laboratory. Each project chief must plan for implementation and financing of QA/QC activities necessary to achieve data-quality objectives. At least 15 percent of the total project effort must be directed toward QA/QC activities. Of this total, 5-10 percent will be used for collection and analysis of quality-control samples. This is an absolute minimum, and more may be required based on project objectives. Proper techniques must be followed in the collection and processing of surface-water, ground-water, biological, precipitation, bed-sediment, bedload, suspended-sediment, and solid-phase samples. These techniques are briefly described in this report and are extensively documented. The reference documents listed in this report will be kept by the District librarian and District Water-Quality Specialist and updated regularly so that they are available to all District staff. Proper handling and documentation before, during, and after field activities are essential to ensure the integrity of the sample and to correct erroneous reporting of data results. Field sites are to be properly identified and entered into the data base before field data-collection activities begin. During field activities, field notes are to be completed and sample bottles appropriately labeled a nd stored. After field activities, all paperwork is to be completed promptly and samples transferred to the laboratory within allowable holding times. All equipment used by District personnel for the collection and processing of water-quality samples is to be properly operated, maintained, and calibrated by project personnel. This includes equipment for onsite measurement of water-quality characteristics (temperature, specific conductance, pH, dissolved oxygen, alkalinity, acidity, and turbidity) and equipment and instruments used for biological sampling. The District Water-Quality Specialist and District Laboratory Coordinator are responsible for preventive maintenance and calibration of equipment in the Ohio District laboratory. The USGS National Water Quality Laboratory in Arvada, Colo., is the primary source of analytical services for most project work done by the Ohio District. Analyses done at the Ohio District laboratory are usually those that must be completed within a few hours of sample collection. Contract laboratories or other USGS laboratories are sometimes used instead of the NWQL or the Ohio District laboratory. When a contract laboratory is used, the projec

  20. Impacts of Intelligent Automated Quality Control on a Small Animal APD-Based Digital PET Scanner

    NASA Astrophysics Data System (ADS)

    Charest, Jonathan; Beaudoin, Jean-François; Bergeron, Mélanie; Cadorette, Jules; Arpin, Louis; Lecomte, Roger; Brunet, Charles-Antoine; Fontaine, Réjean

    2016-10-01

    Stable system performance is mandatory to warrant the accuracy and reliability of biological results relying on small animal positron emission tomography (PET) imaging studies. This simple requirement sets the ground for imposing routine quality control (QC) procedures to keep PET scanners at a reliable optimal performance level. However, such procedures can become burdensome to implement for scanner operators, especially taking into account the increasing number of data acquisition channels in newer generation PET scanners. In systems using pixel detectors to achieve enhanced spatial resolution and contrast-to-noise ratio (CNR), the QC workload rapidly increases to unmanageable levels due to the number of independent channels involved. An artificial intelligence based QC system, referred to as Scanner Intelligent Diagnosis for Optimal Performance (SIDOP), was proposed to help reducing the QC workload by performing automatic channel fault detection and diagnosis. SIDOP consists of four high-level modules that employ machine learning methods to perform their tasks: Parameter Extraction, Channel Fault Detection, Fault Prioritization, and Fault Diagnosis. Ultimately, SIDOP submits a prioritized faulty channel list to the operator and proposes actions to correct them. To validate that SIDOP can perform QC procedures adequately, it was deployed on a LabPET™ scanner and multiple performance metrics were extracted. After multiple corrections on sub-optimal scanner settings, a 8.5% (with a 95% confidence interval (CI) of [7.6, 9.3]) improvement in the CNR, a 17.0% (CI: [15.3, 18.7]) decrease of the uniformity percentage standard deviation, and a 6.8% gain in global sensitivity were observed. These results confirm that SIDOP can indeed be of assistance in performing QC procedures and restore performance to optimal figures.

  1. Office of Student Financial Aid Quality Improvement Program: Design and Implementation Plan.

    ERIC Educational Resources Information Center

    Advanced Technology, Inc., Reston, VA.

    The purpose and direction of the Office of Student Financial Aid (OSFA) quality improvement program are described. The background and context for the Pell Grant quality control (QC) design study and the meaning of QC are reviewed. The general approach to quality improvement consists of the following elements: a strategic approach that enables OSFA…

  2. FASTQ quality control dashboard

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2016-07-25

    FQCDB builds up existing open source software, FastQC, implementing a modern web interface for across parsed output of FastQC. In addition, FQCDB is extensible as a web service to include additional plots of type line, boxplot, or heatmap, across data formatted according to guidelines. The interface is also configurable via more readable JSON format, enabling customization by non-web programmers.

  3. Many roads may lead to Rome: Selected features of quality control within environmental assessment systems in the US, NL, CA, and UK

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Günther, Markus, E-mail: markus.guenther@tu-berlin.de; Geißler, Gesa; Köppel, Johann

    As there is no one-and-only concept on how to precisely define and establish quality control (QC) or quality assurance (QA) in the making of environmental assessments (EA), this paper presents selected features of international approaches that address quality in EA systems in the USA, the Netherlands, Canada, and the United Kingdom. Based on explanative case studies, we highlight the embedding of specific quality control features within the EA systems, the objectives and processes, and relevant transparency challenges. Such features of QC/QA approaches can be considered in cases where substantial quality control and assurance efforts are still missing. Yet further researchmore » needs to be conducted on the efficacy of these approaches, which remains beyond the scope of this study. - Highlights: • We present four tools for quality control and assurance from different EA systems. • Approaches vary in institutional setting, objectives, procedures, and transparency. • Highlighted features might provide guidance in cases where QC/QA is still lacking.« less

  4. Quality Circles: An Innovative Program to Improve Military Hospitals

    DTIC Science & Technology

    1982-08-01

    quality control. However, Dr. Kaoru Ishikawa is credited with starting the first "Quality Control Circles" and registering them with the Japanese Union of...McGregor and Abraham Maslow into a unique style of management. In 1962 Dr. Ishikawa , a professor at Tokyo University, developed the QC concept based on...RECOMMENDATIONS Conclusions The QC concept has come a long way since Dr. Ishikawa gave it birth in 1962. It has left an enviable record of success along its

  5. Follow-Up of External Quality Controls for PCR-Based Diagnosis of Whooping Cough in a Hospital Laboratory Network (Renacoq) and in Other Hospital and Private Laboratories in France.

    PubMed

    Guillot, Sophie; Guiso, Nicole

    2016-08-01

    The French National Reference Centre (NRC) for Whooping Cough carried out an external quality control (QC) analysis in 2010 for the PCR diagnosis of whooping cough. The main objective of the study was to assess the impact of this QC in the participating laboratories through a repeat analysis in 2012. Copyright © 2016, American Society for Microbiology. All Rights Reserved.

  6. The Quality Control Algorithms Used in the Process of Creating the NASA Kennedy Space Center Lightning Protection System Towers Meteorological Database

    NASA Technical Reports Server (NTRS)

    Orcutt, John M.; Brenton, James C.

    2016-01-01

    The methodology and the results of the quality control (QC) process of the meteorological data from the Lightning Protection System (LPS) towers located at Kennedy Space Center (KSC) launch complex 39B (LC-39B) are documented in this paper. Meteorological data are used to design a launch vehicle, determine operational constraints, and to apply defined constraints on day-of-launch (DOL). In order to properly accomplish these tasks, a representative climatological database of meteorological records is needed because the database needs to represent the climate the vehicle will encounter. Numerous meteorological measurement towers exist at KSC; however, the engineering tasks need measurements at specific heights, some of which can only be provided by a few towers. Other than the LPS towers, Tower 313 is the only tower that provides observations up to 150 m. This tower is located approximately 3.5 km from LC-39B. In addition, data need to be QC'ed to remove erroneous reports that could pollute the results of an engineering analysis, mislead the development of operational constraints, or provide a false image of the atmosphere at the tower's location.

  7. Improvement of the customer satisfaction through Quality Assurance Matrix and QC-Story methods: A case study from automotive industry

    NASA Astrophysics Data System (ADS)

    Sicoe, G. M.; Belu, N.; Rachieru, N.; Nicolae, E. V.

    2017-10-01

    Presently, in the automotive industry, the tendency is to adapt permanently to the changes and introduce the market tendency in the new products that leads of the customer satisfaction. Many quality techniques were adopted in this field to continuous improvement of product and process quality and advantages were also gained. The present paper has focused on possibilities that offers the use of Quality Assurance Matrix (QAM) and Quality Control Story (QC Story) to provide largest protection against nonconformities in the production process, throughout a case study in the automotive industry. There is a direct relationship from the QAM to a QC Story analysis. The failures identified using QAM are treated with QC Story methodology. Using this methods, will help to decrease the PPM values and will increase the quality performance and the customer satisfaction.

  8. Examination of China’s performance and thematic evolution in quantum cryptography research using quantitative and computational techniques

    PubMed Central

    2018-01-01

    This study performed two phases of analysis to shed light on the performance and thematic evolution of China’s quantum cryptography (QC) research. First, large-scale research publication metadata derived from QC research published from 2001–2017 was used to examine the research performance of China relative to that of global peers using established quantitative and qualitative measures. Second, this study identified the thematic evolution of China’s QC research using co-word cluster network analysis, a computational science mapping technique. The results from the first phase indicate that over the past 17 years, China’s performance has evolved dramatically, placing it in a leading position. Among the most significant findings is the exponential rate at which all of China’s performance indicators (i.e., Publication Frequency, citation score, H-index) are growing. China’s H-index (a normalized indicator) has surpassed all other countries’ over the last several years. The second phase of analysis shows how China’s main research focus has shifted among several QC themes, including quantum-key-distribution, photon-optical communication, network protocols, and quantum entanglement with an emphasis on applied research. Several themes were observed across time periods (e.g., photons, quantum-key-distribution, secret-messages, quantum-optics, quantum-signatures); some themes disappeared over time (e.g., computer-networks, attack-strategies, bell-state, polarization-state), while others emerged more recently (e.g., quantum-entanglement, decoy-state, unitary-operation). Findings from the first phase of analysis provide empirical evidence that China has emerged as the global driving force in QC. Considering China is the premier driving force in global QC research, findings from the second phase of analysis provide an understanding of China’s QC research themes, which can provide clarity into how QC technologies might take shape. QC and science and technology policy researchers can also use these findings to trace previous research directions and plan future lines of research. PMID:29385151

  9. Examination of China's performance and thematic evolution in quantum cryptography research using quantitative and computational techniques.

    PubMed

    Olijnyk, Nicholas V

    2018-01-01

    This study performed two phases of analysis to shed light on the performance and thematic evolution of China's quantum cryptography (QC) research. First, large-scale research publication metadata derived from QC research published from 2001-2017 was used to examine the research performance of China relative to that of global peers using established quantitative and qualitative measures. Second, this study identified the thematic evolution of China's QC research using co-word cluster network analysis, a computational science mapping technique. The results from the first phase indicate that over the past 17 years, China's performance has evolved dramatically, placing it in a leading position. Among the most significant findings is the exponential rate at which all of China's performance indicators (i.e., Publication Frequency, citation score, H-index) are growing. China's H-index (a normalized indicator) has surpassed all other countries' over the last several years. The second phase of analysis shows how China's main research focus has shifted among several QC themes, including quantum-key-distribution, photon-optical communication, network protocols, and quantum entanglement with an emphasis on applied research. Several themes were observed across time periods (e.g., photons, quantum-key-distribution, secret-messages, quantum-optics, quantum-signatures); some themes disappeared over time (e.g., computer-networks, attack-strategies, bell-state, polarization-state), while others emerged more recently (e.g., quantum-entanglement, decoy-state, unitary-operation). Findings from the first phase of analysis provide empirical evidence that China has emerged as the global driving force in QC. Considering China is the premier driving force in global QC research, findings from the second phase of analysis provide an understanding of China's QC research themes, which can provide clarity into how QC technologies might take shape. QC and science and technology policy researchers can also use these findings to trace previous research directions and plan future lines of research.

  10. Employing quality control and feedback to the EQ-5D-5L valuation protocol to improve the quality of data collection.

    PubMed

    Purba, Fredrick Dermawan; Hunfeld, Joke A M; Iskandarsyah, Aulia; Fitriana, Titi Sahidah; Sadarjoen, Sawitri S; Passchier, Jan; Busschbach, Jan J V

    2017-05-01

    In valuing health states using generic questionnaires such as EQ-5D, there are unrevealed issues with the quality of the data collection. The aims were to describe the problems encountered during valuation and to evaluate a quality control report and subsequent retraining of interviewers in improving this valuation. Data from the first 266 respondents in an EQ-5D-5L valuation study were used. Interviewers were trained and answered questions regarding problems during these initial interviews. Thematic analysis was used, and individual feedback was provided. After completion of 98 interviews, a first quantitative quality control (QC) report was generated, followed by a 1-day retraining program. Subsequently individual feedback was also given on the basis of follow-up QCs. The Wilcoxon signed-rank test was used to assess improvements based on 7 indicators of quality as identified in the first QC and the QC conducted after a further 168 interviews. Interviewers encountered problems in recruiting respondents. Solutions provided were: optimization of the time of interview, the use of broader networks and the use of different scripts to explain the project's goals to respondents. For problems in interviewing process, solutions applied were: developing the technical and personal skills of the interviewers and stimulating the respondents' thought processes. There were also technical problems related to hardware, software and internet connections. There was an improvement in all 7 indicators of quality after the second QC. Training before and during a study, and individual feedback on the basis of a quantitative QC, can increase the validity of values obtained from generic questionnaires.

  11. High-resolution audiometry: an automated method for hearing threshold acquisition with quality control.

    PubMed

    Bian, Lin

    2012-01-01

    In clinical practice, hearing thresholds are measured at only five to six frequencies at octave intervals. Thus, the audiometric configuration cannot closely reflect the actual status of the auditory structures. In addition, differential diagnosis requires quantitative comparison of behavioral thresholds with physiological measures, such as otoacoustic emissions (OAEs) that are usually measured in higher resolution. The purpose of this research was to develop a method to improve the frequency resolution of the audiogram. A repeated-measure design was used in the study to evaluate the reliability of the threshold measurements. A total of 16 participants with clinically normal hearing and mild hearing loss were recruited from a population of university students. No intervention was involved in the study. Custom developed system and software were used for threshold acquisition with quality control (QC). With real-ear calibration and monitoring of test signals, the system provided accurate and individualized measure of hearing thresholds that were determined by an analysis based on signal detection theory (SDT). The reliability of the threshold measure was assessed by correlation and differences between the repeated measures. The audiometric configurations were diverse and unique to each individual ear. The accuracy, within-subject reliability, and between-test repeatability are relatively high. With QC, the high-resolution audiograms can be reliably and accurately measured. Hearing thresholds measured as ear canal sound pressures with higher frequency resolution can provide more customized hearing-aid fitting. The test system may be integrated with other physiological measures, such as OAEs, into a comprehensive evaluative tool. American Academy of Audiology.

  12. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--STANDARD OPERATING PROCEDURE FOR FORM QA AND QC CHECKS (UA-C-2.0)

    EPA Science Inventory

    The purpose of this SOP is to outline the process of field quality assurance and quality control checks. This procedure was followed to ensure consistent data retrieval during the Arizona NHEXAS project and the Border study. Keywords: custody; QA/QC; field checks.

    The U.S.-Mex...

  13. Evaluation of noninvasive cardiac output methods during exercise

    NASA Technical Reports Server (NTRS)

    Moore, Alan D.; Barrows, Linda H.; Rashid, Michael; Siconolfi, Steven F.

    1992-01-01

    Noninvasive techniques to estimate cardiac output (Qc) will be used during future space flight. This retrospective literature survey compared the Qc techniques of carbon dioxide rebreathing (CO2-R), CO2 single breath (CO2-S), Doppler (DOP), impedance (IM), and inert gas (IG: acetylene or nitrous oxide) to direct (DIR) assessments measured at rest and during exercise.

  14. [Highly quality-controlled radiation therapy].

    PubMed

    Shirato, Hiroki

    2005-04-01

    Advanced radiation therapy for intracranial disease has focused on set-up accuracy for the past 15 years. However, quality control in the prescribed dose is actually as important as the tumor set-up in radiation therapy. Because of the complexity of the three-dimensional radiation treatment planning system in recent years, the highly quality-controlled prescription of the dose has now been reappraised as the mainstream to improve the treatment outcome of radiation therapy for intracranial disease. The Japanese Committee for Quality Control of Radiation Therapy has developed fundamental requirements such as a QC committee in each hospital, a medical physicist, dosimetrists (QC members), and an external audit.

  15. Network-Centric Quantum Communications

    NASA Astrophysics Data System (ADS)

    Hughes, Richard

    2014-03-01

    Single-photon quantum communications (QC) offers ``future-proof'' cryptographic security rooted in the laws of physics. Today's quantum-secured communications cannot be compromised by unanticipated future technological advances. But to date, QC has only existed in point-to-point instantiations that have limited ability to address the cyber security challenges of our increasingly networked world. In my talk I will describe a fundamentally new paradigm of network-centric quantum communications (NQC) that leverages the network to bring scalable, QC-based security to user groups that may have no direct user-to-user QC connectivity. With QC links only between each of N users and a trusted network node, NQC brings quantum security to N2 user pairs, and to multi-user groups. I will describe a novel integrated photonics quantum smartcard (``QKarD'') and its operation in a multi-node NQC test bed. The QKarDs are used to implement the quantum cryptographic protocols of quantum identification, quantum key distribution and quantum secret splitting. I will explain how these cryptographic primitives are used to provide key management for encryption, authentication, and non-repudiation for user-to-user communications. My talk will conclude with a description of a recent demonstration that QC can meet both the security and quality-of-service (latency) requirements for electric grid control commands and data. These requirements cannot be met simultaneously with present-day cryptography.

  16. Assessing accuracy and precision for field and laboratory data: a perspective in ecosystem restoration

    USGS Publications Warehouse

    Stapanian, Martin A.; Lewis, Timothy E; Palmer, Craig J.; Middlebrook Amos, Molly

    2016-01-01

    Unlike most laboratory studies, rigorous quality assurance/quality control (QA/QC) procedures may be lacking in ecosystem restoration (“ecorestoration”) projects, despite legislative mandates in the United States. This is due, in part, to ecorestoration specialists making the false assumption that some types of data (e.g. discrete variables such as species identification and abundance classes) are not subject to evaluations of data quality. Moreover, emergent behavior manifested by complex, adapting, and nonlinear organizations responsible for monitoring the success of ecorestoration projects tend to unconsciously minimize disorder, QA/QC being an activity perceived as creating disorder. We discuss similarities and differences in assessing precision and accuracy for field and laboratory data. Although the concepts for assessing precision and accuracy of ecorestoration field data are conceptually the same as laboratory data, the manner in which these data quality attributes are assessed is different. From a sample analysis perspective, a field crew is comparable to a laboratory instrument that requires regular “recalibration,” with results obtained by experts at the same plot treated as laboratory calibration standards. Unlike laboratory standards and reference materials, the “true” value for many field variables is commonly unknown. In the laboratory, specific QA/QC samples assess error for each aspect of the measurement process, whereas field revisits assess precision and accuracy of the entire data collection process following initial calibration. Rigorous QA/QC data in an ecorestoration project are essential for evaluating the success of a project, and they provide the only objective “legacy” of the dataset for potential legal challenges and future uses.

  17. Development and use of quantitative competitive PCR assays for relative quantifying rumen anaerobic fungal populations in both in vitro and in vivo systems.

    PubMed

    Sekhavati, Mohammad H; Mesgaran, Mohsen Danesh; Nassiri, Mohammad R; Mohammadabadi, Tahereh; Rezaii, Farkhondeh; Fani Maleki, Adham

    2009-10-01

    This paper describes the use of a quantitative competitive polymerase chain reaction (QC-PCR) assay; using PCR primers to the rRNA locus of rumen fungi and a standard-control DNA including design and validation. In order to test the efficiency of this method for quantifying anaerobic rumen fungi, it has been attempted to evaluate this method in in vitro conditions by comparing with an assay based on measuring cell wall chitin. The changes in fungal growth have been studied when they are grown in in vitro on either untreated (US) or sodium hydroxide treated wheat straw (TS). Results showed that rumen fungi growth was significantly higher in treated samples compared with untreated during the 12d incubation (P<0.05) and plotting the chitin assay's results against the competitive PCR's showed high positive correlation (R(2)> or =0.87). The low mean values of the coefficients of variance in repeatability in the QC-PCR method against the chitin assay demonstrated more reliability of this new approach. And finally, the efficiency of this method was investigated in in vivo conditions. Samples of rumen fluid were collected from four fistulated Holstein steers which were fed four different diets (basal diet, high starch, high sucrose and starch plus sucrose) in rotation. The results of QC-PCR showed that addition of these non-structural carbohydrates to the basal diets caused a significant decrease in rumen anaerobic fungi biomass. The QC-PCR method appears to be a reliable and can be used for rumen samples.

  18. Disk diffusion quality control guidelines for NVP-PDF 713: a novel peptide deformylase inhibitor.

    PubMed

    Anderegg, Tamara R; Jones, Ronald N

    2004-01-01

    NVP-PDF713 is a peptide deformylase inhibitor that has emerged as a candidate for treating Gram-positive infections and selected Gram-negative species that commonly cause community-acquired respiratory tract infections. This report summarizes the results of a multi-center (seven participants) disk diffusion quality control (QC) investigation for NVP PDF-713 using guidelines of the National Committee for Clinical Laboratory Standards and the standardized disk diffusion method. A total of 420 NVP-PDF 713 zone diameter values were generated for each QC organism. The proposed zone diameter ranges contained 97.6-99.8% of the reported participant results and were: Staphylococcus aureus ATCC 25923 (25-35 mm), Streptococcus pneumoniae ATCC 49619 (30-37 mm), and Haemophilus influenzae ATCC 49247 (24-32 mm). These QC criteria for the disk diffusion method should be applied during the NVP-PDF 713 clinical trials to maximize test accuracy.

  19. UK audit of glomerular filtration rate measurement from plasma sampling in 2013.

    PubMed

    Murray, Anthony W; Lawson, Richard S; Cade, Sarah C; Hall, David O; Kenny, Bob; O'Shaughnessy, Emma; Taylor, Jon; Towey, David; White, Duncan; Carson, Kathryn

    2014-11-01

    An audit was carried out into UK glomerular filtration rate (GFR) calculation. The results were compared with an identical 2001 audit. Participants used their routine method to calculate GFR for 20 data sets (four plasma samples) in millilitres per minute and also the GFR normalized for body surface area. Some unsound data sets were included to analyse the applied quality control (QC) methods. Variability between centres was assessed for each data set, compared with the national median and a reference value calculated using the method recommended in the British Nuclear Medicine Society guidelines. The influence of the number of samples on variability was studied. Supplementary data were requested on workload and methodology. The 59 returns showed widespread standardization. The applied early exponential clearance correction was the main contributor to the observed variability. These corrections were applied by 97% of centres (50% - 2001) with 80% using the recommended averaged Brochner-Mortenson correction. Approximately 75% applied the recommended Haycock body surface area formula for adults (78% for children). The effect of the number of samples used was not significant. There was wide variability in the applied QC techniques, especially in terms of the use of the volume of distribution. The widespread adoption of the guidelines has harmonized national GFR calculation compared with the previous audit. Further standardization could further reduce variability. This audit has highlighted the need to address the national standardization of QC methods. Radionuclide techniques are confirmed as the preferred method for GFR measurement when an unequivocal result is required.

  20. An effective and optimal quality control approach for green energy manufacturing using design of experiments framework and evolutionary algorithm

    NASA Astrophysics Data System (ADS)

    Saavedra, Juan Alejandro

    Quality Control (QC) and Quality Assurance (QA) strategies vary significantly across industries in the manufacturing sector depending on the product being built. Such strategies range from simple statistical analysis and process controls, decision-making process of reworking, repairing, or scraping defective product. This study proposes an optimal QC methodology in order to include rework stations during the manufacturing process by identifying the amount and location of these workstations. The factors that are considered to optimize these stations are cost, cycle time, reworkability and rework benefit. The goal is to minimize the cost and cycle time of the process, but increase the reworkability and rework benefit. The specific objectives of this study are: (1) to propose a cost estimation model that includes energy consumption, and (2) to propose an optimal QC methodology to identify quantity and location of rework workstations. The cost estimation model includes energy consumption as part of the product direct cost. The cost estimation model developed allows the user to calculate product direct cost as the quality sigma level of the process changes. This provides a benefit because a complete cost estimation calculation does not need to be performed every time the processes yield changes. This cost estimation model is then used for the QC strategy optimization process. In order to propose a methodology that provides an optimal QC strategy, the possible factors that affect QC were evaluated. A screening Design of Experiments (DOE) was performed on seven initial factors and identified 3 significant factors. It reflected that one response variable was not required for the optimization process. A full factorial DOE was estimated in order to verify the significant factors obtained previously. The QC strategy optimization is performed through a Genetic Algorithm (GA) which allows the evaluation of several solutions in order to obtain feasible optimal solutions. The GA evaluates possible solutions based on cost, cycle time, reworkability and rework benefit. Finally it provides several possible solutions because this is a multi-objective optimization problem. The solutions are presented as chromosomes that clearly state the amount and location of the rework stations. The user analyzes these solutions in order to select one by deciding which of the four factors considered is most important depending on the product being manufactured or the company's objective. The major contribution of this study is to provide the user with a methodology used to identify an effective and optimal QC strategy that incorporates the number and location of rework substations in order to minimize direct product cost, and cycle time, and maximize reworkability, and rework benefit.

  1. Passive and active mid-infrared semiconductor nanostructures: Three-dimensional metamaterials and high wall plug efficiency quantum cascade lasers

    NASA Astrophysics Data System (ADS)

    Hoffman, Anthony J.

    Every instant, light and matter are interacting in ways that shape the world around us. This dissertation examines the interaction of mid-infrared light with stacks of thin semiconductor layers. The work is divided into two parts: mid-infrared metamaterials and high wall plug efficiency (WPE) Quantum Cascade (QC) lasers. The mid-infrared metamaterials represent an entirely new class of material and have great potential for enabling highly-desired applications such as sub-diffraction imaging, confinement, and waveguiding. High WPE QC lasers greatly enhance the commercial feasibility of sensing, infrared countermeasures and free-space infrared communications. The first part of this dissertation describes the first three-dimensional, optical metamaterial. The all-semiconductor metamaterial is based on a strongly anisotropic dielectric function and exhibits negative refraction for a large bandwidth in the mid-infrared. The underlying theory of strongly anisotropic metamaterials is discussed, detailed characterization of several metamaterials is presented, and a macroscopic beam experiment is employed to demonstrate negative refraction. A detailed study of waveguides with strongly anisotropic cores is also presented and the low-order mode cutoff for such left-handed waveguides is observed. The second part of this dissertation discusses improvements in QC laser WPE through improved processing, packaging, and design. Devices using conventional QC design strategies processed as buried heterostructures operate with 5% WPE at room temperature in continuous wave mode, a significant improvement over previous generation devices. To further improve WPE, QC lasers based on ultra-strong coupling between the injector and upper-laser levels are designed and characterized. These devices operate with nearly 50% pulsed WPE---a true milestone for QC technology. A new type of QC laser design incorporating heterogeneous injector regions to reduce the voltage defect and thus improve WPE is also presented. Optimized devices exhibit efficiencies in excess of 30% at cryogenic temperatures. Finally, a new measurement technique to characterize lasers in continuous wave operation is described in detail. The technique is used to measure the instantaneous threshold, active core heating, device thermal resistance, and laser current efficiency as well as determine the cause of light power roll-over. This new characterization technique allows for improved understanding of QC lasers and further improvements in device performance.

  2. Structures of Human Golgi-resident Glutaminyl Cyclase and Its Complexes with Inhibitors Reveal a Large Loop Movement upon Inhibitor Binding*

    PubMed Central

    Huang, Kai-Fa; Liaw, Su-Sen; Huang, Wei-Lin; Chia, Cho-Yun; Lo, Yan-Chung; Chen, Yi-Ling; Wang, Andrew H.-J.

    2011-01-01

    Aberrant pyroglutamate formation at the N terminus of certain peptides and proteins, catalyzed by glutaminyl cyclases (QCs), is linked to some pathological conditions, such as Alzheimer disease. Recently, a glutaminyl cyclase (QC) inhibitor, PBD150, was shown to be able to reduce the deposition of pyroglutamate-modified amyloid-β peptides in brain of transgenic mouse models of Alzheimer disease, leading to a significant improvement of learning and memory in those transgenic animals. Here, we report the 1.05–1.40 Å resolution structures, solved by the sulfur single-wavelength anomalous dispersion phasing method, of the Golgi-luminal catalytic domain of the recently identified Golgi-resident QC (gQC) and its complex with PBD150. We also describe the high-resolution structures of secretory QC (sQC)-PBD150 complex and two other gQC-inhibitor complexes. gQC structure has a scaffold similar to that of sQC but with a relatively wider and negatively charged active site, suggesting a distinct substrate specificity from sQC. Upon binding to PBD150, a large loop movement in gQC allows the inhibitor to be tightly held in its active site primarily by hydrophobic interactions. Further comparisons of the inhibitor-bound structures revealed distinct interactions of the inhibitors with gQC and sQC, which are consistent with the results from our inhibitor assays reported here. Because gQC and sQC may play different biological roles in vivo, the different inhibitor binding modes allow the design of specific inhibitors toward gQC and sQC. PMID:21288892

  3. Structures of human Golgi-resident glutaminyl cyclase and its complexes with inhibitors reveal a large loop movement upon inhibitor binding.

    PubMed

    Huang, Kai-Fa; Liaw, Su-Sen; Huang, Wei-Lin; Chia, Cho-Yun; Lo, Yan-Chung; Chen, Yi-Ling; Wang, Andrew H-J

    2011-04-08

    Aberrant pyroglutamate formation at the N terminus of certain peptides and proteins, catalyzed by glutaminyl cyclases (QCs), is linked to some pathological conditions, such as Alzheimer disease. Recently, a glutaminyl cyclase (QC) inhibitor, PBD150, was shown to be able to reduce the deposition of pyroglutamate-modified amyloid-β peptides in brain of transgenic mouse models of Alzheimer disease, leading to a significant improvement of learning and memory in those transgenic animals. Here, we report the 1.05-1.40 Å resolution structures, solved by the sulfur single-wavelength anomalous dispersion phasing method, of the Golgi-luminal catalytic domain of the recently identified Golgi-resident QC (gQC) and its complex with PBD150. We also describe the high-resolution structures of secretory QC (sQC)-PBD150 complex and two other gQC-inhibitor complexes. gQC structure has a scaffold similar to that of sQC but with a relatively wider and negatively charged active site, suggesting a distinct substrate specificity from sQC. Upon binding to PBD150, a large loop movement in gQC allows the inhibitor to be tightly held in its active site primarily by hydrophobic interactions. Further comparisons of the inhibitor-bound structures revealed distinct interactions of the inhibitors with gQC and sQC, which are consistent with the results from our inhibitor assays reported here. Because gQC and sQC may play different biological roles in vivo, the different inhibitor binding modes allow the design of specific inhibitors toward gQC and sQC.

  4. Validation of a single-platform method for hematopoietic CD34+ stem cells enumeration according to accreditation procedure.

    PubMed

    Massin, Frédéric; Huili, Cai; Decot, Véronique; Stoltz, Jean-François; Bensoussan, Danièle; Latger-Cannard, Véronique

    2015-01-01

    Stem cells for autologous and allogenic transplantation are obtained from several sources including bone marrow, peripheral blood or cord blood. Accurate enumeration of viable CD34+ hematopoietic stem cells (HSC) is routinely used in clinical settings, especially to monitor progenitor cell mobilization and apheresis. The number of viable CD34+ HSC has also been shown to be the most critical factor in haematopoietic engraftment. The International Society for Cellular Therapy actually recommends the use of single-platform flow cytometry system using 7-AAD as a viability dye. In a way to move routine analysis from a BD FACSCaliburTM instrument to a BD FACSCantoTM II, according to ISO 15189 standard guidelines, we define laboratory performance data of the BDTM Stem Cell Enumeration (SCE) kit on a CE-IVD system including a BD FACSCanto II flow cytometer and the BD FACSCantoTM Clinical Software. InterQCTM software, a real time internet laboratory QC management system developed by VitroTM and distributed by Becton DickinsonTM, was also tested to monitor daily QC data, to define the internal laboratory statistics and to compare them to external laboratories. Precision was evaluated with BDTM Stem Cell Control (high and low) results and the InterQC software, an internet laboratory QC management system by Vitro. This last one drew Levey-Jennings curves and generated numeral statistical parameters allowing detection of potential changes in the system performances as well as interlaboratory comparisons. Repeatability, linearity and lower limits of detection were obtained with routine samples from different origins. Agreement evaluation between BD FACSCanto II system versus BD FACSCalibur system was tested on fresh peripheral blood, freeze-thawed apheresis, fresh bone marrow and fresh cord blood samples. Instrument's measure and staining repeatability clearly evidenced acceptable variability on the different samples tested. Intra- and inter-laboratory CV in CD34+ cell absolute count are consistent and reproducible. Linearity analysis, established between 2 and 329 cells/μl showed a linear relation between expected counts and measured counts (R2=0.97). Linear regression and Bland-Altman representations showed an excellent correlation on samples from different sources between the two systems and allowed the transfer of routine analysis from BD FACSCalibur to BD FACSCanto II. The BD SCE kit provides an accurate measure of the CD34 HSC, and can be used in daily routine to optimize the enumeration of hematopoietic CD34+ stem cells by flow cytometry. Moreover, the InterQC system seems to be a very useful tool for laboratory daily quality monitoring and thus for accreditation.

  5. A situational analysis of breast cancer early detection services in Trinidad and Tobago.

    PubMed

    Badal, Kimberly; Rampersad, Fidel; Warner, Wayne A; Toriola, Adetunji T; Mohammed, Hamish; Scheffel, Harold-Alexis; Ali, Rehanna; Moosoodeen, Murrie; Konduru, Siva; Russel, Adaila; Haraksingh, Rajini

    2018-01-01

    A situational analysis of breast cancer (BC) early detection services was carried out to investigate whether Trinidad and Tobago (T&T) has the framework for successful organized national screening. An online survey was designed to assess the availability, accessibility, quality control and assurance (QC&A), and monitoring and evaluation (M&E) mechanisms for public and private BC early detection. A focus group with local radiologists (n = 3) was held to identify unaddressed challenges and make recommendations for improvement. Major public hospitals offer free detection services with wait times of 1-6 months for an appointment. Private institutions offer mammograms for TTD$240 (USD$37) at minimum with same day service. Both sectors report a lack of trained staff. Using 1.2 mammograms per 10,000 women ≥40 years as sufficient, the public sector's rate of 0.19 mammograms per 10,000 women ≥40 years for screening and diagnosis is inadequate. Program M&E mechanisms, QC&A guidelines for machinery use, delays in receipt of pathology reports, and unreliable drug access are further unaddressed challenges. T&T must first strengthen its human and physical resources, implement M&E and QC&A measures, strengthen cancer care, and address other impediments to BC early detection before investing in nationally organized BC screening.

  6. Production of latex agglutination reagents for pneumococcal serotyping

    PubMed Central

    2013-01-01

    Background The current ‘gold standard’ for serotyping pneumococci is the Quellung test. This technique is laborious and requires a certain level of training to correctly perform. Commercial pneumococcal latex agglutination serotyping reagents are available, but these are expensive. In-house production of latex agglutination reagents can be a cost-effective alternative to using commercially available reagents. This paper describes a method for the production and quality control (QC) of latex reagents, including problem solving recommendations, for pneumococcal serotyping. Results Here we describe a method for the production of latex agglutination reagents based on the passive adsorption of antibodies to latex particles. Sixty-five latex agglutination reagents were made using the PneuCarriage Project (PCP) method, of which 35 passed QC. The other 30 reagents failed QC due to auto-agglutination (n=2), no reactivity with target serotypes (n=8) or cross-reactivity with non-target serotypes (n=20). Dilution of antisera resulted in a further 27 reagents passing QC. The remaining three reagents passed QC when prepared without centrifugation and wash steps. Protein estimates indicated that latex reagents that failed QC when prepared using the PCP method passed when made with antiserum containing ≤ 500 μg/ml of protein. Sixty-one nasopharyngeal isolates were serotyped with our in-house latex agglutination reagents, with the results showing complete concordance with the Quellung reaction. Conclusions The method described here to produce latex agglutination reagents allows simple and efficient serotyping of pneumococci and may be applicable to latex agglutination reagents for typing or identification of other microorganisms. We recommend diluting antisera or removing centrifugation and wash steps for any latex reagents that fail QC. Our latex reagents are cost-effective, technically undemanding to prepare and remain stable for long periods of time, making them ideal for use in low-income countries. PMID:23379961

  7. Development of a Climatology of Vertically Complete Wind Profiles from Doppler Radar Wind Profiler Systems

    NASA Technical Reports Server (NTRS)

    Barbre, Robert E., Jr.

    2015-01-01

    This paper describes in detail the QC and splicing methodology for KSC's 50- and 915-MHz DRWP measurements that generates an extensive archive of vertically complete profiles from 0.20-18.45 km. The concurrent POR from each archive extends from April 2000 to December 2009. MSFC NE applies separate but similar QC processes to each of the 50- and 915-MHz DRWP archives. DRWP literature and data examination provide the basis for developing and applying the automated and manual QC processes on both archives. Depending on the month, the QC'ed 50- and 915-MHz DRWP archives retain 52-65% and 16-30% of the possible data, respectively. The 50- and 915-MHz DRWP QC archives retain 84-91% and 85-95%, respectively, of all the available data provided that data exist in the non- QC'ed archives. Next, MSFC NE applies an algorithm to splice concurrent measurements from both DRWP sources. Last, MSFC NE generates a composite profile from the (up to) five available spliced profiles to effectively characterize boundary layer winds and to utilize all possible 915-MHz DRWP measurements at each timestamp. During a given month, roughly 23,000-32,000 complete profiles exist from 0.25-18.45 km from the composite profiles' archive, and approximately 5,000- 27,000 complete profiles exist from an archive utilizing an individual 915-MHz DRWP. One can extract a variety of profile combinations (pairs, triplets, etc.) from this sample for a given application. The sample of vertically complete DRWP wind measurements not only gives launch vehicle customers greater confidence in loads and trajectory assessments versus using balloon output, but also provides flexibility to simulate different DOL situations across applicable altitudes. In addition to increasing sample size and providing more flexibility for DOL simulations in the vehicle design phase, the spliced DRWP database provides any upcoming launch vehicle program with the capability to utilize DRWP profiles on DOL to compute vehicle steering commands, provided the program applies the procedures that this report describes to new DRWP data on DOL. Decker et al. (2015) details how SLS is proposing to use DRWP data and splicing techniques on DOL. Although automation could enhance the current DOL 50-MHz DRWP QC process and could streamline any future DOL 915-MHz DRWP QC and splicing process, the DOL community would still require manual intervention to ensure that the vehicle only uses valid profiles. If a program desires to use high spatial resolution profiles, then the algorithm could randomly add high-frequency components to the DRWP profiles. The spliced DRWP database provides lots of flexibility in how one performs DOL simulations, and the algorithms that this report provides will assist the aerospace and atmospheric communities that are interested in utilizing the DRWP.

  8. Implementation and application of moving average as continuous analytical quality control instrument demonstrated for 24 routine chemistry assays.

    PubMed

    Rossum, Huub H van; Kemperman, Hans

    2017-07-26

    General application of a moving average (MA) as continuous analytical quality control (QC) for routine chemistry assays has failed due to lack of a simple method that allows optimization of MAs. A new method was applied to optimize the MA for routine chemistry and was evaluated in daily practice as continuous analytical QC instrument. MA procedures were optimized using an MA bias detection simulation procedure. Optimization was graphically supported by bias detection curves. Next, all optimal MA procedures that contributed to the quality assurance were run for 100 consecutive days and MA alarms generated during working hours were investigated. Optimized MA procedures were applied for 24 chemistry assays. During this evaluation, 303,871 MA values and 76 MA alarms were generated. Of all alarms, 54 (71%) were generated during office hours. Of these, 41 were further investigated and were caused by ion selective electrode (ISE) failure (1), calibration failure not detected by QC due to improper QC settings (1), possible bias (significant difference with the other analyzer) (10), non-human materials analyzed (2), extreme result(s) of a single patient (2), pre-analytical error (1), no cause identified (20), and no conclusion possible (4). MA was implemented in daily practice as a continuous QC instrument for 24 routine chemistry assays. In our setup when an MA alarm required follow-up, a manageable number of MA alarms was generated that resulted in valuable MA alarms. For the management of MA alarms, several applications/requirements in the MA management software will simplify the use of MA procedures.

  9. MO-AB-210-03: Workshop [Advancements in high intensity focused ultrasound

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Z.

    The goal of this ultrasound hands-on workshop is to demonstrate advancements in high intensity focused ultrasound (HIFU) and to demonstrate quality control (QC) testing in diagnostic ultrasound. HIFU is a therapeutic modality that uses ultrasound waves as carriers of energy. HIFU is used to focus a beam of ultrasound energy into a small volume at specific target locations within the body. The focused beam causes localized high temperatures and produces a well-defined regions of necrosis. This completely non-invasive technology has great potential for tumor ablation and targeted drug delivery. At the workshop, attendees will see configurations, applications, and hands-on demonstrationsmore » with on-site instructors at separate stations. The involvement of medical physicists in diagnostic ultrasound imaging service is increasing due to QC and accreditation requirements. At the workshop, an array of ultrasound testing phantoms and ultrasound scanners will be provided for attendees to learn diagnostic ultrasound QC in a hands-on environment with live demonstrations of the techniques. Target audience: Medical physicists and other medical professionals in diagnostic imaging and radiation oncology with interest in high-intensity focused ultrasound and in diagnostic ultrasound QC. Learning Objectives: Learn ultrasound physics and safety for HIFU applications through live demonstrations Get an overview of the state-of-the art in HIFU technologies and equipment Gain familiarity with common elements of a quality control program for diagnostic ultrasound imaging Identify QC tools available for testing diagnostic ultrasound systems and learn how to use these tools List of supporting vendors for HIFU and diagnostic ultrasound QC hands-on workshop: Philips Healthcare Alpinion Medical Systems Verasonics, Inc Zonare Medical Systems, Inc Computerized Imaging Reference Systems (CIRS), Inc. GAMMEX, Inc., Cablon Medical BV Steffen Sammet: NIH/NCI grant 5R25CA132822, NIH/NINDS grant 5R25NS080949 and Philips Healthcare research grant C32.« less

  10. MO-AB-210-02: Ultrasound Imaging and Therapy-Hands On Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sammet, S.

    The goal of this ultrasound hands-on workshop is to demonstrate advancements in high intensity focused ultrasound (HIFU) and to demonstrate quality control (QC) testing in diagnostic ultrasound. HIFU is a therapeutic modality that uses ultrasound waves as carriers of energy. HIFU is used to focus a beam of ultrasound energy into a small volume at specific target locations within the body. The focused beam causes localized high temperatures and produces a well-defined regions of necrosis. This completely non-invasive technology has great potential for tumor ablation and targeted drug delivery. At the workshop, attendees will see configurations, applications, and hands-on demonstrationsmore » with on-site instructors at separate stations. The involvement of medical physicists in diagnostic ultrasound imaging service is increasing due to QC and accreditation requirements. At the workshop, an array of ultrasound testing phantoms and ultrasound scanners will be provided for attendees to learn diagnostic ultrasound QC in a hands-on environment with live demonstrations of the techniques. Target audience: Medical physicists and other medical professionals in diagnostic imaging and radiation oncology with interest in high-intensity focused ultrasound and in diagnostic ultrasound QC. Learning Objectives: Learn ultrasound physics and safety for HIFU applications through live demonstrations Get an overview of the state-of-the art in HIFU technologies and equipment Gain familiarity with common elements of a quality control program for diagnostic ultrasound imaging Identify QC tools available for testing diagnostic ultrasound systems and learn how to use these tools List of supporting vendors for HIFU and diagnostic ultrasound QC hands-on workshop: Philips Healthcare Alpinion Medical Systems Verasonics, Inc Zonare Medical Systems, Inc Computerized Imaging Reference Systems (CIRS), Inc. GAMMEX, Inc., Cablon Medical BV Steffen Sammet: NIH/NCI grant 5R25CA132822, NIH/NINDS grant 5R25NS080949 and Philips Healthcare research grant C32.« less

  11. MO-AB-210-01: Ultrasound Imaging and Therapy-Hands On Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Z.

    The goal of this ultrasound hands-on workshop is to demonstrate advancements in high intensity focused ultrasound (HIFU) and to demonstrate quality control (QC) testing in diagnostic ultrasound. HIFU is a therapeutic modality that uses ultrasound waves as carriers of energy. HIFU is used to focus a beam of ultrasound energy into a small volume at specific target locations within the body. The focused beam causes localized high temperatures and produces a well-defined regions of necrosis. This completely non-invasive technology has great potential for tumor ablation and targeted drug delivery. At the workshop, attendees will see configurations, applications, and hands-on demonstrationsmore » with on-site instructors at separate stations. The involvement of medical physicists in diagnostic ultrasound imaging service is increasing due to QC and accreditation requirements. At the workshop, an array of ultrasound testing phantoms and ultrasound scanners will be provided for attendees to learn diagnostic ultrasound QC in a hands-on environment with live demonstrations of the techniques. Target audience: Medical physicists and other medical professionals in diagnostic imaging and radiation oncology with interest in high-intensity focused ultrasound and in diagnostic ultrasound QC. Learning Objectives: Learn ultrasound physics and safety for HIFU applications through live demonstrations Get an overview of the state-of-the art in HIFU technologies and equipment Gain familiarity with common elements of a quality control program for diagnostic ultrasound imaging Identify QC tools available for testing diagnostic ultrasound systems and learn how to use these tools List of supporting vendors for HIFU and diagnostic ultrasound QC hands-on workshop: Philips Healthcare Alpinion Medical Systems Verasonics, Inc Zonare Medical Systems, Inc Computerized Imaging Reference Systems (CIRS), Inc. GAMMEX, Inc., Cablon Medical BV Steffen Sammet: NIH/NCI grant 5R25CA132822, NIH/NINDS grant 5R25NS080949 and Philips Healthcare research grant C32.« less

  12. Improving GEOS-5 seven day forecast skill by assimilation of quality controlled AIRS temperature profiles

    NASA Astrophysics Data System (ADS)

    Susskind, J.; Rosenberg, R. I.

    2016-12-01

    The GEOS-5 Data Assimilation System (DAS) generates a global analysis every six hours by combining the previous six hour forecast for that time period with contemporaneous observations. These observations include in-situ observations as well as those taken by satellite borne instruments, such as AIRS/AMSU on EOS Aqua and CrIS/ATMS on S-NPP. Operational data assimilation methodology assimilates observed channel radiances Ri for IR sounding instruments such as AIRS and CrIS, but only for those channels i in a given scene whose radiances are thought to be unaffected by clouds. A limitation of this approach is that radiances in most tropospheric sounding channels are affected by clouds under partial cloud cover conditions, which occurs most of the time. The AIRS Science Team Version-6 retrieval algorithm generates cloud cleared radiances (CCR's) for each channel in a given scene, which represent the radiances AIRS would have observed if the scene were cloud free, and then uses them to determine quality controlled (QC'd) temperature profiles T(p) under all cloud conditions. There are potential advantages to assimilate either AIRS QC'd CCR's or QC'd T(p) instead of Ri in that the spatial coverage of observations is greater under partial cloud cover. We tested these two alternate data assimilation approaches by running three parallel data assimilation experiments over different time periods using GEOS-5. Experiment 1 assimilated all observations as done operationally, Experiment 2 assimilated QC'd values of AIRS CCRs in place of AIRS radiances, and Experiment 3 assimilated QC'd values of T(p) in place of observed radiances. Assimilation of QC'd AIRS T(p) resulted in significant improvement in seven day forecast skill compared to assimilation of CCR's or assimilation of observed radiances, especially in the Southern Hemisphere Extra-tropics.

  13. Photochemical Energy Storage and Electrochemically Triggered Energy Release in the Norbornadiene-Quadricyclane System: UV Photochemistry and IR Spectroelectrochemistry in a Combined Experiment.

    PubMed

    Brummel, Olaf; Waidhas, Fabian; Bauer, Udo; Wu, Yanlin; Bochmann, Sebastian; Steinrück, Hans-Peter; Papp, Christian; Bachmann, Julien; Libuda, Jörg

    2017-07-06

    The two valence isomers norbornadiene (NBD) and quadricyclane (QC) enable solar energy storage in a single molecule system. We present a new photoelectrochemical infrared reflection absorption spectroscopy (PEC-IRRAS) experiment, which allows monitoring of the complete energy storage and release cycle by in situ vibrational spectroscopy. Both processes were investigated, the photochemical conversion from NBD to QC using the photosensitizer 4,4'-bis(dimethylamino)benzophenone (Michler's ketone, MK) and the electrochemically triggered cycloreversion from QC to NBD. Photochemical conversion was obtained with characteristic conversion times on the order of 500 ms. All experiments were performed under full potential control in a thin-layer configuration with a Pt(111) working electrode. The vibrational spectra of NBD, QC, and MK were analyzed in the fingerprint region, permitting quantitative analysis of the spectroscopic data. We determined selectivities for both the photochemical conversion and the electrochemical cycloreversion and identified the critical steps that limit the reversibility of the storage cycle.

  14. Phase 2 Site Investigations Report. Volume 3 of 3: Appendices

    DTIC Science & Technology

    1994-09-01

    Phase II Site Investigations Ee Report Cn Volume III of III Appendices Fort Devens Sudbury Training Annex, Massachusetts September 1994 Contract No...laboratory quality control (QC) samples collected during field investigations at the Sudbury Training Annex of Fort Devens , Massachusetts. The QC...returned to its original condition. E & E performed this procedure for each monitoring well tested during the 1993 slug testing activities at Fort Devens

  15. Challenges in Development of Sperm Repositories for Biomedical Fishes: Quality Control in Small-Bodied Species.

    PubMed

    Torres, Leticia; Liu, Yue; Guitreau, Amy; Yang, Huiping; Tiersch, Terrence R

    2017-12-01

    Quality control (QC) is essential for reproducible and efficient functioning of germplasm repositories. However, many biomedical fish models present significant QC challenges due to small body sizes (<5 cm) and miniscule sperm volumes (<5 μL). Using minimal volumes of sperm, we used Zebrafish to evaluate common QC endpoints as surrogates for fertilization success along sequential steps of cryopreservation. First, concentrations of calibration bead suspensions were evaluated with a Makler ® counting chamber by using different sample volumes and mixing methods. For sperm analysis, samples were initially diluted at a 1:30 ratio with Hanks' balanced salt solution (HBSS). Motility was evaluated by using different ratios of sperm and activation medium, and membrane integrity was analyzed with flow cytometry at different concentrations. Concentration and sperm motility could be confidently estimated by using volumes as small as 1 μL, whereas membrane integrity required a minimum of 2 μL (at 1 × 10 6 cells/mL). Thus, <5 μL of sperm suspension (after dilution to 30-150 μL with HBSS) was required to evaluate sperm quality by using three endpoints. Sperm quality assessment using a combination of complementary endpoints enhances QC efforts during cryopreservation, increasing reliability and reproducibility, and reducing waste of time and resources.

  16. Droplet digital PCR-based EGFR mutation detection with an internal quality control index to determine the quality of DNA.

    PubMed

    Kim, Sung-Su; Choi, Hyun-Jeung; Kim, Jin Ju; Kim, M Sun; Lee, In-Seon; Byun, Bohyun; Jia, Lina; Oh, Myung Ryurl; Moon, Youngho; Park, Sarah; Choi, Joon-Seok; Chae, Seoung Wan; Nam, Byung-Ho; Kim, Jin-Soo; Kim, Jihun; Min, Byung Soh; Lee, Jae Seok; Won, Jae-Kyung; Cho, Soo Youn; Choi, Yoon-La; Shin, Young Kee

    2018-01-11

    In clinical translational research and molecular in vitro diagnostics, a major challenge in the detection of genetic mutations is overcoming artefactual results caused by the low-quality of formalin-fixed paraffin-embedded tissue (FFPET)-derived DNA (FFPET-DNA). Here, we propose the use of an 'internal quality control (iQC) index' as a criterion for judging the minimum quality of DNA for PCR-based analyses. In a pre-clinical study comparing the results from droplet digital PCR-based EGFR mutation test (ddEGFR test) and qPCR-based EGFR mutation test (cobas EGFR test), iQC index ≥ 0.5 (iQC copies ≥ 500, using 3.3 ng of FFPET-DNA [1,000 genome equivalents]) was established, indicating that more than half of the input DNA was amplifiable. Using this criterion, we conducted a retrospective comparative clinical study of the ddEGFR and cobas EGFR tests for the detection of EGFR mutations in non-small cell lung cancer (NSCLC) FFPET-DNA samples. Compared with the cobas EGFR test, the ddEGFR test exhibited superior analytical performance and equivalent or higher clinical performance. Furthermore, iQC index is a reliable indicator of the quality of FFPET-DNA and could be used to prevent incorrect diagnoses arising from low-quality samples.

  17. Individualized Quality Control Plan (IQCP): Is It Value-Added for Clinical Microbiology?

    PubMed Central

    Miller, Melissa B.; Hindler, Janet

    2015-01-01

    The Center for Medicaid and Medicare Services (CMS) recently published their Individualized Quality Control Plan (IQCP [https://www.cms.gov/regulations-and-guidance/legislation/CLIA/Individualized_Quality_Control_Plan_IQCP.html]), which will be the only option for quality control (QC) starting in January 2016 if laboratories choose not to perform Clinical Laboratory Improvement Act (CLIA) [U.S. Statutes at Large 81(1967):533] default QC. Laboratories will no longer be able to use “equivalent QC” (EQC) or the Clinical and Laboratory Standards Institute (CLSI) standards alone for quality control of their microbiology systems. The implementation of IQCP in clinical microbiology laboratories will most certainly be an added burden, the benefits of which are currently unknown. PMID:26447112

  18. Direct infusion mass spectrometry metabolomics dataset: a benchmark for data processing and quality control

    PubMed Central

    Kirwan, Jennifer A; Weber, Ralf J M; Broadhurst, David I; Viant, Mark R

    2014-01-01

    Direct-infusion mass spectrometry (DIMS) metabolomics is an important approach for characterising molecular responses of organisms to disease, drugs and the environment. Increasingly large-scale metabolomics studies are being conducted, necessitating improvements in both bioanalytical and computational workflows to maintain data quality. This dataset represents a systematic evaluation of the reproducibility of a multi-batch DIMS metabolomics study of cardiac tissue extracts. It comprises of twenty biological samples (cow vs. sheep) that were analysed repeatedly, in 8 batches across 7 days, together with a concurrent set of quality control (QC) samples. Data are presented from each step of the workflow and are available in MetaboLights. The strength of the dataset is that intra- and inter-batch variation can be corrected using QC spectra and the quality of this correction assessed independently using the repeatedly-measured biological samples. Originally designed to test the efficacy of a batch-correction algorithm, it will enable others to evaluate novel data processing algorithms. Furthermore, this dataset serves as a benchmark for DIMS metabolomics, derived using best-practice workflows and rigorous quality assessment. PMID:25977770

  19. Benefits of a Pharmacology Antimalarial Reference Standard and Proficiency Testing Program Provided by the Worldwide Antimalarial Resistance Network (WWARN)

    PubMed Central

    Lourens, Chris; Lindegardh, Niklas; Barnes, Karen I.; Guerin, Philippe J.; Sibley, Carol H.; White, Nicholas J.

    2014-01-01

    Comprehensive assessment of antimalarial drug resistance should include measurements of antimalarial blood or plasma concentrations in clinical trials and in individual assessments of treatment failure so that true resistance can be differentiated from inadequate drug exposure. Pharmacometric modeling is necessary to assess pharmacokinetic-pharmacodynamic relationships in different populations to optimize dosing. To accomplish both effectively and to allow comparison of data from different laboratories, it is essential that drug concentration measurement is accurate. Proficiency testing (PT) of laboratory procedures is necessary for verification of assay results. Within the Worldwide Antimalarial Resistance Network (WWARN), the goal of the quality assurance/quality control (QA/QC) program is to facilitate and sustain high-quality antimalarial assays. The QA/QC program consists of an international PT program for pharmacology laboratories and a reference material (RM) program for the provision of antimalarial drug standards, metabolites, and internal standards for laboratory use. The RM program currently distributes accurately weighed quantities of antimalarial drug standards, metabolites, and internal standards to 44 pharmacology, in vitro, and drug quality testing laboratories. The pharmacology PT program has sent samples to eight laboratories in four rounds of testing. WWARN technical experts have provided advice for correcting identified problems to improve performance of subsequent analysis and ultimately improved the quality of data. Many participants have demonstrated substantial improvements over subsequent rounds of PT. The WWARN QA/QC program has improved the quality and value of antimalarial drug measurement in laboratories globally. It is a model that has potential to be applied to strengthening laboratories more widely and improving the therapeutics of other infectious diseases. PMID:24777099

  20. Impairments in central cardiovascular function contribute to attenuated reflex vasodilation in aged skin

    PubMed Central

    Stanhewicz, Anna E.; Proctor, David N.; Alexander, Lacy M.; Kenney, W. Larry

    2015-01-01

    During supine passive heating, increases in skin blood flow (SkBF) and cardiac output (Qc) are both blunted in older adults. The aim here was to determine the effect of acutely correcting the peripheral vasodilatory capacity of aged skin on the integrated cardiovascular responses to passive heating. A secondary aim was to examine the SkBF-Qc relation during hyperthermia in the presence (upright posture) and absence (dynamic exercise) of challenges to central venous pressure. We hypothesized that greater increases in SkBF would be accompanied by greater increases in Qc. Eleven healthy older adults (69 ± 3 yr) underwent supine passive heating (0.8°C rise in core temperature; water-perfused suit) after ingesting sapropterin (BH4, a nitric oxide synthase cofactor; 10 mg/kg) or placebo (randomized double-blind crossover design). Twelve young (24 ± 1 yr) subjects served as a comparison group. SkBF (laser-Doppler flowmetry) and Qc (open-circuit acetylene wash-in) were measured during supine heating, heating + upright posture, and heating + dynamic exercise. Throughout supine and upright heating, sapropterin fully restored the SkBF response of older adults to that of young adults but Qc remained blunted. During heat + upright posture, SkBF failed to decrease in untreated older subjects. There were no age- or treatment-related differences in SkBF-Qc during dynamic exercise. The principal finding of this study was that the blunted Qc response to passive heat stress is directly related to age as opposed to the blunted peripheral vasodilatory capacity of aged skin. Furthermore, peripheral impairments to SkBF in the aged may contribute to inapposite responses during challenges to central venous pressure during hyperthermia. PMID:26494450

  1. Development of a portable quality control application using a tablet-type electronic device.

    PubMed

    Ono, Tomohiro; Miyabe, Yuki; Akimoto, Mami; Mukumoto, Nobutaka; Ishihara, Yoshitomo; Nakamura, Mitsuhiro; Mizowaki, Takashi

    2018-03-01

    Our aim was to develop a portable quality control (QC) application using a thermometer, a barometer, an angle gauge, and a range finder implemented in a tablet-type consumer electronic device (CED) and to assess the accuracies of the measurements made. The QC application was programmed using Java and OpenCV libraries. First, temperature and atmospheric pressure were measured over 30 days using the temperature and pressure sensors of the CED and compared with those measured by a double-tube thermometer and a digital barometer. Second, the angle gauge was developed using the accelerometer of the CED. The roll and pitch angles of the CED were measured from 0 to 90° at intervals of 10° in the clockwise (CW) and counterclockwise (CCW) directions. The values were compared with those measured by a digital angle gauge. Third, a range finder was developed using the tablet's built-in camera and image-processing capacities. Surrogate markers were detected by the camera and their positions converted to actual positions using a homographic transformation method. Fiducial markers were placed on a treatment couch and moved 100 mm in 10-mm steps in both the lateral and longitudinal directions. The values were compared with those measured by the digital output of the treatment couch. The differences between CED values and those of other devices were compared by calculating means ± standard deviations (SDs). The means ± SDs of differences in temperature and atmospheric pressure were -0.07 ± 0.25°C and 0.05 ± 0.10 hPa, respectively. The means ± SDs of the difference in angle was -0.17 ± 0.87° (0.15 ± 0.23° degrees excluding the 90° angle). The means ± SDs of distances were 0.01 ± 0.07 mm in both the lateral and longitudinal directions. Our portable QC application was accurate and may be used instead of standard measuring devices. Our portable CED is efficient and simple when used in the field of medical physics. © 2018 American Association of Physicists in Medicine.

  2. Analysis of glycoprotein processing in the endoplasmic reticulum using synthetic oligosaccharides.

    PubMed

    Ito, Yukishige; Takeda, Yoichi

    2012-01-01

    Protein quality control (QC) in the endoplasmic reticulum (ER) comprises many steps, including folding and transport of nascent proteins as well as degradation of misfolded proteins. Recent studies have revealed that high-mannose-type glycans play a pivotal role in the QC process. To gain knowledge about the molecular basis of this process with well-defined homogeneous compounds, we achieved a convergent synthesis of high-mannose-type glycans and their functionalized derivatives. We focused on analyses of UDP-Glc: glycoprotein glucosyltransferase (UGGT) and ER Glucosidase II, which play crucial roles in glycoprotein QC; however, their specificities remain unclear. In addition, we established an in vitro assay system mimicking the in vivo condition which is highly crowded because of the presence of various biomacromolecules.

  3. Effect of different solutions on color stability of acrylic resin-based dentures.

    PubMed

    Goiato, Marcelo Coelho; Nóbrega, Adhara Smith; dos Santos, Daniela Micheline; Andreotti, Agda Marobo; Moreno, Amália

    2014-01-01

    The aim of this study was to evaluate the effect of thermocycling and immersion in mouthwash or beverage solutions on the color stability of four different acrylic resin-based dentures (Onda Cryl, OC; QC20, QC; Classico, CL; and Lucitone, LU). The factors evaluated were type of acrylic resin, immersion time, and solution (mouthwash or beverage). A total of 224 denture samples were fabricated. For each type of resin, eight samples were immersed in mouthwashes (Plax-Colgate, PC; Listerine, LI; and Oral-B, OB), beverages (coffee, CP; cola, C; and wine, W), and artificial saliva (AS; control). The color change (DE) was evaluated before (baseline) and after thermocycling (T1), and after immersion in solution for 1 h (T2), 3 h (T3), 24 h (T4), 48 h (T5), and 96 h (T6). The CIE Lab system was used to determine the color changes. The thermocycling test was performed for 5000 cycles. Data were submitted to three-way repeated-measures analysis of variance and Tukey's test (p<0.05). When the samples were immersed in each mouthwash, all assessed factors, associated or not, significantly influenced the color change values, except there was no association between the mouthwash and acrylic resin. Similarly, when the samples were immersed in each beverage, all studied factors influenced the color change values. In general, regardless of the solution, LU exhibited the greatest DE values in the period from T1 to T5; and QC presented the greatest DE values at T6. Thus, thermocycling and immersion in the various solutions influenced the color stability of acrylic resins and QC showed the greatest color alteration.

  4. Preparation of clinical-grade 89Zr-panitumumab as a positron emission tomography biomarker for evaluating epidermal growth factor receptor-targeted therapy

    PubMed Central

    Wei, Ling; Shi, Jianfeng; Afari, George; Bhattacharyya, Sibaprasad

    2014-01-01

    Panitumumab is a fully human monoclonal antibody approved for the treatment of epidermal growth factor receptor (EGFR) positive colorectal cancer. Recently, panitumumab has been radiolabeled with 89Zr and evaluated for its potential to be used as immuno-positron emission tomography (PET) probe for EGFR positive cancers. Interesting preclinical results published by several groups of researchers have prompted us to develop a robust procedure for producing clinical-grade 89Zr-panitumumab as an immuno-PET probe to evaluate EGFR-targeted therapy. In this process, clinical-grade panitumumab is bio-conjugated with desferrioxamine chelate and subsequently radiolabeled with 89Zr resulting in high radiochemical yield (>70%, n=3) and purity (>98%, n=3). All quality control (QC) tests were performed according to United States Pharmacopeia specifications. QC tests showed that 89Zr-panitumumab met all specifications for human injection. Herein, we describe a step-by-step method for the facile synthesis and QC tests of 89Zr-panitumumab for medical use. The entire process of bioconjugation, radiolabeling, and all QC tests will take about 5h. Because the synthesis is fully manual, two rapid, in-process QC tests have been introduced to make the procedure robust and error free. PMID:24448743

  5. QC-ART: A tool for real-time quality control assessment of mass spectrometry-based proteomics data.

    PubMed

    Stanfill, Bryan A; Nakayasu, Ernesto S; Bramer, Lisa M; Thompson, Allison M; Ansong, Charles K; Clauss, Therese; Gritsenko, Marina A; Monroe, Matthew E; Moore, Ronald J; Orton, Daniel J; Piehowski, Paul D; Schepmoes, Athena A; Smith, Richard D; Webb-Robertson, Bobbie-Jo; Metz, Thomas O

    2018-04-17

    Liquid chromatography-mass spectrometry (LC-MS)-based proteomics studies of large sample cohorts can easily require from months to years to complete. Acquiring consistent, high-quality data in such large-scale studies is challenging because of normal variations in instrumentation performance over time, as well as artifacts introduced by the samples themselves, such as those due to collection, storage and processing. Existing quality control methods for proteomics data primarily focus on post-hoc analysis to remove low-quality data that would degrade downstream statistics; they are not designed to evaluate the data in near real-time, which would allow for interventions as soon as deviations in data quality are detected.  In addition to flagging analyses that demonstrate outlier behavior, evaluating how the data structure changes over time can aide in understanding typical instrument performance or identify issues such as a degradation in data quality due to the need for instrument cleaning and/or re-calibration.  To address this gap for proteomics, we developed Quality Control Analysis in Real-Time (QC-ART), a tool for evaluating data as they are acquired in order to dynamically flag potential issues with instrument performance or sample quality.  QC-ART has similar accuracy as standard post-hoc analysis methods with the additional benefit of real-time analysis.  We demonstrate the utility and performance of QC-ART in identifying deviations in data quality due to both instrument and sample issues in near real-time for LC-MS-based plasma proteomics analyses of a sample subset of The Environmental Determinants of Diabetes in the Young cohort. We also present a case where QC-ART facilitated the identification of oxidative modifications, which are often underappreciated in proteomic experiments. Published under license by The American Society for Biochemistry and Molecular Biology, Inc.

  6. Complete genomic sequences for hepatitis C virus subtypes 4b, 4c, 4d, 4g, 4k, 4l, 4m, 4n, 4o, 4p, 4q, 4r and 4t.

    PubMed

    Li, Chunhua; Lu, Ling; Wu, Xianghong; Wang, Chuanxi; Bennett, Phil; Lu, Teng; Murphy, Donald

    2009-08-01

    In this study, we characterized the full-length genomic sequences of 13 distinct hepatitis C virus (HCV) genotype 4 isolates/subtypes: QC264/4b, QC381/4c, QC382/4d, QC193/4g, QC383/4k, QC274/4l, QC249/4m, QC97/4n, QC93/4o, QC139/4p, QC262/4q, QC384/4r and QC155/4t. These were amplified, using RT-PCR, from the sera of patients now residing in Canada, 11 of which were African immigrants. The resulting genomes varied between 9421 and 9475 nt in length and each contains a single ORF of 9018-9069 nt. The sequences showed nucleotide similarities of 77.3-84.3 % in comparison with subtypes 4a (GenBank accession no. Y11604) and 4f (EF589160) and 70.6-72.8 % in comparison with genotype 1 (M62321/1a, M58335/1b, D14853/1c, and 1?/AJ851228) reference sequences. These similarities were often higher than those currently defined by HCV classification criteria for subtype (75.0-80.0 %) and genotype (67.0-70.0 %) division, respectively. Further analyses of the complete and partial E1 and partial NS5B sequences confirmed these 13 'provisionally assigned subtypes'.

  7. Precision oncology using a limited number of cells: optimization of whole genome amplification products for sequencing applications.

    PubMed

    Sho, Shonan; Court, Colin M; Winograd, Paul; Lee, Sangjun; Hou, Shuang; Graeber, Thomas G; Tseng, Hsian-Rong; Tomlinson, James S

    2017-07-01

    Sequencing analysis of circulating tumor cells (CTCs) enables "liquid biopsy" to guide precision oncology strategies. However, this requires low-template whole genome amplification (WGA) that is prone to errors and biases from uneven amplifications. Currently, quality control (QC) methods for WGA products, as well as the number of CTCs needed for reliable downstream sequencing, remain poorly defined. We sought to define strategies for selecting and generating optimal WGA products from low-template input as it relates to their potential applications in precision oncology strategies. Single pancreatic cancer cells (HPAF-II) were isolated using laser microdissection. WGA was performed using multiple displacement amplification (MDA), multiple annealing and looping based amplification (MALBAC) and PicoPLEX. Quality of amplified DNA products were assessed using a multiplex/RT-qPCR based method that evaluates for 8-cancer related genes and QC-scores were assigned. We utilized this scoring system to assess the impact of de novo modifications to the WGA protocol. WGA products were subjected to Sanger sequencing, array comparative genomic hybridization (aCGH) and next generation sequencing (NGS) to evaluate their performances in respective downstream analyses providing validation of the QC-score. Single-cell WGA products exhibited a significant sample-to-sample variability in amplified DNA quality as assessed by our 8-gene QC assay. Single-cell WGA products that passed the pre-analysis QC had lower amplification bias and improved aCGH/NGS performance metrics when compared to single-cell WGA products that failed the QC. Increasing the number of cellular input resulted in improved QC-scores overall, but a resultant WGA product that consistently passed the QC step required a starting cellular input of at least 20-cells. Our modified-WGA protocol effectively reduced this number, achieving reproducible high-quality WGA products from ≥5-cells as a starting template. A starting cellular input of 5 to 10-cells amplified using the modified-WGA achieved aCGH and NGS results that closely matched that of unamplified, batch genomic DNA. The modified-WGA protocol coupled with the 8-gene QC serve as an effective strategy to enhance the quality of low-template WGA reactions. Furthermore, a threshold number of 5-10 cells are likely needed for a reliable WGA reaction and product with high fidelity to the original starting template.

  8. Reproducibility of the exponential rise technique of CO(2) rebreathing for measuring P(v)CO(2) and C(v)CO(2 )to non-invasively estimate cardiac output during incremental, maximal treadmill exercise.

    PubMed

    Cade, W Todd; Nabar, Sharmila R; Keyser, Randall E

    2004-05-01

    The purpose of this study was to determine the reproducibility of the indirect Fick method for the measurement of mixed venous carbon dioxide partial pressure (P(v)CO(2)) and venous carbon dioxide content (C(v)CO(2)) for estimation of cardiac output (Q(c)), using the exponential rise method of carbon dioxide rebreathing, during non-steady-state treadmill exercise. Ten healthy participants (eight female and two male) performed three incremental, maximal exercise treadmill tests to exhaustion within 1 week. Non-invasive Q(c) measurements were evaluated at rest, during each 3-min stage, and at peak exercise, across three identical treadmill tests, using the exponential rise technique for measuring mixed venous PCO(2) and CCO(2) and estimating venous-arterio carbon dioxide content difference (C(v-a)CO(2)). Measurements were divided into measured or estimated variables [heart rate (HR), oxygen consumption (VO(2)), volume of expired carbon dioxide (VCO(2)), end-tidal carbon dioxide (P(ET)CO(2)), arterial carbon dioxide partial pressure (P(a)CO(2)), venous carbon dioxide partial pressure ( P(v)CO(2)), and C(v-a)CO(2)] and cardiorespiratory variables derived from the measured variables [Q(c), stroke volume (V(s)), and arteriovenous oxygen difference ( C(a-v)O(2))]. In general, the derived cardiorespiratory variables demonstrated acceptable (R=0.61) to high (R>0.80) reproducibility, especially at higher intensities and peak exercise. Measured variables, excluding P(a)CO(2) and C(v-a)CO(2), also demonstrated acceptable (R=0.6 to 0.79) to high reliability. The current study demonstrated acceptable to high reproducibility of the exponential rise indirect Fick method in measurement of mixed venous PCO(2) and CCO(2) for estimation of Q(c) during incremental treadmill exercise testing, especially at high-intensity and peak exercise.

  9. Quality Control Methodology Of A Surface Wind Observational Database In North Eastern North America

    NASA Astrophysics Data System (ADS)

    Lucio-Eceiza, Etor E.; Fidel González-Rouco, J.; Navarro, Jorge; Conte, Jorge; Beltrami, Hugo

    2016-04-01

    This work summarizes the design and application of a Quality Control (QC) procedure for an observational surface wind database located in North Eastern North America. The database consists of 526 sites (486 land stations and 40 buoys) with varying resolutions of hourly, 3 hourly and 6 hourly data, compiled from three different source institutions with uneven measurement units and changing measuring procedures, instrumentation and heights. The records span from 1953 to 2010. The QC process is composed of different phases focused either on problems related with the providing source institutions or measurement errors. The first phases deal with problems often related with data recording and management: (1) compilation stage dealing with the detection of typographical errors, decoding problems, site displacements and unification of institutional practices; (2) detection of erroneous data sequence duplications within a station or among different ones; (3) detection of errors related with physically unrealistic data measurements. The last phases are focused on instrumental errors: (4) problems related with low variability, placing particular emphasis on the detection of unrealistic low wind speed records with the help of regional references; (5) high variability related erroneous records; (6) standardization of wind speed record biases due to changing measurement heights, detection of wind speed biases on week to monthly timescales, and homogenization of wind direction records. As a result, around 1.7% of wind speed records and 0.4% of wind direction records have been deleted, making a combined total of 1.9% of removed records. Additionally, around 15.9% wind speed records and 2.4% of wind direction data have been also corrected.

  10. LETTER TO THE EDITOR: The quasi-coherent signature of enhanced Dα H-mode in Alcator C-Mod

    NASA Astrophysics Data System (ADS)

    Snipes, J. A.; La Bombard, B.; Greenwald, M.; Hutchinson, I. H.; Irby, J.; Lin, Y.; Mazurenko, A.; Porkolab, M.

    2001-04-01

    The steady-state H-mode regime found at moderate to high density in Alcator C-Mod, known as enhanced Dα (EDA) H-mode, appears to be maintained by a continuous quasi-coherent (QC) mode in the steep edge gradient region. Large amplitude density and magnetic fluctuations with typical frequencies of about 100 kHz are driven by the QC mode. These fluctuations are measured in the steep edge gradient region by inserting a fast-scanning probe containing two poloidally separated Langmuir probes and a poloidal field pick-up coil. As the probe approaches the plasma edge, clear magnetic fluctuations were measured within about 2 cm of the last-closed flux surface (LCFS). The mode amplitude falls off rapidly with distance from the plasma centre with an exponential decay length of kr≈1.5 cm-1, measured 10 cm above the outboard midplane. The root-mean-square amplitude of the fluctuation extrapolated to the LCFS was θ≈5 G. The density fluctuations, on the other hand, were visible on the Langmuir probe only when it was within a few millimetres of the LCFS. The potential and density fluctuations were sufficiently in phase to enhance particle transport at the QC mode frequency. These results show that the QC signature of the EDA H-mode is an electromagnetic mode that appears to be responsible for the enhanced particle transport in the plasma edge.

  11. AmeriFlux Data Processing: Integrating automated and manual data management across software technologies and an international network to generate timely data products

    NASA Astrophysics Data System (ADS)

    Christianson, D. S.; Beekwilder, N.; Chan, S.; Cheah, Y. W.; Chu, H.; Dengel, S.; O'Brien, F.; Pastorello, G.; Sandesh, M.; Torn, M. S.; Agarwal, D.

    2017-12-01

    AmeriFlux is a network of scientists who independently collect eddy covariance and related environmental observations at over 250 locations across the Americas. As part of the AmeriFlux Management Project, the AmeriFlux Data Team manages standardization, collection, quality assurance / quality control (QA/QC), and distribution of data submitted by network members. To generate data products that are timely, QA/QC'd, and repeatable, and have traceable provenance, we developed a semi-automated data processing pipeline. The new pipeline consists of semi-automated format and data QA/QC checks. Results are communicated via on-line reports as well as an issue-tracking system. Data processing time has been reduced from 2-3 days to a few hours of manual review time, resulting in faster data availability from the time of data submission. The pipeline is scalable to the network level and has the following key features. (1) On-line results of the format QA/QC checks are available immediately for data provider review. This enables data providers to correct and resubmit data quickly. (2) The format QA/QC assessment includes an automated attempt to fix minor format errors. Data submissions that are formatted in the new AmeriFlux FP-In standard can be queued for the data QA/QC assessment, often with minimal delay. (3) Automated data QA/QC checks identify and communicate potentially erroneous data via online, graphical quick views that highlight observations with unexpected values, incorrect units, time drifts, invalid multivariate correlations, and/or radiation shadows. (4) Progress through the pipeline is integrated with an issue-tracking system that facilitates communications between data providers and the data processing team in an organized and searchable fashion. Through development of these and other features of the pipeline, we present solutions to challenges that include optimizing automated with manual processing, bridging legacy data management infrastructure with various software tools, and working across interdisciplinary and international science cultures. Additionally, we discuss results from community member feedback that helped refine QA/QC communications for efficient data submission and revision.

  12. Unanticipated error in HbA(1c) measurement on the HLC-723 G7 analyzer.

    PubMed

    van den Ouweland, Johannes M W; de Keijzer, Marinus H; van Daal, Henny

    2010-04-01

    Investigation of falsely elevated HbA(1c) measurements on the HLC-723 G7 analyser. Comparison of HbA(1c) in blood samples that were diluted either in hemolysis reagent or water. HbA(1c) results became falsely elevated when samples were diluted in hemolysis reagent, but not in water. QC-procedures failed to detect this error as calibrator and QC samples were manually diluted in water, according to manufacturer's instructions, whereas patient samples were automatically diluted using hemolysing reagent. After replacement of the instruments' sample-loop and rotor seal comparable HbA(1c) results were obtained, irrespective of dilution with hemolysing reagent or water. This case illustrates the importance of treating calibrator and QC materials similar to routine patient samples in order to prevent unnoticed drift in patient HbA(1c) results. Copyright 2010 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  13. Results from 15years of quality surveillance for a National Indigenous Point-of-Care Testing Program for diabetes.

    PubMed

    Shephard, Mark; Shephard, Anne; McAteer, Bridgit; Regnier, Tamika; Barancek, Kristina

    2017-12-01

    Diabetes is a major health problem for Australia's Aboriginal and Torres Strait Islander peoples. Point-of-care testing for haemoglobin A1c (HbA1c) has been the cornerstone of a long-standing program (QAAMS) to manage glycaemic control in Indigenous people with diabetes and recently, to diagnose diabetes. The QAAMS quality management framework includes monthly testing of quality control (QC) and external quality assurance (EQA) samples. Key performance indicators of quality include imprecision (coefficient of variation [CV%]) and percentage acceptable results. This paper reports on the past 15years of quality testing in QAAMS and examines the performance of HbA1c POC testing at the 6.5% cut-off recommended for diagnosis. The total number of HbA1c EQA results submitted from 2002 to 2016 was 29,093. The median imprecision for EQA testing by QAAMS device operators averaged 2.81% (SD 0.50; range 2.2 to 3.9%) from 2002 to 2016 and 2.44% (SD 0.22; range 2.2 to 2.9%) from 2009 to 2016. No significant difference was observed between the median imprecision achieved in QAAMS and by Australasian laboratories from 2002 to 2016 (p=0.05; two-tailed paired t-test) and from 2009 to 2016 (p=0.17; two-tailed paired t-test). For QC testing from 2009 to 2016, imprecision averaged 2.5% and 3.0% for the two levels of QC tested. Percentage acceptable results averaged 90% for QA testing from 2002 to 2016 and 96% for QC testing from 2009 to 2016. The DCA Vantage was able to measure a patient and an EQA sample with an HbA1c value close to 6.5% both accurately and precisely. HbA1c POC testing in QAAMS has remained analytically sound, matched the quality achieved by Australasian laboratories and met profession-derived analytical goals for 15years. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  14. Protecting the proteome: Eukaryotic cotranslational quality control pathways

    PubMed Central

    2014-01-01

    The correct decoding of messenger RNAs (mRNAs) into proteins is an essential cellular task. The translational process is monitored by several quality control (QC) mechanisms that recognize defective translation complexes in which ribosomes are stalled on substrate mRNAs. Stalled translation complexes occur when defects in the mRNA template, the translation machinery, or the nascent polypeptide arrest the ribosome during translation elongation or termination. These QC events promote the disassembly of the stalled translation complex and the recycling and/or degradation of the individual mRNA, ribosomal, and/or nascent polypeptide components, thereby clearing the cell of improper translation products and defective components of the translation machinery. PMID:24535822

  15. General Quality Control (QC) Guidelines for SAM Methods

    EPA Pesticide Factsheets

    Learn more about quality control guidelines and recommendations for the analysis of samples using the methods listed in EPA's Selected Analytical Methods for Environmental Remediation and Recovery (SAM).

  16. Spectrally high performing quantum cascade lasers

    NASA Astrophysics Data System (ADS)

    Toor, Fatima

    Quantum cascade (QC) lasers are versatile semiconductor light sources that can be engineered to emit light of almost any wavelength in the mid- to far-infrared (IR) and terahertz region from 3 to 300 mum [1-5]. Furthermore QC laser technology in the mid-IR range has great potential for applications in environmental, medical and industrial trace gas sensing [6-10] since several chemical vapors have strong rovibrational frequencies in this range and are uniquely identifiable by their absorption spectra through optical probing of absorption and transmission. Therefore, having a wide range of mid-IR wavelengths in a single QC laser source would greatly increase the specificity of QC laser-based spectroscopic systems, and also make them more compact and field deployable. This thesis presents work on several different approaches to multi-wavelength QC laser sources that take advantage of band-structure engineering and the uni-polar nature of QC lasers. Also, since for chemical sensing, lasers with narrow linewidth are needed, work is presented on a single mode distributed feedback (DFB) QC laser. First, a compact four-wavelength QC laser source, which is based on a 2-by-2 module design, with two waveguides having QC laser stacks for two different emission wavelengths each, one with 7.0 mum/11.2 mum, and the other with 8.7 mum/12.0 mum is presented. This is the first design of a four-wavelength QC laser source with widely different emission wavelengths that uses minimal optics and electronics. Second, since there are still several unknown factors that affect QC laser performance, results on a first ever study conducted to determine the effects of waveguide side-wall roughness on QC laser performance using the two-wavelength waveguides is presented. The results are consistent with Rayleigh scattering effects in the waveguides, with roughness effecting shorter wavelengths more than longer wavelengths. Third, a versatile time-multiplexed multi-wavelength QC laser system that emits at lambda = 10.8 mum for positive and lambda = 8.6 mum for negative polarity current with microsecond time delay is presented. Such a system is the first demonstration of a time and wavelength multiplexed system that uses a single QC laser. Fourth, work on the design and fabrication of a single-mode distributed feedback (DFB) QC laser emitting at lambda ≈ 7.7 mum to be used in a QC laser based photoacoustic sensor is presented. The DFB QC laser had a temperature tuning co-efficient of 0.45 nm/K for a temperature range of 80 K to 320 K, and a side mode suppression ratio of greater than 30 dB. Finally, study on the lateral mode patterns of wide ridge QC lasers is presented. The results include the observation of degenerate and non-degenerate lateral modes in wide ridge QC lasers emitting at lambda ≈ 5.0 mum. This study was conducted with the end goal of using wide ridge QC lasers in a novel technique to spatiospectrally combine multiple transverse modes to obtain an ultra high power single spot QC laser beam.

  17. Spatial variation of crustal coda Q in California

    USGS Publications Warehouse

    Philips, W.S.; Lee, W.H.K.; Newberry, J.T.

    1988-01-01

    Coda wave data from California microearthquakes were studied in order to delineate regional fluctuations of apparent crustal attenuation in the band 1.5 to 24 Hz. Apparent attenuation was estimated using a single back scattering model of coda waves. The coda wave data were restricted to ???30 s following the origin time; this insures that crustal effects dominate the results as the backscattered shear waves thought to form the coda would not have had time to penetrate much deeper. Results indicate a strong variation in apparent crustal attenuation at high frequencies between the Franciscan and Salinian regions of central California and the Long Valley area of the Sierra Nevada. Although the coda Q measurements coincide at 1.5 Hz (Qc=100), at 24 Hz there is a factor of four difference between the measurements made in Franciscan (Qc=525) and Long Valley (Qc=2100) with the Salinian midway between (Qc=900). These are extremely large variations compared to measures of seismic velocities of comparable resolution, demonstrating the exceptional sensitivity of the high frequency coda Q measurement to regional geology. In addition, the frequency trend of the results is opposite to that seen in a compilation of coda Q measurements made worldwide by other authors which tend to converge at high and diverge at low frequencies, however, the worldwide results generally were obtained without limiting the coda lengths and probably reflect upper mantle rather than crustal properties. Our results match those expected due to scattering in random media represented by Von Karman autocorrelation functions of orders 1/2 to 1/3. The Von Karman medium of order 1/3 corresponding to the Franciscan coda Q measurement contains greater amounts of high wavenumber fluctuations. This indicates relatively large medium fluctuations with wavelengths on the order of 100 m in the highly deformed crust associated with the Franciscan, however, the influence of scattering on the coda Q measurement is currently a matter of controversy. ?? 1988 Birkha??user Verlag.

  18. Overview of the Nordic Seas CARINA data and salinity measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olsen, Are; Key, Robert; Jeansson, Emil

    2009-01-01

    Water column data of carbon and carbon relevant hydrographic and hydrochemical parameters from 188 previously non-publicly available cruises in the Arctic, Atlantic, and Southern Ocean have been retrieved and merged into a new database: CARINA (CARbon IN the Atlantic). The data have been subject to rigorous quality control (QC) in order to ensure highest possible quality and consistency. The data for most of the parameters included were examined in order to quantify systematic biases in the reported values, i.e. secondary quality control. Significant biases have been corrected for in the data products, i.e. the three merged files with measured, calculatedmore » and interpolated values for each of the three CARINA regions; the Arctic Mediterranean Seas (AMS), the Atlantic (ATL) and the Southern Ocean (SO).With the adjustments the CARINA database is consistent both internally as well as with GLODAP (Key et al., 2004) and is suitable for accurate assessments of, for example, oceanic carbon inventories and uptake rates and for model validation. The Arctic Mediterranean Seas include the Arctic Ocean and the Nordic Seas, and the quality control was carried out separately in these two areas. This contribution provides an overview of the CARINA data from the Nordic Seas and summaries the findings of the QC of the salinity data. One cruise had salinity data that were of questionable quality, and these have been removed from the data product. An evaluation of the consistency of the quality controlled salinity data suggests that they are consistent to at least 0.005.« less

  19. Quantum correlation properties in Matrix Product States of finite-number spin rings

    NASA Astrophysics Data System (ADS)

    Zhu, Jing-Min; He, Qi-Kai

    2018-02-01

    The organization and structure of quantum correlation (QC) of quantum spin-chains are very rich and complex. Hence the depiction and measures about the QC of finite-number spin rings deserved to be investigated intensively by using Matrix Product States(MPSs) in addition to the case with infinite-number. Here the dependencies of the geometric quantum discord(GQD) of two spin blocks on the total spin number, the spacing spin number and the environment parameter are presented in detail. We also compare the GQD with the total correlation(TC) and the classical correlation(CC) and illustrate its characteristics. Predictably, our findings may provide the potential of designing the optimal QC experimental detection proposals and pave the way for the designation of optimal quantum information processing schemes.

  20. Pulmonary diffusing capacity, capillary blood volume, and cardiac output during sustained microgravity

    NASA Technical Reports Server (NTRS)

    Prisk, G. K.; Guy, Harold J. B.; Elliott, Ann R.; Deutschman, Robert A., III; West, John B.

    1993-01-01

    We measured pulmonary diffusing capacity (DL), diffusing capacity per unit lung volume, pulmonary capillary blood volume (Vc), membrane diffusing capacity (Dm), pulmonary capillary blood flow or cardiac output (Qc), and cardiac stroke volume (SV) in four subjects exposed to nine days of microgravity. DL in microgravity was elevated compared with preflight standing values and was higher than preflight supine because of the elevation of both Vc and Dm. The elevation in Vc was comparable to that measured supine in 1 G, but the increase in Dm was in sharp contrast to the supine value. We postulate that, in 0 G, pulmonary capillary blood is evenly distributed throughout the lung, providing for uniform capillary filling, leading to an increase in the surface area available for diffusion. By contrast, in the supine 1-G state, the capillaries are less evenly filled, and although a similar increase in blood volume is observed, the corresponding increase in surface area does not occur. DL and its subdivisions showed no adaptive changes from the first measurement 24 h after the start of 0 G to eight days later. Similarly, there were no trends in the postflight data, suggesting that the principal mechanism of these changes was gravitational. The increase in Dm suggests that subclinical pulmonary edema did not result from exposure to 0 G. Qc was modestly increased inflight and decreased postflight compared with preflight standing. Compared with preflight standing, SV was increased 46 percent inflight and decreased 14 percent in the 1st week postflight. There were temporal changes in Qc and SV during 0 G, with the highest values recorded at the first measurement, 24 h into the flight. The lowest values of Qc and SV occurred on the day of return.

  1. Aircraft Measurements for Understanding Air-Sea Coupling and Improving Coupled Model Predictions Over the Indian Ocean

    DTIC Science & Technology

    2012-09-30

    December 2011 • Daily weather forecasts and briefing for aircraft operations in Diego Garcia, reports posted on EOL field catalog in realtime (http...summary of aircraft missions posted on EOL website (http://catalog.eol.ucar.edu/cgi-bin/dynamo/report/index) 3. Post-field campaign (including on...going data analysis into FY13): • Dropsonde data analysis, worked with EOL on data quality control (QC), participated in the DYNAMO Sounding Workshop

  2. Glutaminyl Cyclase Knock-out Mice Exhibit Slight Hypothyroidism but No Hypogonadism

    PubMed Central

    Schilling, Stephan; Kohlmann, Stephanie; Bäuscher, Christoph; Sedlmeier, Reinhard; Koch, Birgit; Eichentopf, Rico; Becker, Andreas; Cynis, Holger; Hoffmann, Torsten; Berg, Sabine; Freyse, Ernst-Joachim; von Hörsten, Stephan; Rossner, Steffen; Graubner, Sigrid; Demuth, Hans-Ulrich

    2011-01-01

    Glutaminyl cyclases (QCs) catalyze the formation of pyroglutamate (pGlu) residues at the N terminus of peptides and proteins. Hypothalamic pGlu hormones, such as thyrotropin-releasing hormone and gonadotropin-releasing hormone are essential for regulation of metabolism and fertility in the hypothalamic pituitary thyroid and gonadal axes, respectively. Here, we analyzed the consequences of constitutive genetic QC ablation on endocrine functions and on the behavior of adult mice. Adult homozygous QC knock-out mice are fertile and behave indistinguishably from wild type mice in tests of motor function, cognition, general activity, and ingestion behavior. The QC knock-out results in a dramatic drop of enzyme activity in the brain, especially in hypothalamus and in plasma. Other peripheral organs like liver and spleen still contain QC activity, which is most likely caused by its homolog isoQC. The serum gonadotropin-releasing hormone, TSH, and testosterone concentrations were not changed by QC depletion. The serum thyroxine was decreased by 24% in homozygous QC knock-out animals, suggesting a mild hypothyroidism. QC knock-out mice were indistinguishable from wild type with regard to blood glucose and glucose tolerance, thus differing from reports of thyrotropin-releasing hormone knock-out mice significantly. The results suggest a significant formation of the hypothalamic pGlu hormones by alternative mechanisms, like spontaneous cyclization or conversion by isoQC. The different effects of QC depletion on the hypothalamic pituitary thyroid and gonadal axes might indicate slightly different modes of substrate conversion of both enzymes. The absence of significant abnormalities in QC knock-out mice suggests the presence of a therapeutic window for suppression of QC activity in current drug development. PMID:21330373

  3. Analytical performance evaluation of a high-volume hematology laboratory utilizing sigma metrics as standard of excellence.

    PubMed

    Shaikh, M S; Moiz, B

    2016-04-01

    Around two-thirds of important clinical decisions about the management of patients are based on laboratory test results. Clinical laboratories are required to adopt quality control (QC) measures to ensure provision of accurate and precise results. Six sigma is a statistical tool, which provides opportunity to assess performance at the highest level of excellence. The purpose of this study was to assess performance of our hematological parameters on sigma scale in order to identify gaps and hence areas of improvement in patient care. Twelve analytes included in the study were hemoglobin (Hb), hematocrit (Hct), red blood cell count (RBC), mean corpuscular volume (MCV), red cell distribution width (RDW), total leukocyte count (TLC) with percentages of neutrophils (Neutr%) and lymphocytes (Lymph %), platelet count (Plt), mean platelet volume (MPV), prothrombin time (PT), and fibrinogen (Fbg). Internal quality control data and external quality assurance survey results were utilized for the calculation of sigma metrics for each analyte. Acceptable sigma value of ≥3 was obtained for the majority of the analytes included in the analysis. MCV, Plt, and Fbg achieved value of <3 for level 1 (low abnormal) control. PT performed poorly on both level 1 and 2 controls with sigma value of <3. Despite acceptable conventional QC tools, application of sigma metrics can identify analytical deficits and hence prospects for the improvement in clinical laboratories. © 2016 John Wiley & Sons Ltd.

  4. Quality assurance and quality control of geochemical data—A primer for the research scientist

    USGS Publications Warehouse

    Geboy, Nicholas J.; Engle, Mark A.

    2011-01-01

    Geochemistry is a constantly expanding science. More and more, scientists are employing geochemical tools to help answer questions about the Earth and earth system processes. Scientists may assume that the responsibility of examining and assessing the quality of the geochemical data they generate is not theirs but rather that of the analytical laboratories to which their samples have been submitted. This assumption may be partially based on knowledge about internal and external quality assurance and quality control (QA/QC) programs in which analytical laboratories typically participate. Or there may be a perceived lack of time or resources to adequately examine data quality. Regardless of the reason, the lack of QA/QC protocols can lead to the generation and publication of erroneous data. Because the interpretations drawn from the data are primary products to U.S. Geological Survey (USGS) stakeholders, the consequences of publishing erroneous results can be significant. The principal investigator of a scientific study ultimately is responsible for the quality and interpretation of the project's findings, and thus must also play a role in the understanding, implementation, and presentation of QA/QC information about the data. Although occasionally ignored, QA/QC protocols apply not only to procedures in the laboratory but also in the initial planning of a research study and throughout the life of the project. Many of the tenets of developing a sound QA/QC program or protocols also parallel the core concepts of developing a good study: What is the main objective of the study? Will the methods selected provide data of enough resolution to answer the hypothesis? How should samples be collected? Are there known or unknown artifacts or contamination sources in the sampling and analysis methods? Assessing data quality requires communication between the scientists responsible for designing the study and those collecting samples, analyzing samples, treating data, and interpreting results. This primer has been developed to provide basic information and guidance about developing QA/QC protocols for geochemical studies. It is not intended to be a comprehensive guide but rather an introduction to key concepts tied to a list of relevant references for further reading. The guidelines are presented in stepwise order beginning with presampling considerations and continuing through final data interpretation. The goal of this primer is to outline basic QA/QC practices that scientists can use before, during, and after chemical analysis to ensure the validity of the data they collect with the goal of providing defendable results and conclusions.

  5. Satellite-Based Quantum Communications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hughes, Richard J; Nordholt, Jane E; McCabe, Kevin P

    2010-09-20

    Single-photon quantum communications (QC) offers the attractive feature of 'future proof', forward security rooted in the laws of quantum physics. Ground based quantum key distribution (QKD) experiments in optical fiber have attained transmission ranges in excess of 200km, but for larger distances we proposed a methodology for satellite-based QC. Over the past decade we have devised solutions to the technical challenges to satellite-to-ground QC, and we now have a clear concept for how space-based QC could be performed and potentially utilized within a trusted QKD network architecture. Functioning as a trusted QKD node, a QC satellite ('QC-sat') could deliver secretmore » keys to the key stores of ground-based trusted QKD network nodes, to each of which multiple users are connected by optical fiber or free-space QC. A QC-sat could thereby extend quantum-secured connectivity to geographically disjoint domains, separated by continental or inter-continental distances. In this paper we describe our system concept that makes QC feasible with low-earth orbit (LEO) QC-sats (200-km-2,000-km altitude orbits), and the results of link modeling of expected performance. Using the architecture that we have developed, LEO satellite-to-ground QKD will be feasible with secret bit yields of several hundred 256-bit AES keys per contact. With multiple ground sites separated by {approx} 100km, mitigation of cloudiness over any single ground site would be possible, potentially allowing multiple contact opportunities each day. The essential next step is an experimental QC-sat. A number of LEO-platforms would be suitable, ranging from a dedicated, three-axis stabilized small satellite, to a secondary experiment on an imaging satellite. to the ISS. With one or more QC-sats, low-latency quantum-secured communications could then be provided to ground-based users on a global scale. Air-to-ground QC would also be possible.« less

  6. Joint design of QC-LDPC codes for coded cooperation system with joint iterative decoding

    NASA Astrophysics Data System (ADS)

    Zhang, Shunwai; Yang, Fengfan; Tang, Lei; Ejaz, Saqib; Luo, Lin; Maharaj, B. T.

    2016-03-01

    In this paper, we investigate joint design of quasi-cyclic low-density-parity-check (QC-LDPC) codes for coded cooperation system with joint iterative decoding in the destination. First, QC-LDPC codes based on the base matrix and exponent matrix are introduced, and then we describe two types of girth-4 cycles in QC-LDPC codes employed by the source and relay. In the equivalent parity-check matrix corresponding to the jointly designed QC-LDPC codes employed by the source and relay, all girth-4 cycles including both type I and type II are cancelled. Theoretical analysis and numerical simulations show that the jointly designed QC-LDPC coded cooperation well combines cooperation gain and channel coding gain, and outperforms the coded non-cooperation under the same conditions. Furthermore, the bit error rate performance of the coded cooperation employing jointly designed QC-LDPC codes is better than those of random LDPC codes and separately designed QC-LDPC codes over AWGN channels.

  7. quantGenius: implementation of a decision support system for qPCR-based gene quantification.

    PubMed

    Baebler, Špela; Svalina, Miha; Petek, Marko; Stare, Katja; Rotter, Ana; Pompe-Novak, Maruša; Gruden, Kristina

    2017-05-25

    Quantitative molecular biology remains a challenge for researchers due to inconsistent approaches for control of errors in the final results. Due to several factors that can influence the final result, quantitative analysis and interpretation of qPCR data are still not trivial. Together with the development of high-throughput qPCR platforms, there is a need for a tool allowing for robust, reliable and fast nucleic acid quantification. We have developed "quantGenius" ( http://quantgenius.nib.si ), an open-access web application for a reliable qPCR-based quantification of nucleic acids. The quantGenius workflow interactively guides the user through data import, quality control (QC) and calculation steps. The input is machine- and chemistry-independent. Quantification is performed using the standard curve approach, with normalization to one or several reference genes. The special feature of the application is the implementation of user-guided QC-based decision support system, based on qPCR standards, that takes into account pipetting errors, assay amplification efficiencies, limits of detection and quantification of the assays as well as the control of PCR inhibition in individual samples. The intermediate calculations and final results are exportable in a data matrix suitable for further statistical analysis or visualization. We additionally compare the most important features of quantGenius with similar advanced software tools and illustrate the importance of proper QC system in the analysis of qPCR data in two use cases. To our knowledge, quantGenius is the only qPCR data analysis tool that integrates QC-based decision support and will help scientists to obtain reliable results which are the basis for biologically meaningful data interpretation.

  8. QUALITY CONTROLS FOR PCR

    EPA Science Inventory

    The purpose of this presentation is to present an overview of the quality control (QC) sections of a draft EPA document entitled, "Quality Assurance/Quality Control Guidance for Laboratories Performing PCR Analyses on Environmental Samples." This document has been prepared by th...

  9. Adjustment of Pesticide Concentrations for Temporal Changes in Analytical Recovery, 1992-2006

    USGS Publications Warehouse

    Martin, Jeffrey D.; Stone, Wesley W.; Wydoski, Duane S.; Sandstrom, Mark W.

    2009-01-01

    Recovery is the proportion of a target analyte that is quantified by an analytical method and is a primary indicator of the analytical bias of a measurement. Recovery is measured by analysis of quality-control (QC) water samples that have known amounts of target analytes added ('spiked' QC samples). For pesticides, recovery is the measured amount of pesticide in the spiked QC sample expressed as percentage of the amount spiked, ideally 100 percent. Temporal changes in recovery have the potential to adversely affect time-trend analysis of pesticide concentrations by introducing trends in environmental concentrations that are caused by trends in performance of the analytical method rather than by trends in pesticide use or other environmental conditions. This report examines temporal changes in the recovery of 44 pesticides and 8 pesticide degradates (hereafter referred to as 'pesticides') that were selected for a national analysis of time trends in pesticide concentrations in streams. Water samples were analyzed for these pesticides from 1992 to 2006 by gas chromatography/mass spectrometry. Recovery was measured by analysis of pesticide-spiked QC water samples. Temporal changes in pesticide recovery were investigated by calculating robust, locally weighted scatterplot smooths (lowess smooths) for the time series of pesticide recoveries in 5,132 laboratory reagent spikes; 1,234 stream-water matrix spikes; and 863 groundwater matrix spikes. A 10-percent smoothing window was selected to show broad, 6- to 12-month time scale changes in recovery for most of the 52 pesticides. Temporal patterns in recovery were similar (in phase) for laboratory reagent spikes and for matrix spikes for most pesticides. In-phase temporal changes among spike types support the hypothesis that temporal change in method performance is the primary cause of temporal change in recovery. Although temporal patterns of recovery were in phase for most pesticides, recovery in matrix spikes was greater than recovery in reagent spikes for nearly every pesticide. Models of recovery based on matrix spikes are deemed more appropriate for adjusting concentrations of pesticides measured in groundwater and stream-water samples than models based on laboratory reagent spikes because (1) matrix spikes are expected to more closely match the matrix of environmental water samples than are reagent spikes and (2) method performance is often matrix dependent, as was shown by higher recovery in matrix spikes for most of the pesticides. Models of recovery, based on lowess smooths of matrix spikes, were developed separately for groundwater and stream-water samples. The models of recovery can be used to adjust concentrations of pesticides measured in groundwater or stream-water samples to 100 percent recovery to compensate for temporal changes in the performance (bias) of the analytical method.

  10. 7 CFR 275.10 - Scope and purpose.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... to enhanced funding. (b) The objectives of quality control reviews are to provide: (1) A systematic... FOOD STAMP AND FOOD DISTRIBUTION PROGRAM PERFORMANCE REPORTING SYSTEM Quality Control (QC) Reviews... responsible for conducting quality control reviews. For food stamp quality control reviews, a sample of...

  11. Environment-induced quantum coherence spreading of a qubit

    NASA Astrophysics Data System (ADS)

    Pozzobom, Mauro B.; Maziero, Jonas

    2017-02-01

    We make a thorough study of the spreading of quantum coherence (QC), as quantified by the l1-norm QC, when a qubit (a two-level quantum system) is subjected to noise quantum channels commonly appearing in quantum information science. We notice that QC is generally not conserved and that even incoherent initial states can lead to transitory system-environment QC. We show that for the amplitude damping channel the evolved total QC can be written as the sum of local and non-local parts, with the last one being equal to entanglement. On the other hand, for the phase damping channel (PDC) entanglement does not account for all non-local QC, with the gap between them depending on time and also on the qubit's initial state. Besides these issues, the possibility and conditions for time invariance of QC are regarded in the case of bit, phase, and bit-phase flip channels. Here we reveal the qualitative dynamical inequivalence between these channels and the PDC and show that the creation of system-environment entanglement does not necessarily imply the destruction of the qubit's QC. We also investigate the resources needed for non-local QC creation, showing that while the PDC requires initial coherence of the qubit, for some other channels non-zero population of the excited state (i.e., energy) is sufficient. Related to that, considering the depolarizing channel we notice the qubit's ability to act as a catalyst for the creation of joint QC and entanglement, without need for nonzero initial QC or excited state population.

  12. CARINA data synthesis project: pH data scale unification and cruise adjustments

    NASA Astrophysics Data System (ADS)

    Velo, A.; Pérez, F. F.; Lin, X.; Key, R. M.; Tanhua, T.; de La Paz, M.; van Heuven, S.; Jutterström, S.; Ríos, A. F.

    2009-10-01

    Data on carbon and carbon-relevant hydrographic and hydrochemical parameters from previously non-publicly available cruise data sets in the Artic Mediterranean Seas (AMS), Atlantic and Southern Ocean have been retrieved and merged to a new database: CARINA (CARbon IN the Atlantic). These data have gone through rigorous quality control (QC) procedures to assure the highest possible quality and consistency. The data for most of the measured parameters in the CARINA database were objectively examined in order to quantify systematic differences in the reported values, i.e. secondary quality control. Systematic biases found in the data have been corrected in the data products, i.e. three merged data files with measured, calculated and interpolated data for each of the three CARINA regions; AMS, Atlantic and Southern Ocean. Out of a total of 188 cruise entries in the CARINA database, 59 reported pH measured values. Here we present details of the secondary QC on pH for the CARINA database. Procedures of quality control, including crossover analysis between cruises and inversion analysis of all crossover data are briefly described. Adjustments were applied to the pH values for 21 of the cruises in the CARINA dataset. With these adjustments the CARINA database is consistent both internally as well as with GLODAP data, an oceanographic data set based on the World Hydrographic Program in the 1990s. Based on our analysis we estimate the internal accuracy of the CARINA pH data to be 0.005 pH units. The CARINA data are now suitable for accurate assessments of, for example, oceanic carbon inventories and uptake rates and for model validation.

  13. Evaluation of effective energy for QA and QC: measurement of half-value layer using radiochromic film density.

    PubMed

    Gotanda, T; Katsuda, T; Gotanda, R; Tabuchi, A; Yamamoto, K; Kuwano, T; Yatake, H; Takeda, Y

    2009-03-01

    The effective energy of diagnostic X-rays is important for quality assurance (QA) and quality control (QC). However, the half-value layer (HVL), which is necessary to evaluate the effective energy, is not ubiquitously monitored because ionization-chamber dosimetry is time-consuming and complicated. To verify the applicability of GAFCHROMIC XR type R (GAF-R) film for HVL measurement as an alternative to monitoring with an ionization chamber, a single-strip method for measuring the HVL has been evaluated. Calibration curves of absorbed dose versus film density were generated using this single-strip method with GAF-R film, and the coefficient of determination (r2) of the straight-line approximation was evaluated. The HVLs (effective energies) estimated using the GAF-R film and an ionization chamber were compared. The coefficient of determination (r2) of the straight-line approximation obtained with the GAF-R film was more than 0.99. The effective energies (HVLs) evaluated using the GAF-R film and the ionization chamber were 43.25 keV (5.10 mm) and 39.86 keV (4.45 mm), respectively. The difference in the effective energies determined by the two methods was thus 8.5%. These results suggest that GAF-R might be used to evaluate the effective energy from the film-density growth without the need for ionization-chamber measurements.

  14. Announcement—guidance document for acquiring reliable data in ecological restoration projects

    USGS Publications Warehouse

    Stapanian, Martin A.; Rodriguez, Karen; Lewis, Timothy E.; Blume, Louis; Palmer, Craig J.; Walters, Lynn; Schofield, Judith; Amos, Molly M.; Bucher, Adam

    2016-01-01

    The Laurentian Great Lakes are undergoing intensive ecological restoration in Canada and the United States. In the United States, an interagency committee was formed to facilitate implementation of quality practices for federally funded restoration projects in the Great Lakes basin. The Committee's responsibilities include developing a guidance document that will provide a common approach to the application of quality assurance and quality control (QA/QC) practices for restoration projects. The document will serve as a “how-to” guide for ensuring data quality during each aspect of ecological restoration projects. In addition, the document will provide suggestions on linking QA/QC data with the routine project data and hints on creating detailed supporting documentation. Finally, the document will advocate integrating all components of the project, including QA/QC applications, into an overarching decision-support framework. The guidance document is expected to be released by the U.S. EPA Great Lakes National Program Office in 2017.

  15. Reliability of plasma polar metabolite concentrations in a large-scale cohort study using capillary electrophoresis-mass spectrometry.

    PubMed

    Harada, Sei; Hirayama, Akiyoshi; Chan, Queenie; Kurihara, Ayako; Fukai, Kota; Iida, Miho; Kato, Suzuka; Sugiyama, Daisuke; Kuwabara, Kazuyo; Takeuchi, Ayano; Akiyama, Miki; Okamura, Tomonori; Ebbels, Timothy M D; Elliott, Paul; Tomita, Masaru; Sato, Asako; Suzuki, Chizuru; Sugimoto, Masahiro; Soga, Tomoyoshi; Takebayashi, Toru

    2018-01-01

    Cohort studies with metabolomics data are becoming more widespread, however, large-scale studies involving 10,000s of participants are still limited, especially in Asian populations. Therefore, we started the Tsuruoka Metabolomics Cohort Study enrolling 11,002 community-dwelling adults in Japan, and using capillary electrophoresis-mass spectrometry (CE-MS) and liquid chromatography-mass spectrometry. The CE-MS method is highly amenable to absolute quantification of polar metabolites, however, its reliability for large-scale measurement is unclear. The aim of this study is to examine reproducibility and validity of large-scale CE-MS measurements. In addition, the study presents absolute concentrations of polar metabolites in human plasma, which can be used in future as reference ranges in a Japanese population. Metabolomic profiling of 8,413 fasting plasma samples were completed using CE-MS, and 94 polar metabolites were structurally identified and quantified. Quality control (QC) samples were injected every ten samples and assessed throughout the analysis. Inter- and intra-batch coefficients of variation of QC and participant samples, and technical intraclass correlation coefficients were estimated. Passing-Bablok regression of plasma concentrations by CE-MS on serum concentrations by standard clinical chemistry assays was conducted for creatinine and uric acid. In QC samples, coefficient of variation was less than 20% for 64 metabolites, and less than 30% for 80 metabolites out of the 94 metabolites. Inter-batch coefficient of variation was less than 20% for 81 metabolites. Estimated technical intraclass correlation coefficient was above 0.75 for 67 metabolites. The slope of Passing-Bablok regression was estimated as 0.97 (95% confidence interval: 0.95, 0.98) for creatinine and 0.95 (0.92, 0.96) for uric acid. Compared to published data from other large cohort measurement platforms, reproducibility of metabolites common to the platforms was similar to or better than in the other studies. These results show that our CE-MS platform is suitable for conducting large-scale epidemiological studies.

  16. Remedial Investigation/Feasibility Study, Operable Unit 5, Elmendorf AFB, Anchorage, Alaska. Volume 3. Appendices K - T

    DTIC Science & Technology

    1994-03-04

    WalerQC METHOD BANK 30104 79-0146 TRHICLOROE1Ifl.BEE(TE) 0.j U11.01 WalerQC UShODSBAIN 301 04W 79-0146 TRIILMOROBHYLBEE (TCE) IU 1101.. alerQC METHOD...OOUL1!ANE -SS 89 %IC WSWeQC METHOD BANK 3020(1400 22M 0-Si-S 2*OOCLOROBUTANE -SI 902 sm WalerQC METHOD BLANK 8020(1400 22M 0-365 1.4003C2LOROSUfANE...SS 920 %wI WmerQC METHMOD BANK 0102(1400 CH 10-56-5 I.OX4-D01OOSUANE -SI IisBc WaNer C METHOD BLANK 8100(1400 22 10-5&5 2.40 EHOROSUTANE -SI 92 IC

  17. A model for tracking concentration of chemical compounds within a tank of an automatic film processor.

    PubMed

    Sobol, Wlad T

    2002-01-01

    A simple kinetic model that describes the time evolution of the chemical concentration of an arbitrary compound within the tank of an automatic film processor is presented. It provides insights into the kinetics of chemistry concentration inside the processor's tank; the results facilitate the tasks of processor tuning and quality control (QC). The model has successfully been used in several troubleshooting sessions of low-volume mammography processors for which maintaining consistent QC tracking was difficult due to fluctuations of bromide levels in the developer tank.

  18. Proximate Composition Analysis.

    PubMed

    2016-01-01

    The proximate composition of foods includes moisture, ash, lipid, protein and carbohydrate contents. These food components may be of interest in the food industry for product development, quality control (QC) or regulatory purposes. Analyses used may be rapid methods for QC or more accurate but time-consuming official methods. Sample collection and preparation must be considered carefully to ensure analysis of a homogeneous and representative sample, and to obtain accurate results. Estimation methods of moisture content, ash value, crude lipid, total carbohydrates, starch, total free amino acids and total proteins are put together in a lucid manner.

  19. 40 CFR 98.144 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... melting furnace from monthly measurements using plant instruments used for accounting purposes, such as... raw material; such measurements shall be based on sampling and chemical analysis conducted by a...

  20. Coda Wave Attenuation Characteristics for North Anatolian Fault Zone, Turkey

    NASA Astrophysics Data System (ADS)

    Sertcelik, Fadime; Guleroglu, Mehmet

    2017-10-01

    North Anatolian Fault Zone, on which large earthquakes have occurred in the past, migrates regularly from east to west, and it is one of the most active faults in the world. The purpose of this study is to estimate the coda wave quality factor (Qc) for each of the five sub regionsthat were determined according to the fault rupture of these large earthquakes and along the fault. 978 records have been analyzed for 1.5, 3, 6, 9, 12 and 18 Hz frequencies by Single Backscattering Method. Along the fault, the variations in the Qc with lapse time are determined via, Qc = (136±25)f(0.96±0.027), Qc = (208±22)f(0.85±0.02) Qc = (307±28)f(0.72±0.025) at 20, 30, 40 sec lapse times, respectively. The estimated average frequency-dependence quality factor for all lapse time are; Qc(f) = (189±26)f(0.86±0.02) for Karliova-Tokat region; Qc(f) = (216±19)f(0.76±0.018) for Tokat-Çorum region; Qc(f) = (232±18)f(0.76±0.019) for Çorum-Adapazari region; Qc(f) = (280±28)f(0.79±0.021) for Adapazari-Yalova region; Qc(f) = (252±26)f(0.81±0.022) for Yalova-Gulf of Saros region. The coda wave quality factor at all the lapse times and frequencies is Qc(f) = (206±15)f(0.85±0.012) in the study area. The most change of Qc with lapse time is determined at Yalova-Saros region. The result may be related to heterogeneity degree of rapidly decreases towards the deep crust like compared to the other sub region. Moreover, the highest Qc is calculated between Adapazari - Yalova. It was interpreted as a result of seismic energy released by 1999 Kocaeli Earthquake. Besides, it couldn't be established a causal relationship between the regional variation of Qc with frequency and lapse time associated to migration of the big earthquakes. These results have been interpreted as the attenuation mechanism is affected by both regional heterogeneity and consist of a single or multi strands of the fault structure.

  1. Forensic-metrological considerations on assessment of compliance (or non-compliance) in forensic blood alcohol content determinations: A case study with software application.

    PubMed

    Zamengo, Luca; Frison, Giampietro; Tedeschi, Gianpaola; Frasson, Samuela

    2016-08-01

    Blood alcohol concentration is the most frequent analytical determination carried out in forensic toxicology laboratories worldwide. It is usually required to assess if an offence has been committed by comparing blood alcohol levels with specified legal limits, which can vary widely among countries. Due to possible serious legal consequences associated with non-compliant alcohol levels, measurement uncertainty should be carefully evaluated, along with other metrological aspects which can influence the final result. The whole procedure can be time-consuming and error-generating in routine practice, increasing the risks for unreliable assessments. A software application named Ethanol WorkBook (EtWB) was developed at the author's laboratory by using Visual Basic for Application language and MS Excel(®), with the aim of providing help to forensic analysts involved in blood alcohol determinations. The program can (i) calculate measurement uncertainties and decision limits with different methodologies; (ii) assess compliance to specification limits with a guard-band approach; (iii) manage quality control (QC) data and create control charts for QC samples; (iv) create control maps from real cases data archives; (v) provide laboratory reports with graphical outputs for elaborated data and (vi) create comprehensive searchable case archives. A typical example of drink driving case is presented and discussed to illustrate the importance of a metrological approach for reliable compliance assessment and to demonstrate software application in routine practice. The tool is made freely available to the scientific community at request. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  2. 7 CFR 275.10 - Scope and purpose.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... FOOD STAMP AND FOOD DISTRIBUTION PROGRAM PERFORMANCE REPORTING SYSTEM Quality Control (QC) Reviews... responsible for conducting quality control reviews. For food stamp quality control reviews, a sample of... terminated (called negative cases). Reviews shall be conducted on active cases to determine if households are...

  3. The Individualized Quality Control Plan - Coming Soon to Clinical Microbiology Laboratories Everywhere!

    PubMed

    Anderson, Nancy

    2015-11-15

    As of January 1, 2016, microbiology laboratories can choose to adopt a new quality control option, the Individualized Quality Control Plan (IQCP), under the Clinical Laboratory Improvement Amendments of 1988 (CLIA). This voluntary approach increases flexibility for meeting regulatory requirements and provides laboratories the opportunity to customize QC for their testing in their unique environments and by their testing personnel. IQCP is an all-inclusive approach to quality based on risk management to address potential errors in the total testing process. It includes three main steps, (1) performing a risk assessment, (2) developing a QC plan, and (3) monitoring the plan through quality assessment. Resources are available from the Centers for Medicare & Medicaid Services, Centers for Disease Control and Prevention, American Society for Microbiology, Clinical and Laboratory Standards Institute, and accrediting organizations, such as the College of American Pathologists and Joint Commission, to assist microbiology laboratories implementing IQCP.

  4. Dynamic 99mTc-MAG3 renography: images for quality control obtained by combining pharmacokinetic modelling, an anthropomorphic computer phantom and Monte Carlo simulated scintillation camera imaging

    NASA Astrophysics Data System (ADS)

    Brolin, Gustav; Sjögreen Gleisner, Katarina; Ljungberg, Michael

    2013-05-01

    In dynamic renal scintigraphy, the main interest is the radiopharmaceutical redistribution as a function of time. Quality control (QC) of renal procedures often relies on phantom experiments to compare image-based results with the measurement setup. A phantom with a realistic anatomy and time-varying activity distribution is therefore desirable. This work describes a pharmacokinetic (PK) compartment model for 99mTc-MAG3, used for defining a dynamic whole-body activity distribution within a digital phantom (XCAT) for accurate Monte Carlo (MC)-based images for QC. Each phantom structure is assigned a time-activity curve provided by the PK model, employing parameter values consistent with MAG3 pharmacokinetics. This approach ensures that the total amount of tracer in the phantom is preserved between time points, and it allows for modifications of the pharmacokinetics in a controlled fashion. By adjusting parameter values in the PK model, different clinically realistic scenarios can be mimicked, regarding, e.g., the relative renal uptake and renal transit time. Using the MC code SIMIND, a complete set of renography images including effects of photon attenuation, scattering, limited spatial resolution and noise, are simulated. The obtained image data can be used to evaluate quantitative techniques and computer software in clinical renography.

  5. A novel construction scheme of QC-LDPC codes based on the RU algorithm for optical transmission systems

    NASA Astrophysics Data System (ADS)

    Yuan, Jian-guo; Liang, Meng-qi; Wang, Yong; Lin, Jin-zhao; Pang, Yu

    2016-03-01

    A novel lower-complexity construction scheme of quasi-cyclic low-density parity-check (QC-LDPC) codes for optical transmission systems is proposed based on the structure of the parity-check matrix for the Richardson-Urbanke (RU) algorithm. Furthermore, a novel irregular QC-LDPC(4 288, 4 020) code with high code-rate of 0.937 is constructed by this novel construction scheme. The simulation analyses show that the net coding gain ( NCG) of the novel irregular QC-LDPC(4 288,4 020) code is respectively 2.08 dB, 1.25 dB and 0.29 dB more than those of the classic RS(255, 239) code, the LDPC(32 640, 30 592) code and the irregular QC-LDPC(3 843, 3 603) code at the bit error rate ( BER) of 10-6. The irregular QC-LDPC(4 288, 4 020) code has the lower encoding/decoding complexity compared with the LDPC(32 640, 30 592) code and the irregular QC-LDPC(3 843, 3 603) code. The proposed novel QC-LDPC(4 288, 4 020) code can be more suitable for the increasing development requirements of high-speed optical transmission systems.

  6. Rapid prediction of chemical metabolism by human UDP-glucuronosyltransferase isoforms using quantum chemical descriptors derived with the electronegativity equalization method.

    PubMed

    Sorich, Michael J; McKinnon, Ross A; Miners, John O; Winkler, David A; Smith, Paul A

    2004-10-07

    This study aimed to evaluate in silico models based on quantum chemical (QC) descriptors derived using the electronegativity equalization method (EEM) and to assess the use of QC properties to predict chemical metabolism by human UDP-glucuronosyltransferase (UGT) isoforms. Various EEM-derived QC molecular descriptors were calculated for known UGT substrates and nonsubstrates. Classification models were developed using support vector machine and partial least squares discriminant analysis. In general, the most predictive models were generated with the support vector machine. Combining QC and 2D descriptors (from previous work) using a consensus approach resulted in a statistically significant improvement in predictivity (to 84%) over both the QC and 2D models and the other methods of combining the descriptors. EEM-derived QC descriptors were shown to be both highly predictive and computationally efficient. It is likely that EEM-derived QC properties will be generally useful for predicting ADMET and physicochemical properties during drug discovery.

  7. Bioanalytical method development and validation for the determination of glycine in human cerebrospinal fluid by ion-pair reversed-phase liquid chromatography-tandem mass spectrometry.

    PubMed

    Jiang, Jian; James, Christopher A; Wong, Philip

    2016-09-05

    A LC-MS/MS method has been developed and validated for the determination of glycine in human cerebrospinal fluid (CSF). The validated method used artificial cerebrospinal fluid as a surrogate matrix for calibration standards. The calibration curve range for the assay was 100-10,000ng/mL and (13)C2, (15)N-glycine was used as an internal standard (IS). Pre-validation experiments were performed to demonstrate parallelism with surrogate matrix and standard addition methods. The mean endogenous glycine concentration in a pooled human CSF determined on three days by using artificial CSF as a surrogate matrix and the method of standard addition was found to be 748±30.6 and 768±18.1ng/mL, respectively. A percentage difference of -2.6% indicated that artificial CSF could be used as a surrogate calibration matrix for the determination of glycine in human CSF. Quality control (QC) samples, except the lower limit of quantitation (LLOQ) QC and low QC samples, were prepared by spiking glycine into aliquots of pooled human CSF sample. The low QC sample was prepared from a separate pooled human CSF sample containing low endogenous glycine concentrations, while the LLOQ QC sample was prepared in artificial CSF. Standard addition was used extensively to evaluate matrix effects during validation. The validated method was used to determine the endogenous glycine concentrations in human CSF samples. Incurred sample reanalysis demonstrated reproducibility of the method. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Development and validation of effective real-time and periodic interinstrument comparison method for automatic hematology analyzers.

    PubMed

    Park, Sang Hyuk; Park, Chan-Jeoung; Kim, Mi-Jeong; Choi, Mi-Ok; Han, Min-Young; Cho, Young-Uk; Jang, Seongsoo

    2014-12-01

    We developed and validated an interinstrument comparison method for automatic hematology analyzers based on the 99th percentile coefficient of variation (CV) cutoff of daily means and validated in both patient samples and quality control (QC) materials. A total of 120 patient samples were obtained over 6 months. Data from the first 3 months were used to determine 99th percentile CV cutoff values, and data obtained in the last 3 months were used to calculate acceptable ranges and rejection rates. Identical analyses were also performed using QC materials. Two instrument comparisons were also performed, and the most appropriate allowable total error (ATE) values were determined. The rejection rates based on the 99th percentile cutoff values were within 10.00% and 9.30% for the patient samples and QC materials, respectively. The acceptable ranges of QC materials based on the currently used method were wider than those calculated from the 99th percentile CV cutoff values in most items. In two-instrument comparisons, 34.8% of all comparisons failed, and 87.0% of failed comparisons were successful when 4 SD was applied as an ATE value instead of 3 SD. The 99th percentile CV cutoff value-derived daily acceptable ranges can be used as a real-time interinstrument comparison method in both patient samples and QC materials. Applying 4 SD as an ATE value can significantly reduce unnecessarily followed recalibration in the leukocyte differential counts, reticulocytes, and mean corpuscular volume. Copyright© by the American Society for Clinical Pathology.

  9. Predicting the Reasons of Customer Complaints: A First Step Toward Anticipating Quality Issues of In Vitro Diagnostics Assays with Machine Learning.

    PubMed

    Aris-Brosou, Stephane; Kim, James; Li, Li; Liu, Hui

    2018-05-15

    Vendors in the health care industry produce diagnostic systems that, through a secured connection, allow them to monitor performance almost in real time. However, challenges exist in analyzing and interpreting large volumes of noisy quality control (QC) data. As a result, some QC shifts may not be detected early enough by the vendor, but lead a customer to complain. The aim of this study was to hypothesize that a more proactive response could be designed by utilizing the collected QC data more efficiently. Our aim is therefore to help prevent customer complaints by predicting them based on the QC data collected by in vitro diagnostic systems. QC data from five select in vitro diagnostic assays were combined with the corresponding database of customer complaints over a period of 90 days. A subset of these data over the last 45 days was also analyzed to assess how the length of the training period affects predictions. We defined a set of features used to train two classifiers, one based on decision trees and the other based on adaptive boosting, and assessed model performance by cross-validation. The cross-validations showed classification error rates close to zero for some assays with adaptive boosting when predicting the potential cause of customer complaints. Performance was improved by shortening the training period when the volume of complaints increased. Denoising filters that reduced the number of categories to predict further improved performance, as their application simplified the prediction problem. This novel approach to predicting customer complaints based on QC data may allow the diagnostic industry, the expected end user of our approach, to proactively identify potential product quality issues and fix these before receiving customer complaints. This represents a new step in the direction of using big data toward product quality improvement. ©Stephane Aris-Brosou, James Kim, Li Li, Hui Liu. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 15.05.2018.

  10. Predicting the Reasons of Customer Complaints: A First Step Toward Anticipating Quality Issues of In Vitro Diagnostics Assays with Machine Learning

    PubMed Central

    Kim, James; Li, Li; Liu, Hui

    2018-01-01

    Background Vendors in the health care industry produce diagnostic systems that, through a secured connection, allow them to monitor performance almost in real time. However, challenges exist in analyzing and interpreting large volumes of noisy quality control (QC) data. As a result, some QC shifts may not be detected early enough by the vendor, but lead a customer to complain. Objective The aim of this study was to hypothesize that a more proactive response could be designed by utilizing the collected QC data more efficiently. Our aim is therefore to help prevent customer complaints by predicting them based on the QC data collected by in vitro diagnostic systems. Methods QC data from five select in vitro diagnostic assays were combined with the corresponding database of customer complaints over a period of 90 days. A subset of these data over the last 45 days was also analyzed to assess how the length of the training period affects predictions. We defined a set of features used to train two classifiers, one based on decision trees and the other based on adaptive boosting, and assessed model performance by cross-validation. Results The cross-validations showed classification error rates close to zero for some assays with adaptive boosting when predicting the potential cause of customer complaints. Performance was improved by shortening the training period when the volume of complaints increased. Denoising filters that reduced the number of categories to predict further improved performance, as their application simplified the prediction problem. Conclusions This novel approach to predicting customer complaints based on QC data may allow the diagnostic industry, the expected end user of our approach, to proactively identify potential product quality issues and fix these before receiving customer complaints. This represents a new step in the direction of using big data toward product quality improvement. PMID:29764796

  11. Identifying and attributing common data quality problems: temperature and precipitation observations in Bolivia and Peru

    NASA Astrophysics Data System (ADS)

    Hunziker, Stefan; Gubler, Stefanie; Calle, Juan; Moreno, Isabel; Andrade, Marcos; Velarde, Fernando; Ticona, Laura; Carrasco, Gualberto; Castellón, Yaruska; Oria Rojas, Clara; Brönnimann, Stefan; Croci-Maspoli, Mischa; Konzelmann, Thomas; Rohrer, Mario

    2016-04-01

    Assessing climatological trends and extreme events requires high-quality data. However, for many regions of the world, observational data of the desired quality is not available. In order to eliminate errors in the data, quality control (QC) should be applied before data analysis. If the data still contains undetected errors and quality problems after QC, a consequence may be misleading and erroneous results. A region which is seriously affected by observational data quality problems is the Central Andes. At the same time, climatological information on ongoing climate change and climate risks are of utmost importance in this area due to its vulnerability to meteorological extreme events and climatic changes. Beside data quality issues, the lack of metadata and the low station network density complicate quality control and assessment, and hence, appropriate application of the data. Errors and data problems may occur at any point of the data generation chain, e.g. due to unsuitable station configuration or siting, poor station maintenance, erroneous instrument reading, or inaccurate data digitalization and post processing. Different measurement conditions in the predominantly conventional station networks in Bolivia and Peru compared to the mostly automated networks e.g. in Europe or Northern America may cause different types of errors. Hence, applying QC methods used on state of the art networks to Bolivian and Peruvian climate observations may not be suitable or sufficient. A comprehensive amount of Bolivian and Peruvian maximum and minimum temperature and precipitation in-situ measurements were analyzed to detect and describe common data quality problems. Furthermore, station visits and reviews of the original documents were done. Some of the errors could be attributed to a specific source. Such information is of great importance for data users, since it allows them to decide for what applications the data still can be used. In ideal cases, it may even allow to correct the error. Strategies on how to deal with data from the Central Andes will be suggested. However, the approach may be applicable to networks from other countries where conditions of climate observations are comparable.

  12. 40 CFR 136.7 - Quality assurance and quality control.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... quality control elements, where applicable, into the laboratory's documented standard operating procedure... quality control elements must be clearly documented in the written standard operating procedure for each... Methods contains QA/QC procedures in the Part 1000 section of the Standard Methods Compendium. The...

  13. 40 CFR 136.7 - Quality assurance and quality control.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... quality control elements, where applicable, into the laboratory's documented standard operating procedure... quality control elements must be clearly documented in the written standard operating procedure for each... Methods contains QA/QC procedures in the Part 1000 section of the Standard Methods Compendium. The...

  14. 40 CFR 136.7 - Quality assurance and quality control.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... quality control elements, where applicable, into the laboratory's documented standard operating procedure... quality control elements must be clearly documented in the written standard operating procedure for each... Methods contains QA/QC procedures in the Part 1000 section of the Standard Methods Compendium. The...

  15. 77 FR 73611 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-11

    ...: Negative Quality Control Review Schedule. OMB Control Number: 0584-0034. Summary of Collection: The legislative basis for the operation of the quality control system is provided by section 16 of the Food and Nutrition Act of 2008. State agencies are required to perform Quality Control (QC) reviews for the...

  16. Analysis of copy number variation in Alzheimer's disease in a cohort of clinically characterized and neuropathologically verified individuals.

    PubMed

    Swaminathan, Shanker; Huentelman, Matthew J; Corneveaux, Jason J; Myers, Amanda J; Faber, Kelley M; Foroud, Tatiana; Mayeux, Richard; Shen, Li; Kim, Sungeun; Turk, Mari; Hardy, John; Reiman, Eric M; Saykin, Andrew J

    2012-01-01

    Copy number variations (CNVs) are genomic regions that have added (duplications) or deleted (deletions) genetic material. They may overlap genes affecting their function and have been shown to be associated with disease. We previously investigated the role of CNVs in late-onset Alzheimer's disease (AD) and mild cognitive impairment using Alzheimer's Disease Neuroimaging Initiative (ADNI) and National Institute of Aging-Late Onset AD/National Cell Repository for AD (NIA-LOAD/NCRAD) Family Study participants, and identified a number of genes overlapped by CNV calls. To confirm the findings and identify other potential candidate regions, we analyzed array data from a unique cohort of 1617 Caucasian participants (1022 AD cases and 595 controls) who were clinically characterized and whose diagnosis was neuropathologically verified. All DNA samples were extracted from brain tissue. CNV calls were generated and subjected to quality control (QC). 728 cases and 438 controls who passed all QC measures were included in case/control association analyses including candidate gene and genome-wide approaches. Rates of deletions and duplications did not significantly differ between cases and controls. Case-control association identified a number of previously reported regions (CHRFAM7A, RELN and DOPEY2) as well as a new gene (HLA-DRA). Meta-analysis of CHRFAM7A indicated a significant association of the gene with AD and/or MCI risk (P = 0.006, odds ratio = 3.986 (95% confidence interval 1.490-10.667)). A novel APP gene duplication was observed in one case sample. Further investigation of the identified genes in independent and larger samples is warranted.

  17. Testing and analysis of LWT and SCB properties of asphalt concrete mixtures.

    DOT National Transportation Integrated Search

    2016-04-01

    Currently, Louisianas Quality Control and Quality Assurance (QC/QA) practice for asphalt mixtures in : pavement construction is mainly based on controlling properties of plant produced mixtures that include : gradation and asphalt content, voids f...

  18. Embankment quality and assessment of moisture control implementation : tech transfer summary.

    DOT National Transportation Integrated Search

    2016-02-01

    The motivation for this project was based on work by : Iowa State University (ISU) researchers at a few recent : grading projects that demonstrated embankments were : being constructed outside moisture control limits, even : though the contractor QC ...

  19. Quality control and quality assurance of hot mix asphalt construction in Delaware.

    DOT National Transportation Integrated Search

    2006-07-01

    Since the mid 60s the Federal Highway Administration began to encourage : Departments of Transportation and Contractors toward the use of quality control and : quality assurance (QA/QC) specifications, which are statistically based. : For example,...

  20. To QC or not to QC: the key to a consistent laboratory?

    PubMed

    Lane, Michelle; Mitchell, Megan; Cashman, Kara S; Feil, Deanne; Wakefield, Sarah; Zander-Fox, Deirdre L

    2008-01-01

    A limiting factor in every embryology laboratory is its capacity to grow 'normal' embryos. In human in vitro fertilisation (IVF), there is considerable awareness that the environment of the laboratory itself can alter the quality of the embryos produced and the industry as a whole has moved towards the implementation of auditable quality management systems. Furthermore, in some countries, such as Australia, an established quality management system is mandatory for clinical IVF practice, but such systems are less frequently found in other embryology laboratories. Although the same challenges of supporting consistent and repeatable embryo development are paramount to success in all embryology laboratories, it could be argued that they are more important in a research setting where often the measured outcomes are at an intracellular or molecular level. In the present review, we have outlined the role and importance of quality control and quality assurance systems in any embryo laboratory and have highlighted examples of how simple monitoring can provide consistency and avoid the induction of artefacts, irrespective of the laboratory's purpose, function or species involved.

  1. Development of QC Procedures for Ocean Data Obtained by National Research Projects of Korea

    NASA Astrophysics Data System (ADS)

    Kim, S. D.; Park, H. M.

    2017-12-01

    To establish data management system for ocean data obtained by national research projects of Ministry of Oceans and Fisheries of Korea, KIOST conducted standardization and development of QC procedures. After reviewing and analyzing the existing international and domestic ocean-data standards and QC procedures, the draft version of standards and QC procedures were prepared. The proposed standards and QC procedures were reviewed and revised by experts in the field of oceanography and academic societies several times. A technical report on the standards of 25 data items and 12 QC procedures for physical, chemical, biological and geological data items. The QC procedure for temperature and salinity data was set up by referring the manuals published by GTSPP, ARGO and IOOS QARTOD. It consists of 16 QC tests applicable for vertical profile data and time series data obtained in real-time mode and delay mode. Three regional range tests to inspect annual, seasonal and monthly variations were included in the procedure. Three programs were developed to calculate and provide upper limit and lower limit of temperature and salinity at depth from 0 to 1550m. TS data of World Ocean Database, ARGO, GTSPP and in-house data of KIOST were analysed statistically to calculate regional limit of Northwest Pacific area. Based on statistical analysis, the programs calculate regional ranges using mean and standard deviation at 3 kind of grid systems (3° grid, 1° grid and 0.5° grid) and provide recommendation. The QC procedures for 12 data items were set up during 1st phase of national program for data management (2012-2015) and are being applied to national research projects practically at 2nd phase (2016-2019). The QC procedures will be revised by reviewing the result of QC application when the 2nd phase of data management programs is completed.

  2. High-power terahertz lasers with excellent beam quality for local oscillator sources

    NASA Astrophysics Data System (ADS)

    Williams, Benjamin

    Many molecular species that compose the interstellar medium have strong spectral features in the 2-5 THz range, and heterodyne spectroscopy is required to obtain ~km/s velocity resolution to resolve their complicated lineshapes and disentangle them from the background. Understanding the kinetics and energetics within the gas clouds of the interstellar medium is critical to understanding star formation processes and validating theories of galactic evolution. Herschel Observatory's heterodyne HIFI instrument provided several years of high-spectral-resolution measurements of the interstellar medium, although only up to 1.9 THz. The next frontier for heterodyne spectroscopy is the 2-6 THz region. However, development of heterodyne receivers above 2 THz has been severely hindered by a lack of convenient coherent sources of sufficient power to serve as local oscillators (LOs). The recently developed quantum-cascade (QC) lasers are emerging as candidates for LOs in the 1.5-5 THz range. The current generation of single-mode THz QC-lasers can provide a few milliwatts of power in a directive beam, and will be sufficient to pump single pixels and small-format heterodyne arrays (~10 elements). This proposal looks beyond the state-of-the-art, to the development of large format heterodyne arrays which contain on the order of 100-1000 elements. LO powers on the order of 10-100 mW delivered in a high-quality Gaussian beam will be needed to pump the mixer array - not only because of the microwatt mixer power requirement, but to account for large anticipated losses in LO coupling and distribution. Large format heterodyne array instruments are attractive for a dramatic speedup of mapping of the interstellar medium, particularly on airborne platforms such as the Stratospheric Observatory for Infrared Astronomy (SOFIA), and on long duration balloon platforms such as the Stratospheric Terahertz Observatory (STO), where observation time is limited. The research goal of this proposal is to demonstrate a new concept for terahertz quantum-cascade (QC) lasers designed to deliver scalable continuous-wave output power in the range of 10 to 100 mW or more in a near-diffraction limited output beam: a chip-scale THz quantum-cascade vertical-external-cavity-surface-emitting-laser (QC-VECSEL). We focus here on the development of a chip-scale version of size < 1 cm3 that oscillates in a single mode and can readily fit on a cold stage. The enabling technology for this proposed laser is an active metasurface reflector, which is comprised of a sparse array of antenna-coupled THz QC-laser sub-cavities. The metasurface reflector is part of the laser cavity such that multiple THz QC-laser sub-cavities are locked to a high-quality-factor cavity mode, which allows for scalable power combining with a favorable geometry for thermal dissipation and continuous-wave operation. We propose an integrated design, modeling, and experimental approach to design, fabricate, and characterize amplifying reflective QC metasurfaces and QC-VECSEL lasers. Demonstration laser devices will be developed at 2.7 THz and 4.7 THz, near the important frequencies for HD at 2.675 THz (for measurements of the hydrogen deuterium ratio and probing past star formation), and OI at 4.745 THz (a major coolant for photo-dissociation regions in giant molecular clouds). High resolution frequency measurements will be performed on a demonstration device at 2.7 THz will using downconversion with a Schottky diode sub-harmonic mixer to characterize the spectral purity, linewidth, and fine frequency tuning of this new type of QC-laser. This proposed laser is supporting technology for next-generation terahertz detectors.

  3. Variation of coda wave attenuation in the Alborz region and central Iran

    NASA Astrophysics Data System (ADS)

    Rahimi, H.; Motaghi, K.; Mukhopadhyay, S.; Hamzehloo, H.

    2010-06-01

    More than 340 earthquakes recorded by the Institute of Geophysics, University of Tehran (IGUT) short period stations from 1996 to 2004 were analysed to estimate the S-coda attenuation in the Alborz region, the northern part of the Alpine-Himalayan orogen in western Asia, and in central Iran, which is the foreland of this orogen. The coda quality factor, Qc, was estimated using the single backscattering model in frequency bands of 1-25 Hz. In this research, lateral and depth variation of Qc in the Alborz region and central Iran are studied. It is observed that in the Alborz region there is absence of significant lateral variation in Qc. The average frequency relation for this region is Qc = 79 +/- 2f1.07+/-0.08. Two anomalous high-attenuation areas in central Iran are recognized around the stations LAS and RAZ. The average frequency relation for central Iran excluding the values of these two stations is Qc = 94 +/- 2f0.97+/-0.12. To investigate the attenuation variation with depth, Qc value was calculated for 14 lapse times (25, 30, 35,... 90s) for two data sets having epicentral distance range R < 100 km (data set 1) and 100 < R < 200 km (data set 2) in each area. It is observed that Qc increases with depth. However, the rate of increase of Qc with depth is not uniform in our study area. Beneath central Iran the rate of increase of Qc is greater at depths less than 100 km compared to that at larger depths indicating the existence of a high attenuation anomalous structure under the lithosphere of central Iran. In addition, below ~180 km, the Qc value does not vary much with depth under both study areas, indicating the presence of a transparent mantle under them.

  4. Hybrid spin and valley quantum computing with singlet-triplet qubits.

    PubMed

    Rohling, Niklas; Russ, Maximilian; Burkard, Guido

    2014-10-24

    The valley degree of freedom in the electronic band structure of silicon, graphene, and other materials is often considered to be an obstacle for quantum computing (QC) based on electron spins in quantum dots. Here we show that control over the valley state opens new possibilities for quantum information processing. Combining qubits encoded in the singlet-triplet subspace of spin and valley states allows for universal QC using a universal two-qubit gate directly provided by the exchange interaction. We show how spin and valley qubits can be separated in order to allow for single-qubit rotations.

  5. NDA BATCH 2002-02

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lawrence Livermore National Laboratory

    2009-12-09

    QC sample results (daily background checks, 20-gram and 100-gram SGS drum checks) were within acceptable criteria established by WIPP's Quality Assurance Objectives for TRU Waste Characterization. Replicate runs were performed on 5 drums with IDs LL85101099TRU, LL85801147TRU, LL85801109TRU, LL85300999TRU and LL85500979TRU. All replicate measurement results are identical at the 95% confidence level as established by WIPP criteria. Note that the batch covered 5 weeks of SGS measurements from 23-Jan-2002 through 22-Feb-2002. Data packet for SGS Batch 2002-02 generated using gamma spectroscopy with the Pu Facility SGS unit is technically reasonable. All QC samples are in compliance with established control limits.more » The batch data packet has been reviewed for correctness, completeness, consistency and compliance with WIPP's Quality Assurance Objectives and determined to be acceptable. An Expert Review was performed on the data packet between 28-Feb-02 and 09-Jul-02 to check for potential U-235, Np-237 and Am-241 interferences and address drum cases where specific scan segments showed Se gamma ray transmissions for the 136-keV gamma to be below 0.1 %. Two drums in the batch showed Pu-238 at a relative mass ratio more than 2% of all the Pu isotopes.« less

  6. 77 FR 3228 - Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-23

    ..., Office of Management and Budget (OMB), [email protected] or fax (202) 395-5806 and to... it displays a currently valid OMB control number. Food and Nutrition Service Title: Quality Control... perform Quality Control (QC) review for the Supplemental Nutrition Assistance Program (SNAP). The FNS-380...

  7. Intercomparison of field measurements of nitrous acid (HONO) during the SHARP campaign

    NASA Astrophysics Data System (ADS)

    Pinto, J. P.; Dibb, J.; Lee, B. H.; Rappenglück, B.; Wood, E. C.; Levy, M.; Zhang, R.-Y.; Lefer, B.; Ren, X.-R.; Stutz, J.; Tsai, C.; Ackermann, L.; Golovko, J.; Herndon, S. C.; Oakes, M.; Meng, Q.-Y.; Munger, J. W.; Zahniser, M.; Zheng, J.

    2014-05-01

    Because of the importance of HONO as a radical reservoir, consistent and accurate measurements of its concentration are needed. As part of SHARP (Study of Houston Atmospheric Radical Precursors), time series of HONO were obtained by six different measurement techniques on the roof of the Moody Tower at the University of Houston. Techniques used were long path differential optical absorption spectroscopy (DOAS), stripping coil-visible absorption photometry (SC-AP), long path absorption photometry (LOPAP®), mist chamber/ion chromatography (MC-IC), quantum cascade-tunable infrared laser differential absorption spectroscopy (QC-TILDAS), and ion drift-chemical ionization mass spectrometry (ID-CIMS). Various combinations of techniques were in operation from 15 April through 31 May 2009. All instruments recorded a similar diurnal pattern of HONO concentrations with higher median and mean values during the night than during the day. Highest values were observed in the final 2 weeks of the campaign. Inlets for the MC-IC, SC-AP, and QC-TILDAS were collocated and agreed most closely with each other based on several measures. Largest differences between pairs of measurements were evident during the day for concentrations < 100 parts per trillion (ppt). Above 200 ppt, concentrations from the SC-AP, MC-IC, and QC-TILDAS converged to within about 20%, with slightly larger discrepancies when DOAS was considered. During the first 2 weeks, HONO measured by ID-CIMS agreed with these techniques, but ID-CIMS reported higher values during the afternoon and evening of the final 4 weeks, possibly from interference from unknown sources. A number of factors, including building related sources, likely affected measured concentrations.

  8. 23 CFR 650.313 - Inspection procedures.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...) Quality control and quality assurance. Assure systematic quality control (QC) and quality assurance (QA... periodic field review of inspection teams, periodic bridge inspection refresher training for program managers and team leaders, and independent review of inspection reports and computations. (h) Follow-up on...

  9. Identification of potential glutaminyl cyclase inhibitors from lead-like libraries by in silico and in vitro fragment-based screening.

    PubMed

    Szaszkó, Mária; Hajdú, István; Flachner, Beáta; Dobi, Krisztina; Magyar, Csaba; Simon, István; Lőrincz, Zsolt; Kapui, Zoltán; Pázmány, Tamás; Cseh, Sándor; Dormán, György

    2017-02-01

    A glutaminyl cyclase (QC) fragment library was in silico selected by disconnection of the structure of known QC inhibitors and by lead-like 2D virtual screening of the same set. The resulting fragment library (204 compounds) was acquired from commercial suppliers and pre-screened by differential scanning fluorimetry followed by functional in vitro assays. In this way, 10 fragment hits were identified ([Formula: see text]5 % hit rate, best inhibitory activity: 16 [Formula: see text]). The in vitro hits were then docked to the active site of QC, and the best scoring compounds were analyzed for binding interactions. Two fragments bound to different regions in a complementary manner, and thus, linking those fragments offered a rational strategy to generate novel QC inhibitors. Based on the structure of the virtual linked fragment, a 77-membered QC target focused library was selected from vendor databases and docked to the active site of QC. A PubChem search confirmed that the best scoring analogues are novel, potential QC inhibitors.

  10. Analysis of CrIS/ATMS using AIRS Version-7 Retrieval and QC Methodology

    NASA Astrophysics Data System (ADS)

    Susskind, J.; Kouvaris, L. C.; Blaisdell, J. M.; Iredell, L. F.

    2017-12-01

    The objective of the proposed research is to develop, implement, test, and refine a CrIS/ATMS retrieval algorithm which will produce monthly mean data products that are compatible with those of the soon to be operational AIRS V7 retrieval algorithm. This is a necessary condition for CrIS/ATMS on NPP and future missions to serve as adequate follow-ons to AIRS for the monitoring of climate variability and trends. Of particular importance toward this end is achieving agreement of monthly mean fields of CrIS and AIRS geophysical parameters on a 1 deg by 1 deg spatial scale, and, more significantly, agreement of their interannual differences. Indications are that the best way to achieve this is to use scientific retrieval and Quality Control (QC) methodology for CrIS/ATMS which is analogous to that which will be used in AIRS V7. We refer to the current scientific candidate for AIRS V7 as AIRS Sounder Research Team (SRT) V6.42, which currently runs at JPL on the AIRS Team Leader Scientific Facility (TLSCF). We ported CrIS SRT V6.42 Level 2 (L2) retrieval code and QC methodology to run at the Sounder SIPS at JPL. The months of January and July 2015 were both processed at JPL using AIRS and CrIS at the TLSCF and SIPS respectively. This paper shows excellent agreement of AIRS and CrIS single day and monthly mean products on a 1 deg lat by 1 deg long spatial grid with each other and with the other satellites measures of the same products.

  11. SRT Evaluation of AIRS Version-6.02 and Version-6.02 AIRS Only (6.02 AO) Products

    NASA Technical Reports Server (NTRS)

    Susskind, Joel; Iredell, Lena; Molnar, Gyula; Blaisdell, John

    2012-01-01

    Version-6 contains a number of significant improvements over Version-5. This report compares Version-6 products resulting from the advances listed below to those from Version-5. 1. Improved methodology to determine skin temperature (T(sub s)) and spectral emissivity (Epsilon(sub v)). 2. Use of Neural-net start-up state. 3. Improvements which decrease the spurious negative Version-5 trend in tropospheric temperatures. 4. Improved QC methodology. Version-6 uses separate QC thresholds optimized for Data Assimilation (QC=0) and Climate applications (QC=0,1) respectively. 5. Channel-by-channel clear-column radiances R-hat(sub tau) QC flags. 6. Improved cloud parameter retrieval algorithm. 7. Improved OLR RTA. Our evaluation compared V6.02 and V6.02 AIRS Only (V6.02 AO) Quality Controlled products with those of Version-5.0. In particular we evaluated surface skin temperature T(sub s), surface spectral emissivity Epsilon(sub v), temperature profile T(p), water vapor profile q(p), OLR, OLR(sub CLR), effective cloud fraction alpha-Epsilon, and cloud cleared radiances R-hat(sub tau) . We conducted two types of evaluations. The first compared results on 7 focus days to collocated ECMWF truth. The seven focus days are: September 6, 2002; January 25, 2003; September 29, 2004; August 5, 2005; February 24, 2007; August 10, 2007; and May 30, 2010. In these evaluations, we show results for T(sub s), Epsilon(sub v), T(p), and q(p) in terms of yields, and RMS differences and biases with regard to ECMWF. We also show yield trends as well as bias trends of these quantities relative to ECMWF truth. We also show yields and accuracy of channel by channel QC d values of R-hat(sub tau) for V6.02 and V6.02 AO. Version-5 did not contain channel by channel QC d values of R-hat(sub tau). In the second type of evaluation, we compared V6.03 monthly mean Level-3 products to those of Version-5.0, for four different months: January, April, July, and October; in 3 different years 2003, 2007, and 2011. In particular, we compared V6.03 and V5.0 trends of T(p), q(p), alpha-Epsilon, OLR, and OLR(sub CLR) computed based on results for these 12 time periods

  12. Detection of biogenic CO production above vascular cell cultures using a near-room-temperature QC-DFB laser

    NASA Technical Reports Server (NTRS)

    Kosterev, A. A.; Tittel, F. K.; Durante, W.; Allen, M.; Kohler, R.; Gmachl, C.; Capasso, F.; Sivco, D. L.; Cho, A. Y.

    2002-01-01

    We report the first application of pulsed, near-room-temperature quantum cascade laser technology to the continuous detection of biogenic CO production rates above viable cultures of vascular smooth muscle cells. A computer-controlled sequence of measurements over a 9-h period was obtained, resulting in a minimum detectable CO production of 20 ppb in a 1-m optical path above a standard cell-culture flask. Data-processing procedures for real-time monitoring of both biogenic and ambient atmospheric CO concentrations are described.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Luyao; Curwen, Christopher; Chen, Daguan

    A longstanding challenge for terahertz quantum-cascade (QC) lasers is achieving both a high power and high-quality beam pattern, this is due in part due to their use of sub-wavelength metallic waveguides. Recently, the vertical-external-cavity surface-emitting laser (VECSEL) concept was demonstrated for the first time in the terahertz range and for a QC-laser. This is enabled by the development of an amplifying metasurface reflector capable of coupling incident free-space THz radiation to the QC-laser material such that it is amplified and re-radiated. The THz metasurface QC-VECSEL initiates a new approach for making QC-lasers with high power and excellent beam pattern. Furthermore,more » the ability to engineer the electromagnetic phase, amplitude, and polarization response of the metasurface enables lasers with new functionality. Our article provides an overview of the fundamental theory, design considerations, and recent results for high-performance THz QC-VECSELs.« less

  14. Terahertz metasurface quantum-cascade VECSELs: theory and performance

    DOE PAGES

    Xu, Luyao; Curwen, Christopher; Chen, Daguan; ...

    2017-04-12

    A longstanding challenge for terahertz quantum-cascade (QC) lasers is achieving both a high power and high-quality beam pattern, this is due in part due to their use of sub-wavelength metallic waveguides. Recently, the vertical-external-cavity surface-emitting laser (VECSEL) concept was demonstrated for the first time in the terahertz range and for a QC-laser. This is enabled by the development of an amplifying metasurface reflector capable of coupling incident free-space THz radiation to the QC-laser material such that it is amplified and re-radiated. The THz metasurface QC-VECSEL initiates a new approach for making QC-lasers with high power and excellent beam pattern. Furthermore,more » the ability to engineer the electromagnetic phase, amplitude, and polarization response of the metasurface enables lasers with new functionality. Our article provides an overview of the fundamental theory, design considerations, and recent results for high-performance THz QC-VECSELs.« less

  15. PHABULOSA Controls the Quiescent Center-Independent Root Meristem Activities in Arabidopsis thaliana

    PubMed Central

    Sebastian, Jose; Ryu, Kook Hui; Zhou, Jing; Tarkowská, Danuše; Tarkowski, Petr; Cho, Young-Hee; Yoo, Sang-Dong; Kim, Eun-Sol; Lee, Ji-Young

    2015-01-01

    Plant growth depends on stem cell niches in meristems. In the root apical meristem, the quiescent center (QC) cells form a niche together with the surrounding stem cells. Stem cells produce daughter cells that are displaced into a transit-amplifying (TA) domain of the root meristem. TA cells divide several times to provide cells for growth. SHORTROOT (SHR) and SCARECROW (SCR) are key regulators of the stem cell niche. Cytokinin controls TA cell activities in a dose-dependent manner. Although the regulatory programs in each compartment of the root meristem have been identified, it is still unclear how they coordinate one another. Here, we investigate how PHABULOSA (PHB), under the posttranscriptional control of SHR and SCR, regulates TA cell activities. The root meristem and growth defects in shr or scr mutants were significantly recovered in the shr phb or scr phb double mutant, respectively. This rescue in root growth occurs in the absence of a QC. Conversely, when the modified PHB, which is highly resistant to microRNA, was expressed throughout the stele of the wild-type root meristem, root growth became very similar to that observed in the shr; however, the identity of the QC was unaffected. Interestingly, a moderate increase in PHB resulted in a root meristem phenotype similar to that observed following the application of high levels of cytokinin. Our protoplast assay and transgenic approach using ARR10 suggest that the depletion of TA cells by high PHB in the stele occurs via the repression of B-ARR activities. This regulatory mechanism seems to help to maintain the cytokinin homeostasis in the meristem. Taken together, our study suggests that PHB can dynamically regulate TA cell activities in a QC-independent manner, and that the SHR-PHB pathway enables a robust root growth system by coordinating the stem cell niche and TA domain. PMID:25730098

  16. Characteristic Evaluation on Cooling Performance of Thermoelectric Modules.

    PubMed

    Seo, Sae Rom; Han, Seungwoo

    2015-10-01

    The aim of this work is to develop a performance evaluation system for thermoelectric cooling modules. We describe the design of such a system, composed of a vacuum chamber with a heat sink along with a metal block to measure the absorbed heat Qc. The system has a simpler structure than existing water-cooled or air-cooled systems. The temperature difference between the cold and hot sides of the thermoelectric module ΔT can be accurately measured without any effects due to convection, and the temperature equilibrium time is minimized compared to a water-cooled system. The evaluation system described here can be used to measure characteristic curves of Qc as a function of ΔT, as well as the current-voltage relations. High-performance thermoelectric systems can therefore be developed using optimal modules evaluated with this system.

  17. Simple and ultra-fast recognition and quantitation of compounded monoclonal antibodies: Application to flow injection analysis combined to UV spectroscopy and matching method.

    PubMed

    Jaccoulet, E; Schweitzer-Chaput, A; Toussaint, B; Prognon, P; Caudron, E

    2018-09-01

    Compounding of monoclonal antibody (mAbs) constantly increases in hospital. Quality control (QC) of the compounded mAbs based on quantification and identification is required to prevent potential errors and fast method is needed to manage outpatient chemotherapy administration. A simple and ultra-fast (less than 30 s) method using flow injection analysis associated to least square matching method issued from the analyzer software was performed and evaluated for the routine hospital QC of three compounded mAbs: bevacizumab, infliximab and rituximab. The method was evaluated through qualitative and quantitative parameters. Preliminary analysis of the UV absorption and second derivative spectra of the mAbs allowed us to adapt analytical conditions according to the therapeutic range of the mAbs. In terms of quantitative QC, linearity, accuracy and precision were assessed as specified in ICH guidelines. Very satisfactory recovery was achieved and the RSD (%) of the intermediate precision were less than 1.1%. Qualitative analytical parameters were also evaluated in terms of specificity, sensitivity and global precision through a matrix of confusion. Results showed to be concentration and mAbs dependant and excellent (100%) specificity and sensitivity were reached within specific concentration range. Finally, routine application on "real life" samples (n = 209) from different batch of the three mAbs complied with the specifications of the quality control i.e. excellent identification (100%) and ± 15% of targeting concentration belonging to the calibration range. The successful use of the combination of second derivative spectroscopy and partial least square matching method demonstrated the interest of FIA for the ultra-fast QC of mAbs after compounding using matching method. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. CARINA TCO2 data in the Atlantic Ocean

    NASA Astrophysics Data System (ADS)

    Pierrot, D.; Brown, P.; van Heuven, S.; Tanhua, T.; Schuster, U.; Wanninkhof, R.; Key, R. M.

    2010-01-01

    Water column data of carbon and carbon-relevant hydrographic and hydrochemical parameters from 188 cruises in the Arctic, Atlantic and Southern Ocean have been retrieved and merged in a new data base: the CARINA (CARbon IN the Atlantic) Project. These data have gone through rigorous quality control (QC) procedures to assure the highest possible quality and consistency. Secondary quality control, which involved objective study of data in order to quantify systematic differences in the reported values, was performed for the pertinent parameters in the CARINA data base. Systematic biases in the data have been corrected in the data products. The products are three merged data files with measured, adjusted and interpolated data of all cruises for each of the three CARINA regions (Arctic, Atlantic and Southern Ocean). Ninety-eight cruises were conducted in the "Atlantic" defined as the region south of the Greenland-Iceland-Scotland Ridge and north of about 30° S. Here we report the details of the secondary QC which was done on the total dissolved inorganic carbon (TCO2) data and the adjustments that were applied to yield the final data product in the Atlantic. Procedures of quality control - including crossover analysis between stations and inversion analysis of all crossover data - are briefly described. Adjustments were applied to TCO2 measurements for 17 of the cruises in the Atlantic Ocean region. With these adjustments, the CARINA data base is consistent both internally as well as with GLODAP data, an oceanographic data set based on the WOCE Hydrographic Program in the 1990s, and is now suitable for accurate assessments of, for example, regional oceanic carbon inventories, uptake rates and model validation.

  19. CARINA alkalinity data in the Atlantic Ocean

    NASA Astrophysics Data System (ADS)

    Velo, A.; Perez, F. F.; Brown, P.; Tanhua, T.; Schuster, U.; Key, R. M.

    2009-08-01

    Data on carbon and carbon-relevant hydrographic and hydrochemical parameters from previously non-publicly available cruise data sets in the Arctic, Atlantic and Southern Ocean have been retrieved and merged to a new database: CARINA (CARbon IN the Atlantic). These data have gone through rigorous quality control (QC) procedures to assure the highest possible quality and consistency. The data for most of the measured parameters in the CARINA data base were objectively examined in order to quantify systematic differences in the reported values, i.e. secondary quality control. Systematic biases found in the data have been corrected in the data products, i.e. three merged data files with measured, calculated and interpolated data for each of the three CARINA regions; Arctic, Atlantic and Southern Ocean. Out of a total of 188 cruise entries in the CARINA database, 98 were conducted in the Atlantic Ocean and of these, 75 cruises report alkalinity values. Here we present details of the secondary QC on alkalinity for the Atlantic Ocean part of CARINA. Procedures of quality control, including crossover analysis between cruises and inversion analysis of all crossover data are briefly described. Adjustments were applied to the alkalinity values for 16 of the cruises in the Atlantic Ocean region. With these adjustments the CARINA database is consistent both internally as well as with GLODAP data, an oceanographic data set based on the World Hydrographic Program in the 1990s. Based on our analysis we estimate the internal accuracy of the CARINA-ATL alkalinity data to be 3.3 μmol kg-1. The CARINA data are now suitable for accurate assessments of, for example, oceanic carbon inventories and uptake rates and for model validation.

  20. CARINA: nutrient data in the Atlantic Ocean

    NASA Astrophysics Data System (ADS)

    Tanhua, T.; Brown, P. J.; Key, R. M.

    2009-11-01

    Data on carbon and carbon-relevant hydrographic and hydrochemical parameters from previously non-publicly available cruise data sets in the Arctic, Atlantic and Southern Ocean have been retrieved and merged to a new database: CARINA (CARbon IN the Atlantic). These data have gone through rigorous quality control (QC) procedures to assure the highest possible quality and consistency. The data for most of the measured parameters in the CARINA data base were objectively examined in order to quantify systematic differences in the reported values, i.e. secondary quality control. Systematic biases found in the data have been corrected in the data products, i.e. three merged data files with measured, calculated and interpolated data for each of the three CARINA regions; Arctic Mediterranean Seas, Atlantic and Southern Ocean. Out of a total of 188 cruise entries in the CARINA database, 98 were conducted in the Atlantic Ocean and of these 84 cruises report nitrate values, 79 silicate, and 78 phosphate. Here we present details of the secondary QC for nutrients for the Atlantic Ocean part of CARINA. Procedures of quality control, including crossover analysis between cruises and inversion analysis of all crossover data are briefly described. Adjustments were applied to the nutrient values for 43 of the cruises in the Atlantic Ocean region. With these adjustments the CARINA database is consistent both internally as well as with GLODAP data, an oceanographic data set based on the World Hydrographic Program in the 1990s (Key et al., 2004). Based on our analysis we estimate the internal accuracy of the CARINA-ATL nutrient data to be: nitrate 1.5%; phosphate 2.6%; silicate 3.1%. The CARINA data are now suitable for accurate assessments of, for example, oceanic carbon inventories and uptake rates and for model validation.

  1. CARINA: nutrient data in the Atlantic Ocean

    NASA Astrophysics Data System (ADS)

    Tanhua, T.; Brown, P. J.; Key, R. M.

    2009-07-01

    Data on carbon and carbon-relevant hydrographic and hydrochemical parameters from previously non-publicly available cruise data sets in the Arctic, Atlantic and Southern Ocean have been retrieved and merged to a new database: CARINA (CARbon IN the Atlantic). These data have gone through rigorous quality control (QC) procedures to assure the highest possible quality and consistency. The data for most of the measured parameters in the CARINA data base were objectively examined in order to quantify systematic differences in the reported values, i.e. secondary quality control. Systematic biases found in the data have been corrected in the data products, i.e. three merged data files with measured, calculated and interpolated data for each of the three CARINA regions; Arctic, Atlantic and Southern Ocean. Out of a total of 188 cruise entries in the CARINA database, 98 were conducted in the Atlantic Ocean and of these 84 cruises report nitrate values, 79 silicate, and 78 phosphate. Here we present details of the secondary QC for nutrients for the Atlantic Ocean part of CARINA. Procedures of quality control, including crossover analysis between cruises and inversion analysis of all crossover data are briefly described. Adjustments were applied to the nutrient values for 43 of the cruises in the Atlantic Ocean region. With these adjustments the CARINA database is consistent both internally as well as with GLODAP data, an oceanographic data set based on the World Hydrographic Program in the 1990s (Key et al., 2004). Based on our analysis we estimate the internal accuracy of the CARINA-ATL nutrient data to be: nitrate 1.5%; phosphate 2.6%; silicate 3.1%. The CARINA data are now suitable for accurate assessments of, for example, oceanic carbon inventories and uptake rates and for model validation.

  2. CARINA alkalinity data in the Atlantic Ocean

    NASA Astrophysics Data System (ADS)

    Velo, A.; Perez, F. F.; Brown, P.; Tanhua, T.; Schuster, U.; Key, R. M.

    2009-11-01

    Data on carbon and carbon-relevant hydrographic and hydrochemical parameters from previously non-publicly available cruise data sets in the Arctic, Atlantic and Southern Ocean have been retrieved and merged to a new database: CARINA (CARbon IN the Atlantic). These data have gone through rigorous quality control (QC) procedures to assure the highest possible quality and consistency. The data for most of the measured parameters in the CARINA data base were objectively examined in order to quantify systematic differences in the reported values, i.e. secondary quality control. Systematic biases found in the data have been corrected in the data products, i.e. three merged data files with measured, calculated and interpolated data for each of the three CARINA regions; Arctic, Atlantic and Southern Ocean. Out of a total of 188 cruise entries in the CARINA database, 98 were conducted in the Atlantic Ocean and of these, 75 cruises report alkalinity values. Here we present details of the secondary QC on alkalinity for the Atlantic Ocean part of CARINA. Procedures of quality control, including crossover analysis between cruises and inversion analysis of all crossover data are briefly described. Adjustments were applied to the alkalinity values for 16 of the cruises in the Atlantic Ocean region. With these adjustments the CARINA database is consistent both internally as well as with GLODAP data, an oceanographic data set based on the World Hydrographic Program in the 1990s. Based on our analysis we estimate the internal accuracy of the CARINA-ATL alkalinity data to be 3.3 μmol kg-1. The CARINA data are now suitable for accurate assessments of, for example, oceanic carbon inventories and uptake rates and for model validation.

  3. QUALITY ASSURANCE AND QUALITY CONTROL FOR WASTE CONTAINMENT FACILITIES. Project Summary

    EPA Science Inventory

    It is generally agreed that both quality assurance (QA) and quality control (QC) are essential to the proper installation and eventual performance of environmentally safe and secure waste containment systems. Even further, there are both manufacturing and construction aspects to...

  4. Evaluation of non-destructive technologies for construction quality control of HMA and PCC pavements in Louisiana.

    DOT National Transportation Integrated Search

    2013-11-01

    Current roadway quality control and quality acceptance (QC/QA) procedures for the Louisiana Department of Transportation and : Development (LADOTD) include coring for thickness, density, and air voids in hot mix asphalt (HMA) pavements and thickness ...

  5. HANDBOOK: QUALITY ASSURANCE/QUALITY CONTROL (QA/QC) PROCEDURES FOR HAZARDOUS WASTE INCINERATION

    EPA Science Inventory

    Resource Conservation and Recovery Act regulations for hazardous waste incineration require trial burns by permit applicants. uality Assurance Project Plan (QAPjP) must accompany a trial burn plan with appropriate quality assurance/quality control procedures. uidance on the prepa...

  6. Does quality control matter? Surface urban heat island intensity variations estimated by satellite-derived land surface temperature products

    NASA Astrophysics Data System (ADS)

    Lai, Jiameng; Zhan, Wenfeng; Huang, Fan; Quan, Jinling; Hu, Leiqiu; Gao, Lun; Ju, Weimin

    2018-05-01

    The temporally regular and spatially comprehensive monitoring of surface urban heat islands (SUHIs) have been extremely difficult, until the advent of satellite-based land surface temperature (LST) products. However, these LST products have relatively higher errors compared to in situ measurements. This has resulted in comparatively inaccurate estimations of SUHI indicators and, consequently, may have distorted interpretations of SUHIs. Although reports have shown that LST qualities are important for SUHI interpretations, systematic investigations of the response of SUHI indicators to LST qualities across cities with dissimilar bioclimates are rare. To address this issue, we chose eighty-six major cities across mainland China and analyzed SUHI intensity (SUHII) derived from Moderate Resolution Imaging Spectroradiometer (MODIS) LST data. The LST-based SUHII differences due to inclusion or exclusion of MODIS quality control (QC) flags (i.e., ΔSUHII) were evaluated. Our major findings included, but are not limited to, the following four aspects: (1) SUHIIs can be significantly impacted by MODIS QC flags, and the associated QC-induced ΔSUHIIs generally accounted for 24.3% (29.9%) of the total SUHII value during the day (night); (2) the ΔSUHIIs differed between seasons, with considerable differences between transitional (spring and autumn) and extreme (summer and winter) seasons; (3) significant discrepancies also appeared among cities located in northern and southern regions, with northern cities often possessing higher annual mean ΔSUHIIs. The internal variations of ΔSUHIIs within individual cities also showed high heterogeneity, with ΔSUHII variations that generally exceeded 5.0 K (3.0 K) in northern (southern) cities; (4) ΔSUHIIs were negatively related to SUHIIs and cloud cover percentages (mostly in transitional seasons). No significant relationship was found in the extreme seasons. Our findings highlight the need to be extremely cautious when using LST product-based SUHIIs to interpret SUHIs.

  7. A quality control system for digital elevation data

    NASA Astrophysics Data System (ADS)

    Knudsen, Thomas; Kokkendorf, Simon; Flatman, Andrew; Nielsen, Thorbjørn; Rosenkranz, Brigitte; Keller, Kristian

    2015-04-01

    In connection with the introduction of a new version of the Danish national coverage Digital Elevation Model (DK-DEM), the Danish Geodata Agency has developed a comprehensive quality control (QC) and metadata production (MP) system for LiDAR point cloud data. The architecture of the system reflects its origin in a national mapping organization where raw data deliveries are typically outsourced to external suppliers. It also reflects a design decision of aiming at, whenever conceivable, doing full spatial coverage tests, rather than scattered sample checks. Hence, the QC procedure is split in two phases: A reception phase and an acceptance phase. The primary aim of the reception phase is to do a quick assessment of things that can typically go wrong, and which are relatively simple to check: Data coverage, data density, strip adjustment. If a data delivery passes the reception phase, the QC continues with the acceptance phase, which checks five different aspects of the point cloud data: Vertical accuracy Vertical precision Horizontal accuracy Horizontal precision Point classification correctness The vertical descriptors are comparatively simple to measure: The vertical accuracy is checked by direct comparison with previously surveyed patches. The vertical precision is derived from the observed variance on well defined flat surface patches. These patches are automatically derived from the road centerlines registered in FOT, the official Danish map data base. The horizontal descriptors are less straightforward to measure, since potential reference material for direct comparison is typically expected to be less accurate than the LiDAR data. The solution selected is to compare photogrammetrically derived roof centerlines from FOT with LiDAR derived roof centerlines. These are constructed by taking the 3D Hough transform of a point cloud patch defined by the photogrammetrical roof polygon. The LiDAR derived roof centerline is then the intersection line of the two primary planes of the transformed data. Since the photogrammetrical and the LiDAR derived roof centerline sets are independently derived, a low RMS difference indicates that both data sets are of very high accuracy. The horizontal precision is derived by doing a similar comparison between LiDAR derived roof centerlines in the overlap zone of neighbouring flight strips. Contrary to the vertical and horizontal descriptors, the point classification correctness is neither geometric, nor well defined. In this case we must resolve by introducing a human in the loop and presenting data in a form that is as useful as possible to this human. Hence, the QC system produces maps of suspicious patterns such as Vegetation below buildings Points classified as buildings where no building is registered in the map data base Building polygons from the map data base without any building points Buildings on roads All elements of the QC process is carried out in smaller tiles (typically 1 km × 1 km) and hence trivially parallelizable. Results from the parallel executing processes are collected in a geospatial data base system (PostGIS) and the progress can be analyzed and visualized in a desktop GIS while the processes run. Implementation wise, the system is based on open source components, primarily from the OSGeo stack (GDAL, PostGIS, QGIS, NumPy, SciPy, etc.). The system specific code is also being open sourced. This open source distribution philosophy supports the parallel execution paradigm, since all available hardware can be utilized without any licensing problems. As yet, the system has only been used for QC of the first part of a new Danish elevation model. The experience has, however, been very positive. Especially notable is the utility of doing full spatial coverage tests (rather than scattered sample checks). This means that error detection and error reports are exactly as spatial as the point cloud data they concern. This makes it very easy for both data receiver and data provider, to discuss and reason about the nature and causes of irregularities.

  8. 40 CFR 98.314 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... accounting purposes including direct measurement weighing the petroleum coke fed into your process (by belt... line dioxide using plant instruments used for accounting purposes including direct measurement weighing... used to ensure the accuracy of monthly calcined petroleum coke consumption measurements. (c) You must...

  9. Potential of dynamically harmonized Fourier transform ion cyclotron resonance cell for high-throughput metabolomics fingerprinting: control of data quality.

    PubMed

    Habchi, Baninia; Alves, Sandra; Jouan-Rimbaud Bouveresse, Delphine; Appenzeller, Brice; Paris, Alain; Rutledge, Douglas N; Rathahao-Paris, Estelle

    2018-01-01

    Due to the presence of pollutants in the environment and food, the assessment of human exposure is required. This necessitates high-throughput approaches enabling large-scale analysis and, as a consequence, the use of high-performance analytical instruments to obtain highly informative metabolomic profiles. In this study, direct introduction mass spectrometry (DIMS) was performed using a Fourier transform ion cyclotron resonance (FT-ICR) instrument equipped with a dynamically harmonized cell. Data quality was evaluated based on mass resolving power (RP), mass measurement accuracy, and ion intensity drifts from the repeated injections of quality control sample (QC) along the analytical process. The large DIMS data size entails the use of bioinformatic tools for the automatic selection of common ions found in all QC injections and for robustness assessment and correction of eventual technical drifts. RP values greater than 10 6 and mass measurement accuracy of lower than 1 ppm were obtained using broadband mode resulting in the detection of isotopic fine structure. Hence, a very accurate relative isotopic mass defect (RΔm) value was calculated. This reduces significantly the number of elemental composition (EC) candidates and greatly improves compound annotation. A very satisfactory estimate of repeatability of both peak intensity and mass measurement was demonstrated. Although, a non negligible ion intensity drift was observed for negative ion mode data, a normalization procedure was easily applied to correct this phenomenon. This study illustrates the performance and robustness of the dynamically harmonized FT-ICR cell to perform large-scale high-throughput metabolomic analyses in routine conditions. Graphical abstract Analytical performance of FT-ICR instrument equipped with a dynamically harmonized cell.

  10. Intra- and inter-observer reliability of quantitative analysis of the infra-patellar fat pad and comparison between fat- and non-fat-suppressed imaging--Data from the osteoarthritis initiative.

    PubMed

    Steidle-Kloc, E; Wirth, W; Ruhdorfer, A; Dannhauer, T; Eckstein, F

    2016-03-01

    The infra-patellar fat pad (IPFP), as intra-articular adipose tissue represents a potential source of pro-inflammatory cytokines and its size has been suggested to be associated with osteoarthritis (OA) of the knee. This study examines inter- and intra-observer reliability of fat-suppressed (fs) and non-fat-suppressed (nfs) MR imaging for determination of IPFP morphological measurements as novel biomarkers. The IPFP of nine right knees of healthy Osteoarthritis Initiative participants was segmented by five readers, using fs and nfs baseline sagittal MRIs. The intra-observer reliability was determined from baseline and 1-year follow-up images. All segmentations were quality controlled (QC) by an expert reader. Reliability was expressed as root mean square coefficient of variation (RMS CV%). After QC, the inter-observer reliability for fs (nfs) imaging was 2.0% (1.1%) for IPFP volume, 2.1%/2.5% (1.6%/1.8%) for anterior/posterior surface areas, 1.8% (1.8%) for depth, and 2.1% (2.4%) for maximum sagittal area. The intra-observer reliability was 3.1% (5.0%) for volume, 2.3%/2.8% (2.5%/2.9%) for anterior/posterior surfaces, 1.9% (3.5%) for depth, and 3.3% (4.5%) for maximum sagittal area. IPFP volume from nfs images was systematically greater (+7.3%) than from fs images, but highly correlated (r=0.98). The results suggest that quantitative measurements of IPFP morphology can be performed with satisfactory reliability when expert QC is implemented. The IPFP is more clearly depicted in nfs images, and there is a small systematic off-set versus analysis from fs images. However, the high linear relationship between fs and nfs imaging suggests that fs images can be used to analyze IPFP morphology, when nfs images are not available. Copyright © 2015 Elsevier GmbH. All rights reserved.

  11. Intra- and inter-observer reliability of quantitative analysis of the infra-patellar fat pad and comparison between fat- and non-fat-suppressed imaging—Data from the osteoarthritis initiative

    PubMed Central

    Steidle-Kloc, E.; Wirth, W.; Ruhdorfer, A.; Dannhauer, T.; Eckstein, F.

    2015-01-01

    The infra-patellar fat pad (IPFP), as intra-articular adipose tissue represents a potential source of pro-inflammatory cytokines and its size has been suggested to be associated with osteoarthritis (OA) of the knee. This study examines inter- and intra-observer reliability of fat-suppressed (fs) and non-fat-suppressed (nfs) MR imaging for determination of IPFP morphological measurements as novel biomarkers. The IPFP of nine right knees of healthy Osteoarthritis Initiative participants was segmented by five readers, using fs and nfs baseline sagittal MRIs. The intra-observer reliability was determined from baseline and 1-year follow-up images. All segmentations were quality controlled (QC) by an expert reader. Reliability was expressed as root mean square coefficient of variation (RMS CV%). After QC, the inter-observer reliability for fs (nfs) imaging was 2.0% (1.1%) for IPFP volume, 2.1%/2.5% (1.6%/1.8%) for anterior/posterior surface areas, 1.8% (1.8%) for depth, and 2.1% (2.4%) for maximum sagittal area. The intra-observer reliability was 3.1% (5.0%) for volume, 2.3%/2.8% (2.5%/2.9%) for anterior/posterior surfaces, 1.9% (3.5%) for depth, and 3.3% (4.5%) for maximum sagittal area. IPFP volume from nfs images was systematically greater (+7.3%) than from fs images, but highly correlated (r = 0.98). The results suggest that quantitative measurements of IPFP morphology can be performed with satisfactory reliability when expert QC is implemented. The IPFP is more clearly depicted in nfs images, and there is a small systematic off-set versus analysis from fs images. However, the high linear relationship between fs and nfs imaging suggests that fs images can be used to analyze IPFP morphology, when nfs images are not available. PMID:26569532

  12. Visualization and Quality Control Web Tools for CERES Products

    NASA Astrophysics Data System (ADS)

    Mitrescu, C.; Doelling, D. R.

    2017-12-01

    The NASA CERES project continues to provide the scientific communities a wide variety of satellite-derived data products such as observed TOA broadband shortwave and longwave observed fluxes, computed TOA and Surface fluxes, as well as cloud, aerosol, and other atmospheric parameters. They encompass a wide range of temporal and spatial resolutions, suited to specific applications. CERES data is used mostly by climate modeling communities but also by a wide variety of educational institutions. To better serve our users, a web-based Ordering and Visualization Tool (OVT) was developed by using Opens Source Software such as Eclipse, java, javascript, OpenLayer, Flot, Google Maps, python, and others. Due to increased demand by our own scientists, we also implemented a series of specialized functions to be used in the process of CERES Data Quality Control (QC) such as 1- and 2-D histograms, anomalies and differences, temporal and spatial averaging, side-by-side parameter comparison, and others that made the process of QC far easier and faster, but more importantly far more portable. With the integration of ground site observed surface fluxes we further facilitate the CERES project to QC the CERES computed surface fluxes. An overview of the CERES OVT basic functions using Open Source Software, as well as future steps in expanding its capabilities will be presented at the meeting.

  13. PACS 2000: quality control using the task allocation chart

    NASA Astrophysics Data System (ADS)

    Norton, Gary S.; Romlein, John R.; Lyche, David K.; Richardson, Ronald R., Jr.

    2000-05-01

    Medical imaging's technological evolution in the next century will continue to include Picture Archive and Communication Systems (PACS) and teleradiology. It is difficult to predict radiology's future in the new millennium with both computed radiography and direct digital capture competing as the primary image acquisition methods for routine radiography. Changes in Computed Axial Tomography (CT) and Magnetic Resonance Imaging (MRI) continue to amaze the healthcare community. No matter how the acquisition, display, and archive functions change, Quality Control (QC) of the radiographic imaging chain will remain an important step in the imaging process. The Task Allocation Chart (TAC) is a tool that can be used in a medical facility's QC process to indicate the testing responsibilities of the image stakeholders and the medical informatics department. The TAC shows a grid of equipment to be serviced, tasks to be performed, and the organization assigned to perform each task. Additionally, skills, tasks, time, and references for each task can be provided. QC of the PACS must be stressed as a primary element of a PACS' implementation. The TAC can be used to clarify responsibilities during warranty and paid maintenance periods. Establishing a TAC a part of a PACS implementation has a positive affect on patient care and clinical acceptance.

  14. Evaluation of capillary zone electrophoresis for the quality control of complex biologic samples: Application to snake venoms.

    PubMed

    Kpaibe, André P S; Ben-Ameur, Randa; Coussot, Gaëlle; Ladner, Yoann; Montels, Jérôme; Ake, Michèle; Perrin, Catherine

    2017-08-01

    Snake venoms constitute a very promising resource for the development of new medicines. They are mainly composed of very complex peptide and protein mixtures, which composition may vary significantly from batch to batch. This latter consideration is a challenge for routine quality control (QC) in the pharmaceutical industry. In this paper, we report the use of capillary zone electrophoresis for the development of an analytical fingerprint methodology to assess the quality of snake venoms. The analytical fingerprint concept is being widely used for the QC of herbal drugs but rarely for venoms QC so far. CZE was chosen for its intrinsic efficiency in the separation of protein and peptide mixtures. The analytical fingerprint methodology was first developed and evaluated for a particular snake venom, Lachesis muta. Optimal analysis conditions required the use of PDADMAC capillary coating to avoid protein and peptide adsorption. Same analytical conditions were then applied to other snake venom species. Different electrophoretic profiles were obtained for each venom. Excellent repeatability and intermediate precision was observed for each batch. Analysis of different batches of the same species revealed inherent qualitative and quantitative composition variations of the venoms between individuals. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Origin of the concept of the quiescent centre of plant roots.

    PubMed

    Barlow, Peter W

    2016-09-01

    Concepts in biology feed into general theories of growth, development and evolution of organisms and how they interact with the living and non-living components of their environment. A well-founded concept clarifies unsolved problems and serves as a focus for further research. One such example of a constructive concept in the plant sciences is that of the quiescent centre (QC). In anatomical terms, the QC is an inert group of cells maintained within the apex of plant roots. However, the evidence that established the presence of a QC accumulated only gradually, making use of strands of different types of observations, notably from geometrical-analytical anatomy, radioisotope labelling and autoradiography. In their turn, these strands contributed to other concepts: those of the mitotic cell cycle and of tissue-related cell kinetics. Another important concept to which the QC contributed was that of tissue homeostasis. The general principle of this last-mentioned concept is expressed by the QC in relation to the recovery of root growth following a disturbance to cell proliferation; the resulting activation of the QC provides new cells which not only repair the root meristem but also re-establish a new QC.

  16. pH dependent antioxidant activity of lettuce (L. sativa) and synergism with added phenolic antioxidants.

    PubMed

    Altunkaya, Arzu; Gökmen, Vural; Skibsted, Leif H

    2016-01-01

    Influence of pH on the antioxidant activities of combinations of lettuce extract (LE) with quercetin (QC), green tea extract (GTE) or grape seed extract (GSE) was investigated for both reduction of Fremy's salt in aqueous solution using direct electron spin resonance (ESR) spectroscopy and in L-α-phosphatidylcholine liposome peroxidation assay measured following formation of conjugated dienes. All examined phenolic antioxidants showed increasing radical scavenging effect with increasing pH values by using both methods. QC, GTE and GSE acted synergistically in combination with LE against oxidation of peroxidating liposomes and with QC showing the largest effect. The pH dependent increase of the antioxidant activity of the phenols is due to an increase of their electron-donating ability upon deprotonation and to their stabilization in alkaline solutions leading to polymerization reaction. Such polymerization reactions of polyphenolic antioxidants can form new oxidizable -OH moieties in their polymeric products resulting in a higher radical scavenging activity. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Evaluation of non-destructive technologies for construction quality control of HMA and PCC pavements in Louisiana : [tech summary].

    DOT National Transportation Integrated Search

    2013-11-01

    Current roadway quality control and quality acceptance (QC/QA) procedures for the Louisiana Department of Transportation and Development : (LADOTD) include coring for thickness, density, and air voids in hot mix asphalt (HMA) pavements and thickness ...

  18. Assessment of in-situ test technology for construction control of base courses and embankments.

    DOT National Transportation Integrated Search

    2004-05-01

    With the coming move from an empirical to mechanistic-empirical pavement design, it is essential to improve the quality control/quality assurance (QC/QA) procedures of compacted materials from a density-based criterion to a stiffness/strength-based c...

  19. 40 CFR 98.314 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... requirements. (a) You must measure your consumption of calcined petroleum coke using plant instruments used for accounting purposes including direct measurement weighing the petroleum coke fed into your process (by belt... used to ensure the accuracy of monthly calcined petroleum coke consumption measurements. (c) You must...

  20. 40 CFR 98.314 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... requirements. (a) You must measure your consumption of calcined petroleum coke using plant instruments used for accounting purposes including direct measurement weighing the petroleum coke fed into your process (by belt... used to ensure the accuracy of monthly calcined petroleum coke consumption measurements. (c) You must...

  1. 40 CFR 98.314 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... requirements. (a) You must measure your consumption of calcined petroleum coke using plant instruments used for accounting purposes including direct measurement weighing the petroleum coke fed into your process (by belt... used to ensure the accuracy of monthly calcined petroleum coke consumption measurements. (c) You must...

  2. 40 CFR 98.314 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... requirements. (a) You must measure your consumption of calcined petroleum coke using plant instruments used for accounting purposes including direct measurement weighing the petroleum coke fed into your process (by belt... used to ensure the accuracy of monthly calcined petroleum coke consumption measurements. (c) You must...

  3. 40 CFR 98.194 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... with the same plant instruments used for accounting purposes, including but not limited to, calibrated weigh feeders, rail or truck scales, and barge measurements. The direct measurements of each lime... for these products, when measurements represent lime sold. (b) You must determine the annual quantity...

  4. 40 CFR 98.214 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... (for Equation U-2 of this subpart) must be determined annually from monthly measurements using the same plant instruments used for accounting purposes including purchase records or direct measurement, such as... accounting purposes including purchase records or direct measurement, such as weigh hoppers or belt weigh...

  5. Operational CryoSat Product Quality Assessment

    NASA Astrophysics Data System (ADS)

    Mannan, Rubinder; Webb, Erica; Hall, Amanda; Bouzinac, Catherine

    2013-12-01

    The performance and quality of the CryoSat data products are routinely assessed by the Instrument Data quality Evaluation and Analysis Service (IDEAS). This information is then conveyed to the scientific and user community in order to allow them to utilise CryoSat data with confidence. This paper presents details of the Quality Control (QC) activities performed for CryoSat products under the IDEAS contract. Details of the different QC procedures and tools deployed by IDEAS to assess the quality of operational data are presented. The latest updates to the Instrument Processing Facility (IPF) for the Fast Delivery Marine (FDM) products and the future update to Baseline-C are discussed.

  6. Ambient quality of ground water in the vicinity of Naval Submarine Base Bangor, Kitsap County, Washington, 1995

    USGS Publications Warehouse

    Greene, Karen E.

    1997-01-01

    A study of the ambient ground-water quality in the vicinity of Naval Submarine Base (SUBASE) Bangor was conducted to provide the U.S. Navywith background levels of selected constituents.The Navy needs this information to plan and manage cleanup activities on the base. DuringMarch and April 1995, 136 water-supply wells were sampled for common ions, trace elements, and organic compounds; not all wells were sampled for all constituents. Man-made organic compounds were detected in only two of fifty wells, and the sources of these organic compounds were attributed to activities in the immediate vicinities of these off- base wells. Drinking water standards for trichloroethylene, iron, and manganese were exceeded in one of these wells, which was probablycontaminated by an old local (off-base) dump. Ground water from wells open to the following hydrogeologic units (in order from shallow to deep) was investigated: the Vashon till confining unit (Qvt, three wells); the Vashon aquifer (Qva, 54 wells); the Upper confining unit (QC1, 16 wells); the Permeable interbeds within QC1 (QC1pi, 34 wells); and the Sea-level aquifer (QA1, 29 wells).The 50th and 90th percentile ambient background levels of 35 inorganic constituents were determined for each hydrogeologic unit. At least tenmeasurements were required for a constituent in each hydro- geologic unit for determination of ambient background levels, and data for three wellsdetermined to be affected by localized activities were excluded from these analyses. The only drinking water standards exceeded by ambient background levels were secondary maximum contaminant levels for iron (300 micrograms per liter), in QC1 and QC1pi, and manganese (50 micrograms per liter), in all of the units. The 90th percentile values for arsenic in QC1pi, QA1, and for the entire study area are above 5 micrograms per liter, the Model Toxics Control Act Method A value for protecting drinking water, but well below the maximum contaminant level of 50 micrograms per liter for arsenic. The manganese standard was exceeded in 38 wells and the standard for iron was exceeded in 12 wells.Most of these wells were in QC1 or QC1pi and had dissolved oxygen concentrations of less than 1 milligram per liter and dissolved organic carbon concentrations greater than 1\\x11milligram per liter.The dissolved oxygen concentration is generally lower in the deeper units, while pH increases; the recommended pH range of 6.5-8.5 standard units was exceeded in 9 wells. The common-ion chemistry was similar for all of the units.

  7. Development of an LC-MS/MS method for the determination of endogenous cortisol in hair using (13)C3-labeled cortisol as surrogate analyte.

    PubMed

    Binz, Tina M; Braun, Ueli; Baumgartner, Markus R; Kraemer, Thomas

    2016-10-15

    Hair cortisol levels are increasingly applied as a measure for stress in humans and mammals. Cortisol is an endogenous compound and is always present within the hair matrix. Therefore, "cortisol-free hair matrix" is a critical point for any analytical method to accurately quantify especially low cortisol levels. The aim of this project was to modify current methods used for hair cortisol analysis to more accurately determine low endogenous cortisol concentrations in hair. For that purpose, (13)C3-labeled cortisol, which is not naturally present in hair (above 13C natural abundance levels), was used for calibration and comparative validation applying cortisol versus (13)C3-labeled cortisol. Cortisol was extracted from 20mg hair (standard sample amount) applying an optimized single step extraction protocol. An LC-MS/MS method was developed for the quantitative analysis of cortisol using either cortisol or (13)C3-cortisol as calibrators and D7-cortisone as internal standard (IS). The two methods (cortisol/(13)C3-labeled cortisol) were validated in a concentration range up to 500pg/mg and showed good linearity for both analytes (cortisol: R(2)=0.9995; (13)C3-cortisol R(2)=0.9992). Slight differences were observed for limit of detection (LOD) (0.2pg/mg/0.1pg/mg) and limit of quantification (LOQ) (1pg/mg/0.5pg/mg). Precision was good with a maximum deviation of 8.8% and 10% for cortisol and (13)C3-cortisol respectively. Accuracy and matrix effects were good for both analytes except for the quality control (QC) low cortisol. QC low (2.5pg/mg) showed matrix effects (126.5%, RSD 35.5%) and accuracy showed a deviation of 26% when using cortisol to spike. These effects are likely to be caused by the unknown amount of endogenous cortisol in the different hair samples used to determine validation parameters like matrix effect, LOQ and accuracy. No matrix effects were observed for the high QC (400pg/mg) samples. Recovery was good with 92.7%/87.3% (RSD 9.9%/6.2%) for QC low and 102.3%/82.1% (RSD 5.8%/11.4%) for QC high. After successful validation the applicability of the method could be proven. The study shows that the method is especially useful for determining low endogenous cortisol concentrations as they occur in cow hair for example. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. 20 CFR 602.20 - Organization.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...' Benefits EMPLOYMENT AND TRAINING ADMINISTRATION, DEPARTMENT OF LABOR QUALITY CONTROL IN THE FEDERAL-STATE... QC unit. The organizational location of this unit shall be positioned to maximize its objectivity, to... organizational conflict of interest. ...

  9. 78 FR 48766 - Petition for Waiver of Compliance

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-09

    ...'s Network Management Center in Montreal, QC, Canada. CP operates approximately six to eight trains a day over this segment. The trackage is operated under a Centralized Traffic Control system and...

  10. Characterizing a Quantum Cascade Tunable Infrared Laser Differential Absorption Spectrometer (QC-TILDAS) for Measurements of Atmospheric Ammonia

    NASA Astrophysics Data System (ADS)

    Ellis, R.; Murphy, J. G.; van Haarlem, R.; Pattey, E.; O'Brien, J.

    2009-05-01

    A compact, fast response Quantum Cascade Tunable Infrared Laser Differential Absorption Spectrometer (QC- TILDAS) for measurements of ammonia has been evaluated under both laboratory and field conditions. Absorption of radiation from a pulsed, thermoelectrically cooled QC laser occurs at reduced pressure in a 76 m path length, 0.5 L volume multiple pass absorption cell. Detection is achieved using a thermoelectrically cooled HgCdTe infrared detector. A novel sampling technique was used, consisting of a short, heated, quartz inlet with a hydrophobic coating to minimize the adsorption of ammonia to surfaces. The inlet contains a critical orifice that reduces the pressure, a virtual impactor for separation of particles and additional ports for delivering ammonia free background air and calibration gas standards. This instrument has been found to have a detection limit of 0.3 ppb with a time resolution of 1 s. The sampling technique has been compared to the results of a conventional lead salt Tunable Diode Laser (TDL) absorption spectrometer during a laboratory intercomparison. Various lengths and types of sample inlet tubing material, heated and unheated, under dry and ambient humidity conditions with ammonia concentrations ranging from 10-1000 ppb were investigated. Preliminary analysis suggests the time response improves with the use of short, PFA tubing sampling lines. No significant improvement was observed when using a heated sampling line and humidity was seen to play an important role on the bi-exponential decay of ammonia. A field intercomparison of the QC-TILDAS with a modified Thermo 42C chemiluminescence based analyzer was also performed at Environment Canada's Centre for Atmospheric Research Experiments (CARE) in the rural town of Egbert, ON between May-July 2008. Background tests and calibrations using two different permeation tube sources and an ammonia gas cylinder were regularly carried out throughout the study. Results indicate a very good correlation (r2>0.9) between the two instruments at the beginning of the study, when regular background subtraction was applied to the QC- TILDAS.

  11. 77 FR 11484 - Agency Information Collection Activities: Proposed Collection; Comment Request-Negative QC Review...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-27

    ... Quality Control process for the Supplemental Nutrition Assistance Program and the FNS-248 will be removed... other forms of information technology. Comments may be sent to: Francis B. Heil, Chief, Quality Control... directed to Francis B. Heil, (703) 305-2442. SUPPLEMENTARY INFORMATION: Title: Negative Quality Control...

  12. Discovery of Potent Human Glutaminyl Cyclase Inhibitors as Anti-Alzheimer's Agents Based on Rational Design.

    PubMed

    Hoang, Van-Hai; Tran, Phuong-Thao; Cui, Minghua; Ngo, Van T H; Ann, Jihyae; Park, Jongmi; Lee, Jiyoun; Choi, Kwanghyun; Cho, Hanyang; Kim, Hee; Ha, Hee-Jin; Hong, Hyun-Seok; Choi, Sun; Kim, Young-Ho; Lee, Jeewoo

    2017-03-23

    Glutaminyl cyclase (QC) has been implicated in the formation of toxic amyloid plaques by generating the N-terminal pyroglutamate of β-amyloid peptides (pGlu-Aβ) and thus may participate in the pathogenesis of Alzheimer's disease (AD). We designed a library of glutamyl cyclase (QC) inhibitors based on the proposed binding mode of the preferred substrate, Aβ 3E-42 . An in vitro structure-activity relationship study identified several excellent QC inhibitors demonstrating 5- to 40-fold increases in potency compared to a known QC inhibitor. When tested in mouse models of AD, compound 212 significantly reduced the brain concentrations of pyroform Aβ and total Aβ and restored cognitive functions. This potent Aβ-lowering effect was achieved by incorporating an additional binding region into our previously established pharmacophoric model, resulting in strong interactions with the carboxylate group of Glu327 in the QC binding site. Our study offers useful insights in designing novel QC inhibitors as a potential treatment option for AD.

  13. The sensitivity of patient specific IMRT QC to systematic MLC leaf bank offset errors.

    PubMed

    Rangel, Alejandra; Palte, Gesa; Dunscombe, Peter

    2010-07-01

    Patient specific IMRT QC is performed routinely in many clinics as a safeguard against errors and inaccuracies which may be introduced during the complex planning, data transfer, and delivery phases of this type of treatment. The purpose of this work is to evaluate the feasibility of detecting systematic errors in MLC leaf bank position with patient specific checks. 9 head and neck (H&N) and 14 prostate IMRT beams were delivered using MLC files containing systematic offsets (+/- 1 mm in two banks, +/- 0.5 mm in two banks, and 1 mm in one bank of leaves). The beams were measured using both MAPCHECK (Sun Nuclear Corp., Melbourne, FL) and the aS1000 electronic portal imaging device (Varian Medical Systems, Palo Alto, CA). Comparisons with calculated fields, without offsets, were made using commonly adopted criteria including absolute dose (AD) difference, relative dose difference, distance to agreement (DTA), and the gamma index. The criteria most sensitive to systematic leaf bank offsets were the 3% AD, 3 mm DTA for MAPCHECK and the gamma index with 2% AD and 2 mm DTA for the EPID. The criterion based on the relative dose measurements was the least sensitive to MLC offsets. More highly modulated fields, i.e., H&N, showed greater changes in the percentage of passing points due to systematic MLC inaccuracy than prostate fields. None of the techniques or criteria tested is sufficiently sensitive, with the population of IMRT fields, to detect a systematic MLC offset at a clinically significant level on an individual field. Patient specific QC cannot, therefore, substitute for routine QC of the MLC itself.

  14. Design, implementation, and quality control in the Pathways American-Indian multicenter trial

    PubMed Central

    Stone, Elaine J.; Norman, James E.; Davis, Sally M.; Stewart, Dawn; Clay, Theresa E.; Caballero, Ben; Lohman, Timothy G.; Murray, David M.

    2016-01-01

    Background Pathways was the first multicenter American-Indian school-based study to test the effectiveness of an obesity prevention program promoting healthy eating and physical activity. Methods Pathways employed a nested cohort design in which 41 schools were randomized to intervention or control conditions and students within these schools were followed as a cohort (1,704 third graders at baseline). The study’s primary endpoint was percent body fat. Secondary endpoints were levels of fat in school lunches; time spent in physical activity; and knowledge, attitudes, and behaviors regarding diet and exercise. Quality control (QC) included design of data management systems which provided standardization and quality assurance of data collection and processing. Data QC procedures at study centers included manuals of operation, training and certification, and monitoring of performance. Process evaluation was conducted to monitor dose and fidelity of the interventions. Registration and tracking systems were used for students and schools. Results No difference in mean percent body fat at fifth grade was found between the intervention and control schools. Percent of calories from fat and saturated fat in school lunches was significantly reduced in the intervention schools as was total energy intake from 24-hour recalls. Significant increases in self-reported physical activity levels and knowledge of healthy behaviors were found for the intervention school students. Conclusions The Pathways study results provide evidence demonstrating the role schools can play in public health promotion. Its study design and QC systems and procedures provide useful models for other similar school based multi- or single-site studies. PMID:14636805

  15. Identification of Glutaminyl Cyclase Genes Involved in Pyroglutamate Modification of Fungal Lignocellulolytic Enzymes

    DOE PAGES

    Wu, Vincent W.; Dana, Craig M.; Iavarone, Anthony T.; ...

    2017-01-17

    The breakdown of plant biomass to simple sugars is essential for the production of second-generation biofuels and high-value bioproducts. Currently, enzymes produced from filamentous fungi are used for deconstructing plant cell wall polysaccharides into fermentable sugars for biorefinery applications. A post-translational N-terminal pyroglutamate modification observed in some of these enzymes occurs when N-terminal glutamine or glutamate is cyclized to form a five-membered ring. This modification has been shown to confer resistance to thermal denaturation for CBH-1 and EG-1 cellulases. In mammalian cells, the formation of pyroglutamate is catalyzed by glutaminyl cyclases. Using the model filamentous fungus Neurospora crassa, we identifiedmore » two genes ( qc-1 and qc-2) that encode proteins homologous to mammalian glutaminyl cyclases. We show that qc-1 and qc-2 are essential for catalyzing the formation of an N-terminal pyroglutamate on CBH-1 and GH5-1. CBH-1 and GH5-1 produced in a Δqc-1 Δqc-2 mutant, and thus lacking the N-terminal pyroglutamate modification, showed greater sensitivity to thermal denaturation, and for GH5-1, susceptibility to proteolytic cleavage. QC-1 and QC-2 are endoplasmic reticulum (ER)-localized proteins. The pyroglutamate modification is predicted to occur in a number of additional fungal proteins that have diverse functions. The identification of glutaminyl cyclases in fungi may have implications for production of lignocellulolytic enzymes, heterologous expression, and biotechnological applications revolving around protein stability.« less

  16. Identification of Glutaminyl Cyclase Genes Involved in Pyroglutamate Modification of Fungal Lignocellulolytic Enzymes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Vincent W.; Dana, Craig M.; Iavarone, Anthony T.

    The breakdown of plant biomass to simple sugars is essential for the production of second-generation biofuels and high-value bioproducts. Currently, enzymes produced from filamentous fungi are used for deconstructing plant cell wall polysaccharides into fermentable sugars for biorefinery applications. A post-translational N-terminal pyroglutamate modification observed in some of these enzymes occurs when N-terminal glutamine or glutamate is cyclized to form a five-membered ring. This modification has been shown to confer resistance to thermal denaturation for CBH-1 and EG-1 cellulases. In mammalian cells, the formation of pyroglutamate is catalyzed by glutaminyl cyclases. Using the model filamentous fungus Neurospora crassa, we identifiedmore » two genes ( qc-1 and qc-2) that encode proteins homologous to mammalian glutaminyl cyclases. We show that qc-1 and qc-2 are essential for catalyzing the formation of an N-terminal pyroglutamate on CBH-1 and GH5-1. CBH-1 and GH5-1 produced in a Δqc-1 Δqc-2 mutant, and thus lacking the N-terminal pyroglutamate modification, showed greater sensitivity to thermal denaturation, and for GH5-1, susceptibility to proteolytic cleavage. QC-1 and QC-2 are endoplasmic reticulum (ER)-localized proteins. The pyroglutamate modification is predicted to occur in a number of additional fungal proteins that have diverse functions. The identification of glutaminyl cyclases in fungi may have implications for production of lignocellulolytic enzymes, heterologous expression, and biotechnological applications revolving around protein stability.« less

  17. High performance quantum cascade lasers: Loss, beam stability, and gain engineering

    NASA Astrophysics Data System (ADS)

    Bouzi, Pierre Michel

    Quantum Cascade (QC) lasers are semiconductor devices emitting in the mid-infrared (3-30 micron) and terahertz (30-300 micron) regions of the electromagnetic spectrum. Since their first demonstration by Jerome Faist et. al. in 1994, they have evolved very quickly into high performance devices and given rise to many applications such as trace-gas sensing, medical diagnosis, free-space communication, and light detection and ranging (LIDAR). In this thesis, we investigate a further increase of the performance of QC devices and, through meticulous device modeling and characterizations, gain a deeper understanding of several of their unique characteristics, especially their carrier transport and lifetime, their characteristic temperature, their waveguide loss and modal gain, their leakage current, and their transverse mode profile. First, in our quest to achieve higher performance, we investigate the effect of growth asymmetries on device transport characteristics. This investigation stems from recent studies on the role of interface roughness on intersubband scattering and device performance. Through a symmetric active core design, we find that interface roughness and ionized impurity scattering induced by dopant migration play a significant role in carrier transport through the device. Understanding how interface roughness affects intersubband scattering, in turn, we engineer the gain in QC devices by placing monolayer barriers at specific locations within the device band structure. These strategically placed additional thin barrier layers introduce roughness scattering into the device active region, thereby selectively decreasing the lower laser state lifetime and increasing population inversion necessary for laser action. Preliminary measurement results from modified devices reveal a 50% decrease in the emission broadening compared to the control structures, which should lead to a two-fold increase in gain. A special class of so-called "strong coupling" QC lasers recently emerged with high optical power and high efficiency at cryogenic temperatures. However their performances decay rather rapidly with temperature in both pulsed and continuous wave modes. Through detailed measurements and analysis, we investigate several possible causes of this shortcoming and propose design modifications for temperature performance improvement. While the strong coupling devices are efficient and powerful, their performance often suffers from unintentional and potentially harmful beam steering at high power. Here, we identify the root of this pointing instability to be from non-linear interactions between multiple transverse modes. And, to resolve this issue, we employ focused ion beam (FIB) milling to etch small lateral constrictions on top of the devices and fill them with metal. This has the effect of greatly reducing the intensity of higher order transverse modes as they propagate through the cavity. A good grasp of the microscopic details involved in QC device operations will result in better lasers, with high beam quality. This, in turn, will enable new applications, such as the detection of SO2 isotopologues near 7.4 micron, which is of particular importance for the study of ultraviolet photolysis and the sulfur cycle on Venus.

  18. A Comparison of the Performance of Efficient Data Analysis Versus Fine Particle Dose as Metrics for the Quality Control of Aerodynamic Particle Size Distributions of Orally Inhaled Pharmaceuticals.

    PubMed

    Tougas, Terrence P; Goodey, Adrian P; Hardwell, Gareth; Mitchell, Jolyon; Lyapustina, Svetlana

    2017-02-01

    The performance of two quality control (QC) tests for aerodynamic particle size distributions (APSD) of orally inhaled drug products (OIPs) is compared. One of the tests is based on the fine particle dose (FPD) metric currently expected by the European regulators. The other test, called efficient data analysis (EDA), uses the ratio of large particle mass to small particle mass (LPM/SPM), along with impactor sized mass (ISM), to detect changes in APSD for QC purposes. The comparison is based on analysis of APSD data from four products (two different pressurized metered dose inhalers (MDIs) and two dry powder inhalers (DPIs)). It is demonstrated that in each case, EDA is able to detect shifts and abnormalities that FPD misses. The lack of sensitivity on the part of FPD is due to its "aggregate" nature, since FPD is a univariate measure of all particles less than about 5 μm aerodynamic diameter, and shifts or changes within the range encompassed by this metric may go undetected. EDA is thus shown to be superior to FPD for routine control of OIP quality. This finding augments previously reported superiority of EDA compared with impactor stage groupings (favored by US regulators) for incorrect rejections (type I errors) when incorrect acceptances (type II errors) were adjusted to the same probability for both approaches. EDA is therefore proposed as a method of choice for routine quality control of OIPs in both European and US regulatory environments.

  19. 40 CFR 98.44 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Monitoring and QA/QC requirements. 98.44 Section 98.44 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electricity Generation § 98.44 Monitoring and QA/QC...

  20. 40 CFR 98.44 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 21 2011-07-01 2011-07-01 false Monitoring and QA/QC requirements. 98.44 Section 98.44 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electricity Generation § 98.44 Monitoring and QA/QC...

  1. 40 CFR 98.44 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 22 2013-07-01 2013-07-01 false Monitoring and QA/QC requirements. 98.44 Section 98.44 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electricity Generation § 98.44 Monitoring and QA/QC...

  2. 40 CFR 98.44 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 21 2014-07-01 2014-07-01 false Monitoring and QA/QC requirements. 98.44 Section 98.44 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electricity Generation § 98.44 Monitoring and QA/QC...

  3. 40 CFR 98.44 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 22 2012-07-01 2012-07-01 false Monitoring and QA/QC requirements. 98.44 Section 98.44 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electricity Generation § 98.44 Monitoring and QA/QC...

  4. 40 CFR 98.64 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Monitoring and QA/QC requirements. 98.64 Section 98.64 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Aluminum Production § 98.64 Monitoring and QA/QC requirements...

  5. 40 CFR 98.64 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 21 2011-07-01 2011-07-01 false Monitoring and QA/QC requirements. 98.64 Section 98.64 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Aluminum Production § 98.64 Monitoring and QA/QC requirements...

  6. 40 CFR 98.84 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 21 2011-07-01 2011-07-01 false Monitoring and QA/QC requirements. 98.84 Section 98.84 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Cement Production § 98.84 Monitoring and QA/QC requirements...

  7. References on EPA Quality Assurance Project Plans

    EPA Pesticide Factsheets

    Provides requirements for the conduct of quality management practices, including quality assurance (QA) and quality control (QC) activities, for all environmental data collection and environmental technology programs performed by or for this Agency.

  8. Revision 2 of the Enbridge Quality Assurance Project Plan

    EPA Pesticide Factsheets

    This Quality Assurance Project Plan (QAPP) presents Revision 2 of the organization, objectives, planned activities, and specific quality assurance/quality control (QA/QC) procedures associated with the Enbridge Marshall Pipeline Release Project.

  9. Evaluation of non-destructive technologies for construction quality control of HMA and PCC pavements in Louisiana : [research project capsule].

    DOT National Transportation Integrated Search

    2009-07-01

    Current roadway quality control and quality acceptance (QC/QA) procedures : for Louisiana include coring for thickness, density, and air void checks in hot : mix asphalt (HMA) pavements and thickness and compressive strength for : Portland cement con...

  10. A Quadrupole Dalton-based multi-attribute method for product characterization, process development, and quality control of therapeutic proteins.

    PubMed

    Xu, Weichen; Jimenez, Rod Brian; Mowery, Rachel; Luo, Haibin; Cao, Mingyan; Agarwal, Nitin; Ramos, Irina; Wang, Xiangyang; Wang, Jihong

    2017-10-01

    During manufacturing and storage process, therapeutic proteins are subject to various post-translational modifications (PTMs), such as isomerization, deamidation, oxidation, disulfide bond modifications and glycosylation. Certain PTMs may affect bioactivity, stability or pharmacokinetics and pharmacodynamics profile and are therefore classified as potential critical quality attributes (pCQAs). Identifying, monitoring and controlling these PTMs are usually key elements of the Quality by Design (QbD) approach. Traditionally, multiple analytical methods are utilized for these purposes, which is time consuming and costly. In recent years, multi-attribute monitoring methods have been developed in the biopharmaceutical industry. However, these methods combine high-end mass spectrometry with complicated data analysis software, which could pose difficulty when implementing in a quality control (QC) environment. Here we report a multi-attribute method (MAM) using a Quadrupole Dalton (QDa) mass detector to selectively monitor and quantitate PTMs in a therapeutic monoclonal antibody. The result output from the QDa-based MAM is straightforward and automatic. Evaluation results indicate this method provides comparable results to the traditional assays. To ensure future application in the QC environment, this method was qualified according to the International Conference on Harmonization (ICH) guideline and applied in the characterization of drug substance and stability samples. The QDa-based MAM is shown to be an extremely useful tool for product and process characterization studies that facilitates facile understanding of process impact on multiple quality attributes, while being QC friendly and cost-effective.

  11. 40 CFR 98.224 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... at 40 CFR part 63, appendix A, Measurement of Vapor Phase Organic and Inorganic Emissions by... according to paragraphs (c)(1) or (c)(2) of this section. (1) Direct measurement of production and concentration (such as using flow meters, weigh scales, for production and concentration measurements). (2...

  12. SU-E-T-103: Development and Implementation of Web Based Quality Control Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Studinski, R; Taylor, R; Angers, C

    Purpose: Historically many radiation medicine programs have maintained their Quality Control (QC) test results in paper records or Microsoft Excel worksheets. Both these approaches represent significant logistical challenges, and are not predisposed to data review and approval. It has been our group's aim to develop and implement web based software designed not just to record and store QC data in a centralized database, but to provide scheduling and data review tools to help manage a radiation therapy clinics Equipment Quality control program. Methods: The software was written in the Python programming language using the Django web framework. In order tomore » promote collaboration and validation from other centres the code was made open source and is freely available to the public via an online source code repository. The code was written to provide a common user interface for data entry, formalize the review and approval process, and offer automated data trending and process control analysis of test results. Results: As of February 2014, our installation of QAtrack+ has 180 tests defined in its database and has collected ∼22 000 test results, all of which have been reviewed and approved by a physicist via QATrack+'s review tools. These results include records for quality control of Elekta accelerators, CT simulators, our brachytherapy programme, TomoTherapy and Cyberknife units. Currently at least 5 other centres are known to be running QAtrack+ clinically, forming the start of an international user community. Conclusion: QAtrack+ has proven to be an effective tool for collecting radiation therapy QC data, allowing for rapid review and trending of data for a wide variety of treatment units. As free and open source software, all source code, documentation and a bug tracker are available to the public at https://bitbucket.org/tohccmedphys/qatrackplus/.« less

  13. Degradation Signals for Ubiquitin-Proteasome Dependent Cytosolic Protein Quality Control (CytoQC) in Yeast

    PubMed Central

    Maurer, Matthew J.; Spear, Eric D.; Yu, Allen T.; Lee, Evan J.; Shahzad, Saba; Michaelis, Susan

    2016-01-01

    Cellular protein quality control (PQC) systems selectively target misfolded or otherwise aberrant proteins for degradation by the ubiquitin-proteasome system (UPS). How cells discern abnormal from normal proteins remains incompletely understood, but involves in part the recognition between ubiquitin E3 ligases and degradation signals (degrons) that are exposed in misfolded proteins. PQC is compartmentalized in the cell, and a great deal has been learned in recent years about ER-associated degradation (ERAD) and nuclear quality control. In contrast, a comprehensive view of cytosolic quality control (CytoQC) has yet to emerge, and will benefit from the development of a well-defined set of model substrates. In this study, we generated an isogenic “degron library” in Saccharomyces cerevisiae consisting of short sequences appended to the C-terminus of a reporter protein, Ura3. About half of these degron-containing proteins are substrates of the integral membrane E3 ligase Doa10, which also plays a pivotal role in ERAD and some nuclear protein degradation. Notably, some of our degron fusion proteins exhibit dependence on the E3 ligase Ltn1/Rkr1 for degradation, apparently by a mechanism distinct from its known role in ribosomal quality control of translationally paused proteins. Ubr1 and San1, E3 ligases involved in the recognition of some misfolded CytoQC substrates, are largely dispensable for the degradation of our degron-containing proteins. Interestingly, the Hsp70/Hsp40 chaperone/cochaperones Ssa1,2 and Ydj1, are required for the degradation of all constructs tested. Taken together, the comprehensive degron library presented here provides an important resource of isogenic substrates for testing candidate PQC components and identifying new ones. PMID:27172186

  14. Influence of chemical and mechanical polishing on water sorption and solubility of denture base acrylic resins.

    PubMed

    Rahal, Juliana Saab; Mesquita, Marcelo Ferraz; Henriques, Guilherme Elias Pessanha; Nóbilo, Mauro Antonio Arruda

    2004-01-01

    Influence of polishing methods on water sorption and solubility of denture base acrylic resins was studied. Eighty samples were divided into groups: Classico (CL), and QC 20 (QC) - hot water bath cured; Acron MC (AC), and Onda Cryl (ON) - microwave cured; and submitted to mechanical polishing (MP) - pumice slurry, chalk powder, soft brush and felt cone in a bench vise; or chemical polishing (CP) - heated monomer fluid in a chemical polisher. The first desiccation process was followed by storage in distilled water at 37 +/- 1 degrees C for 1 h, 1 day, 1, 2, 3 and 4 weeks. Concluding each period, water sorption was measured. After the fourth week, a second desiccation process was done to calculate solubility. Data were submitted to analysis of variance, followed by Tukey test (p

  15. Quantum correlations in a family of bipartite separable qubit states

    NASA Astrophysics Data System (ADS)

    Xie, Chuanmei; Liu, Yimin; Chen, Jianlan; Zhang, Zhanjun

    2017-03-01

    Quantum correlations (QCs) in some separable states have been proposed as a key resource for certain quantum communication tasks and quantum computational models without entanglement. In this paper, a family of nine-parameter separable states, obtained from arbitrary mixture of two sets of bi-qubit product pure states, is considered. QCs in these separable states are studied analytically or numerically using four QC quantifiers, i.e., measurement-induced disturbance (Luo in Phys Rev A77:022301, 2008), ameliorated MID (Girolami et al. in J Phys A Math Theor 44:352002, 2011),quantum dissonance (DN) (Modi et al. in Phys Rev Lett 104:080501, 2010), and new quantum dissonance (Rulli in Phys Rev A 84:042109, 2011), respectively. First, an inherent symmetry in the concerned separable states is revealed, that is, any nine-parameter separable states concerned in this paper can be transformed to a three-parameter kernel state via some certain local unitary operation. Then, four different QC expressions are concretely derived with the four QC quantifiers. Furthermore, some comparative studies of the QCs are presented, discussed and analyzed, and some distinct features about them are exposed. We find that, in the framework of all the four QC quantifiers, the more mixed the original two pure product states, the bigger QCs the separable states own. Our results reveal some intrinsic features of QCs in separable systems in quantum information.

  16. 40 CFR 98.424 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 22 2012-07-01 2012-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Carbon Dioxide § 98.424 Monitoring and QA/QC... determine quantity in accordance with this paragraph. (i) Reporters that supply CO2 in containers using...

  17. 40 CFR 98.424 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 22 2013-07-01 2013-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Carbon Dioxide § 98.424 Monitoring and QA/QC... determine quantity in accordance with this paragraph. (i) Reporters that supply CO2 in containers using...

  18. 40 CFR 98.334 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 22 2012-07-01 2012-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Zinc Production § 98.334 Monitoring and QA/QC requirements. If..., belt weigh feeders, weighed purchased quantities in shipments or containers, combination of bulk...

  19. 40 CFR 98.94 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 21 2011-07-01 2011-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electronics Manufacturing § 98.94 Monitoring and QA/QC requirements. (a) For calendar year 2011 monitoring, you may follow the provisions in paragraphs (a)(1) through...

  20. 40 CFR 98.424 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 21 2011-07-01 2011-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Carbon Dioxide § 98.424 Monitoring and QA/QC... determine quantity in accordance with this paragraph. (i) Reporters that supply CO2 in containers using...

  1. 40 CFR 98.334 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 21 2011-07-01 2011-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Zinc Production § 98.334 Monitoring and QA/QC requirements. If..., belt weigh feeders, weighed purchased quantities in shipments or containers, combination of bulk...

  2. 40 CFR 98.334 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 21 2014-07-01 2014-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Zinc Production § 98.334 Monitoring and QA/QC requirements. If..., belt weigh feeders, weighed purchased quantities in shipments or containers, combination of bulk...

  3. 40 CFR 98.94 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 22 2013-07-01 2013-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electronics Manufacturing § 98.94 Monitoring and QA/QC requirements. (a) For calendar year 2011 monitoring, you may follow the provisions in paragraphs (a)(1) through...

  4. 40 CFR 98.424 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 21 2014-07-01 2014-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Carbon Dioxide § 98.424 Monitoring and QA/QC... determine quantity in accordance with this paragraph. (i) Reporters that supply CO2 in containers using...

  5. 40 CFR 98.94 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 22 2012-07-01 2012-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electronics Manufacturing § 98.94 Monitoring and QA/QC requirements. (a) For calendar year 2011 monitoring, you may follow the provisions in paragraphs (a)(1) through...

  6. 40 CFR 98.334 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 22 2013-07-01 2013-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Zinc Production § 98.334 Monitoring and QA/QC requirements. If..., belt weigh feeders, weighed purchased quantities in shipments or containers, combination of bulk...

  7. 40 CFR 98.94 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 21 2014-07-01 2014-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electronics Manufacturing § 98.94 Monitoring and QA/QC...-specific heel factors for each container type for each gas used, according to the procedures in paragraphs...

  8. Quantum cascade transmitters for ultrasensitive chemical agent and explosives detection

    NASA Astrophysics Data System (ADS)

    Schultz, John F.; Taubman, Matthew S.; Harper, Warren W.; Williams, Richard M.; Myers, Tanya L.; Cannon, Bret D.; Sheen, David M.; Anheier, Norman C., Jr.; Allen, Paul J.; Sundaram, S. K.; Johnson, Bradley R.; Aker, Pamela M.; Wu, Ming C.; Lau, Erwin K.

    2003-07-01

    The small size, high power, promise of access to any wavelength between 3.5 and 16 microns, substantial tuning range about a chosen center wavelength, and general robustness of quantum cascade (QC) lasers provide opportunities for new approaches to ultra-sensitive chemical detection and other applications in the mid-wave infrared. PNNL is developing novel remote and sampling chemical sensing systems based on QC lasers, using QC lasers loaned by Lucent Technologies. In recent months laboratory cavity-enhanced sensing experiments have achieved absorption sensitivities of 8.5 x 10-11 cm-1 Hz-1/2, and the PNNL team has begun monostatic and bi-static frequency modulated, differential absorption lidar (FM DIAL) experiments at ranges of up to 2.5 kilometers. In related work, PNNL and UCLA are developing miniature QC laser transmitters with the multiplexed tunable wavelengths, frequency and amplitude stability, modulation characteristics, and power levels needed for chemical sensing and other applications. Current miniaturization concepts envision coupling QC oscillators, QC amplifiers, frequency references, and detectors with miniature waveguides and waveguide-based modulators, isolators, and other devices formed from chalcogenide or other types of glass. Significant progress has been made on QC laser stabilization and amplification, and on development and characterization of high-purity chalcogenide glasses, waveguide writing techniques, and waveguide metrology.

  9. SU-F-T-567: Sensitivity and Reproducibility of the Portal Imaging Panel for Routine FFF QC and Patient Plan Dose Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Willett, A; Gilmore, M; Rowbottom, C

    2016-06-15

    Purpose: The purpose of this work was to see if the EPID is a viable alternative to other QA devices for routine FFF QA and plan dose measurements. Methods: Sensitivity measurements were made to assess response to small changes in field size and beam steering. QA plans were created where field size was varied from baseline values (5–5.5cm, 20–20.5cm). Beam steering was adjusted by altering values in service mode (Symmetry 0–3%). Plans were measured using the Varian portal imager (aS1200 DMI panel), QA3 (Sun Nuclear), and Starcheck Maxi (PTW). FFF beam parameters as stated in Fogliata et al were calculated.more » Constancy measurements were taken using all 3 QC devices to measure a MLC defined 20×20cm field. Two clinical SABR patient plans were measured on a Varian Edge linac, using the Portal Dosimetry module in ARIA, and results compared with analysis made using Delta4 (ScandiDos). Results: The EPID and the Starcheck performed better at detecting clinically relevant changes in field size with the QA3 performing better when detecting similar changes in beam symmetry. Consistency measurements with the EPID and Starcheck were equivalent, with comparable standard deviations. Clinical plan measurements on the EPID compared well with Delta4 results at 3%/1mm. Conclusion: Our results show that for FFF QA measurements such as field size and symmetry, using the EPID is a viable alternative to other QA devices. The EPID could potentially be used for QC measurements with a focus on geometric accuracy, such as MLC positional QA, due to its high resolution compared to other QA devices (EPID 0.34mm, Starcheck 3mm, QA3 5mm). Good agreement between Delta4 and portal dosimetry also indicated the EPID may be a suitable alternative for measurement of clinical plans.« less

  10. 40 CFR 98.454 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... using measurements and/or engineering assessments or calculations based on chemical engineering principles or physical or chemical laws or properties. Such assessments or calculations may be based on, as...

  11. 40 CFR 98.454 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... using measurements and/or engineering assessments or calculations based on chemical engineering principles or physical or chemical laws or properties. Such assessments or calculations may be based on, as...

  12. 40 CFR 98.454 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... using measurements and/or engineering assessments or calculations based on chemical engineering principles or physical or chemical laws or properties. Such assessments or calculations may be based on, as...

  13. 40 CFR 98.454 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... using measurements and/or engineering assessments or calculations based on chemical engineering principles or physical or chemical laws or properties. Such assessments or calculations may be based on, as...

  14. A surrogate analyte method to determine D-serine in mouse brain using liquid chromatography-tandem mass spectrometry.

    PubMed

    Kinoshita, Kohnosuke; Jingu, Shigeji; Yamaguchi, Jun-ichi

    2013-01-15

    A bioanalytical method for determining endogenous d-serine levels in the mouse brain using a surrogate analyte and liquid chromatography-tandem mass spectrometry (LC-MS/MS) was developed. [2,3,3-(2)H]D-serine and [(15)N]D-serine were used as a surrogate analyte and an internal standard, respectively. The surrogate analyte was spiked into brain homogenate to yield calibration standards and quality control (QC) samples. Both endogenous and surrogate analytes were extracted using protein precipitation followed by solid phase extraction. Enantiomeric separation was achieved on a chiral crown ether column with an analysis time of only 6 min without any derivatization. The column eluent was introduced into an electrospray interface of a triple-quadrupole mass spectrometer. The calibration range was 1.00 to 300 nmol/g, and the method showed acceptable accuracy and precision at all QC concentration levels from a validation point of view. In addition, the brain d-serine levels of normal mice determined using this method were the same as those obtained by a standard addition method, which is time-consuming but is often used for the accurate measurement of endogenous substances. Thus, this surrogate analyte method should be applicable to the measurement of d-serine levels as a potential biomarker for monitoring certain effects of drug candidates on the central nervous system. Copyright © 2012 Elsevier Inc. All rights reserved.

  15. A metadata reporting framework for standardization and synthesis of ecohydrological field observations

    NASA Astrophysics Data System (ADS)

    Christianson, D. S.; Varadharajan, C.; Detto, M.; Faybishenko, B.; Gimenez, B.; Jardine, K.; Negron Juarez, R. I.; Pastorello, G.; Powell, T.; Warren, J.; Wolfe, B.; McDowell, N. G.; Kueppers, L. M.; Chambers, J.; Agarwal, D.

    2016-12-01

    The U.S. Department of Energy's (DOE) Next Generation Ecosystem Experiment (NGEE) Tropics project aims to develop a process-rich tropical forest ecosystem model that is parameterized and benchmarked by field observations. Thus, data synthesis, quality assurance and quality control (QA/QC), and data product generation of a diverse and complex set of ecohydrological observations, including sapflux, leaf surface temperature, soil water content, and leaf gas exchange from sites across the Tropics, are required to support model simulations. We have developed a metadata reporting framework, implemented in conjunction with the NGEE Tropics Data Archive tool, to enable cross-site and cross-method comparison, data interpretability, and QA/QC. We employed a modified User-Centered Design approach, which involved short development cycles based on user-identified needs, and iterative testing with data providers and users. The metadata reporting framework currently has been implemented for sensor-based observations and leverages several existing metadata protocols. The framework consists of templates that define a multi-scale measurement position hierarchy, descriptions of measurement settings, and details about data collection and data file organization. The framework also enables data providers to define data-access permission settings, provenance, and referencing to enable appropriate data usage, citation, and attribution. In addition to describing the metadata reporting framework, we discuss tradeoffs and impressions from both data providers and users during the development process, focusing on the scalability, usability, and efficiency of the framework.

  16. Effect of quadriceps and hamstrings muscle cooling on standing balance in healthy young men.

    PubMed

    Alghadir, A H; Anwer, S; Zafar, H; Al-Eisa, E S

    2017-09-01

    The present study compared the effect of quadriceps and hamstring muscle cooling on standing balance in healthy young men. Thirty healthy young men (18-30 years) participated in the study. The participants were randomly assigned to three groups (n=10 each): quadriceps cooling (QC), hamstring cooling (HC), or control group (no cooling). Participants in the QC and HC groups received 20 minutes of cooling using a cold pack (gel pack), placed on the anterior thigh (from the apex of the patella to the mid-thigh) and the posterior thigh (from the base of the popliteal fossa to the mid-thigh), respectively. Balance score including unilateral stance was measured at baseline and immediately after the application of the cold pack. No significant difference in the balance score was noted in any group after the application of the cold pack (p⟩0.05). Similarly, no significant differences in post-test balance score were noted among the three groups (p⟩0.05). Cooling of the quadriceps and hamstring muscles has no immediate effect on standing balance in healthy young men. However, longitudinal studies are warranted to investigate the long-term effects of cooling these muscles on standing balance.

  17. Quality Assurance and Quality Control Practices for Rehabilitation of Sewer and Water Mains

    EPA Science Inventory

    As part of the US Environmental Protection Agency (EPA)’s Aging Water Infrastructure Research Program, several areas of research are being pursued, including a review of quality assurance and quality control (QA/QC) practices and acceptance testing during the installation of reha...

  18. Development of models to estimate the subgrade and subbase layers' resilient modulus from in situ devices test results for construction control.

    DOT National Transportation Integrated Search

    2008-04-01

    The objective of this study was to develop resilient modulus prediction models for possible application in the quality control/quality assurance (QC/QA) procedures during and after the construction of pavement layers. Field and laboratory testing pro...

  19. Quality Assurance and Quality Control Practices For Rehabilitation of Sewer and Water Mains

    EPA Science Inventory

    As part of the US Environmental Protection Agency (EPA)’s Aging Water Infrastructure Research Program, several areas of research are being pursued including a review of quality assurance and quality control (QA/QC) practices and acceptance testing during the installation of rehab...

  20. Quality control ranges for testing broth microdilution susceptibility of Flavobacterium columnare and F. psychrophilium to nine antimicrobials

    USDA-ARS?s Scientific Manuscript database

    A multi-laboratory broth microdilution method trial was performed to standardize the specialized test conditions required for fish pathogens Flavobacterium columnare and F. pyschrophilum. Nine laboratories tested the quality control (QC) strains Escherichia coli ATCC 25922 and Aeromonas salmonicid...

  1. 7 CFR 283.2 - Scope and applicability.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... agencies of Food and Nutrition Service quality control (QC) claims for Fiscal Year (“FY”) 1986 and... Agriculture Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF AGRICULTURE FOOD STAMP AND FOOD DISTRIBUTION PROGRAM APPEALS OF QUALITY CONTROL (âQCâ) CLAIMS General § 283.2...

  2. From quantum cascade to super cascade laser a new laser design paradigm for broad spectral emission & a re-examination of current spreading

    NASA Astrophysics Data System (ADS)

    Le, Loan T.

    Over the span of more than 20 years of development, the Quantum Cascade (QC) laser has positioned itself as the most viable mid-infrared (mid-IR) light source. Today's QC lasers emit watts of continuous wave power at room temperature. Despite significant progress, the mid-IR region remains vastly under-utilized. State-of-the-art QC lasers are found in high power defense applications and detection of trace gases with narrow absorption lines. A large number of applications, however, do not require so much power, but rather, a broadly tunable laser source to detect molecules with broad absorption features. As such, a QC laser that is broadly tunable over the entire biochemical fingerprinting region remains the missing link to markets such as non- invasive biomedical diagnostics, food safety, and stand-off detection in turbid media. In this thesis, we detail how we utilized the inherent flexibility of the QC design space to conceive a new type of laser with the potential to bridge that missing link of the QC laser to large commercial markets. Our design concept, the Super Cascade (SC) laser, works contrary to conventional laser design principle by supporting multiple independent optical transitions, each contributing to broadening the gain spectrum. We have demonstrated a room temperature laser gain medium with electroluminescence spanning 3.3-12.5 ?m and laser emission from 6.2-12.5 ?m, the record spectral width for any solid state laser gain medium. This gain bandwidth covers the entire biochemical fingerprinting region. The achievement of such a spectrally broad gain medium presents engineering challenges of how to optimally utilize the bandwidth. As of this work, a monolithi- cally integrated array of Distributed Feedback QC (DFB-QC) lasers is one of the most promising ways to fully utilize the SC gain bandwidth. Therefore, in this thesis, we explore ways of improving the yield and ease of fabrication of DFB-QC lasers, including a re-examination of the role of current spreading in QC geometry.

  3. WIM data analyst's manual

    DOT National Transportation Integrated Search

    2010-06-01

    This manual provides information and recommended procedures to be utilized by an agencys Weigh-in-Motion (WIM) Office Data Analyst to perform validation and quality control (QC) checks of WIM traffic data. This manual focuses on data generated by ...

  4. Analysis of QA procedures at the Oregon Department of Transportation.

    DOT National Transportation Integrated Search

    2010-06-01

    This research explored the Oregon Department of Transportation (ODOT) practice of Independent Assurance (IA), : for validation of the contractors test methods, and Verification, for validation of the contractors Quality Control : (QC) data. The...

  5. Final Project Report - ARM CLASIC CIRPAS Twin Otter Aerosol

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    John A. Ogren

    2010-04-05

    The NOAA/ESRL/GMD aerosol group made three types of contributions related to airborne measurements of aerosol light scattering and absorption for the Cloud and Land Surface Interaction Campaign (CLASIC) in June 2007 on the Twin Otter research airplane operated by the Center for Interdisciplinary Remotely-Piloted Aircraft Studies (CIRPAS). GMD scientists served as the instrument mentor for the integrating nephelometer and particle soot absorption photometer (PSAP) on the Twin Otter during CLASIC, and were responsible for (1) instrument checks/comparisons; (2) instrument trouble shooting/repair; and (3) data quality control (QC) and submittal to the archive.

  6. 40 CFR 98.474 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 22 2013-07-01 2013-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Injection of Carbon Dioxide § 98.474 Monitoring and QA/QC.... (2) You must determine the quarterly mass or volume of contents in all containers if you receive CO2...

  7. 40 CFR 98.474 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 21 2014-07-01 2014-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Injection of Carbon Dioxide § 98.474 Monitoring and QA/QC.... (2) You must determine the quarterly mass or volume of contents in all containers if you receive CO2...

  8. 40 CFR 98.474 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 22 2012-07-01 2012-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Injection of Carbon Dioxide § 98.474 Monitoring and QA/QC.... (2) You must determine the quarterly mass or volume of contents in all containers if you receive CO2...

  9. Study of quantum correlation swapping with relative entropy methods

    NASA Astrophysics Data System (ADS)

    Xie, Chuanmei; Liu, Yimin; Chen, Jianlan; Zhang, Zhanjun

    2016-02-01

    To generate long-distance shared quantum correlations (QCs) for information processing in future quantum networks, recently we proposed the concept of QC repeater and its kernel technique named QC swapping. Besides, we extensively studied the QC swapping between two simple QC resources (i.e., a pair of Werner states) with four different methods to quantify QCs (Xie et al. in Quantum Inf Process 14:653-679, 2015). In this paper, we continue to treat the same issue by employing other three different methods associated with relative entropies, i.e., the MPSVW method (Modi et al. in Phys Rev Lett 104:080501, 2010), the Zhang method (arXiv:1011.4333 [quant-ph]) and the RS method (Rulli and Sarandy in Phys Rev A 84:042109, 2011). We first derive analytic expressions of all QCs which occur during the swapping process and then reveal their properties about monotonicity and threshold. Importantly, we find that a long-distance shared QC can be generated from two short-distance ones via QC swapping indeed. In addition, we simply compare our present results with our previous ones.

  10. Quantum Testbeds Stakeholder Workshop (QTSW) Report meeting purpose and agenda.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hebner, Gregory A.

    Quantum computing (QC) is a promising early-stage technology with the potential to provide scientific computing capabilities far beyond what is possible with even an Exascale computer in specific problems of relevance to the Office of Science. These include (but are not limited to) materials modeling, molecular dynamics, and quantum chromodynamics. However, commercial QC systems are not yet available and the technical maturity of current QC hardware, software, algorithms, and systems integration is woefully incomplete. Thus, there is a significant opportunity for DOE to define the technology building blocks, and solve the system integration issues to enable a revolutionary tool. Oncemore » realized, QC will have world changing impact on economic competitiveness, the scientific enterprise, and citizen well-being. Prior to this workshop, DOE / Office of Advanced Scientific Computing Research (ASCR) hosted a workshop in 2015 to explore QC scientific applications. The goal of that workshop was to assess the viability of QC technologies to meet the computational requirements in support of DOE’s science and energy mission and to identify the potential impact of these technologies.« less

  11. Lapse time and frequency-dependent coda wave attenuation for Delhi and its surrounding regions

    NASA Astrophysics Data System (ADS)

    Das, Rabin; Mukhopadhyay, Sagarika; Singh, Ravi Kant; Baidya, Pushap R.

    2018-07-01

    Attenuation of seismic wave energy of Delhi and its surrounding regions has been estimated using coda of local earthquakes. Estimated quality factor (Qc) values are strongly dependent on frequency and lapse time. Frequency dependence of Qc has been estimated from the relationship Qc(f) = Q0fn for different lapse time window lengths. Q0 and n values vary from 73 to 453 and 0.97 to 0.63 for lapse time window lengths of 15 s to 90 s respectively. Average estimated frequency dependent relation is, Qc(f) = 135 ± 8f0.96±0.02 for the entire region for a window length of 30 s, where the average Qc value varies from 200 at 1.5 Hz to 1962 at 16 Hz. These values show that the region is seismically active and highly heterogeneous. The entire study region is divided into two sub-regions according to the geology of the area to investigate if there is a spatial variation in attenuation characteristics in this region. It is observed that at smaller lapse time both regions have similar Qc values. However, at larger lapse times the rate of increase of Qc with frequency is larger for Region 2 compared to Region 1. This is understandable, as it is closer to the tectonically more active Himalayan ranges and seismically more active compared to Region 1. The difference in variation of Qc with frequencies for the two regions is such that at larger lapse time and higher frequencies Region 2 shows higher Qc compared to Region 1. For lower frequencies the opposite situation is true. This indicates that there is a systematic variation in attenuation characteristics from the south (Region 1) to the north (Region 2) in the deeper part of the study area. This variation can be explained in terms of an increase in heat flow and a decrease in the age of the rocks from south to north.

  12. Summation rules for a fully nonlocal energy-based quasicontinuum method

    NASA Astrophysics Data System (ADS)

    Amelang, J. S.; Venturini, G. N.; Kochmann, D. M.

    2015-09-01

    The quasicontinuum (QC) method coarse-grains crystalline atomic ensembles in order to bridge the scales from individual atoms to the micro- and mesoscales. A crucial cornerstone of all QC techniques, summation or quadrature rules efficiently approximate the thermodynamic quantities of interest. Here, we investigate summation rules for a fully nonlocal, energy-based QC method to approximate the total Hamiltonian of a crystalline atomic ensemble by a weighted sum over a small subset of all atoms in the crystal lattice. Our formulation does not conceptually differentiate between atomistic and coarse-grained regions and thus allows for seamless bridging without domain-coupling interfaces. We review traditional summation rules and discuss their strengths and weaknesses with a focus on energy approximation errors and spurious force artifacts. Moreover, we introduce summation rules which produce no residual or spurious force artifacts in centrosymmetric crystals in the large-element limit under arbitrary affine deformations in two dimensions (and marginal force artifacts in three dimensions), while allowing us to seamlessly bridge to full atomistics. Through a comprehensive suite of examples with spatially non-uniform QC discretizations in two and three dimensions, we compare the accuracy of the new scheme to various previous ones. Our results confirm that the new summation rules exhibit significantly smaller force artifacts and energy approximation errors. Our numerical benchmark examples include the calculation of elastic constants from completely random QC meshes and the inhomogeneous deformation of aggressively coarse-grained crystals containing nano-voids. In the elastic regime, we directly compare QC results to those of full atomistics to assess global and local errors in complex QC simulations. Going beyond elasticity, we illustrate the performance of the energy-based QC method with the new second-order summation rule by the help of nanoindentation examples with automatic mesh adaptation. Overall, our findings provide guidelines for the selection of summation rules for the fully nonlocal energy-based QC method.

  13. Countably QC-Approximating Posets

    PubMed Central

    Mao, Xuxin; Xu, Luoshan

    2014-01-01

    As a generalization of countably C-approximating posets, the concept of countably QC-approximating posets is introduced. With the countably QC-approximating property, some characterizations of generalized completely distributive lattices and generalized countably approximating posets are given. The main results are as follows: (1) a complete lattice is generalized completely distributive if and only if it is countably QC-approximating and weakly generalized countably approximating; (2) a poset L having countably directed joins is generalized countably approximating if and only if the lattice σ c(L)op of all σ-Scott-closed subsets of L is weakly generalized countably approximating. PMID:25165730

  14. The role of metadata and strategies to detect and control temporal data bias in environmental monitoring of soil contamination.

    PubMed

    Desaules, André

    2012-11-01

    It is crucial for environmental monitoring to fully control temporal bias, which is the distortion of real data evolution by varying bias through time. Temporal bias cannot be fully controlled by statistics alone but requires appropriate and sufficient metadata, which should be under rigorous and continuous quality assurance and control (QA/QC) to reliably document the degree of consistency of the monitoring system. All presented strategies to detect and control temporal data bias (QA/QC, harmonisation/homogenisation/standardisation, mass balance approach, use of tracers and analogues and control of changing boundary conditions) rely on metadata. The Will Rogers phenomenon, due to subsequent reclassification, is a particular source of temporal data bias introduced to environmental monitoring here. Sources and effects of temporal data bias are illustrated by examples from the Swiss soil monitoring network. The attempt to make a comprehensive compilation and assessment of required metadata for soil contamination monitoring reveals that most metadata are still far from being reliable. This leads to the conclusion that progress in environmental monitoring means further development of the concept of environmental metadata for the sake of temporal data bias control as a prerequisite for reliable interpretations and decisions.

  15. 40 CFR 98.184 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... calendar year. The monthly mass may be determined using plant instruments used for accounting purposes, including either direct measurement of the quantity of the material placed in the unit or by calculations...

  16. 40 CFR 98.114 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... calendar year. The monthly mass may be determined using plant instruments used for accounting purposes, including either direct measurement of the quantity of the material placed in the unit or by calculations...

  17. 40 CFR 98.474 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... American National Standards Institute (ANSI), the American Gas Association (AGA), the American Society of... Standards and Technology (NIST) traceable. (c) General. (1) If you measure the concentration of any CO2...

  18. 40 CFR 98.294 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... designed to measure the total alkalinity in soda ash not in trona. The modified method referred to above... requirements. Section 98.293 provides three different procedures for emission calculations. The appropriate...

  19. 40 CFR 98.294 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... designed to measure the total alkalinity in soda ash not in trona. The modified method referred to above... requirements. Section 98.293 provides three different procedures for emission calculations. The appropriate...

  20. 40 CFR 98.294 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... designed to measure the total alkalinity in soda ash not in trona. The modified method referred to above... requirements. Section 98.293 provides three different procedures for emission calculations. The appropriate...

  1. 40 CFR 98.294 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... designed to measure the total alkalinity in soda ash not in trona. The modified method referred to above... requirements. Section 98.293 provides three different procedures for emission calculations. The appropriate...

  2. Valid internal standard technique for arson detection based on gas chromatography-mass spectrometry.

    PubMed

    Salgueiro, Pedro A S; Borges, Carlos M F; Bettencourt da Silva, Ricardo J N

    2012-09-28

    The most popular procedures for the detection of residues of accelerants in fire debris are the ones published by the American Society for Testing and Materials (ASTM E1412-07 and E1618-10). The most critical stages of these tests are the conservation of fire debris from the sampling to the laboratory, the extraction of residues of accelerants from the debris to the activated charcoal strips (ACS) and from those to the final solvent, as well as the analysis of sample extract by gas chromatography-mass spectrometry (GC-MS) and the interpretation of the instrumental signal. This work proposes a strategy for checking the quality of the sample conservation, the accelerant residues transference to final solvent and GC-MS analysis, using internal standard additions. It is used internal standards ranging from a highly volatile compound for checking debris conservation to low volatile compound for checking GC-MS repeatability. The developed quality control (QC) parameters are not affected by GC-MS sensitivity variation and, specifically, the GC-MS performance control is not affected by ACS adsorption saturation that may mask test performance deviations. The proposed QC procedure proved to be adequate to check GC-MS repeatability, ACS extraction and sample conservation since: (1) standard additions are affected by negligible uncertainty and (2) observed dispersion of QC parameters are fit for its intended use. Copyright © 2012 Elsevier B.V. All rights reserved.

  3. A method to establish seismic noise baselines for automated station assessment

    USGS Publications Warehouse

    McNamara, D.E.; Hutt, C.R.; Gee, L.S.; Benz, H.M.; Buland, R.P.

    2009-01-01

    We present a method for quantifying station noise baselines and characterizing the spectral shape of out-of-nominal noise sources. Our intent is to automate this method in order to ensure that only the highest-quality data are used in rapid earthquake products at NEIC. In addition, the station noise baselines provide a valuable tool to support the quality control of GSN and ANSS backbone data and metadata. The procedures addressed here are currently in development at the NEIC, and work is underway to understand how quickly changes from nominal can be observed and used within the NEIC processing framework. The spectral methods and software used to compute station baselines and described herein (PQLX) can be useful to both permanent and portable seismic stations operators. Applications include: general seismic station and data quality control (QC), evaluation of instrument responses, assessment of near real-time communication system performance, characterization of site cultural noise conditions, and evaluation of sensor vault design, as well as assessment of gross network capabilities (McNamara et al. 2005). Future PQLX development plans include incorporating station baselines for automated QC methods and automating station status report generation and notification based on user-defined QC parameters. The PQLX software is available through the USGS (http://earthquake. usgs.gov/research/software/pqlx.php) and IRIS (http://www.iris.edu/software/ pqlx/).

  4. Characterising and correcting batch variation in an automated direct infusion mass spectrometry (DIMS) metabolomics workflow.

    PubMed

    Kirwan, J A; Broadhurst, D I; Davidson, R L; Viant, M R

    2013-06-01

    Direct infusion mass spectrometry (DIMS)-based untargeted metabolomics measures many hundreds of metabolites in a single experiment. While every effort is made to reduce within-experiment analytical variation in untargeted metabolomics, unavoidable sources of measurement error are introduced. This is particularly true for large-scale multi-batch experiments, necessitating the development of robust workflows that minimise batch-to-batch variation. Here, we conducted a purpose-designed, eight-batch DIMS metabolomics study using nanoelectrospray (nESI) Fourier transform ion cyclotron resonance mass spectrometric analyses of mammalian heart extracts. First, we characterised the intrinsic analytical variation of this approach to determine whether our existing workflows are fit for purpose when applied to a multi-batch investigation. Batch-to-batch variation was readily observed across the 7-day experiment, both in terms of its absolute measurement using quality control (QC) and biological replicate samples, as well as its adverse impact on our ability to discover significant metabolic information within the data. Subsequently, we developed and implemented a computational workflow that includes total-ion-current filtering, QC-robust spline batch correction and spectral cleaning, and provide conclusive evidence that this workflow reduces analytical variation and increases the proportion of significant peaks. We report an overall analytical precision of 15.9%, measured as the median relative standard deviation (RSD) for the technical replicates of the biological samples, across eight batches and 7 days of measurements. When compared against the FDA guidelines for biomarker studies, which specify an RSD of <20% as an acceptable level of precision, we conclude that our new workflows are fit for purpose for large-scale, high-throughput nESI DIMS metabolomics studies.

  5. Material quality assurance risk assessment.

    DOT National Transportation Integrated Search

    2013-01-01

    Over the past two decades the role of SHA has shifted from quality control (QC) of materials and : placement techniques to quality assurance (QA) and acceptance. The role of the Office of Materials : Technology (OMT) has been shifting towards assuran...

  6. Data Validation & Laboratory Quality Assurance for Region 9

    EPA Pesticide Factsheets

    In all hazardous site investigations it is essential to know the quality of the data used for decision-making purposes. Validation of data requires that appropriate quality assurance and quality control (QA/QC) procedures be followed.

  7. Long-term pavement performance indicators for failed materials.

    DOT National Transportation Integrated Search

    2016-04-01

    State Transportation Agencies (STAs) use quality control/quality assurance (QC/QA) specifications to guide the testing and inspection of : road pavement construction. Although failed materials of pavement rarely occur in practice, it is critical to h...

  8. Material quality assurance risk assessment : [summary].

    DOT National Transportation Integrated Search

    2013-01-01

    With the shift from quality control (QC) of materials and placement techniques : to quality assurance (QA) and acceptance over the years, the role of the Office : of Materials Technology (OMT) has been shifting towards assurance of : material quality...

  9. Quality control in the year 2000.

    PubMed

    Schade, B

    1992-01-01

    'Just-in-time' production is a prerequisite for a company to meet the challenges of competition. Manufacturing cycles have been so successfully optimized that release time now has become a significant factor. A vision for a major quality-control (QC) contribution to profitability in this decade seems to be the just-in-time release. Benefits will go beyond cost savings for lower inventory. The earlier detection of problems will reduce rejections and scrap. In addition, problem analysis and problem-solving will be easier. To achieve just-in-time release, advanced automated systems like robots will become the workhorses in QC for high volume pharmaceutical production. The requirements for these systems are extremely high in terms of quality, reliability and ruggedness. Crucial for the success might be advances in use of microelectronics for error checks, system recording, trouble shooting, etc. as well as creative new approaches (for example the use of redundant assay systems).

  10. Quality control in the year 2000

    PubMed Central

    Schade, Bernd

    1992-01-01

    ‘Just-in-time’ production is a prerequisite for a company to meet the challenges of competition. Manufacturing cycles have been so successfully optimized that release time now has become a significant factor. A vision for a major quality-control (QC) contribution to profitability in this decade seems to be the just-in-time release. Benefits will go beyond cost savings for lower inventory. The earlier detection of problems will reduce rejections and scrap. In addition, problem analysis and problem-solving will be easier. To achieve just-in-time release, advanced automated systems like robots will become the workhorses in QC for high volume pharmaceutical production. The requirements for these systems are extremely high in terms of quality, reliability and ruggedness. Crucial for the success might be advances in use of microelectronics for error checks, system recording, trouble shooting, etc. as well as creative new approaches (for example the use of redundant assay systems). PMID:18924930

  11. 40 CFR 98.334 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... used for accounting purposes including direct measurement weighing or through the use of purchase records same plant instruments or procedures that are used for accounting purposes (such as weigh hoppers... density and volume measurements, etc.). Record the total mass for the materials consumed each calendar...

  12. 40 CFR 98.84 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... by direct weight measurement using the same plant instruments used for accounting purposes, such as... be determined quarterly by direct weight measurement using the same plant instruments used for accounting purposes, such as weigh hoppers, truck weigh scales, or belt weigh feeders. (f) The quantity of...

  13. Large-Scale Topographic Features on Venus: A Comparison by Geological Mapping in Four Quadrangles

    NASA Astrophysics Data System (ADS)

    Ivanov, M. A.; Head, J. W.

    2002-05-01

    We have conducted geological mapping in four quadrangles under the NASA program of geological mapping of Venus. Two quadrangles portray large equidimensional lowlands (Lavinia, V55, and Atalanta, V4, Planitiae) and two more areas are characterized by a large corona (Quetzalpetlatl corona, QC, V66), and Lakshmi Planum (LP, V7). Geological mapping of these large-scale features allows for their broad comparisons by both sets of typical structures and sequences of events. The Planitiae share a number of similar characteristics. (1) Lavinia and Atalanta are broad quasi-circular lowlands 1-2 km deep. (2) The central portions of the basins lack both coronae and large volcanoes. (3) The belts of tectonic deformation characterize the central portions of the basins. (4) There is evidence in both lowlands that they subsided predominantly before the emplacement of regional plains. (5) Recent volcanism is shifted toward the periphery of the basins and occurred after or at the late stages the formation of the lowlands. The above characteristics of the lowlands are better reconciled with the scenario in which their formation is due to a broad-scale mantle downwelling that started relatively early in the visible geologic history of Venus. The QC and LP are elevated structures roughly comparable in size. The formation of QC is commonly attributed to large-scale mantle positive diapirism while the formation of LP remains controversial and both mantle upwelling and downwelling models exist. QC and LP have similar characteristics such as broadly circular shape in plan-view, association with regional highlands, associated relatively young volcanism, and a topographic moat bordering both QC and LP from the North. Despite the above similarities, the striking differences between QC and LP are obvious too. LP is crowned by the highest mountain ranges on Venus and QC is bordered from the North by a common belt of ridges. LP itself makes up a regional highland within the upland of Ishtar Terra while QC produces a much less significant topographic anomaly on the background of the highland of Lada Terra. Highly deformed, tessera-like, terrain apparently makes up the basement of LP, and QC formed in the tessera-free area. Volcanic activity is concentrated in the central portion of LP while QC is a regionally important center of young volcanism. These differences, which probably can not be accounted for by simple difference in the size of LP and QC, suggest non-similar modes of the formation of both regional structures and do not favor the upwelling models of the formation of LP.

  14. Quercetin ameliorates imiquimod-induced psoriasis-like skin inflammation in mice via the NF-κB pathway.

    PubMed

    Chen, Haiming; Lu, Chuanjian; Liu, Huazhen; Wang, Maojie; Zhao, Hui; Yan, Yuhong; Han, Ling

    2017-07-01

    Quercetin (QC) is a dietary flavonoid abundant in many natural plants. A series of studies have shown that it has been shown to exhibit several biological properties, including anti-inflammatory, anti-oxidant, cardio-protective, vasodilatory, liver-protective and anti-cancer activities. However, so far the possible therapeutic effect of QC on psoriasis has not been reported. The present study was undertaken to evaluate the potential beneficial effect of QC in psoriasis using a generated imiquimod (IMQ)-induced psoriasis-like mouse model, and to further elucidate its underlying mechanisms of action. Effects of QC on PASI scores, back temperature, histopathological changes, oxidative/anti-oxidative indexes, pro-inflammatory cytokines and NF-κB pathway in IMQ-induced mice were investigated. Our results showed that QC could significantly reduce the PASI scores, decrease the temperature of the psoriasis-like lesions, and ameliorate the deteriorating histopathology in IMQ-induced mice. Moreover, QC effectively attenuated levels of TNF-α, IL-6 and IL-17 in serum, increased activities of GSH, CAT and SOD, and decreased the accumulation of MDA in skin tissue induced by IMQ in mice. The mechanism may be associated with the down-regulation of NF-κB, IKKα, NIK and RelB expression and up-regulation of TRAF3, which were critically involved in the non-canonical NF-κB pathway. In conclusion, our present study demonstrated that QC had appreciable anti-psoriasis effects in IMQ-induced mice, and the underlying mechanism may involve the improvement of antioxidant and anti-inflammatory status and inhibition on the activation of the NF-κB signaling. Hence, QC, a naturally occurring flavone with potent anti-psoriatic effects, has the potential for further development as a candidate for psoriasis treatment. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. SU-E-T-421: Failure Mode and Effects Analysis (FMEA) of Xoft Electronic Brachytherapy for the Treatment of Superficial Skin Cancers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoisak, J; Manger, R; Dragojevic, I

    Purpose: To perform a failure mode and effects analysis (FMEA) of the process for treating superficial skin cancers with the Xoft Axxent electronic brachytherapy (eBx) system, given the recent introduction of expanded quality control (QC) initiatives at our institution. Methods: A process map was developed listing all steps in superficial treatments with Xoft eBx, from the initial patient consult to the completion of the treatment course. The process map guided the FMEA to identify the failure modes for each step in the treatment workflow and assign Risk Priority Numbers (RPN), calculated as the product of the failure mode’s probability ofmore » occurrence (O), severity (S) and lack of detectability (D). FMEA was done with and without the inclusion of recent QC initiatives such as increased staffing, physics oversight, standardized source calibration, treatment planning and documentation. The failure modes with the highest RPNs were identified and contrasted before and after introduction of the QC initiatives. Results: Based on the FMEA, the failure modes with the highest RPN were related to source calibration, treatment planning, and patient setup/treatment delivery (Fig. 1). The introduction of additional physics oversight, standardized planning and safety initiatives such as checklists and time-outs reduced the RPNs of these failure modes. High-risk failure modes that could be mitigated with improved hardware and software interlocks were identified. Conclusion: The FMEA analysis identified the steps in the treatment process presenting the highest risk. The introduction of enhanced QC initiatives mitigated the risk of some of these failure modes by decreasing their probability of occurrence and increasing their detectability. This analysis demonstrates the importance of well-designed QC policies, procedures and oversight in a Xoft eBx programme for treatment of superficial skin cancers. Unresolved high risk failure modes highlight the need for non-procedural quality initiatives such as improved planning software and more robust hardware interlock systems.« less

  16. The Navy’s Quality Journey: Operational Implementation of TQL

    DTIC Science & Technology

    1993-04-01

    training. Dr. Kaoru Ishikawa "Guide to Ouality Control" "QC begins with education and ends with education. To implement TQC, we need to carry out...York: McGraw-Hill, 1986. 20. Ishikawa , Kaoru . What is Total Qualit Control? Englewood Cliffs, NJ: Prentice-Hall, Inc., 1985. 21. Ishikawa , Kaoru

  17. Stability of Tetrahydrocannabinol and Cannabidiol in Prepared Quality Control Medible Brownies.

    PubMed

    Wolf, Carl E; Poklis, Justin L; Poklis, Alphonse

    2017-03-01

    The legalization of marijuana in the USA for both medicinal and recreational use has increased in the past few years. Currently, 24 states have legalized marijuana for medicinal use. The US Drug Enforcement Administration has classified marijuana as a Schedule I substance. The US Food and Drug Administration does not regulate formulations or packages of marijuana that are currently marketed in states that have legalized marijuana. Marijuana edibles or "medibles" are typically packages of candies and baked goods consumed for medicinal as well as recreational marijuana use. They contain major psychoactive drug in marijuana, delta-9-tetrahydrocannabinol (THC) and/or cannabidiol (CBD), which has reputed medical properties. Presented is a method for the preparation and application of THC and CBD containing brownies used as quality control (QC) material for the analysis of marijuana or cannabinoid baked medibles. The performance parameters of the assay including possible matrix effects and cannabinoid stability in the brownie QC over time are presented. It was determined that the process used to prepare and bake the brownie control material did not degrade the THC or CBD. The brownie matrix was found not to interfere with the analysis of a THC or a CBD. Ten commercially available brownie matrixes were evaluated for potential interferences; none of them were found to interfere with the analysis of THC or CBD. The laboratory baked medible QC material was found to be stable at room temperature for at least 3 months. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  18. Effects of N-glycan precursor length diversity on quality control of protein folding and on protein glycosylation

    PubMed Central

    Samuelson, John; Robbins, Phillips W.

    2014-01-01

    Asparagine-linked glycans (N-glycans) of medically important protists have much to tell us about the evolution of N-glycosylation and of N-glycan-dependent quality control (N-glycan QC) of protein folding in the endoplasmic reticulum. While host N-glycans are built upon a dolichol-pyrophosphate-linked precursor with 14 sugars (Glc3Man9GlcNAc2), protist N-glycan precursors vary from Glc3Man9GlcNAc2 (Acanthamoeba) to Man9GlcNAc2 (Trypanosoma) to Glc3Man5GlcNAc2 (Toxoplasma) to Man5GlcNAc2 (Entamoeba, Trichomonas, and Eimeria) to GlcNAc2 (Plasmodium and Giardia) to zero (Theileria). As related organisms have differing N-glycan lengths (e.g. Toxoplasma, Eimeria, Plasmodium, and Theileria), the present N-glycan variation is based upon secondary loss of Alg genes, which encode enzymes that add sugars to the N-glycan precursor. An N-glycan precursor with Man5GlcNAc2 is necessary but not sufficient for N-glycan QC, which is predicted by the presence of the UDP-glucose:glucosyltransferase (UGGT) plus calreticulin and/or calnexin. As many parasites lack glucose in their N-glycan precursor, UGGT product may be identified by inhibition of glucosidase II. The presence of an armless calnexin in Toxoplasma suggests secondary loss of N-glycan QC from coccidia. Positive selection for N-glycan sites occurs in secreted proteins of organisms with NG-QC and is based upon an increased likelihood of threonine but not serine in the second position versus asparagine. In contrast, there appears to be selection against N-glycan length in Plasmodium and N-glycan site density in Toxoplasma. Finally, there is suggestive evidence for N-glycan-dependent ERAD in Trichomonas, which glycosylates and degrades the exogenous reporter mutant carboxypeptidase Y (CPY*). PMID:25475176

  19. Development of a quantitative-competitive PCR for quantification of human cytomegalovirus load and comparison with antigenaemia, viraemia and pp67 RNA detection by nucleic acid sequence-based amplification.

    PubMed

    Bergallo, M; Costa, C; Tarallo, S; Daniele, R; Merlino, C; Segoloni, G P; Negro Ponzi, A; Cavallo, R

    2006-06-01

    The human cytomegalovirus (HCMV) is an important pathogen in immunocompromised patients, such as transplant recipients. The use of sensitive and rapid diagnostic assays can have a great impact on antiviral prophylaxis and therapy monitoring and diagnosing active disease. Quantification of HCMV DNA may additionally have prognostic value and guide routine management. The aim of this study was to develop a reliable internally-controlled quantitative-competitive PCR (QC-PCR) for the detection and quantification of HCMV DNA viral load in peripheral blood and compare it with other methods: the HCMV pp65 antigenaemia assay in leukocyte fraction, the HCMV viraemia, both routinely employed in our laboratory, and the nucleic acid sequence-based amplification (NASBA) for detection of HCMV pp67-mRNA. Quantitative-competitive PCR is a procedure for nucleic acid quantification based on co-amplification of competitive templates, the target DNA and a competitor functioning as internal standard. In particular, a standard curve is generated by amplifying 10(2) to 10(5) copies of target pCMV-435 plasmid with 10(4) copies of competitor pCMV-C plasmid. Clinical samples derived from 40 kidney transplant patients were tested by spiking 10(4) copies of pCMV-C into the PCR mix as internal control, and comparing results with the standard curve. Of the 40 patients studied, 39 (97.5%) were positive for HCMV DNA by QC-PCR. While the correlation between the number of pp65-positive cells and the number of HCMV DNA genome copies/mL and the former and the pp67mRNA-positivity were statistically significant, there was no significant correlation between HCMV DNA viral load assayed by QC-PCR and HCMV viraemia. The QC-PCR assay could detect from 10(2) to over 10(7) copies of HCMV DNA with a range of linearity between 10(2) and 10(5) genomes.

  20. Sci-Fri AM: Quality, Safety, and Professional Issues 01: CPQR Technical Quality Control Suite Development including Quality Control Workload Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malkoske, Kyle; Nielsen, Michelle; Brown, Erika

    A close partnership between the Canadian Partnership for Quality Radiotherapy (CPQR) and the Canadian Organization of Medical Physicist’s (COMP) Quality Assurance and Radiation Safety Advisory Committee (QARSAC) has resulted in the development of a suite of Technical Quality Control (TQC) Guidelines for radiation treatment equipment, that outline specific performance objectives and criteria that equipment should meet in order to assure an acceptable level of radiation treatment quality. The framework includes consolidation of existing guidelines and/or literature by expert reviewers, structured stages of public review, external field-testing and ratification by COMP. The adopted framework for the development and maintenance of themore » TQCs ensures the guidelines incorporate input from the medical physics community during development, measures the workload required to perform the QC tests outlined in each TQC, and remain relevant (i.e. “living documents”) through subsequent planned reviews and updates. This presentation will show the Multi-Leaf Linear Accelerator document as an example of how feedback and cross-national work to achieve a robust guidance document. During field-testing, each technology was tested at multiple centres in a variety of clinic environments. As part of the defined feedback, workload data was captured. This lead to average time associated with testing as defined in each TQC document. As a result, for a medium-sized centre comprising 6 linear accelerators and a comprehensive brachytherapy program, we evaluate the physics workload to 1.5 full-time equivalent physicist per year to complete all QC tests listed in this suite.« less

  1. Quantum cost optimized design of 4-bit reversible universal shift register using reduced number of logic gate

    NASA Astrophysics Data System (ADS)

    Maity, H.; Biswas, A.; Bhattacharjee, A. K.; Pal, A.

    In this paper, we have proposed the design of quantum cost (QC) optimized 4-bit reversible universal shift register (RUSR) using reduced number of reversible logic gates. The proposed design is very useful in quantum computing due to its low QC, less no. of reversible logic gate and less delay. The QC, no. of gates, garbage outputs (GOs) are respectively 64, 8 and 16 for proposed work. The improvement of proposed work is also presented. The QC is 5.88% to 70.9% improved, no. of gate is 60% to 83.33% improved with compared to latest reported result.

  2. The Quasicontinuum Method: Overview, applications and current directions

    NASA Astrophysics Data System (ADS)

    Miller, Ronald E.; Tadmor, E. B.

    2002-10-01

    The Quasicontinuum (QC) Method, originally conceived and developed by Tadmor, Ortiz and Phillips [1] in 1996, has since seen a great deal of development and application by a number of researchers. The idea of the method is a relatively simple one. With the goal of modeling an atomistic system without explicitly treating every atom in the problem, the QC provides a framework whereby degrees of freedom are judiciously eliminated and force/energy calculations are expedited. This is combined with adaptive model refinement to ensure that full atomistic detail is retained in regions of the problem where it is required while continuum assumptions reduce the computational demand elsewhere. This article provides a review of the method, from its original motivations and formulation to recent improvements and developments. A summary of the important mechanics of materials results that have been obtained using the QC approach is presented. Finally, several related modeling techniques from the literature are briefly discussed. As an accompaniment to this paper, a website designed to serve as a clearinghouse for information on the QC method has been established at www.qcmethod.com. The site includes information on QC research, links to researchers, downloadable QC code and documentation.

  3. 40 CFR 98.354 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... digester, or lagoon) from which biogas is recovered, you must make the measurements or determinations specified in paragraphs (f)(1) through (f)(3) of this section. (1) You must continuously measure the biogas flow rate as specified in paragraph (h) of this section and determine the cumulative volume of biogas...

  4. 40 CFR 98.354 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... digester, or lagoon) from which biogas is recovered, you must make the measurements or determinations specified in paragraphs (f)(1) through (f)(3) of this section. (1) You must continuously measure the biogas flow rate as specified in paragraph (h) of this section and determine the cumulative volume of biogas...

  5. 40 CFR 98.354 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... digester, or lagoon) from which biogas is recovered, you must make the measurements or determinations specified in paragraphs (f)(1) through (f)(3) of this section. (1) You must continuously measure the biogas flow rate as specified in paragraph (h) of this section and determine the cumulative volume of biogas...

  6. Managing the Quality of Environmental Data in EPA Region 9

    EPA Pesticide Factsheets

    EPA Pacific Southwest, Region 9's Quality Assurance (QA) section's primary mission is to effectively oversee and carry out the Quality System and Quality Management Plan, and project-level quality assurance and quality control (QA/QC) activities.

  7. Implementation of GPS controlled highway construction equipment phase II.

    DOT National Transportation Integrated Search

    2008-01-01

    "During 2006, WisDOT and the Construction Materials and Support Center at UW-Madison worked together to develop : a specification and QC/QA procedures for GPS machine guidance on highway construction grading operations. These : specifications and pro...

  8. Implementation of GPS controlled highway construction equipment, phase III.

    DOT National Transportation Integrated Search

    2009-02-01

    Beginning in 2006, WisDOT and the Construction Material and Support Center (CMSC) at UW-Madison worked : together to develop the specifications and the QA/QC procedures for GPS machine guidance on highway grading : projects. These specifications and ...

  9. QC/QA : evaluation of effectiveness in Kentucky.

    DOT National Transportation Integrated Search

    2008-06-30

    Quality control and quality assurance in the highway industry is going through a cultural shift. There is a growing trend toward using the contractor data for acceptance and payment purpose. This has led to serious concerns about conflicts of interes...

  10. A field-deployable compound-specific isotope analyzer based on quantum cascade laser and hollow waveguide

    NASA Astrophysics Data System (ADS)

    Wu, Sheng; Deev, Andrei

    2013-01-01

    A field deployable Compound Specific Isotope Analyzer (CSIA) coupled with capillary chromatogrpahy based on Quantum Cascade (QC) lasers and Hollow Waveguide (HWG) with precision and chemical resolution matching mature Mass Spectroscopy has been achieved in our laboratory. The system could realize 0.3 per mil accuracy for 12C/13C for a Gas Chromatography (GC) peak lasting as short as 5 seconds with carbon molar concentration in the GC peak less than 0.5%. Spectroscopic advantages of HWG when working with QC lasers, i.e. single mode transmission, noiseless measurement and small sample volume, are compared with traditional free space and multipass spectroscopy methods.

  11. Eddy covariance carbonyl sulfide flux measurements with a quantum cascade laser absorption spectrometer

    NASA Astrophysics Data System (ADS)

    Gerdel, Katharina; Spielmann, Felix M.; Hammerle, Albin; Wohlfahrt, Georg

    2016-04-01

    Carbonyl sulfide (COS) is the most abundant sulfur containing trace gas present in the troposphere at concentrations of around 500 ppt. Recent interest in COS by the ecosystem-physiological community has been sparked by the fact that COS co-diffuses into plant leaves pretty much the same way as carbon dioxide (CO2) does, but in contrast to CO2, COS is not known to be emitted by plants. Thus uptake of COS by vegetation has the potential to be used as a tracer for canopy gross photosynthesis, which cannot be measured directly, however represents a key term in the global carbon cycle. Since a few years, quantum cascade laser absorption spectrometers (QCLAS) are commercially available with the precision, sensitivity and time response suitable for eddy covariance (EC) flux measurements. While there exist a handful of published reports on EC flux measurements in the recent literature, no rigorous investigation of the applicability of QCLAS for EC COS flux measurements has been carried out so far, nor have been EC processing and QA/QC steps developed for carbon dioxide and water vapor flux measurements within FLUXNET been assessed for COS. The aim of this study is to close this knowledge gap, to discuss critical steps in the post-processing chain of COS EC flux measurements and to devise best-practice guidelines for COS EC flux data processing. To this end we collected EC COS (and CO2, H2O and CO) flux measurements above a temperate mountain grassland in Austria over the vegetation period 2015 with a commercially available QCLAS. We discuss various aspects of EC data post-processing, in particular issues with the time-lag estimation between sonic anemometer and QCLAS signals and QCLAS time series detrending, as well as QA/QC, in particular flux detection limits, random flux uncertainty, the interaction of various processing steps with common EC QA/QC filters (e.g. detrending and stationarity tests), u*-filtering, etc.

  12. Construction method of QC-LDPC codes based on multiplicative group of finite field in optical communication

    NASA Astrophysics Data System (ADS)

    Huang, Sheng; Ao, Xiang; Li, Yuan-yuan; Zhang, Rui

    2016-09-01

    In order to meet the needs of high-speed development of optical communication system, a construction method of quasi-cyclic low-density parity-check (QC-LDPC) codes based on multiplicative group of finite field is proposed. The Tanner graph of parity check matrix of the code constructed by this method has no cycle of length 4, and it can make sure that the obtained code can get a good distance property. Simulation results show that when the bit error rate ( BER) is 10-6, in the same simulation environment, the net coding gain ( NCG) of the proposed QC-LDPC(3 780, 3 540) code with the code rate of 93.7% in this paper is improved by 2.18 dB and 1.6 dB respectively compared with those of the RS(255, 239) code in ITU-T G.975 and the LDPC(3 2640, 3 0592) code in ITU-T G.975.1. In addition, the NCG of the proposed QC-LDPC(3 780, 3 540) code is respectively 0.2 dB and 0.4 dB higher compared with those of the SG-QC-LDPC(3 780, 3 540) code based on the two different subgroups in finite field and the AS-QC-LDPC(3 780, 3 540) code based on the two arbitrary sets of a finite field. Thus, the proposed QC-LDPC(3 780, 3 540) code in this paper can be well applied in optical communication systems.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, Biao; Yamaguchi, Keiichi; Fukuoka, Mayuko

    To accelerate the logical drug design procedure, we created the program “NAGARA,” a plugin for PyMOL, and applied it to the discovery of small compounds called medical chaperones (MCs) that stabilize the cellular form of a prion protein (PrP{sup C}). In NAGARA, we constructed a single platform to unify the docking simulation (DS), free energy calculation by molecular dynamics (MD) simulation, and interfragment interaction energy (IFIE) calculation by quantum chemistry (QC) calculation. NAGARA also enables large-scale parallel computing via a convenient graphical user interface. Here, we demonstrated its performance and its broad applicability from drug discovery to lead optimization withmore » full compatibility with various experimental methods including Western blotting (WB) analysis, surface plasmon resonance (SPR), and nuclear magnetic resonance (NMR) measurements. Combining DS and WB, we discovered anti-prion activities for two compounds and tegobuvir (TGV), a non-nucleoside non-structural protein NS5B polymerase inhibitor showing activity against hepatitis C virus genotype 1. Binding profiles predicted by MD and QC are consistent with those obtained by SPR and NMR. Free energy analyses showed that these compounds stabilize the PrP{sup C} conformation by decreasing the conformational fluctuation of the PrP{sup C}. Because TGV has been already approved as a medicine, its extension to prion diseases is straightforward. Finally, we evaluated the affinities of the fragmented regions of TGV using QC and found a clue for its further optimization. By repeating WB, MD, and QC recursively, we were able to obtain the optimum lead structure. - Highlights: • NAGARA integrates docking simulation, molecular dynamics, and quantum chemistry. • We found many compounds, e.g., tegobuvir (TGV), that exhibit anti-prion activities. • We obtained insights into the action mechanism of TGV as a medical chaperone. • Using QC, we obtained useful information for optimization of the lead compound, TGV. • NAGARA is a convenient platform for drug discovery and lead optimization.« less

  14. Development of the QA/QC Procedures for a Neutron Interrogation System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Obhodas, Jasmina; Sudac, Davorin; Valkovic, Vladivoj

    In order to perform QA/QC procedures for a system dedicated to the neutron interrogation of objects for the presence of threat materials one needs to perform measurements of reference materials (RM) having the same (or similar) atomic ratios as real materials. It is well known that explosives, drugs, and various other benign materials, contain chemical elements such as hydrogen, oxygen, carbon and nitrogen in distinctly different quantities. For example, a high carbon-to-oxygen ratio (C/O) is characteristic of drugs. Explosives can be differentiated by measurement of both C/O and nitrogen-to-oxygen (N/O) ratios. The C/N ratio of the chemical warfare agents, coupledmore » with the measurement of elements such as fluorine and phosphorus, clearly differentiate them from the conventional explosives. Correlations between theoretical values and experimental results obtained in laboratory conditions for C/O and N/C ratios of simulants of hexogen (RDX), TNT, DLM2, TATP, cocaine, heroin, yperite, tetranitromethane, peroxide methylethyl-ketone, nitromethane and ethyleneglycol dinitrate are presented. (authors)« less

  15. Main factors causing intergranular and quasi-cleavage fractures at hydrogen-induced cracking in tempered martensitic steels

    NASA Astrophysics Data System (ADS)

    Kurokawa, Ami; Doshida, Tomoki; Hagihara, Yukito; Suzuki, Hiroshi; Takai, Kenichi

    2018-05-01

    Though intergranular (IG) and quasi-cleavage (QC) fractures have been widely recognized as typical fracture modes of the hydrogen-induced cracking in high-strength steels, the main factor has been unclarified yet. In the present study, the hydrogen content dependence on the main factor causing hydrogen-induced cracking has been examined through the fracture mode transition from QC to IG at the crack initiation site in the tempered martensitic steels. Two kinds of tempered martensitic steels were prepared to change the cohesive force due to the different precipitation states of Fe3C on the prior γ grain boundaries. A high amount of Si (H-Si) steel has a small amount of Fe3C on the prior austenite grain boundaries. Whereas, a low amount of Si (L-Si) steel has a large amount of Fe3C sheets on the grain boundaries. The fracture modes and initiations were observed using FE-SEM (Field Emission-Scanning Electron Microscope). The crack initiation sites of the H-Si steel were QC fracture at the notch tip under various hydrogen contents. While the crack initiation of the L-Si steel change from QC fracture at the notch tip to QC and IG fractures from approximately 10 µm ahead of the notch tip as increasing in hydrogen content. For L-Si steels, two possibilities are considered that the QC or IG fracture occurred firstly, or the QC and IG fractures occurred simultaneously. Furthermore, the principal stress and equivalent plastic strain distributions near the notch tip were calculated with FEM (Finite Element Method) analysis. The plastic strain was the maximum at the notch tip and the principle stress was the maximum at approximately 10 µm from the notch tip. The position of the initiation of QC and IG fracture observed using FE-SEM corresponds to the position of maximum strain and stress obtained with FEM, respectively. These findings indicate that the main factors causing hydrogen-induced cracking are different between QC and IG fractures.

  16. Statistical analysis of the Nb3Sn strand production for the ITER toroidal field coils

    NASA Astrophysics Data System (ADS)

    Vostner, A.; Jewell, M.; Pong, I.; Sullivan, N.; Devred, A.; Bessette, D.; Bevillard, G.; Mitchell, N.; Romano, G.; Zhou, C.

    2017-04-01

    The ITER toroidal field (TF) strand procurement initiated the largest Nb3Sn superconducting strand production hitherto. The industrial-scale production started in Japan in 2008 and finished in summer 2015. Six ITER partners (so-called Domestic Agencies, or DAs) are in charge of the procurement and involved eight different strand suppliers all over the world, of which four are using the bronze route (BR) process and four the internal-tin (IT) process. In total more than 500 tons have been produced including excess material covering losses during the conductor manufacturing process, in particular the cabling. The procurement is based on a functional specification where the main strand requirements like critical current, hysteresis losses, Cu ratio and residual resistance ratio are specified but not the strand production process or layout. This paper presents the analysis on the data acquired during the quality control (QC) process that was carried out to ensure the same conductor performance requirements are met by the different strand suppliers regardless of strand design. The strand QC is based on 100% billet testing and on applying statistical process control (SPC) limits. Throughout the production, samples adjacent to the strand pieces tested by the suppliers are cross-checked (‘verified’) by their respective DAs reference labs. The level of verification was lowered from 100% at the beginning of the procurement progressively to approximately 25% during the final phase of production. Based on the complete dataset of the TF strand production, an analysis of the SPC limits of the critical strand parameters is made and the related process capability indices are calculated. In view of the large-scale production and costs, key manufacturing parameters such as billet yield, number of breakages and piece-length distribution are also discussed. The results are compared among all the strand suppliers, focusing on the difference between BR and IT processes. Following the completion of the largest Nb3Sn strand production, our experience gained from monitoring the execution of the QC activities and from auditing the results from the measurements is summarised for future superconducting strand material procurement activities.

  17. Field correlation of PQI gauge with nuclear density gauge: phase 1.

    DOT National Transportation Integrated Search

    2006-12-01

    Traditionally, the Oklahoma Department of Transportation (ODOT) uses a nuclear density gauge as a quality control (QC) and quality assurance (QA) tool for in-place density. The nuclear-based devices, however, tend to have problems associated with lic...

  18. On Quality Control Procedures Being Adopted for TRMM LBA and KWAJEX Soundings Data Sets

    NASA Technical Reports Server (NTRS)

    Roy, B.; Halverson, Jeffrey B.; Starr, David OC. (Technical Monitor)

    2001-01-01

    During NASA's Tropical Rainfall Measuring Mission (TRMM) field campaigns Large Scale Biosphere Atmosphere (LBA) held in Amazonia (Brazil) in the period January- February, 1999, and the Kwajalein Experiment (KWAJEX) held in the Republic of Marshall Islands in the period between August-September, 1999, extensive radiosonde observations (raob) were collected using VIZ and Vaisala sondes which have different response characteristics. In all, 320 raob for LBA and 972 fixed raob for KWAJEX have been obtained and are being processed. Most atmospheric sensible heat source (Q1) and apparent moisture sink (Q2) budget studies are based on sounding data, and the accuracy of the raob is important especially in regions of deep moist convection. A data quality control (QC) project has been initiated at GSFC by the principal investigator (JBH), and this paper addresses some of the quantitative findings for the level I and II QC procedures. Based on these quantitative assessment of sensor (or system) biases associated with each type of sonde, the initial data repair work will be started. Evidence of moisture biases between the two different sondes (VIZ and Vaisala) has been shown earlier by Halverson et al. (2000). Vaisala humidity sensors are found to have a low-level dry bias in the boundary layer, whereas above 600 mb the VIZ sensor tends to register a dryer atmosphere. All raob data were subjected to a limit check based on an algorithm already well tested for the raob data obtained during the Tropical Ocean Global Atmosphere (TOGA-COARE).

  19. Integrating anthropogenic hazard data to facilitate research related to the exploitation of geo-resources

    NASA Astrophysics Data System (ADS)

    Kwiatek, Grzegorz; Blanke, Aglaja; Olszewska, Dorota; Orlecka-Sikora, Beata; Lasocki, Stanisław; Kozlovskaya, Elena; Nevalainen, Jouni; Schmittbuhl, Jean; Grasso, Jean-Robert; Schaming, Marc; Bigarre, Pascal; Kinscher, Jannes-Lennart; Saccorotti, Gilberto; Garcia, Alexander; Cassidy, Nigel; Toon, Sam; Mutke, Grzegorz; Sterzel, Mariusz; Szepieniec, Tomasz

    2017-04-01

    The Thematic Core Service "Anthropogenic Hazards" (TCS AH) integrates data and provides various data services in a form of complete e-research infrastructure for advanced analysis and geophysical modelling of anthropogenic hazard due to georesources exploitation. TCS AH is based on the prototype built in the framework of the IS-EPOS project POIG.02.03.00-14-090/13-00 (https://tcs.ah-epos.eu/). The TCS AH is currently being further developed within EPOS Implementation phase (H2020-INFRADEV-1-2015-1, INFRADEV-3-2015). The TCS AH aims to have a measurable impact on innovative research and development by providing a comprehensive, wide-scale and high quality research infrastructure available to the scientific community, industrial partners and public. One of the main deliverable of TCS AH is the access to numerous induced seismicity datasets called "episodes". The episode is defined as a comprehensive set of data describing the geophysical process induced or triggered by technological activity, which under certain circumstances can become hazardous for people, infrastructure and the environment. The episode is a time-correlated, standardized collection of geophysical, technological and other relevant geodata forming complete documentation of seismogenic process. In addition to the 6 episodes already implemented during previous phase of integration, and 3 episodes integrated within SHEER project, at least 18 new episodes related to conventional hydrocarbon extraction, reservoir treatment, underground mining and geothermal energy production are currently being integrated into the TCS AH. The heterogeneous multi-disciplinary data from different episodes are subjected to an extensive quality control (QC) procedure composed of five steps and involving the collaborative work of data providers, quality control team, IT team, that is being supervised by the quality control manager with the aid of Redmine platform. The first three steps of QC are performed at local data center and include the (1) transfer of episode data to the local data center, (2) data standardization and validation of formats, (3) metadata preparation according to TCS AH metadata scheme. The final two steps of QC are performed already at the level of TCS AH website and include (4) Contextual analysis of data quality followed by appearance of episode in TCS AH maintenance area, and finally the (5) Episode publication at TCS AH website.

  20. MO-F-CAMPUS-T-04: Implementation of a Standardized Monthly Quality Check for Linac Output Management in a Large Multi-Site Clinic

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, H; Yi, B; Prado, K

    2015-06-15

    Purpose: This work is to investigate the feasibility of a standardized monthly quality check (QC) of LINAC output determination in a multi-site, multi-LINAC institution. The QC was developed to determine individual LINAC output using the same optimized measurement setup and a constant calibration factor for all machines across the institution. Methods: The QA data over 4 years of 7 Varian machines over four sites, were analyzed. The monthly output constancy checks were performed using a fixed source-to-chamber-distance (SCD), with no couch position adjustment throughout the measurement cycle for all the photon energies: 6 and 18MV, and electron energies: 6, 9,more » 12, 16 and 20 MeV. The constant monthly output calibration factor (Nconst) was determined by averaging the machines’ output data, acquired with the same monthly ion chamber. If a different monthly ion chamber was used, Nconst was then re-normalized to consider its different NDW,Co-60. Here, the possible changes of Nconst over 4 years have been tracked, and the precision of output results based on this standardized monthly QA program relative to the TG-51 calibration for each machine was calculated. Any outlier of the group was investigated. Results: The possible changes of Nconst varied between 0–0.9% over 4 years. The normalization of absorbed-dose-to-water calibration factors corrects for up to 3.3% variations of different monthly QA chambers. The LINAC output precision based on this standardized monthly QC relative to the TG-51 output calibration is within 1% for 6MV photon energy and 2% for 18MV and all the electron energies. A human error in one TG-51 report was found through a close scrutiny of outlier data. Conclusion: This standardized QC allows for a reasonably simplified, precise and robust monthly LINAC output constancy check, with the increased sensitivity needed to detect possible human errors and machine problems.« less

  1. 222-S Laboratory Quality Assurance Plan. Revision 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meznarich, H.K.

    1995-07-31

    This Quality Assurance Plan provides,quality assurance (QA) guidance, regulatory QA requirements (e.g., 10 CFR 830.120), and quality control (QC) specifications for analytical service. This document follows the U.S Department of Energy (DOE) issued Hanford Analytical Services Quality Assurance Plan (HASQAP). In addition, this document meets the objectives of the Quality Assurance Program provided in the WHC-CM-4-2, Section 2.1. Quality assurance elements required in the Guidelines and Specifications for Preparing Quality Assurance Program Plans (QAMS-004) and Interim Guidelines and Specifications for Preparing Quality Assurance Project Plans (QAMS-005) from the US Environmental Protection Agency (EPA) are covered throughout this document. A qualitymore » assurance index is provided in the Appendix A. This document also provides and/or identifies the procedural information that governs laboratory operations. The personnel of the 222-S Laboratory and the Standards Laboratory including managers, analysts, QA/QC staff, auditors, and support staff shall use this document as guidance and instructions for their operational and quality assurance activities. Other organizations that conduct activities described in this document for the 222-S Laboratory shall follow this QA/QC document.« less

  2. An Evaluation of the Effect of a Quality Circle Intervention on Attitudinal Variables in Three DoD Organizations.

    DTIC Science & Technology

    1984-09-01

    and Control Groups on the Pretest .......................... 57 XI. T-tests Between Full-term QC and Control Groups on the Posttest ...two groups differed at the pretest in terms of self-rated job performance and job involvement. At the posttest , one significant result emerged. Table...8 Static Group Designs .......................... 9 Pretest / posttest Designs ...................... 9 Nonequivalent Control Group

  3. A single fracture toughness parameter for fibrous composite laminates

    NASA Technical Reports Server (NTRS)

    Poe, C. C., Jr.

    1981-01-01

    A general fracture toughness parameter Qc was previously derived and verified to be a material constant, independent of layup, for centrally cracked boron aluminum composite specimens. The specimens were made with various proportions of 0 and + or - 45 degree plies. A limited amount of data indicated that the ratio Qc/epsilon tuf' where epsilon tuf is the ultimate tensile strain of the fibers, might be a constant for all composite laminates, regardless of material and layup. In that case, a single value of Qc/epsilon tuf could be used to predict the fracture toughness of all fibrous composite laminates from only the elastic constants and epsilon tuf. Values of Qc/epsilon tuf were calculated for centrally cracked specimens made from graphite/polyimide, graphite/epoxy, E glass/epoxy, boron/epoxy, and S glass graphite/epoxy materials with numerous layups. Within ordinary scatter, the data indicate that Qc/epsilon tuf is a constant for all laminates that did not split extensively at the crack tips or have other deviate failure modes.

  4. The isoenzyme of glutaminyl cyclase is an important regulator of monocyte infiltration under inflammatory conditions

    PubMed Central

    Cynis, Holger; Hoffmann, Torsten; Friedrich, Daniel; Kehlen, Astrid; Gans, Kathrin; Kleinschmidt, Martin; Rahfeld, Jens-Ulrich; Wolf, Raik; Wermann, Michael; Stephan, Anett; Haegele, Monique; Sedlmeier, Reinhard; Graubner, Sigrid; Jagla, Wolfgang; Müller, Anke; Eichentopf, Rico; Heiser, Ulrich; Seifert, Franziska; Quax, Paul H A; de Vries, Margreet R; Hesse, Isabel; Trautwein, Daniela; Wollert, Ulrich; Berg, Sabine; Freyse, Ernst-Joachim; Schilling, Stephan; Demuth, Hans-Ulrich

    2011-01-01

    Acute and chronic inflammatory disorders are characterized by detrimental cytokine and chemokine expression. Frequently, the chemotactic activity of cytokines depends on a modified N-terminus of the polypeptide. Among those, the N-terminus of monocyte chemoattractant protein 1 (CCL2 and MCP-1) is modified to a pyroglutamate (pE-) residue protecting against degradation in vivo. Here, we show that the N-terminal pE-formation depends on glutaminyl cyclase activity. The pE-residue increases stability against N-terminal degradation by aminopeptidases and improves receptor activation and signal transduction in vitro. Genetic ablation of the glutaminyl cyclase iso-enzymes QC (QPCT) or isoQC (QPCTL) revealed a major role of isoQC for pE1-CCL2 formation and monocyte infiltration. Consistently, administration of QC-inhibitors in inflammatory models, such as thioglycollate-induced peritonitis reduced monocyte infiltration. The pharmacologic efficacy of QC/isoQC-inhibition was assessed in accelerated atherosclerosis in ApoE3*Leiden mice, showing attenuated atherosclerotic pathology following chronic oral treatment. Current strategies targeting CCL2 are mainly based on antibodies or spiegelmers. The application of small, orally available inhibitors of glutaminyl cyclases represents an alternative therapeutic strategy to treat CCL2-driven disorders such as atherosclerosis/restenosis and fibrosis. PMID:21774078

  5. A novel construction method of QC-LDPC codes based on CRT for optical communications

    NASA Astrophysics Data System (ADS)

    Yuan, Jian-guo; Liang, Meng-qi; Wang, Yong; Lin, Jin-zhao; Pang, Yu

    2016-05-01

    A novel construction method of quasi-cyclic low-density parity-check (QC-LDPC) codes is proposed based on Chinese remainder theory (CRT). The method can not only increase the code length without reducing the girth, but also greatly enhance the code rate, so it is easy to construct a high-rate code. The simulation results show that at the bit error rate ( BER) of 10-7, the net coding gain ( NCG) of the regular QC-LDPC(4 851, 4 546) code is respectively 2.06 dB, 1.36 dB, 0.53 dB and 0.31 dB more than those of the classic RS(255, 239) code in ITU-T G.975, the LDPC(32 640, 30 592) code in ITU-T G.975.1, the QC-LDPC(3 664, 3 436) code constructed by the improved combining construction method based on CRT and the irregular QC-LDPC(3 843, 3 603) code constructed by the construction method based on the Galois field ( GF( q)) multiplicative group. Furthermore, all these five codes have the same code rate of 0.937. Therefore, the regular QC-LDPC(4 851, 4 546) code constructed by the proposed construction method has excellent error-correction performance, and can be more suitable for optical transmission systems.

  6. Stable Isotopes, Quantum Computing and Consciousness

    NASA Astrophysics Data System (ADS)

    Berezin, Alexander A.

    2000-10-01

    Recent proposals of quantum computing/computers (QC) based on nuclear spins suggest that consciousness (CON) activity may be related (assisted) to subset of C13 atoms incorporated randomly, or quasirandomly, in neural structures. Consider two DNA chains. Even if they are completely identical chemically (same sequence of codons), patterns of 12C and 13C isotopes in them are different (possible origin of personal individuality). Perhaps it is subsystem of nuclear spins of 13C "sublattice" which forms dynamical system capable of QC and on which CON is "spanned". Some issues related to this hypothesis are: (1) existence of CON-driven positional correlations among C13 atoms, (2) motion (hopping) of C13 via enhanced neutron tunneling, cf. quantum "anti Zeno-effect", (3) possible optimization of concentration of QC-active C13 atoms above their standard isotopic abundance, (4) characteristic time-scales for operation of C13-based QC (perrhaps, broad range of scales), (5) reflection of QC dynamics of C13 on CON, (6) possibility that C13-based QC operates "above" level of "regular" CON (perhaps, Jungian sub/super-CON), (7) isotopicity as connector to universal Library of Patterns ("Platonic World"), (8) self-stabilization of coherence in C13 (sub)system. Some of this questions are, in principle, experimentally addressable through shifting of isotopic abundances.

  7. 40 CFR 98.54 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... in paragraphs (b)(1) through (b)(3) of this section. (1) EPA Method 320, Measurement of Vapor Phase...) Direct measurement (such as using flow meters or weigh scales). (2) Existing plant procedures used for accounting purposes. (d) You must conduct all required performance tests according to the methods in § 98.54...

  8. Unique and Conserved Features of the Barley Root Meristem

    PubMed Central

    Kirschner, Gwendolyn K.; Stahl, Yvonne; Von Korff, Maria; Simon, Rüdiger

    2017-01-01

    Plant root growth is enabled by root meristems that harbor the stem cell niches as a source of progenitors for the different root tissues. Understanding the root development of diverse plant species is important to be able to control root growth in order to gain better performances of crop plants. In this study, we analyzed the root meristem of the fourth most abundant crop plant, barley (Hordeum vulgare). Cell division studies revealed that the barley stem cell niche comprises a Quiescent Center (QC) of around 30 cells with low mitotic activity. The surrounding stem cells contribute to root growth through the production of new cells that are displaced from the meristem, elongate and differentiate into specialized root tissues. The distal stem cells produce the root cap and lateral root cap cells, while cells lateral to the QC generate the epidermis, as it is typical for monocots. Endodermis and inner cortex are derived from one common initial lateral to the QC, while the outer cortex cell layers are derived from a distinct stem cell. In rice and Arabidopsis, meristem homeostasis is achieved through feedback signaling from differentiated cells involving peptides of the CLE family. Application of synthetic CLE40 orthologous peptide from barley promotes meristem cell differentiation, similar to rice and Arabidopsis. However, in contrast to Arabidopsis, the columella stem cells do not respond to the CLE40 peptide, indicating that distinct mechanisms control columella cell fate in monocot and dicot plants. PMID:28785269

  9. Visualization and Quality Control Web Tools for CERES Products

    NASA Astrophysics Data System (ADS)

    Mitrescu, C.; Doelling, D. R.; Rutan, D. A.

    2016-12-01

    The CERES project continues to provide the scientific community a wide variety of satellite-derived data products such as observed TOA broadband shortwave and longwave observed fluxes, computed TOA and Surface fluxes, as well as cloud, aerosol, and other atmospheric parameters. They encompass a wide range of temporal and spatial resolutions, suited to specific applications. Now in its 16-year, CERES products are mostly used by climate modeling communities that focus on global mean energetics, meridianal heat transport, and climate trend studies. In order to serve all our users, we developed a web-based Ordering and Visualization Tool (OVT). Using Opens Source Software such as Eclipse, java, javascript, OpenLayer, Flot, Google Maps, python, and others, the OVT Team developed a series of specialized functions to be used in the process of CERES Data Quality Control (QC). We mention 1- and 2-D histogram, anomaly, deseasonalization, temporal and spatial averaging, side-by-side parameter comparison, and others that made the process of QC far easier and faster, but more importantly far more portable. We are now in the process of integrating ground site observed surface fluxes to further facilitate the CERES project to QC the CERES computed surface fluxes. These features will give users the opportunity to perform their own comparisons of the CERES computed surface fluxes and observed ground site fluxes. An overview of the CERES OVT basic functions using Open Source Software, as well as future steps in expanding its capabilities will be presented at the meeting.

  10. Quality control and quality assurance in genotypic data for genome-wide association studies

    PubMed Central

    Laurie, Cathy C.; Doheny, Kimberly F.; Mirel, Daniel B.; Pugh, Elizabeth W.; Bierut, Laura J.; Bhangale, Tushar; Boehm, Frederick; Caporaso, Neil E.; Cornelis, Marilyn C.; Edenberg, Howard J.; Gabriel, Stacy B.; Harris, Emily L.; Hu, Frank B.; Jacobs, Kevin; Kraft, Peter; Landi, Maria Teresa; Lumley, Thomas; Manolio, Teri A.; McHugh, Caitlin; Painter, Ian; Paschall, Justin; Rice, John P.; Rice, Kenneth M.; Zheng, Xiuwen; Weir, Bruce S.

    2011-01-01

    Genome-wide scans of nucleotide variation in human subjects are providing an increasing number of replicated associations with complex disease traits. Most of the variants detected have small effects and, collectively, they account for a small fraction of the total genetic variance. Very large sample sizes are required to identify and validate findings. In this situation, even small sources of systematic or random error can cause spurious results or obscure real effects. The need for careful attention to data quality has been appreciated for some time in this field, and a number of strategies for quality control and quality assurance (QC/QA) have been developed. Here we extend these methods and describe a system of QC/QA for genotypic data in genome-wide association studies. This system includes some new approaches that (1) combine analysis of allelic probe intensities and called genotypes to distinguish gender misidentification from sex chromosome aberrations, (2) detect autosomal chromosome aberrations that may affect genotype calling accuracy, (3) infer DNA sample quality from relatedness and allelic intensities, (4) use duplicate concordance to infer SNP quality, (5) detect genotyping artifacts from dependence of Hardy-Weinberg equilibrium (HWE) test p-values on allelic frequency, and (6) demonstrate sensitivity of principal components analysis (PCA) to SNP selection. The methods are illustrated with examples from the ‘Gene Environment Association Studies’ (GENEVA) program. The results suggest several recommendations for QC/QA in the design and execution of genome-wide association studies. PMID:20718045

  11. CARINA data synthesis project: pH data scale unification and cruise adjustments

    NASA Astrophysics Data System (ADS)

    Velo, A.; Pérez, F. F.; Lin, X.; Key, R. M.; Tanhua, T.; de La Paz, M.; Olsen, A.; van Heuven, S.; Jutterström, S.; Ríos, A. F.

    2010-05-01

    Data on carbon and carbon-relevant hydrographic and hydrochemical parameters from 188 previously non-publicly available cruise data sets in the Artic Mediterranean Seas (AMS), Atlantic Ocean and Southern Ocean have been retrieved and merged to a new database: CARINA (CARbon IN the Atlantic Ocean). These data have gone through rigorous quality control (QC) procedures to assure the highest possible quality and consistency. The data for most of the measured parameters in the CARINA database were objectively examined in order to quantify systematic differences in the reported values. Systematic biases found in the data have been corrected in the data products, three merged data files with measured, calculated and interpolated data for each of the three CARINA regions; AMS, Atlantic Ocean and Southern Ocean. Out of a total of 188 cruise entries in the CARINA database, 59 reported pH measured values. All reported pH data have been unified to the Sea-Water Scale (SWS) at 25 °C. Here we present details of the secondary QC of pH in the CARINA database and the scale unification to SWS at 25 °C. The pH scale has been converted for 36 cruises. Procedures of quality control, including crossover analysis between cruises and inversion analysis are described. Adjustments were applied to the pH values for 21 of the cruises in the CARINA dataset. With these adjustments the CARINA database is consistent both internally as well as with the GLODAP data, an oceanographic data set based on the World Hydrographic Program in the 1990s. Based on our analysis we estimate the internal consistency of the CARINA pH data to be 0.005 pH units. The CARINA data are now suitable for accurate assessments of, for example, oceanic carbon inventories and uptake rates, for ocean acidification assessment and for model validation.

  12. Assessment of mammographic film processor performance in a hospital and mobile screening unit.

    PubMed

    Murray, J G; Dowsett, D J; Laird, O; Ennis, J T

    1992-12-01

    In contrast to the majority of mammographic breast screening programmes, film processing at this centre occurs on site in both hospital and mobile trailer units. Initial (1989) quality control (QC) sensitometric tests revealed a large variation in film processor performance in the mobile unit. The clinical significance of these variations was assessed and acceptance limits for processor performance determined. Abnormal mammograms were used as reference material and copied using high definition 35 mm film over a range of exposure settings. The copies were than matched with QC film density variation from the mobile unit. All films were subsequently ranked for spatial and contrast resolution. Optimal values for processing time of 2 min (equivalent to film transit time 3 min and developer time 46 s) and temperature of 36 degrees C were obtained. The widespread anomaly of reporting film transit time as processing time is highlighted. Use of mammogram copies as a means of measuring the influence of film processor variation is advocated. Careful monitoring of the mobile unit film processor performance has produced stable quality comparable with the hospital based unit. The advantages of on site film processing are outlined. The addition of a sensitometric step wedge to all mammography film stock as a means of assessing image quality is recommended.

  13. The Ocean Observatories Initiative Data Management and QA/QC: Lessons Learned and the Path Ahead

    NASA Astrophysics Data System (ADS)

    Vardaro, M.; Belabbassi, L.; Garzio, L. M.; Knuth, F.; Smith, M. J.; Kerfoot, J.; Crowley, M. F.

    2016-02-01

    The Ocean Observatories Initiative (OOI) is a multi-decadal, NSF-funded program that will provide long-term, near real-time cabled and telemetered measurements of climate variability, ocean circulation, ecosystem dynamics, air-sea exchange, seafloor processes, and plate-scale geodynamics. The OOI platforms consist of seafloor sensors, fixed moorings, and mobile assets containing over 700 operational instruments in the Atlantic and Pacific oceans. Rutgers University operates the Cyberinfrastructure (CI) component of the OOI, which acquires, processes and distributes data to scientists, researchers, educators and the public. It will also provide observatory mission command and control, data assessment and distribution, and long-term data management. The Rutgers Data Management Team consists of a data manager and four data evaluators, who are tasked with ensuring data completeness and quality, as well as interaction with OOI users to facilitate data delivery and utility. Here we will discuss the procedures developed to guide the data team workflow, the automated QC algorithms and human-in-the-loop (HITL) annotations that are used to flag suspect data (whether due to instrument failures, biofouling, or unanticipated events), system alerts and alarms, long-term data storage and CF (Climate and Forecast) standard compliance, and the lessons learned during construction and the first several months of OOI operations.

  14. CUSTOMER/SUPPLIER ACCOUNTABILITY AND PROGRAM IMPLEMENTATION

    EPA Science Inventory

    Quality assurance (QA) and quality control (QC) are the basic components of a QA program, which is a fundamental quality management tool. he quality of outputs and services strongly depends on the caliber of the communications between the "customer" and the "supplier." lear under...

  15. Investigation of the Asphalt Pavement Analyzer (APA) testing program in Nebraska.

    DOT National Transportation Integrated Search

    2008-03-01

    The asphalt pavement analyzer (APA) has been widely used to evaluate hot-mix asphalt (HMA) rutting potential in mix : design and quality control-quality assurance (QC-QA) applications, because the APA testing and its data analyses are : relatively si...

  16. THE MAQC (MICROARRAY QUALITY CONTROL) PROJECT: CALIBRATED RNA SAMPLES, REFERENCE DATASETS, AND QC METRICS AND THRESHOLDS

    EPA Science Inventory

    FDAs Critical Path Initiative identifies pharmacogenomics and toxicogenomics as key opportunities in advancing medical product development and personalized medicine, and the Guidance for Industry: Pharmacogenomic Data Submissions has been released. Microarrays represent a co...

  17. Implementation of GPS Machine Controlled Grading - Phase III (2008) and Technical Training

    DOT National Transportation Integrated Search

    2009-02-01

    Beginning in 2006, WisDOT and the Construction Material and Support Center (CMSC) at UW-Madison worked together to develop the specifications and the QA/QC procedures for GPS machine guidance on highway grading projects. These specifications and proc...

  18. LipidQC: Method Validation Tool for Visual Comparison to SRM 1950 Using NIST Interlaboratory Comparison Exercise Lipid Consensus Mean Estimate Values.

    PubMed

    Ulmer, Candice Z; Ragland, Jared M; Koelmel, Jeremy P; Heckert, Alan; Jones, Christina M; Garrett, Timothy J; Yost, Richard A; Bowden, John A

    2017-12-19

    As advances in analytical separation techniques, mass spectrometry instrumentation, and data processing platforms continue to spur growth in the lipidomics field, more structurally unique lipid species are detected and annotated. The lipidomics community is in need of benchmark reference values to assess the validity of various lipidomics workflows in providing accurate quantitative measurements across the diverse lipidome. LipidQC addresses the harmonization challenge in lipid quantitation by providing a semiautomated process, independent of analytical platform, for visual comparison of experimental results of National Institute of Standards and Technology Standard Reference Material (SRM) 1950, "Metabolites in Frozen Human Plasma", against benchmark consensus mean concentrations derived from the NIST Lipidomics Interlaboratory Comparison Exercise.

  19. Assessing the influence of component processing and donor characteristics on quality of red cell concentrates using quality control data.

    PubMed

    Jordan, A; Chen, D; Yi, Q-L; Kanias, T; Gladwin, M T; Acker, J P

    2016-07-01

    Quality control (QC) data collected by blood services are used to monitor production and to ensure compliance with regulatory standards. We demonstrate how analysis of quality control data can be used to highlight the sources of variability within red cell concentrates (RCCs). We merged Canadian Blood Services QC data with manufacturing and donor records for 28 227 RCC between June 2011 and October 2014. Units were categorized based on processing method, bag manufacturer, donor age and donor sex, then assessed based on product characteristics: haemolysis and haemoglobin levels, unit volume, leucocyte count and haematocrit. Buffy-coat method (top/bottom)-processed units exhibited lower haemolysis than units processed using the whole-blood filtration method (top/top). Units from female donors exhibited lower haemolysis than male donations. Processing method influenced unit volume and the ratio of additive solution to residual plasma. Stored red blood cell characteristics are influenced by prestorage processing and donor factors. Understanding the relationship between processing, donors and RCC quality will help blood services to ensure the safety of transfused products. © 2016 International Society of Blood Transfusion.

  20. Automated locomotor activity monitoring as a quality control assay for mass-reared tephritid flies.

    PubMed

    Dominiak, Bernard C; Fanson, Benjamin G; Collins, Samuel R; Taylor, Phillip W

    2014-02-01

    The Sterile Insect Technique (SIT) requires vast numbers of consistently high quality insects to be produced over long periods. Quality control (QC) procedures are critical to effective SIT, both providing quality assurance and warning of operational deficiencies. We here present a potential new QC assay for mass rearing of Queensland fruit flies (Bactrocera tryoni Froggatt) for SIT; locomotor activity monitoring. We investigated whether automated locomotor activity monitors (LAMs) that simply detect how often a fly passes an infrared sensor in a glass tube might provide similar insights but with much greater economy. Activity levels were generally lower for females than for males, and declined over five days in the monitor for both sexes. Female activity levels were not affected by irradiation, but males irradiated at 60 or 70 Gy had reduced activity levels compared with unirradiated controls. We also found some evidence that mild heat shock of pupae results in adults with reduced activity. LAM offers a convenient, effective and economical assay to probe such changes. © 2013 Society of Chemical Industry.

  1. Lens Coupled Quantum Cascade Laser

    NASA Technical Reports Server (NTRS)

    Lee, Alan Wei Min (Inventor); Hu, Qing (Inventor)

    2013-01-01

    Terahertz quantum cascade (QC) devices are disclosed that can operate, e.g., in a range of about 1 THz to about 10 THz. In some embodiments, QC lasers are disclosed in which an optical element (e.g., a lens) is coupled to an output facet of the laser's active region to enhance coupling of the lasing radiation from the active region to an external environment. In other embodiments, terahertz amplifier and tunable terahertz QC lasers are disclosed.

  2. Integrability of the coupled cubic-quintic complex Ginzburg-Landau equations and multiple-soliton solutions via mathematical methods

    NASA Astrophysics Data System (ADS)

    Selima, Ehab S.; Seadawy, Aly R.; Yao, Xiaohua; Essa, F. A.

    2018-02-01

    This paper is devoted to study the (1+1)-dimensional coupled cubic-quintic complex Ginzburg-Landau equations (cc-qcGLEs) with complex coefficients. This equation can be used to describe the nonlinear evolution of slowly varying envelopes of periodic spatial-temporal patterns in a convective binary fluid. Dispersion relation and properties of cc-qcGLEs are constructed. Painlevé analysis is used to check the integrability of cc-qcGLEs and to establish the Bäcklund transformation form. New traveling wave solutions and a general form of multiple-soliton solutions of cc-qcGLEs are obtained via the Bäcklund transformation and simplest equation method with Bernoulli, Riccati and Burgers’ equations as simplest equations.

  3. A comparison of battery testing protocols: Those used by the U.S. advanced battery consortium and those used in China

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, David C.; Christophersen, Jon P.; Bennett, Taylor

    Two testing protocols, QC/T 743 and those used by the U.S. Advanced Battery Consortium (USABC), were compared using cells based on LiFePO4/graphite chemistry. Differences in the protocols directly affected the data and the performance decline mechanisms deduced from the data. A change in capacity fade mechanism from linear-with-time to t1/2 was observed when the power density measurement was included in the QC/T 743 testing. The rate of resistance increase was linear with time using both protocols. Overall, the testing protocols produced very similar data when the testing conditions and metrics used to define performance were similar. The choice of depthmore » of discharge and pulse width had a direct effect on estimated cell life. At greater percent depth of discharge (%DOD) and pulse width, the estimated life was shorter that at lower %DOD and shorter pulse width. This indicates that cells which were at the end of life based on the USABC protocol were not at end of life based on the QC/T 743 protocol by a large margin.« less

  4. Gas Phase Photoacoustic Sensor at 8.41 mu m Using Quartz Tuning Forks and Amplitude Modulated Quantum Cascade Lasers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wojcik, Michael D.; Phillips, Mark C.; Cannon, Bret D.

    2006-10-01

    We demonstrate the performance of a novel long-wave infrared photoacoustic laser absorbance spectrometer for gas-phase species using an amplitude modulated (AM) quantum cascade (QC) laser and a quartz tuning fork microphone. Photoacoustic signal was generated by focusing the output of a Fabry-Perot QC laser operating at 8.41 ?m between the legs of a quartz tuning fork which served as a transducer for the transient acoustic pressure wave. The QC laser was modulated at the resonant frequency of the tuning fork (32.8 kHz) and delivered a modest 5.3 mW at the tuning fork. This spectrometer was calibrated using the infrared absorbermore » Freon-134a by performing a simultaneous absorption measurement using a 35 cm absorption cell. The NEAS of this instrument was determined to be 2 x 10{sup -8} W cm-1 Hz{sup -1/2}. A corresponding theoretical analysis of the instrument sensitivity is presented and is capable of quantitatively reproducing the experimental NEAS, indicating that the fundamental sensitivity of this technique is limited by the noise floor of the tuning fork itself.« less

  5. The Ultra-filtration of Macromolecules with Different Conformations and Configurations through Nanopores

    NASA Astrophysics Data System (ADS)

    Ge, Hui

    This Ph. D. thesis presents our study on the ultrafiltration of polymers with different configurations and conformations; namly, theoretically, the passing of polymer chains through a nanopore under an elongational flow filed has been studied for years, but experimental studies are rare because of two following reasons: (1) lacks a precise method to investigate how individual single polymer chain pass through a nanopore; (2) it is difficult, if not impossible, to obtain a set of polymer samples with a narrow molar mass distribution and a uniform structures; except for linear chains. The central question in this study is to find the critical (minimum) flow rate (qc) for each kind of chains, at which the chains can pass through a given nanopore. A comparison of the measured and calculated qc leads to a better understanding how different chains are deformed, stretched and pulled through a nanopore. We have developed a novel method of combinating static and dynamic laser light scattering (LLS) to precisely measure the relative retention concentration ((C0 - C)/C0). Chapter 1 briefly introduces the theoretical background of how applications and lists some of resent research progresses in this area. Polymer with various configurations and conformations pass through nanopores; including polymer linear chains, stars polymer, branched polymers, polymer micelles are introduced. Among them, the de Gennes and Brochard-Wyart's predictions of polymer linear and star chains passing through nanopores are emphasized, in which they predicted that qc of linear chain is qc ≃ kBT/(3pieta), where kB, T and eta are the Boltzmann constant, the absolutely temperature, and the viscosity of solvent, respectively, independent of both the chain length and the pore size; and for star chains passing through nanopores, there exist a optimal entering arm numbers, namely, the star chains passing through nanopores. Chapter 2 details basic theory of static and dynamic laser light scattering (LLS), including its instrumentation and our ultrafiltration setup. Chapter 3 briefly introduces the sample preparation, including the history and mechanism of anionic living polymerization, as well as how we used a novel home-made set-up to prepare linear polystyrene with different chain lengths and star polystyrene with various arm numbers and lengths. Chapter 4 summarizes our measured critical flow rates (qc) of linear polymer chains with different lengths for nanopores with different sizes, since the flow rate is directly related to the hydrodynamic force, we have developed a sensitive method (down to tens fN) to directly assess how much the hydrodynamic force (Fh) is required to overcome the weak entropy elasticity and stretch individual coiled chains in solution. Our method is completely different from the using existing optical tweezers or AFM, because they measure the relatively stronger enthalpy elasticity. Our results confirm that qc is indeed independent of the chain length, but decreases as the pore size increases. The value of qc is ˜10--200 times smaller than kBT/(3pieta). Such a discrepancy has been attributed to the rough assumption made by de Gennes and his coworkers; namely, each chain segment "blob" confined inside the pore is not a hard sphere so that the effective length along the flow direction is much longer than the pore diameter. Finally, using the solution temperature, we varied the chain conformation, our result shows that q c has a minimum which is near, but not exactly located at the theta temperature, might leading to a better way to determine the true ideal state of a polymer solution, at which all viral coefficients, not only the second vanish. Chapter 5 uses polymer solutions made of different mixtures of linear and star chains, we have demonstrated that flushing these solution mixtures through a nanopore with a properly chosen flow rate can effectively and cleanly separate linear and star chains no matter whether linear chains are larger or smaller than star chains. Chapter 6 further investigates how star-like polystyrene pass through a given nanopore under the flow field. Star polystyrene chains with different arm lengths (LA) and numbers (f) passing through a nanopore (20 nm) under an elongational flow field was investigated in terms of the flow-rate dependent relative retention ((C0 - C)/C0), where C 0 and C are the polymer concentrations before and after the ultrafiltration. Our results reveal that for a given arm length (LA), the critical flow rate (qc,star), below which star chains are blocked, dramatically increases with the total arm numbers (f); but for a given f, is nearly independent on LA, contradictory to the previous prediction made by de Gennes and Brochard-Wyart. We have revised their theory in the region fin < fout and also accounted for the effective length of each blob, where fin and fout are the numbers of arms inside and outside the pore, respectively. In the revision, we show that qc,star is indeed independent of LA but related to f and f in in two different ways, depending on whether fin ≤ f/2 or ≥ f/2. A comparison of our experimental and calculated results reveals that most of star chains pass through the nanopores with fin ˜ f/2. Further study of the temperature dependent (C0 - C)/C 0 of polystyrene in cyclohexane reveals that there exists a minimum of qc,star at ˜38 °C, close to its theta temperature (-34.5 °C).

  6. 40 CFR 98.164 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... CEMS is not used to measure GHG emissions. (2) Fossil fuel consumption, when, pursuant to § 98.33(e), the owner or operator of a unit that uses CEMS to quantify CO2 emissions and that combusts both fossil...

  7. 40 CFR 98.164 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... CEMS is not used to measure GHG emissions. (2) Fossil fuel consumption, when, pursuant to § 98.33(e), the owner or operator of a unit that uses CEMS to quantify CO2 emissions and that combusts both fossil...

  8. Ameliorative effect of nanoencapsulated flavonoid against chlorpyrifos-induced hepatic oxidative damage and immunotoxicity in Wistar rats.

    PubMed

    Suke, Sanvidhan G; Sherekar, Prasad; Kahale, Vivek; Patil, Shaktipal; Mundhada, Dharmendra; Nanoti, Vivek M

    2018-04-18

    The theme of the present work is to evaluate the protective effect of nanoencapsulated quercetin (NEQ) against chlorpyrifos (CPF)-induced hepatic damage and immune alterations in animals. Nanoparticles (NP) drug encapsulation was prepared. Forty male Wistar rats were divided into eight groups. Two groups served as control and CPF (13.5 mg/kg) treatment for 28 days. Other three groups were free quercetin (QC), NP and NEQ treated with 3 mg/kg respectively for 15 days; whereas remaining three groups received treatment of CPF and QC, NP, NEQ, respectively, for 15 days. The results show that significantly altered oxidative stress in the liver tissue and liver enzyme parameters in blood and immune responses in CPF-treated rats compared to controls. Administration of NEQ attenuated biochemical and immunological parameters. The liver histopathological analysis confirmed pathological improvement. Hence, use of NEQ appeared to be beneficial to a great extent in attenuating and restoring hepatic oxidative damage and immune alteration sustained by pesticide exposure. © 2018 Wiley Periodicals, Inc.

  9. DETERMINATION OF NATIONAL DIAGNOSTIC REFERENCE LEVELS IN COMPUTED TOMOGRAPHY EXAMINATIONS OF IRAN BY A NEW QUALITY CONTROL-BASED DOSE SURVEY METHOD.

    PubMed

    Sohrabi, Mehdi; Parsi, Masoumeh; Mianji, Fereidoun

    2018-05-01

    National diagnostic reference levels (NDRLs) of Iran were determined for the four most common CT examinations including head, sinus, chest and abdomen/pelvis. A new 'quality control (QC)-based dose survey method', as developed by us, was applied to 157 CT scanners in Iran (2014-15) with different slice classes, models and geographic spread across the country. The NDRLs for head, sinus, chest and abdomen/pelvis examinations are 58, 29, 12 and 14 mGy for CTDIVol and 750, 300, 300 and 650 mGy.cm for DLP, respectively. The 'QC-based dose survey method' was further proven that it is a simple, accurate and practical method for a time and cost-effective NDRLs determination. One effective approach for optimization of the CT examination protocols at the national level is the provision of an adequate standardized training of the radiologists, technicians and medical physicists on the patient radiation protection principles and implementation of the DRL concept in clinical practices.

  10. Real Time Quality Control Methods for Cued EMI Data Collection

    DTIC Science & Technology

    2016-03-14

    contents be construed as reflecting the official policy or position of the Department of Defense. Reference herein to any specific commercial product...This project evaluated the effectiveness of in-field quality control (QC) procedures during cued electromagnetic induction (EMI) data collection. The...electromagnetic induction ESTCP Environmental Security Technology Certification Program hr hour ISO Industry Standard Object IVS Instrument

  11. Mixed biogenic and hydrothermal quartz in Permian lacustrine shale of Santanghu Basin, NW China: implications for penecontemporaneous transformation of silica minerals

    NASA Astrophysics Data System (ADS)

    Jiao, Xin; Liu, Yiqun; Yang, Wan; Zhou, Dingwu; Wang, Shuangshuang; Jin, Mengqi; Sun, Bin; Fan, Tingting

    2018-01-01

    The cycling of various isomorphs of authigenic silica minerals is a complex and long-term process. A special type of composite quartz (Qc) grains in tuffaceous shale of Permian Lucaogou Formation in the sediment-starved volcanically and hydrothermally active intracontinental lacustrine Santanghu rift basin (NW China) is studied in detail to demonstrate such processes. Samples from one well in the central basin were subject to petrographic, elemental chemical, and fluid inclusion analyses. About 200 Qc-bearing laminae are 0.1-2 mm and mainly 1 mm thick and intercalated within tuffaceous shale laminae. The Qc grains occur as framework grains and are dispersed in igneous feldspar-dominated matrix, suggesting episodic accumulation. The Qc grains are bedding-parallel, uniform in size (100 s µm), elongate, and radial in crystal pattern, suggesting a biogenic origin. Qc grains are composed of a core of anhedral microcrystalline quartz and an outer part of subhedral mega-quartz grains, whose edges are composed of small euhedral quartz crystals, indicating multiple episodic processes of recrystallization and overgrowth. Abundance of Al and Ti in quartz crystals and estimated temperature from fluid inclusions in Qc grains indicate that processes are related to hydrothermal fluids. Finally, the Qc grains are interpreted as original silica precipitation in microorganism (algae?) cysts, which were reworked by bottom currents and altered by hydrothermal fluids to recrystalize and overgrow during penecontemporaneous shallow burial. It is postulated that episodic volcanic and hydrothermal activities had changed lake water chemistry, temperature, and nutrient supply, resulting in variations in microorganic productivities and silica cycling. The transformation of authigenic silica from amorphous to well crystallized had occurred in a short time span during shallow burial.

  12. Formulation of Subgrid Variability and Boundary-Layer Cloud Cover in Large-Scale Models

    DTIC Science & Technology

    1999-02-28

    related to burned and unburned landscapes, saline and non-saline soils, and irrigated and nonirrigated crops. Escuela de Agrono’mia Universidad de Talca...Piso 2 Departamento de Ciencias de la Atmosfera 1428 Capital Federal ARGENTINA Juan Carlos TORRES, torres@cima.uba.ar Coupled land-surface...evaporation fraction, and qc,sat is the canopy saturation specific humidity, a function of Tc. Using (21) - (22) we then de - termine qc qc = qca

  13. A highly selective and sensitive turn-on probe for aluminum(III) based on quinoline Schiff's base and its cell imaging

    NASA Astrophysics Data System (ADS)

    Zhou, Fenfen; Wang, Hongqing; Liu, Pengying; Hu, Qinghua; Wang, Yuyuan; Liu, Can; Hu, Jiangke

    2018-02-01

    A reversible Schiff's base fluorescence probe for Al3+, (3,5-dichloro-2- hydroxybenzylidene) quinoline-2-carbohydrazide (QC), based on quinoline derivative has been designed, synthesized and evaluated. The QC exhibited a high sensitivity and selectivity toward Al3+ in EtOH-H2O (v/v = 1:9, pH = 6) by forming a 1:1 complex with Al3+ and the detection limit of QC for Al3+ was as low as 0.012 μM. Furthermore, these results displayed that the binding of QCsbnd Al3+ was broken by F-, so this system could be used to monitor F- in the future. The enhancement fluorescence of the QC could be attributed to the inhibition of PET and ESIPT and the emergency of CHEF process induced by Al3+. More importantly, QC was not only successfully used for the determination of trace Al3+ in the tap water and the human blood serum, but was valid for fluorescence imaging of Al3+ in the Hela cells.

  14. A novel QC-LDPC code based on the finite field multiplicative group for optical communications

    NASA Astrophysics Data System (ADS)

    Yuan, Jian-guo; Xu, Liang; Tong, Qing-zhen

    2013-09-01

    A novel construction method of quasi-cyclic low-density parity-check (QC-LDPC) code is proposed based on the finite field multiplicative group, which has easier construction, more flexible code-length code-rate adjustment and lower encoding/decoding complexity. Moreover, a regular QC-LDPC(5334,4962) code is constructed. The simulation results show that the constructed QC-LDPC(5334,4962) code can gain better error correction performance under the condition of the additive white Gaussian noise (AWGN) channel with iterative decoding sum-product algorithm (SPA). At the bit error rate (BER) of 10-6, the net coding gain (NCG) of the constructed QC-LDPC(5334,4962) code is 1.8 dB, 0.9 dB and 0.2 dB more than that of the classic RS(255,239) code in ITU-T G.975, the LDPC(32640,30592) code in ITU-T G.975.1 and the SCG-LDPC(3969,3720) code constructed by the random method, respectively. So it is more suitable for optical communication systems.

  15. Overexpression of the Qc-SNARE gene OsSYP71 enhances tolerance to oxidative stress and resistance to rice blast in rice (Oryza sativa L.).

    PubMed

    Bao, Yong-Mei; Sun, Shu-Jing; Li, Meng; Li, Li; Cao, Wen-Lei; Luo, Jia; Tang, Hai-Juan; Huang, Ji; Wang, Zhou-Fei; Wang, Jian-Fei; Zhang, Hong-Sheng

    2012-08-10

    OsSYP71 is an oxidative stress and rice blast response gene that encodes a Qc-SNARE protein in rice. Qc-SNARE proteins belong to the superfamily of SNAREs (soluble N-ethylmaleimide-sensitive factor attachment protein receptors), which function as important components of the vesicle trafficking machinery in eukaryotic cells. In this paper, 12 Qc-SNARE genes were isolated from rice, and expression patterns of 9 genes were detected in various tissues and in seedlings challenged with oxidative stresses and inoculated with rice blast. The expression of OsSYP71 was clearly up-regulated under these stresses. Overexpression of OsSYP71 in rice showed more tolerance to oxidative stress and resistance to rice blast than wild-type plants. These results indicate that Qc-SNAREs play an important role in rice response to environmental stresses, and OsSYP71 is useful in engineering crop plants with enhanced tolerance to oxidative stress and resistance to rice blast. Copyright © 2012 Elsevier B.V. All rights reserved.

  16. Field assessment of dried Plasmodium falciparum samples for malaria rapid diagnostic test quality control and proficiency testing in Ethiopia.

    PubMed

    Tamiru, Afework; Boulanger, Lucy; Chang, Michelle A; Malone, Joseph L; Aidoo, Michael

    2015-01-21

    Rapid diagnostic tests (RDTs) are now widely used for laboratory confirmation of suspected malaria cases to comply with the World Health Organization recommendation for universal testing before treatment. However, many malaria programmes lack quality control (QC) processes to assess RDT use under field conditions. Prior research showed the feasibility of using the dried tube specimen (DTS) method for preserving Plasmodium falciparum parasites for use as QC samples for RDTs. This study focused on the use of DTS for RDT QC and proficiency testing under field conditions. DTS were prepared using cultured P. falciparum at densities of 500 and 1,000 parasites/μL; 50 μL aliquots of these along with parasite negative human blood controls (0 parasites/μL) were air-dried in specimen tubes and reactivity verified after rehydration. The DTS were used in a field study in the Oromia Region of Ethiopia. Replicate DTS samples containing 0, 500 and 1,000 parasites/μL were stored at 4°C at a reference laboratory and at ambient temperatures at two nearby health facilities. At weeks 0, 4, 8, 12, 16, 20, and 24, the DTS were rehydrated and tested on RDTs stored under manufacturer-recommended temperatures at the RL and on RDTs stored under site-specific conditions at the two health facilities. Reactivity of DTS stored at 4°C at the reference laboratory on RDTs stored at the reference laboratory was considered the gold standard for assessing DTS stability. A proficiency-testing panel consisting of one negative and three positive samples, monitored with a checklist was administered at weeks 12 and 24. At all the seven time points, DTS stored at both the reference laboratory and health facility were reactive on RDTs stored under the recommended temperature and under field conditions, and the DTS without malaria parasites were negative. At the reference laboratory and one health facility, a 500 parasites/μL DTS from the proficiency panel was falsely reported as negative at week 24 due to errors in interpreting faint test lines. The DTS method can be used under field conditions to supplement other RDT QC methods and health worker proficiency in Ethiopia and possibly other malaria-endemic countries.

  17. Explicit Lower and Upper Bounds on the Entangled Value of Multiplayer XOR Games

    NASA Astrophysics Data System (ADS)

    Briët, Jop; Vidick, Thomas

    2013-07-01

    The study of quantum-mechanical violations of Bell inequalities is motivated by the investigation, and the eventual demonstration, of the nonlocal properties of entanglement. In recent years, Bell inequalities have found a fruitful re-formulation using the language of multiplayer games originating from Computer Science. This paper studies the nonlocal properties of entanglement in the context of the simplest such games, called XOR games. When there are two players, it is well known that the maximum bias—the advantage over random play—of players using entanglement can be at most a constant times greater than that of classical players. Recently, Pérez-García et al. (Commun. Mathe. Phys. 279:455, 2008) showed that no such bound holds when there are three or more players: the use of entanglement can provide an unbounded advantage, and scale with the number of questions in the game. Their proof relies on non-trivial results from operator space theory, and gives a non-explicit existence proof, leading to a game with a very large number of questions and only a loose control over the local dimension of the players' shared entanglement. We give a new, simple and explicit (though still probabilistic) construction of a family of three-player XOR games which achieve a large quantum-classical gap (QC-gap). This QC-gap is exponentially larger than the one given by Pérez-García et. al. in terms of the size of the game, achieving a QC-gap of order {√{N}} with N 2 questions per player. In terms of the dimension of the entangled state required, we achieve the same (optimal) QC-gap of {√{N}} for a state of local dimension N per player. Moreover, the optimal entangled strategy is very simple, involving observables defined by tensor products of the Pauli matrices. Additionally, we give the first upper bound on the maximal QC-gap in terms of the number of questions per player, showing that our construction is only quadratically off in that respect. Our results rely on probabilistic estimates on the norm of random matrices and higher-order tensors which may be of independent interest.

  18. Evaluation of digital radiography practice using exposure index tracking

    PubMed Central

    Zhou, Yifang; Allahverdian, Janet; Nute, Jessica L.; Lee, Christina

    2016-01-01

    Some digital radiography (DR) detectors and software allow for remote download of exam statistics, including image reject status, body part, projection, and exposure index (EI). The ability to have automated data collection from multiple DR units is conducive to a quality control (QC) program monitoring institutional radiographic exposures. We have implemented such a QC program with the goal to identify outliers in machine radiation output and opportunities for improvement in radiation dose levels. We studied the QC records of four digital detectors in greater detail on a monthly basis for one year. Although individual patient entrance skin exposure varied, the radiation dose levels to the detectors were made to be consistent via phototimer recalibration. The exposure data stored on each digital detector were periodically downloaded in a spreadsheet format for analysis. EI median and standard deviation were calculated for each protocol (by body part) and EI histograms were created for torso protocols. When histograms of EI values for different units were compared, we observed differences up to 400 in average EI (representing 60% difference in radiation levels to the detector) between units nominally calibrated to the same EI. We identified distinct components of the EI distributions, which in some cases, had mean EI values 300 apart. Peaks were observed at the current calibrated EI, a previously calibrated EI, and an EI representing computed radiography (CR) techniques. Our findings in this ongoing project have allowed us to make useful interventions, from emphasizing the use of phototimers instead of institutional memory of manual techniques to improvements in our phototimer calibration. We believe that this QC program can be implemented at other sites and can reveal problems with radiation levels in the aggregate that are difficult to identify on a case‐by‐case basis. PACS number(s): 87.59.bf PMID:27929507

  19. Effect of microwave disinfection on the surface roughness of three denture base resins after tooth brushing.

    PubMed

    Izumida, Fernanda Emiko; Ribeiro, Roberta Chuqui; Giampaolo, Eunice Teresinha; Machado, Ana Lucia; Pavarina, Ana Cláudia; Vergani, Carlos Eduardo

    2011-12-01

    This study investigated the effect of microwave disinfection on the roughness of three heat-polymerised acrylic resins after tooth brushing. Microwave disinfection has been recommended to reduce cross-contamination. However, this procedure may also influence the physical and mechanical properties of acrylic resins. Specimens (40 × 20 × 2 mm) of resins: Lucitone 550 (L), QC 20(QC) and Acron MC (A) were prepared and divided into four groups (n = 10): Control groups 1 (C1) and 2 (C2) - stored in water for 48 h or 7 days; Test groups 1 (MW2) and 2 (MW7) - stored in water for 48 h and disinfected (650 W for 6 min) daily for 2 or 7 days, respectively. After treatments, the specimens were placed in a tooth brushing machine at a rate of 60 reciprocal strokes per minute. The specimens were brushed with 20 000 strokes, which represent approximately 2 years of denture cleansing. The surface roughness (Ra) was evaluated before and after the tooth brushing. Data were analysed by two-way anova and Tukey Honestly Significant Difference (HSD) post hoc tests (α = 0.05). The data revealed significant changes between test groups for A and L resins. Comparison among resins revealed that for MW7, the roughness of A was significantly lower than that of L. After the seven microwave cycles, it could be seen that the roughness values of QC were significantly lower than those of L. The roughness of QC after brushing was not significantly affected by microwave disinfection. For A and L, seven microwave cycles resulted in increased roughness. © 2011 The Gerodontology Society and John Wiley & Sons A/S.

  20. FPGA implementation of high-performance QC-LDPC decoder for optical communications

    NASA Astrophysics Data System (ADS)

    Zou, Ding; Djordjevic, Ivan B.

    2015-01-01

    Forward error correction is as one of the key technologies enabling the next-generation high-speed fiber optical communications. Quasi-cyclic (QC) low-density parity-check (LDPC) codes have been considered as one of the promising candidates due to their large coding gain performance and low implementation complexity. In this paper, we present our designed QC-LDPC code with girth 10 and 25% overhead based on pairwise balanced design. By FPGAbased emulation, we demonstrate that the 5-bit soft-decision LDPC decoder can achieve 11.8dB net coding gain with no error floor at BER of 10-15 avoiding using any outer code or post-processing method. We believe that the proposed single QC-LDPC code is a promising solution for 400Gb/s optical communication systems and beyond.

Top