Selecting Statistical Procedures for Quality Control Planning Based on Risk Management.
Yago, Martín; Alcover, Silvia
2016-07-01
According to the traditional approach to statistical QC planning, the performance of QC procedures is assessed in terms of its probability of rejecting an analytical run that contains critical size errors (PEDC). Recently, the maximum expected increase in the number of unacceptable patient results reported during the presence of an undetected out-of-control error condition [Max E(NUF)], has been proposed as an alternative QC performance measure because it is more related to the current introduction of risk management concepts for QC planning in the clinical laboratory. We used a statistical model to investigate the relationship between PEDC and Max E(NUF) for simple QC procedures widely used in clinical laboratories and to construct charts relating Max E(NUF) with the capability of the analytical process that allow for QC planning based on the risk of harm to a patient due to the report of erroneous results. A QC procedure shows nearly the same Max E(NUF) value when used for controlling analytical processes with the same capability, and there is a close relationship between PEDC and Max E(NUF) for simple QC procedures; therefore, the value of PEDC can be estimated from the value of Max E(NUF) and vice versa. QC procedures selected by their high PEDC value are also characterized by a low value for Max E(NUF). The PEDC value can be used for estimating the probability of patient harm, allowing for the selection of appropriate QC procedures in QC planning based on risk management. © 2016 American Association for Clinical Chemistry.
Cian, Francesco; Villiers, Elisabeth; Archer, Joy; Pitorri, Francesca; Freeman, Kathleen
2014-06-01
Quality control (QC) validation is an essential tool in total quality management of a veterinary clinical pathology laboratory. Cost-analysis can be a valuable technique to help identify an appropriate QC procedure for the laboratory, although this has never been reported in veterinary medicine. The aim of this study was to determine the applicability of the Six Sigma Quality Cost Worksheets in the evaluation of possible candidate QC rules identified by QC validation. Three months of internal QC records were analyzed. EZ Rules 3 software was used to evaluate candidate QC procedures, and the costs associated with the application of different QC rules were calculated using the Six Sigma Quality Cost Worksheets. The costs associated with the current and the candidate QC rules were compared, and the amount of cost savings was calculated. There was a significant saving when the candidate 1-2.5s, n = 3 rule was applied instead of the currently utilized 1-2s, n = 3 rule. The savings were 75% per year (£ 8232.5) based on re-evaluating all of the patient samples in addition to the controls, and 72% per year (£ 822.4) based on re-analyzing only the control materials. The savings were also shown to change accordingly with the number of samples analyzed and with the number of daily QC procedures performed. These calculations demonstrated the importance of the selection of an appropriate QC procedure, and the usefulness of the Six Sigma Costs Worksheet in determining the most cost-effective rule(s) when several candidate rules are identified by QC validation. © 2014 American Society for Veterinary Clinical Pathology and European Society for Veterinary Clinical Pathology.
A Total Quality-Control Plan with Right-Sized Statistical Quality-Control.
Westgard, James O
2017-03-01
A new Clinical Laboratory Improvement Amendments option for risk-based quality-control (QC) plans became effective in January, 2016. Called an Individualized QC Plan, this option requires the laboratory to perform a risk assessment, develop a QC plan, and implement a QC program to monitor ongoing performance of the QC plan. Difficulties in performing a risk assessment may limit validity of an Individualized QC Plan. A better alternative is to develop a Total QC Plan including a right-sized statistical QC procedure to detect medically important errors. Westgard Sigma Rules provides a simple way to select the right control rules and the right number of control measurements. Copyright © 2016 Elsevier Inc. All rights reserved.
Jayakody, Chatura; Hull-Ryde, Emily A
2016-01-01
Well-defined quality control (QC) processes are used to determine whether a certain procedure or action conforms to a widely accepted standard and/or set of guidelines, and are important components of any laboratory quality assurance program (Popa-Burke et al., J Biomol Screen 14: 1017-1030, 2009). In this chapter, we describe QC procedures useful for monitoring the accuracy and precision of laboratory instrumentation, most notably automated liquid dispensers. Two techniques, gravimetric QC and photometric QC, are highlighted in this chapter. When used together, these simple techniques provide a robust process for evaluating liquid handler accuracy and precision, and critically underpin high-quality research programs.
Quality Control for Scoring Tests Administered in Continuous Mode: An NCME Instructional Module
ERIC Educational Resources Information Center
Allalouf, Avi; Gutentag, Tony; Baumer, Michal
2017-01-01
Quality control (QC) in testing is paramount. QC procedures for tests can be divided into two types. The first type, one that has been well researched, is QC for tests administered to large population groups on few administration dates using a small set of test forms (e.g., large-scale assessment). The second type is QC for tests, usually…
Quality control and assurance for validation of DOS/I measurements
NASA Astrophysics Data System (ADS)
Cerussi, Albert; Durkin, Amanda; Kwong, Richard; Quang, Timothy; Hill, Brian; Tromberg, Bruce J.; MacKinnon, Nick; Mantulin, William W.
2010-02-01
Ongoing multi-center clinical trials are crucial for Biophotonics to gain acceptance in medical imaging. In these trials, quality control (QC) and assurance (QA) are key to success and provide "data insurance". Quality control and assurance deal with standardization, validation, and compliance of procedures, materials and instrumentation. Specifically, QC/QA involves systematic assessment of testing materials, instrumentation performance, standard operating procedures, data logging, analysis, and reporting. QC and QA are important for FDA accreditation and acceptance by the clinical community. Our Biophotonics research in the Network for Translational Research in Optical Imaging (NTROI) program for breast cancer characterization focuses on QA/QC issues primarily related to the broadband Diffuse Optical Spectroscopy and Imaging (DOS/I) instrumentation, because this is an emerging technology with limited standardized QC/QA in place. In the multi-center trial environment, we implement QA/QC procedures: 1. Standardize and validate calibration standards and procedures. (DOS/I technology requires both frequency domain and spectral calibration procedures using tissue simulating phantoms and reflectance standards, respectively.) 2. Standardize and validate data acquisition, processing and visualization (optimize instrument software-EZDOS; centralize data processing) 3. Monitor, catalog and maintain instrument performance (document performance; modularize maintenance; integrate new technology) 4. Standardize and coordinate trial data entry (from individual sites) into centralized database 5. Monitor, audit and communicate all research procedures (database, teleconferences, training sessions) between participants ensuring "calibration". This manuscript describes our ongoing efforts, successes and challenges implementing these strategies.
Yago, Martín
2017-05-01
QC planning based on risk management concepts can reduce the probability of harming patients due to an undetected out-of-control error condition. It does this by selecting appropriate QC procedures to decrease the number of erroneous results reported. The selection can be easily made by using published nomograms for simple QC rules when the out-of-control condition results in increased systematic error. However, increases in random error also occur frequently and are difficult to detect, which can result in erroneously reported patient results. A statistical model was used to construct charts for the 1 ks and X /χ 2 rules. The charts relate the increase in the number of unacceptable patient results reported due to an increase in random error with the capability of the measurement procedure. They thus allow for QC planning based on the risk of patient harm due to the reporting of erroneous results. 1 ks Rules are simple, all-around rules. Their ability to deal with increases in within-run imprecision is minimally affected by the possible presence of significant, stable, between-run imprecision. X /χ 2 rules perform better when the number of controls analyzed during each QC event is increased to improve QC performance. Using nomograms simplifies the selection of statistical QC procedures to limit the number of erroneous patient results reported due to an increase in analytical random error. The selection largely depends on the presence or absence of stable between-run imprecision. © 2017 American Association for Clinical Chemistry.
A Framework for a Quality Control System for Vendor/Processor Contracts.
ERIC Educational Resources Information Center
Advanced Technology, Inc., Reston, VA.
A framework for monitoring quality control (QC) of processor contracts administered by the Department of Education's Office of Student Financial Assistance (OSFA) is presented and applied to the Pell Grant program. Guidelines for establishing QC measures and standards are included, and the uses of a sampling procedure in the QC system are…
NASA Technical Reports Server (NTRS)
Robinson, Michael; Steiner, Matthias; Wolff, David B.; Ferrier, Brad S.; Kessinger, Cathy; Einaudi, Franco (Technical Monitor)
2000-01-01
The primary function of the TRMM Ground Validation (GV) Program is to create GV rainfall products that provide basic validation of satellite-derived precipitation measurements for select primary sites. A fundamental and extremely important step in creating high-quality GV products is radar data quality control. Quality control (QC) processing of TRMM GV radar data is based on some automated procedures, but the current QC algorithm is not fully operational and requires significant human interaction to assure satisfactory results. Moreover, the TRMM GV QC algorithm, even with continuous manual tuning, still can not completely remove all types of spurious echoes. In an attempt to improve the current operational radar data QC procedures of the TRMM GV effort, an intercomparison of several QC algorithms has been conducted. This presentation will demonstrate how various radar data QC algorithms affect accumulated radar rainfall products. In all, six different QC algorithms will be applied to two months of WSR-88D radar data from Melbourne, Florida. Daily, five-day, and monthly accumulated radar rainfall maps will be produced for each quality-controlled data set. The QC algorithms will be evaluated and compared based on their ability to remove spurious echoes without removing significant precipitation. Strengths and weaknesses of each algorithm will be assessed based on, their abilit to mitigate both erroneous additions and reductions in rainfall accumulation from spurious echo contamination and true precipitation removal, respectively. Contamination from individual spurious echo categories will be quantified to further diagnose the abilities of each radar QC algorithm. Finally, a cost-benefit analysis will be conducted to determine if a more automated QC algorithm is a viable alternative to the current, labor-intensive QC algorithm employed by TRMM GV.
Rosenbaum, Matthew W; Flood, James G; Melanson, Stacy E F; Baumann, Nikola A; Marzinke, Mark A; Rai, Alex J; Hayden, Joshua; Wu, Alan H B; Ladror, Megan; Lifshitz, Mark S; Scott, Mitchell G; Peck-Palmer, Octavia M; Bowen, Raffick; Babic, Nikolina; Sobhani, Kimia; Giacherio, Donald; Bocsi, Gregary T; Herman, Daniel S; Wang, Ping; Toffaletti, John; Handel, Elizabeth; Kelly, Kathleen A; Albeiroti, Sami; Wang, Sihe; Zimmer, Melissa; Driver, Brandon; Yi, Xin; Wilburn, Clayton; Lewandrowski, Kent B
2018-05-29
In the United States, minimum standards for quality control (QC) are specified in federal law under the Clinical Laboratory Improvement Amendment and its revisions. Beyond meeting this required standard, laboratories have flexibility to determine their overall QC program. We surveyed chemistry and immunochemistry QC procedures at 21 clinical laboratories within leading academic medical centers to assess if standardized QC practices exist for chemistry and immunochemistry testing. We observed significant variation and unexpected similarities in practice across laboratories, including QC frequency, cutoffs, number of levels analyzed, and other features. This variation in practice indicates an opportunity exists to establish an evidence-based approach to QC that can be generalized across institutions.
Planning Risk-Based SQC Schedules for Bracketed Operation of Continuous Production Analyzers.
Westgard, James O; Bayat, Hassan; Westgard, Sten A
2018-02-01
To minimize patient risk, "bracketed" statistical quality control (SQC) is recommended in the new CLSI guidelines for SQC (C24-Ed4). Bracketed SQC requires that a QC event both precedes and follows (brackets) a group of patient samples. In optimizing a QC schedule, the frequency of QC or run size becomes an important planning consideration to maintain quality and also facilitate responsive reporting of results from continuous operation of high production analytic systems. Different plans for optimizing a bracketed SQC schedule were investigated on the basis of Parvin's model for patient risk and CLSI C24-Ed4's recommendations for establishing QC schedules. A Sigma-metric run size nomogram was used to evaluate different QC schedules for processes of different sigma performance. For high Sigma performance, an effective SQC approach is to employ a multistage QC procedure utilizing a "startup" design at the beginning of production and a "monitor" design periodically throughout production. Example QC schedules are illustrated for applications with measurement procedures having 6-σ, 5-σ, and 4-σ performance. Continuous production analyzers that demonstrate high σ performance can be effectively controlled with multistage SQC designs that employ a startup QC event followed by periodic monitoring or bracketing QC events. Such designs can be optimized to minimize the risk of harm to patients. © 2017 American Association for Clinical Chemistry.
NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR FORM QA/QC CHECKS (UA-C-2.0)
The purpose of this SOP is to outline the process of Field Quality Assurance and Quality Control checks. This procedure was followed to ensure consistent data retrieval during the Arizona NHEXAS project and the "Border" study. Keywords: custody; QA/QC; field checks.
The Nation...
The purpose of this SOP is to outline the process of field quality assurance and quality control checks. This procedure was followed to ensure consistent data retrieval during the Arizona NHEXAS project and the Border study. Keywords: custody; QA/QC; field checks.
The U.S.-Mex...
Rossum, Huub H van; Kemperman, Hans
2017-07-26
General application of a moving average (MA) as continuous analytical quality control (QC) for routine chemistry assays has failed due to lack of a simple method that allows optimization of MAs. A new method was applied to optimize the MA for routine chemistry and was evaluated in daily practice as continuous analytical QC instrument. MA procedures were optimized using an MA bias detection simulation procedure. Optimization was graphically supported by bias detection curves. Next, all optimal MA procedures that contributed to the quality assurance were run for 100 consecutive days and MA alarms generated during working hours were investigated. Optimized MA procedures were applied for 24 chemistry assays. During this evaluation, 303,871 MA values and 76 MA alarms were generated. Of all alarms, 54 (71%) were generated during office hours. Of these, 41 were further investigated and were caused by ion selective electrode (ISE) failure (1), calibration failure not detected by QC due to improper QC settings (1), possible bias (significant difference with the other analyzer) (10), non-human materials analyzed (2), extreme result(s) of a single patient (2), pre-analytical error (1), no cause identified (20), and no conclusion possible (4). MA was implemented in daily practice as a continuous QC instrument for 24 routine chemistry assays. In our setup when an MA alarm required follow-up, a manageable number of MA alarms was generated that resulted in valuable MA alarms. For the management of MA alarms, several applications/requirements in the MA management software will simplify the use of MA procedures.
NASA Technical Reports Server (NTRS)
Brenton, James C.; Barbre. Robert E., Jr.; Decker, Ryan K.; Orcutt, John M.
2018-01-01
The National Aeronautics and Space Administration's (NASA) Marshall Space Flight Center (MSFC) Natural Environments Branch (EV44) has provided atmospheric databases and analysis in support of space vehicle design and day-of-launch operations for NASA and commercial launch vehicle programs launching from the NASA Kennedy Space Center (KSC), co-located on the United States Air Force's Eastern Range (ER) at the Cape Canaveral Air Force Station. The ER complex is one of the most heavily instrumented sites in the United States with over 31 towers measuring various atmospheric parameters on a continuous basis. An inherent challenge with large sets of data consists of ensuring erroneous data is removed from databases, and thus excluded from launch vehicle design analyses. EV44 has put forth great effort in developing quality control (QC) procedures for individual meteorological instruments, however no standard QC procedures for all databases currently exists resulting in QC databases that have inconsistencies in variables, methodologies, and periods of record. The goal of this activity is to use the previous efforts by EV44 to develop a standardized set of QC procedures from which to build meteorological databases from KSC and the ER, while maintaining open communication with end users from the launch community to develop ways to improve, adapt and grow the QC database. Details of the QC procedures will be described. As the rate of launches increases with additional launch vehicle programs, it is becoming more important that weather databases are continually updated and checked for data quality before use in launch vehicle design and certification analyses.
Impacts of Intelligent Automated Quality Control on a Small Animal APD-Based Digital PET Scanner
NASA Astrophysics Data System (ADS)
Charest, Jonathan; Beaudoin, Jean-François; Bergeron, Mélanie; Cadorette, Jules; Arpin, Louis; Lecomte, Roger; Brunet, Charles-Antoine; Fontaine, Réjean
2016-10-01
Stable system performance is mandatory to warrant the accuracy and reliability of biological results relying on small animal positron emission tomography (PET) imaging studies. This simple requirement sets the ground for imposing routine quality control (QC) procedures to keep PET scanners at a reliable optimal performance level. However, such procedures can become burdensome to implement for scanner operators, especially taking into account the increasing number of data acquisition channels in newer generation PET scanners. In systems using pixel detectors to achieve enhanced spatial resolution and contrast-to-noise ratio (CNR), the QC workload rapidly increases to unmanageable levels due to the number of independent channels involved. An artificial intelligence based QC system, referred to as Scanner Intelligent Diagnosis for Optimal Performance (SIDOP), was proposed to help reducing the QC workload by performing automatic channel fault detection and diagnosis. SIDOP consists of four high-level modules that employ machine learning methods to perform their tasks: Parameter Extraction, Channel Fault Detection, Fault Prioritization, and Fault Diagnosis. Ultimately, SIDOP submits a prioritized faulty channel list to the operator and proposes actions to correct them. To validate that SIDOP can perform QC procedures adequately, it was deployed on a LabPET™ scanner and multiple performance metrics were extracted. After multiple corrections on sub-optimal scanner settings, a 8.5% (with a 95% confidence interval (CI) of [7.6, 9.3]) improvement in the CNR, a 17.0% (CI: [15.3, 18.7]) decrease of the uniformity percentage standard deviation, and a 6.8% gain in global sensitivity were observed. These results confirm that SIDOP can indeed be of assistance in performing QC procedures and restore performance to optimal figures.
NASA Technical Reports Server (NTRS)
Brenton, J. C.; Barbre, R. E.; Decker, R. K.; Orcutt, J. M.
2018-01-01
The National Aeronautics and Space Administration's (NASA) Marshall Space Flight Center (MSFC) Natural Environments Branch (EV44) provides atmospheric databases and analysis in support of space vehicle design and day-of-launch operations for NASA and commercial launch vehicle programs launching from the NASA Kennedy Space Center (KSC), co-located on the United States Air Force's Eastern Range (ER) at the Cape Canaveral Air Force Station. The ER complex is one of the most heavily instrumented sites in the United States with over 31 towers measuring various atmospheric parameters on a continuous basis. An inherent challenge with large datasets consists of ensuring erroneous data are removed from databases, and thus excluded from launch vehicle design analyses. EV44 has put forth great effort in developing quality control (QC) procedures for individual meteorological instruments, however no standard QC procedures for all databases currently exists resulting in QC databases that have inconsistencies in variables, development methodologies, and periods of record. The goal of this activity is to use the previous efforts to develop a standardized set of QC procedures from which to build meteorological databases from KSC and the ER, while maintaining open communication with end users from the launch community to develop ways to improve, adapt and grow the QC database. Details of the QC procedures will be described. As the rate of launches increases with additional launch vehicle programs, It is becoming more important that weather databases are continually updated and checked for data quality before use in launch vehicle design and certification analyses.
Taghizadeh, Somayeh; Yang, Claus Chunli; R. Kanakamedala, Madhava; Morris, Bart; Vijayakumar, Srinivasan
2017-01-01
Purpose Magnetic resonance (MR) images are necessary for accurate contouring of intracranial targets, determination of gross target volume and evaluation of organs at risk during stereotactic radiosurgery (SRS) treatment planning procedures. Many centers use magnetic resonance imaging (MRI) simulators or regular diagnostic MRI machines for SRS treatment planning; while both types of machine require two stages of quality control (QC), both machine- and patient-specific, before use for SRS, no accepted guidelines for such QC currently exist. This article describes appropriate machine-specific QC procedures for SRS applications. Methods and materials We describe the adaptation of American College of Radiology (ACR)-recommended QC tests using an ACR MRI phantom for SRS treatment planning. In addition, commercial Quasar MRID3D and Quasar GRID3D phantoms were used to evaluate the effects of static magnetic field (B0) inhomogeneity, gradient nonlinearity, and a Leksell G frame (SRS frame) and its accessories on geometrical distortion in MR images. Results QC procedures found in-plane distortions (Maximum = 3.5 mm, Mean = 0.91 mm, Standard deviation = 0.67 mm, >2.5 mm (%) = 2) in X-direction (Maximum = 2.51 mm, Mean = 0.52 mm, Standard deviation = 0.39 mm, > 2.5 mm (%) = 0) and in Y-direction (Maximum = 13. 1 mm , Mean = 2.38 mm, Standard deviation = 2.45 mm, > 2.5 mm (%) = 34) in Z-direction and < 1 mm distortion at a head-sized region of interest. MR images acquired using a Leksell G frame and localization devices showed a mean absolute deviation of 2.3 mm from isocenter. The results of modified ACR tests were all within recommended limits, and baseline measurements have been defined for regular weekly QC tests. Conclusions With appropriate QC procedures in place, it is possible to routinely obtain clinically useful MR images suitable for SRS treatment planning purposes. MRI examination for SRS planning can benefit from the improved localization and planning possible with the superior image quality and soft tissue contrast achieved under optimal conditions. PMID:29487771
Fatemi, Ali; Taghizadeh, Somayeh; Yang, Claus Chunli; R Kanakamedala, Madhava; Morris, Bart; Vijayakumar, Srinivasan
2017-12-18
Purpose Magnetic resonance (MR) images are necessary for accurate contouring of intracranial targets, determination of gross target volume and evaluation of organs at risk during stereotactic radiosurgery (SRS) treatment planning procedures. Many centers use magnetic resonance imaging (MRI) simulators or regular diagnostic MRI machines for SRS treatment planning; while both types of machine require two stages of quality control (QC), both machine- and patient-specific, before use for SRS, no accepted guidelines for such QC currently exist. This article describes appropriate machine-specific QC procedures for SRS applications. Methods and materials We describe the adaptation of American College of Radiology (ACR)-recommended QC tests using an ACR MRI phantom for SRS treatment planning. In addition, commercial Quasar MRID 3D and Quasar GRID 3D phantoms were used to evaluate the effects of static magnetic field (B 0 ) inhomogeneity, gradient nonlinearity, and a Leksell G frame (SRS frame) and its accessories on geometrical distortion in MR images. Results QC procedures found in-plane distortions (Maximum = 3.5 mm, Mean = 0.91 mm, Standard deviation = 0.67 mm, >2.5 mm (%) = 2) in X-direction (Maximum = 2.51 mm, Mean = 0.52 mm, Standard deviation = 0.39 mm, > 2.5 mm (%) = 0) and in Y-direction (Maximum = 13. 1 mm , Mean = 2.38 mm, Standard deviation = 2.45 mm, > 2.5 mm (%) = 34) in Z-direction and < 1 mm distortion at a head-sized region of interest. MR images acquired using a Leksell G frame and localization devices showed a mean absolute deviation of 2.3 mm from isocenter. The results of modified ACR tests were all within recommended limits, and baseline measurements have been defined for regular weekly QC tests. Conclusions With appropriate QC procedures in place, it is possible to routinely obtain clinically useful MR images suitable for SRS treatment planning purposes. MRI examination for SRS planning can benefit from the improved localization and planning possible with the superior image quality and soft tissue contrast achieved under optimal conditions.
40 CFR 136.7 - Quality assurance and quality control.
Code of Federal Regulations, 2014 CFR
2014-07-01
... quality control elements, where applicable, into the laboratory's documented standard operating procedure... quality control elements must be clearly documented in the written standard operating procedure for each... Methods contains QA/QC procedures in the Part 1000 section of the Standard Methods Compendium. The...
40 CFR 136.7 - Quality assurance and quality control.
Code of Federal Regulations, 2013 CFR
2013-07-01
... quality control elements, where applicable, into the laboratory's documented standard operating procedure... quality control elements must be clearly documented in the written standard operating procedure for each... Methods contains QA/QC procedures in the Part 1000 section of the Standard Methods Compendium. The...
40 CFR 136.7 - Quality assurance and quality control.
Code of Federal Regulations, 2012 CFR
2012-07-01
... quality control elements, where applicable, into the laboratory's documented standard operating procedure... quality control elements must be clearly documented in the written standard operating procedure for each... Methods contains QA/QC procedures in the Part 1000 section of the Standard Methods Compendium. The...
Development of QC Procedures for Ocean Data Obtained by National Research Projects of Korea
NASA Astrophysics Data System (ADS)
Kim, S. D.; Park, H. M.
2017-12-01
To establish data management system for ocean data obtained by national research projects of Ministry of Oceans and Fisheries of Korea, KIOST conducted standardization and development of QC procedures. After reviewing and analyzing the existing international and domestic ocean-data standards and QC procedures, the draft version of standards and QC procedures were prepared. The proposed standards and QC procedures were reviewed and revised by experts in the field of oceanography and academic societies several times. A technical report on the standards of 25 data items and 12 QC procedures for physical, chemical, biological and geological data items. The QC procedure for temperature and salinity data was set up by referring the manuals published by GTSPP, ARGO and IOOS QARTOD. It consists of 16 QC tests applicable for vertical profile data and time series data obtained in real-time mode and delay mode. Three regional range tests to inspect annual, seasonal and monthly variations were included in the procedure. Three programs were developed to calculate and provide upper limit and lower limit of temperature and salinity at depth from 0 to 1550m. TS data of World Ocean Database, ARGO, GTSPP and in-house data of KIOST were analysed statistically to calculate regional limit of Northwest Pacific area. Based on statistical analysis, the programs calculate regional ranges using mean and standard deviation at 3 kind of grid systems (3° grid, 1° grid and 0.5° grid) and provide recommendation. The QC procedures for 12 data items were set up during 1st phase of national program for data management (2012-2015) and are being applied to national research projects practically at 2nd phase (2016-2019). The QC procedures will be revised by reviewing the result of QC application when the 2nd phase of data management programs is completed.
Quality Assurance and Control Considerations in Environmental Measurements and Monitoring
NASA Astrophysics Data System (ADS)
Sedlet, Jacob
1982-06-01
Quality assurance and quality control have become accepted as essential parts of all environmental surveillance, measurements, and monitoring programs, both nuclear and non-nuclear. The same principles and details apply to each. It is primarily the final measurement technique that differs. As the desire and need to measure smaller amounts of pollutants with greater accuracy has increased, it has been recognized that quality assurance and control programs are cost-effective in achieving the expected results. Quality assurance (QA) consists of all the actions necessary to provide confidence in the results. Quality control (QC) is a part of QA, and consists of those actions and activities that permit the control of the individual steps in the environmental program. The distinction between the two terms is not always clearly defined, but a sharp division is not necessary. The essential principle of QA and QC is a commitment to high quality results. The essential components of a QA and QC program are a complete, written procedures manual for all parts of the environmental program, the use of standard or validated procedures, participation in applicable interlaboratory comparison or QA programs, replicate analysis and measurement, training of personnel, and a means of auditing or checking that the QA and QC programs are properly conducted. These components are discussed below in some detail.
Quality control for federal clean water act and safe drinking water act regulatory compliance.
Askew, Ed
2013-01-01
QC sample results are required in order to have confidence in the results from analytical tests. Some of the AOAC water methods include specific QC procedures, frequencies, and acceptance criteria. These are considered to be the minimum controls needed to perform the method successfully. Some regulatory programs, such as those in 40 CFR Part 136.7, require additional QC or have alternative acceptance limits. Essential QC measures include method calibration, reagent standardization, assessment of each analyst's capabilities, analysis of blind check samples, determination of the method's sensitivity (method detection level or quantification limit), and daily evaluation of bias, precision, and the presence of laboratory contamination or other analytical interference. The details of these procedures, their performance frequency, and expected ranges of results are set out in this manuscript. The specific regulatory requirements of 40 CFR Part 136.7 for the Clean Water Act, the laboratory certification requirements of 40 CFR Part 141 for the Safe Drinking Water Act, and the ISO 17025 accreditation requirements under The NELAC Institute are listed.
Hesari, Nikou; Kıratlı Yılmazçoban, Nursel; Elzein, Mohamad; Alum, Absar; Abbaszadegan, Morteza
2017-01-03
Rapid bacterial detection using biosensors is a novel approach for microbiological testing applications. Validation of such methods is an obstacle in the adoption of new bio-sensing technologies for water testing. Therefore, establishing a quality assurance and quality control (QA/QC) plan is essential to demonstrate accuracy and reliability of the biosensor method for the detection of E. coli in drinking water samples. In this study, different reagents and assay conditions including temperatures, holding time, E. coli strains and concentrations, dissolving agents, salinity and pH effects, quality of substrates of various suppliers of 4-methylumbelliferyl glucuronide (MUG), and environmental water samples were included in the QA/QC plan and used in the assay optimization and documentation. Furthermore, the procedural QA/QC for the monitoring of drinking water samples was established to validate the performance of the biosensor platform for the detection of E. coli using a culture-based standard technique. Implementing the developed QA/QC plan, the same level of precision and accuracy was achieved using both the standard and the biosensor methods. The established procedural QA/QC for the biosensor will provide a reliable tool for a near real-time monitoring of E. coli in drinking water samples to both industry and regulatory authorities.
Hesari, Nikou; Kıratlı Yılmazçoban, Nursel; Elzein, Mohamad; Alum, Absar; Abbaszadegan, Morteza
2017-01-01
Rapid bacterial detection using biosensors is a novel approach for microbiological testing applications. Validation of such methods is an obstacle in the adoption of new bio-sensing technologies for water testing. Therefore, establishing a quality assurance and quality control (QA/QC) plan is essential to demonstrate accuracy and reliability of the biosensor method for the detection of E. coli in drinking water samples. In this study, different reagents and assay conditions including temperatures, holding time, E. coli strains and concentrations, dissolving agents, salinity and pH effects, quality of substrates of various suppliers of 4-methylumbelliferyl glucuronide (MUG), and environmental water samples were included in the QA/QC plan and used in the assay optimization and documentation. Furthermore, the procedural QA/QC for the monitoring of drinking water samples was established to validate the performance of the biosensor platform for the detection of E. coli using a culture-based standard technique. Implementing the developed QA/QC plan, the same level of precision and accuracy was achieved using both the standard and the biosensor methods. The established procedural QA/QC for the biosensor will provide a reliable tool for a near real-time monitoring of E. coli in drinking water samples to both industry and regulatory authorities. PMID:28054956
HANDBOOK: QUALITY ASSURANCE/QUALITY CONTROL (QA/QC) PROCEDURES FOR HAZARDOUS WASTE INCINERATION
Resource Conservation and Recovery Act regulations for hazardous waste incineration require trial burns by permit applicants. uality Assurance Project Plan (QAPjP) must accompany a trial burn plan with appropriate quality assurance/quality control procedures. uidance on the prepa...
Root, Patsy; Hunt, Margo; Fjeld, Karla; Kundrat, Laurie
2014-01-01
Quality assurance (QA) and quality control (QC) data are required in order to have confidence in the results from analytical tests and the equipment used to produce those results. Some AOAC water methods include specific QA/QC procedures, frequencies, and acceptance criteria, but these are considered to be the minimum controls needed to perform a microbiological method successfully. Some regulatory programs, such as those at Code of Federal Regulations (CFR), Title 40, Part 136.7 for chemistry methods, require additional QA/QC measures beyond those listed in the method, which can also apply to microbiological methods. Essential QA/QC measures include sterility checks, reagent specificity and sensitivity checks, assessment of each analyst's capabilities, analysis of blind check samples, and evaluation of the presence of laboratory contamination and instrument calibration and checks. The details of these procedures, their performance frequency, and expected results are set out in this report as they apply to microbiological methods. The specific regulatory requirements of CFR Title 40 Part 136.7 for the Clean Water Act, the laboratory certification requirements of CFR Title 40 Part 141 for the Safe Drinking Water Act, and the International Organization for Standardization 17025 accreditation requirements under The NELAC Institute are also discussed.
Details on the verification test design, measurement test procedures, and Quality assurance/Quality Control (QA/QC) procedures can be found in the test plan titled Testing and Quality Assurance Plan, MIRATECH Corporation GECO 3100 Air/Fuel Ratio Controller (SRI 2001). It can be d...
Quality control in urodynamics and the role of software support in the QC procedure.
Hogan, S; Jarvis, P; Gammie, A; Abrams, P
2011-11-01
This article aims to identify quality control (QC) best practice, to review published QC audits in order to identify how closely good practice is followed, and to carry out a market survey of the software features that support QC offered by urodynamics machines available in the UK. All UK distributors of urodynamic systems were contacted and asked to provide information on the software features relating to data quality of the products they supply. The results of the market survey show that the features offered by manufacturers differ greatly. Automated features, which can be turned off in most cases, include: cough recognition, detrusor contraction detection, and high pressure alerts. There are currently no systems that assess data quality based on published guidelines. A literature review of current QC guidelines for urodynamics was carried out; QC audits were included in the literature review to see how closely guidelines were being followed. This review highlights the fact that basic QC is not being carried out effectively by urodynamicists. Based on the software features currently available and the results of the literature review there is both the need and capacity for a greater degree of automation in relation to urodynamic data quality and accuracy assessment. Some progress has been made in this area and certain manufacturers have already developed automated cough detection. Copyright © 2011 Wiley Periodicals, Inc.
Phase 2 Site Investigations Report. Volume 3 of 3: Appendices
1994-09-01
Phase II Site Investigations Ee Report Cn Volume III of III Appendices Fort Devens Sudbury Training Annex, Massachusetts September 1994 Contract No...laboratory quality control (QC) samples collected during field investigations at the Sudbury Training Annex of Fort Devens , Massachusetts. The QC...returned to its original condition. E & E performed this procedure for each monitoring well tested during the 1993 slug testing activities at Fort Devens
Wei, Ling; Shi, Jianfeng; Afari, George; Bhattacharyya, Sibaprasad
2014-01-01
Panitumumab is a fully human monoclonal antibody approved for the treatment of epidermal growth factor receptor (EGFR) positive colorectal cancer. Recently, panitumumab has been radiolabeled with 89Zr and evaluated for its potential to be used as immuno-positron emission tomography (PET) probe for EGFR positive cancers. Interesting preclinical results published by several groups of researchers have prompted us to develop a robust procedure for producing clinical-grade 89Zr-panitumumab as an immuno-PET probe to evaluate EGFR-targeted therapy. In this process, clinical-grade panitumumab is bio-conjugated with desferrioxamine chelate and subsequently radiolabeled with 89Zr resulting in high radiochemical yield (>70%, n=3) and purity (>98%, n=3). All quality control (QC) tests were performed according to United States Pharmacopeia specifications. QC tests showed that 89Zr-panitumumab met all specifications for human injection. Herein, we describe a step-by-step method for the facile synthesis and QC tests of 89Zr-panitumumab for medical use. The entire process of bioconjugation, radiolabeling, and all QC tests will take about 5h. Because the synthesis is fully manual, two rapid, in-process QC tests have been introduced to make the procedure robust and error free. PMID:24448743
NASA Technical Reports Server (NTRS)
Brenton, James C.; Barbre, Robert E.; Orcutt, John M.; Decker, Ryan K.
2018-01-01
The National Aeronautics and Space Administration's (NASA) Marshall Space Flight Center (MSFC) Natural Environments Branch (EV44) has provided atmospheric databases and analysis in support of space vehicle design and day-of-launch operations for NASA and commercial launch vehicle programs launching from the NASA Kennedy Space Center (KSC), co-located on the United States Air Force's Eastern Range (ER) at the Cape Canaveral Air Force Station. The ER is one of the most heavily instrumented sites in the United States measuring various atmospheric parameters on a continuous basis. An inherent challenge with the large databases that EV44 receives from the ER consists of ensuring erroneous data are removed from the databases, and thus excluded from launch vehicle design analyses. EV44 has put forth great effort in developing quality control (QC) procedures for individual meteorological instruments; however, no standard QC procedures for all databases currently exist resulting in QC databases that have inconsistencies in variables, methodologies, and periods of record. The goal of this activity is to use the previous efforts by EV44 to develop a standardized set of QC procedures from which to build flags within the meteorological databases from KSC and the ER, while maintaining open communication with end users from the launch community to develop ways to improve, adapt and grow the QC database. Details of the QC checks are described. The flagged data points will be plotted in a graphical user interface (GUI) as part of a manual confirmation that the flagged data do indeed need to be removed from the archive. As the rate of launches increases with additional launch vehicle programs, more emphasis is being placed to continually update and check weather databases for data quality before use in launch vehicle design and certification analyses.
Evaluation of peak picking quality in LC-MS metabolomics data.
Brodsky, Leonid; Moussaieff, Arieh; Shahaf, Nir; Aharoni, Asaph; Rogachev, Ilana
2010-11-15
The output of LC-MS metabolomics experiments consists of mass-peak intensities identified through a peak-picking/alignment procedure. Besides imperfections in biological samples and instrumentation, data accuracy is highly dependent on the applied algorithms and their parameters. Consequently, quality control (QC) is essential for further data analysis. Here, we present a QC approach that is based on discrepancies between replicate samples. First, the quantile normalization of per-sample log-signal distributions is applied to each group of biologically homogeneous samples. Next, the overall quality of each replicate group is characterized by the Z-transformed correlation coefficients between samples. This general QC allows a tuning of the procedure's parameters which minimizes the inter-replicate discrepancies in the generated output. Subsequently, an in-depth QC measure detects local neighborhoods on a template of aligned chromatograms that are enriched by divergences between intensity profiles of replicate samples. These neighborhoods are determined through a segmentation algorithm. The retention time (RT)-m/z positions of the neighborhoods with local divergences are indicative of either: incorrect alignment of chromatographic features, technical problems in the chromatograms, or to a true biological discrepancy between replicates for particular metabolites. We expect this method to aid in the accurate analysis of metabolomics data and in the development of new peak-picking/alignment procedures.
Cho, Min-Chul; Kim, So Young; Jeong, Tae-Dong; Lee, Woochang; Chun, Sail; Min, Won-Ki
2014-11-01
Verification of new lot reagent's suitability is necessary to ensure that results for patients' samples are consistent before and after reagent lot changes. A typical procedure is to measure results of some patients' samples along with quality control (QC) materials. In this study, the results of patients' samples and QC materials in reagent lot changes were analysed. In addition, the opinion regarding QC target range adjustment along with reagent lot changes was proposed. Patients' sample and QC material results of 360 reagent lot change events involving 61 analytes and eight instrument platforms were analysed. The between-lot differences for the patients' samples (ΔP) and the QC materials (ΔQC) were tested by Mann-Whitney U tests. The size of the between-lot differences in the QC data was calculated as multiples of standard deviation (SD). The ΔP and ΔQC values only differed significantly in 7.8% of the reagent lot change events. This frequency was not affected by the assay principle or the QC material source. One SD was proposed for the cutoff for maintaining pre-existing target range after reagent lot change. While non-commutable QC material results were infrequent in the present study, our data confirmed that QC materials have limited usefulness when assessing new reagent lots. Also a 1 SD standard for establishing a new QC target range after reagent lot change event was proposed. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Chow, Judith C; Watson, John G; Robles, Jerome; Wang, Xiaoliang; Chen, L-W Antony; Trimble, Dana L; Kohl, Steven D; Tropp, Richard J; Fung, Kochy K
2011-12-01
Accurate, precise, and valid organic and elemental carbon (OC and EC, respectively) measurements require more effort than the routine analysis of ambient aerosol and source samples. This paper documents the quality assurance (QA) and quality control (QC) procedures that should be implemented to ensure consistency of OC and EC measurements. Prior to field sampling, the appropriate filter substrate must be selected and tested for sampling effectiveness. Unexposed filters are pre-fired to remove contaminants and acceptance tested. After sampling, filters must be stored in the laboratory in clean, labeled containers under refrigeration (<4 °C) to minimize loss of semi-volatile OC. QA activities include participation in laboratory accreditation programs, external system audits, and interlaboratory comparisons. For thermal/optical carbon analyses, periodic QC tests include calibration of the flame ionization detector with different types of carbon standards, thermogram inspection, replicate analyses, quantification of trace oxygen concentrations (<100 ppmv) in the helium atmosphere, and calibration of the sample temperature sensor. These established QA/QC procedures are applicable to aerosol sampling and analysis for carbon and other chemical components.
Liu, Zhao; Zheng, Chaorong; Wu, Yue
2017-09-01
Wind profilers have been widely adopted to observe the wind field information in the atmosphere for different purposes. But accuracy of its observation has limitations due to various noises or disturbances and hence need to be further improved. In this paper, the data measured under strong wind conditions, using a 1290-MHz boundary layer profiler (BLP), are quality controlled via a composite quality control (QC) procedure proposed by the authors. Then, through the comparison with the data measured by radiosonde flights (balloon observations), the critical thresholds in the composite QC procedure, including consensus average threshold T 1 and vertical shear threshold T 3 , are systematically discussed. And the performance of the BLP operated under precipitation is also evaluated. It is found that to ensure the high accuracy and high data collectable rate, the optimal range of subsets is determined to be 4 m/s. Although the number of data rejected by the combined algorithm of vertical shear examination and small median test is quite limited, it is proved that the algorithm is quite useful to recognize the outlier with a large discrepancy. And the optimal wind shear threshold T 3 can be recommended as 5 ms -1 /100m. During patchy precipitation, the quality of data measured by the four oblique beams (using the DBS measuring technique) can still be ensured. After the BLP data are quality controlled by the composite QC procedure, the output can show good agreement with the balloon observation.
23 CFR 650.313 - Inspection procedures.
Code of Federal Regulations, 2010 CFR
2010-04-01
...) Quality control and quality assurance. Assure systematic quality control (QC) and quality assurance (QA... periodic field review of inspection teams, periodic bridge inspection refresher training for program managers and team leaders, and independent review of inspection reports and computations. (h) Follow-up on...
DOT National Transportation Integrated Search
2013-11-01
Current roadway quality control and quality acceptance (QC/QA) procedures for the Louisiana Department of Transportation and : Development (LADOTD) include coring for thickness, density, and air voids in hot mix asphalt (HMA) pavements and thickness ...
Revision 2 of the Enbridge Quality Assurance Project Plan
This Quality Assurance Project Plan (QAPP) presents Revision 2 of the organization, objectives, planned activities, and specific quality assurance/quality control (QA/QC) procedures associated with the Enbridge Marshall Pipeline Release Project.
Lean Six Sigma in Health Care: Improving Utilization and Reducing Waste.
Almorsy, Lamia; Khalifa, Mohamed
2016-01-01
Healthcare costs have been increasing worldwide mainly due to over utilization of resources. The savings potentially achievable from systematic, comprehensive, and cooperative reduction in waste are far higher than from more direct and blunter cuts in care and coverage. At King Faisal Specialist Hospital and Research Center inappropriate and over utilization of the glucose test strips used for whole blood glucose determination using glucometers was observed. The hospital implemented a project to improve its utilization. Using the Six Sigma DMAIC approach (Define, Measure, Analyze, Improve and Control), an efficient practice was put in place including updating the related internal policies and procedures and the proper implementation of an effective users' training and competency check off program. That resulted in decreasing the unnecessary Quality Control (QC) runs from 13% to 4%, decreasing the failed QC runs from 14% to 7%, lowering the QC to patient testing ratio from 24/76 to 19/81.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Günther, Markus, E-mail: markus.guenther@tu-berlin.de; Geißler, Gesa; Köppel, Johann
As there is no one-and-only concept on how to precisely define and establish quality control (QC) or quality assurance (QA) in the making of environmental assessments (EA), this paper presents selected features of international approaches that address quality in EA systems in the USA, the Netherlands, Canada, and the United Kingdom. Based on explanative case studies, we highlight the embedding of specific quality control features within the EA systems, the objectives and processes, and relevant transparency challenges. Such features of QC/QA approaches can be considered in cases where substantial quality control and assurance efforts are still missing. Yet further researchmore » needs to be conducted on the efficacy of these approaches, which remains beyond the scope of this study. - Highlights: • We present four tools for quality control and assurance from different EA systems. • Approaches vary in institutional setting, objectives, procedures, and transparency. • Highlighted features might provide guidance in cases where QC/QA is still lacking.« less
DOT National Transportation Integrated Search
2013-11-01
Current roadway quality control and quality acceptance (QC/QA) procedures for the Louisiana Department of Transportation and Development : (LADOTD) include coring for thickness, density, and air voids in hot mix asphalt (HMA) pavements and thickness ...
Assessment of in-situ test technology for construction control of base courses and embankments.
DOT National Transportation Integrated Search
2004-05-01
With the coming move from an empirical to mechanistic-empirical pavement design, it is essential to improve the quality control/quality assurance (QC/QA) procedures of compacted materials from a density-based criterion to a stiffness/strength-based c...
DOT National Transportation Integrated Search
2010-06-01
This manual provides information and recommended procedures to be utilized by an agencys Weigh-in-Motion (WIM) Office Data Analyst to perform validation and quality control (QC) checks of WIM traffic data. This manual focuses on data generated by ...
Analysis of QA procedures at the Oregon Department of Transportation.
DOT National Transportation Integrated Search
2010-06-01
This research explored the Oregon Department of Transportation (ODOT) practice of Independent Assurance (IA), : for validation of the contractors test methods, and Verification, for validation of the contractors Quality Control : (QC) data. The...
The Quality System Implementation Plan (QSIP) describes the quality assurance and quality control procedures developed for the CTEPP study. It provides the QA/QC procedures used in recruitment of subjects, sample field collection, sample extraction and analysis, data storage, and...
DOT National Transportation Integrated Search
2009-07-01
Current roadway quality control and quality acceptance (QC/QA) procedures : for Louisiana include coring for thickness, density, and air void checks in hot : mix asphalt (HMA) pavements and thickness and compressive strength for : Portland cement con...
Mapp, Latisha; Klonicki, Patricia; Takundwa, Prisca; Hill, Vincent R; Schneeberger, Chandra; Knee, Jackie; Raynor, Malik; Hwang, Nina; Chambers, Yildiz; Miller, Kenneth; Pope, Misty
2015-11-01
The U.S. Environmental Protection Agency's (EPA) Water Laboratory Alliance (WLA) currently uses ultrafiltration (UF) for concentration of biosafety level 3 (BSL-3) agents from large volumes (up to 100-L) of drinking water prior to analysis. Most UF procedures require comprehensive training and practice to achieve and maintain proficiency. As a result, there was a critical need to develop quality control (QC) criteria. Because select agents are difficult to work with and pose a significant safety hazard, QC criteria were developed using surrogates, including Enterococcus faecalis and Bacillus atrophaeus. This article presents the results from the QC criteria development study and results from a subsequent demonstration exercise in which E. faecalis was used to evaluate proficiency using UF to concentrate large volume drinking water samples. Based on preliminary testing EPA Method 1600 and Standard Methods 9218, for E. faecalis and B. atrophaeus respectively, were selected for use during the QC criteria development study. The QC criteria established for Method 1600 were used to assess laboratory performance during the demonstration exercise. Based on the results of the QC criteria study E. faecalis and B. atrophaeus can be used effectively to demonstrate and maintain proficiency using ultrafiltration. Published by Elsevier B.V.
Data Validation & Laboratory Quality Assurance for Region 9
In all hazardous site investigations it is essential to know the quality of the data used for decision-making purposes. Validation of data requires that appropriate quality assurance and quality control (QA/QC) procedures be followed.
40 CFR 98.94 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 21 2014-07-01 2014-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electronics Manufacturing § 98.94 Monitoring and QA/QC...-specific heel factors for each container type for each gas used, according to the procedures in paragraphs...
DOT National Transportation Integrated Search
2008-04-01
The objective of this study was to develop resilient modulus prediction models for possible application in the quality control/quality assurance (QC/QA) procedures during and after the construction of pavement layers. Field and laboratory testing pro...
DOT National Transportation Integrated Search
2011-06-01
The main objective of this study is to investigate the use of the semi-circular bend (SCB) : test as a quality assurance/quality control (QA/QC) measure for field construction. : Comparison of fracture properties from the SCB test and fatigue beam te...
Implementation of GPS controlled highway construction equipment phase II.
DOT National Transportation Integrated Search
2008-01-01
"During 2006, WisDOT and the Construction Materials and Support Center at UW-Madison worked together to develop : a specification and QC/QA procedures for GPS machine guidance on highway construction grading operations. These : specifications and pro...
Implementation of GPS controlled highway construction equipment, phase III.
DOT National Transportation Integrated Search
2009-02-01
Beginning in 2006, WisDOT and the Construction Material and Support Center (CMSC) at UW-Madison worked : together to develop the specifications and the QA/QC procedures for GPS machine guidance on highway grading : projects. These specifications and ...
Effects of Data Quality on the Characterization of Aerosol Properties from Multiple Sensors
NASA Technical Reports Server (NTRS)
Petrenko, Maksym; Ichoku, Charles; Leptoukh, Gregory
2011-01-01
Cross-comparison of aerosol properties between ground-based and spaceborne measurements is an important validation technique that helps to investigate the uncertainties of aerosol products acquired using spaceborne sensors. However, it has been shown that even minor differences in the cross-characterization procedure may significantly impact the results of such validation. Of particular consideration is the quality assurance I quality control (QA/QC) information - an auxiliary data indicating a "confidence" level (e.g., Bad, Fair, Good, Excellent, etc.) conferred by the retrieval algorithms on the produced data. Depending on the treatment of available QA/QC information, a cross-characterization procedure has the potential of filtering out invalid data points, such as uncertain or erroneous retrievals, which tend to reduce the credibility of such comparisons. However, under certain circumstances, even high QA/QC values may not fully guarantee the quality of the data. For example, retrievals in proximity of a cloud might be particularly perplexing for an aerosol retrieval algorithm, resulting in an invalid data that, nonetheless, could be assigned a high QA/QC confidence. In this presentation, we will study the effects of several QA/QC parameters on cross-characterization of aerosol properties between the data acquired by multiple spaceborne sensors. We will utilize the Multi-sensor Aerosol Products Sampling System (MAPSS) that provides a consistent platform for multi-sensor comparison, including collocation with measurements acquired by the ground-based Aerosol Robotic Network (AERONET), The multi-sensor spaceborne data analyzed include those acquired by the Terra-MODIS, Aqua-MODIS, Terra-MISR, Aura-OMI, Parasol-POLDER, and CalipsoCALIOP satellite instruments.
An introduction to statistical process control in research proteomics.
Bramwell, David
2013-12-16
Statistical process control is a well-established and respected method which provides a general purpose, and consistent framework for monitoring and improving the quality of a process. It is routinely used in many industries where the quality of final products is critical and is often required in clinical diagnostic laboratories [1,2]. To date, the methodology has been little utilised in research proteomics. It has been shown to be capable of delivering quantitative QC procedures for qualitative clinical assays [3] making it an ideal methodology to apply to this area of biological research. To introduce statistical process control as an objective strategy for quality control and show how it could be used to benefit proteomics researchers and enhance the quality of the results they generate. We demonstrate that rules which provide basic quality control are easy to derive and implement and could have a major impact on data quality for many studies. Statistical process control is a powerful tool for investigating and improving proteomics research work-flows. The process of characterising measurement systems and defining control rules forces the exploration of key questions that can lead to significant improvements in performance. This work asserts that QC is essential to proteomics discovery experiments. Every experimenter must know the current capabilities of their measurement system and have an objective means for tracking and ensuring that performance. Proteomic analysis work-flows are complicated and multi-variate. QC is critical for clinical chemistry measurements and huge strides have been made in ensuring the quality and validity of results in clinical biochemistry labs. This work introduces some of these QC concepts and works to bridge their use from single analyte QC to applications in multi-analyte systems. This article is part of a Special Issue entitled: Standardization and Quality Control in Proteomics. Copyright © 2013 The Author. Published by Elsevier B.V. All rights reserved.
Quality Control (QC) System Development for the Pell Grant Program: A Conceptual Framework.
ERIC Educational Resources Information Center
Advanced Technology, Inc., Reston, VA.
The objectives of the Pell Grant quality control (QC) system and the general definition of QC are considered. Attention is also directed to: the objectives of the Stage II Pell Grant QC system design and testing project, the approach used to develop the QC system, and the interface of the QC system and the Pell Grant delivery system. The…
Spatial Data Quality Control Procedure applied to the Okavango Basin Information System
NASA Astrophysics Data System (ADS)
Butchart-Kuhlmann, Daniel
2014-05-01
Spatial data is a powerful form of information, capable of providing information of great interest and tremendous use to a variety of users. However, much like other data representing the 'real world', precision and accuracy must be high for the results of data analysis to be deemed reliable and thus applicable to real world projects and undertakings. The spatial data quality control (QC) procedure presented here was developed as the topic of a Master's thesis, in the sphere of and using data from the Okavango Basin Information System (OBIS), itself a part of The Future Okavango (TFO) project. The aim of the QC procedure was to form the basis of a method through which to determine the quality of spatial data relevant for application to hydrological, solute, and erosion transport modelling using the Jena Adaptable Modelling System (JAMS). As such, the quality of all data present in OBIS classified under the topics of elevation, geoscientific information, or inland waters, was evaluated. Since the initial data quality has been evaluated, efforts are underway to correct the errors found, thus improving the quality of the dataset.
CHALLENGES IN SETTING UP QUALITY CONTROL IN DIAGNOSTIC RADIOLOGY FACILITIES IN NIGERIA.
Inyang, S O; Egbe, N O; Ekpo, E
2015-01-01
The Nigerian Nuclear Regulatory Authority (NNRA) was established to regulate and control the use of radioactive and radiation emitting sources in Nigeria. Quality control (QC) on diagnostic radiology equipment form part of the fundamental requirements for the authorization of diagnostic radiology facilities in the Country. Some quality control tests (output, exposure linearity and reproducibility) were measured on the x-ray machines in the facilities that took part in the study. Questionnaire was developed to evaluate the frequencies at which QC tests were conducted in the facilities and the challenges in setting up QC. Results show great variation in the values of the QC parameters measured. Inadequate cooperation by facilities management, lack of QC equipment and insufficient staff form the major challenges in setting up QC in the facilities under study. The responses on the frequencies at which QC tests should be conducted did not correspond to the recommended standards; indicating that personnel were not familiar with QC implementation and may require further training on QC.
Implementation of GPS Machine Controlled Grading - Phase III (2008) and Technical Training
DOT National Transportation Integrated Search
2009-02-01
Beginning in 2006, WisDOT and the Construction Material and Support Center (CMSC) at UW-Madison worked together to develop the specifications and the QA/QC procedures for GPS machine guidance on highway grading projects. These specifications and proc...
40 CFR 98.252 - GHGs to report.
Code of Federal Regulations, 2011 CFR
2011-07-01
... follow the calculation methodologies from § 98.253(f) and the monitoring and QA/QC methods, missing data..., monitoring and QA/QC methods, missing data procedures, reporting requirements, and recordkeeping requirements...
40 CFR 98.252 - GHGs to report.
Code of Federal Regulations, 2013 CFR
2013-07-01
... follow the calculation methodologies from § 98.253(f) and the monitoring and QA/QC methods, missing data..., monitoring and QA/QC methods, missing data procedures, reporting requirements, and recordkeeping requirements...
40 CFR 98.252 - GHGs to report.
Code of Federal Regulations, 2012 CFR
2012-07-01
... follow the calculation methodologies from § 98.253(f) and the monitoring and QA/QC methods, missing data..., monitoring and QA/QC methods, missing data procedures, reporting requirements, and recordkeeping requirements...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gillespie, B.M.; Stromatt, R.W.; Ross, G.A.
This data package contains the results obtained by Pacific Northwest Laboratory (PNL) staff in the characterization of samples for the 101-SY Hydrogen Safety Project. The samples were submitted for analysis by Westinghouse Hanford Company (WHC) under the Technical Project Plan (TPP) 17667 and the Quality Assurance Plan MCS-027. They came from a core taken during Window C'' after the May 1991 gas release event. The analytical procedures required for analysis were defined in the Test Instructions (TI) prepared by the PNL 101-SY Analytical Chemistry Laboratory (ACL) Project Management Office in accordance with the TPP and the QA Plan. The requestedmore » analysis for these samples was volatile organic analysis. The quality control (QC) requirements for each sample are defined in the Test Instructions for each sample. The QC requirements outlined in the procedures and requested in the WHC statement of work were followed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gillespie, B.M.; Stromatt, R.W.; Ross, G.A.
This data package contains the results obtained by Pacific Northwest Laboratory (PNL) staff in the characterization of samples for the 101-SY Hydrogen Safety Project. The samples were submitted for analysis by Westinghouse Hanford Company (WHC) under the Technical Project Plan (TPP) 17667 and the Quality Assurance Plan MCS-027. They came from a core taken during Window ``C`` after the May 1991 gas release event. The analytical procedures required for analysis were defined in the Test Instructions (TI) prepared by the PNL 101-SY Analytical Chemistry Laboratory (ACL) Project Management Office in accordance with the TPP and the QA Plan. The requestedmore » analysis for these samples was volatile organic analysis. The quality control (QC) requirements for each sample are defined in the Test Instructions for each sample. The QC requirements outlined in the procedures and requested in the WHC statement of work were followed.« less
Design, implementation, and quality control in the Pathways American-Indian multicenter trial
Stone, Elaine J.; Norman, James E.; Davis, Sally M.; Stewart, Dawn; Clay, Theresa E.; Caballero, Ben; Lohman, Timothy G.; Murray, David M.
2016-01-01
Background Pathways was the first multicenter American-Indian school-based study to test the effectiveness of an obesity prevention program promoting healthy eating and physical activity. Methods Pathways employed a nested cohort design in which 41 schools were randomized to intervention or control conditions and students within these schools were followed as a cohort (1,704 third graders at baseline). The study’s primary endpoint was percent body fat. Secondary endpoints were levels of fat in school lunches; time spent in physical activity; and knowledge, attitudes, and behaviors regarding diet and exercise. Quality control (QC) included design of data management systems which provided standardization and quality assurance of data collection and processing. Data QC procedures at study centers included manuals of operation, training and certification, and monitoring of performance. Process evaluation was conducted to monitor dose and fidelity of the interventions. Registration and tracking systems were used for students and schools. Results No difference in mean percent body fat at fifth grade was found between the intervention and control schools. Percent of calories from fat and saturated fat in school lunches was significantly reduced in the intervention schools as was total energy intake from 24-hour recalls. Significant increases in self-reported physical activity levels and knowledge of healthy behaviors were found for the intervention school students. Conclusions The Pathways study results provide evidence demonstrating the role schools can play in public health promotion. Its study design and QC systems and procedures provide useful models for other similar school based multi- or single-site studies. PMID:14636805
Valid internal standard technique for arson detection based on gas chromatography-mass spectrometry.
Salgueiro, Pedro A S; Borges, Carlos M F; Bettencourt da Silva, Ricardo J N
2012-09-28
The most popular procedures for the detection of residues of accelerants in fire debris are the ones published by the American Society for Testing and Materials (ASTM E1412-07 and E1618-10). The most critical stages of these tests are the conservation of fire debris from the sampling to the laboratory, the extraction of residues of accelerants from the debris to the activated charcoal strips (ACS) and from those to the final solvent, as well as the analysis of sample extract by gas chromatography-mass spectrometry (GC-MS) and the interpretation of the instrumental signal. This work proposes a strategy for checking the quality of the sample conservation, the accelerant residues transference to final solvent and GC-MS analysis, using internal standard additions. It is used internal standards ranging from a highly volatile compound for checking debris conservation to low volatile compound for checking GC-MS repeatability. The developed quality control (QC) parameters are not affected by GC-MS sensitivity variation and, specifically, the GC-MS performance control is not affected by ACS adsorption saturation that may mask test performance deviations. The proposed QC procedure proved to be adequate to check GC-MS repeatability, ACS extraction and sample conservation since: (1) standard additions are affected by negligible uncertainty and (2) observed dispersion of QC parameters are fit for its intended use. Copyright © 2012 Elsevier B.V. All rights reserved.
7 CFR 283.15 - Procedure for hearing.
Code of Federal Regulations, 2010 CFR
2010-01-01
... evidence, the QC claim against the State agency for a QC error rate in excess of the tolerance level. The... admissible in evidence subject to such objections as to relevancy, materiality or competency of the testimony...
DOT National Transportation Integrated Search
2015-01-01
Acceptance of earthwork construction by the Florida Department of Transportation (FDOT) : requires in-place testing conducted with a nuclear density gauge (NDG) to determine : dry density, which must obtain a required percent compaction based upon a ...
Operational CryoSat Product Quality Assessment
NASA Astrophysics Data System (ADS)
Mannan, Rubinder; Webb, Erica; Hall, Amanda; Bouzinac, Catherine
2013-12-01
The performance and quality of the CryoSat data products are routinely assessed by the Instrument Data quality Evaluation and Analysis Service (IDEAS). This information is then conveyed to the scientific and user community in order to allow them to utilise CryoSat data with confidence. This paper presents details of the Quality Control (QC) activities performed for CryoSat products under the IDEAS contract. Details of the different QC procedures and tools deployed by IDEAS to assess the quality of operational data are presented. The latest updates to the Instrument Processing Facility (IPF) for the Fast Delivery Marine (FDM) products and the future update to Baseline-C are discussed.
ChronQC: a quality control monitoring system for clinical next generation sequencing.
Tawari, Nilesh R; Seow, Justine Jia Wen; Perumal, Dharuman; Ow, Jack L; Ang, Shimin; Devasia, Arun George; Ng, Pauline C
2018-05-15
ChronQC is a quality control (QC) tracking system for clinical implementation of next-generation sequencing (NGS). ChronQC generates time series plots for various QC metrics to allow comparison of current runs to historical runs. ChronQC has multiple features for tracking QC data including Westgard rules for clinical validity, laboratory-defined thresholds and historical observations within a specified time period. Users can record their notes and corrective actions directly onto the plots for long-term recordkeeping. ChronQC facilitates regular monitoring of clinical NGS to enable adherence to high quality clinical standards. ChronQC is freely available on GitHub (https://github.com/nilesh-tawari/ChronQC), Docker (https://hub.docker.com/r/nileshtawari/chronqc/) and the Python Package Index. ChronQC is implemented in Python and runs on all common operating systems (Windows, Linux and Mac OS X). tawari.nilesh@gmail.com or pauline.c.ng@gmail.com. Supplementary data are available at Bioinformatics online.
Thieme, Frank; Marillonnet, Sylvestre
2014-01-01
Identification of unknown sequences that flank known sequences of interest requires PCR amplification of DNA fragments that contain the junction between the known and unknown flanking sequences. Since amplified products often contain a mixture of specific and nonspecific products, the quick and clean (QC) cloning procedure was developed to clone specific products only. QC cloning is a ligation-independent cloning procedure that relies on the exonuclease activity of T4 DNA polymerase to generate single-stranded extensions at the ends of the vector and insert. A specific feature of QC cloning is the use of vectors that contain a sequence called catching sequence that allows cloning specific products only. QC cloning is performed by a one-pot incubation of insert and vector in the presence of T4 DNA polymerase at room temperature for 10 min followed by direct transformation of the incubation mix in chemo-competent Escherichia coli cells.
Real Time Quality Control Methods for Cued EMI Data Collection
2016-03-14
contents be construed as reflecting the official policy or position of the Department of Defense. Reference herein to any specific commercial product...This project evaluated the effectiveness of in-field quality control (QC) procedures during cued electromagnetic induction (EMI) data collection. The...electromagnetic induction ESTCP Environmental Security Technology Certification Program hr hour ISO Industry Standard Object IVS Instrument
This Multi-Site QAPP presents the organization, data quality objectives (DQOs), a set of anticipated activities, sample analysis, data handling and specific Quality Assurance/Quality Control (QA/QC) procedures associated with Studies done in EPA Region 5
DOT National Transportation Integrated Search
2015-01-01
One of the objectives of this study was to evaluate soil testing equipment based on its capability of measuring in-place stiffness or modulus values. : As design criteria transition from empirical to mechanistic-empirical, soil test methods and equip...
Diffusion imaging quality control via entropy of principal direction distribution.
Farzinfar, Mahshid; Oguz, Ipek; Smith, Rachel G; Verde, Audrey R; Dietrich, Cheryl; Gupta, Aditya; Escolar, Maria L; Piven, Joseph; Pujol, Sonia; Vachet, Clement; Gouttard, Sylvain; Gerig, Guido; Dager, Stephen; McKinstry, Robert C; Paterson, Sarah; Evans, Alan C; Styner, Martin A
2013-11-15
Diffusion MR imaging has received increasing attention in the neuroimaging community, as it yields new insights into the microstructural organization of white matter that are not available with conventional MRI techniques. While the technology has enormous potential, diffusion MRI suffers from a unique and complex set of image quality problems, limiting the sensitivity of studies and reducing the accuracy of findings. Furthermore, the acquisition time for diffusion MRI is longer than conventional MRI due to the need for multiple acquisitions to obtain directionally encoded Diffusion Weighted Images (DWI). This leads to increased motion artifacts, reduced signal-to-noise ratio (SNR), and increased proneness to a wide variety of artifacts, including eddy-current and motion artifacts, "venetian blind" artifacts, as well as slice-wise and gradient-wise inconsistencies. Such artifacts mandate stringent Quality Control (QC) schemes in the processing of diffusion MRI data. Most existing QC procedures are conducted in the DWI domain and/or on a voxel level, but our own experiments show that these methods often do not fully detect and eliminate certain types of artifacts, often only visible when investigating groups of DWI's or a derived diffusion model, such as the most-employed diffusion tensor imaging (DTI). Here, we propose a novel regional QC measure in the DTI domain that employs the entropy of the regional distribution of the principal directions (PD). The PD entropy quantifies the scattering and spread of the principal diffusion directions and is invariant to the patient's position in the scanner. High entropy value indicates that the PDs are distributed relatively uniformly, while low entropy value indicates the presence of clusters in the PD distribution. The novel QC measure is intended to complement the existing set of QC procedures by detecting and correcting residual artifacts. Such residual artifacts cause directional bias in the measured PD and here called dominant direction artifacts. Experiments show that our automatic method can reliably detect and potentially correct such artifacts, especially the ones caused by the vibrations of the scanner table during the scan. The results further indicate the usefulness of this method for general quality assessment in DTI studies. Copyright © 2013 Elsevier Inc. All rights reserved.
Diffusion imaging quality control via entropy of principal direction distribution
Oguz, Ipek; Smith, Rachel G.; Verde, Audrey R.; Dietrich, Cheryl; Gupta, Aditya; Escolar, Maria L.; Piven, Joseph; Pujol, Sonia; Vachet, Clement; Gouttard, Sylvain; Gerig, Guido; Dager, Stephen; McKinstry, Robert C.; Paterson, Sarah; Evans, Alan C.; Styner, Martin A.
2013-01-01
Diffusion MR imaging has received increasing attention in the neuroimaging community, as it yields new insights into the microstructural organization of white matter that are not available with conventional MRI techniques. While the technology has enormous potential, diffusion MRI suffers from a unique and complex set of image quality problems, limiting the sensitivity of studies and reducing the accuracy of findings. Furthermore, the acquisition time for diffusion MRI is longer than conventional MRI due to the need for multiple acquisitions to obtain directionally encoded Diffusion Weighted Images (DWI). This leads to increased motion artifacts, reduced signal-to-noise ratio (SNR), and increased proneness to a wide variety of artifacts, including eddy-current and motion artifacts, “venetian blind” artifacts, as well as slice-wise and gradient-wise inconsistencies. Such artifacts mandate stringent Quality Control (QC) schemes in the processing of diffusion MRI data. Most existing QC procedures are conducted in the DWI domain and/or on a voxel level, but our own experiments show that these methods often do not fully detect and eliminate certain types of artifacts, often only visible when investigating groups of DWI's or a derived diffusion model, such as the most-employed diffusion tensor imaging (DTI). Here, we propose a novel regional QC measure in the DTI domain that employs the entropy of the regional distribution of the principal directions (PD). The PD entropy quantifies the scattering and spread of the principal diffusion directions and is invariant to the patient's position in the scanner. High entropy value indicates that the PDs are distributed relatively uniformly, while low entropy value indicates the presence of clusters in the PD distribution. The novel QC measure is intended to complement the existing set of QC procedures by detecting and correcting residual artifacts. Such residual artifacts cause directional bias in the measured PD and here called dominant direction artifacts. Experiments show that our automatic method can reliably detect and potentially correct such artifacts, especially the ones caused by the vibrations of the scanner table during the scan. The results further indicate the usefulness of this method for general quality assessment in DTI studies. PMID:23684874
Levey-Jennings Analysis Uncovers Unsuspected Causes of Immunohistochemistry Stain Variability.
Vani, Kodela; Sompuram, Seshi R; Naber, Stephen P; Goldsmith, Jeffrey D; Fulton, Regan; Bogen, Steven A
Almost all clinical laboratory tests use objective, quantitative measures of quality control (QC), incorporating Levey-Jennings analysis and Westgard rules. Clinical immunohistochemistry (IHC) testing, in contrast, relies on subjective, qualitative QC review. The consequences of using Levey-Jennings analysis for QC assessment in clinical IHC testing are not known. To investigate this question, we conducted a 1- to 2-month pilot test wherein the QC for either human epidermal growth factor receptor 2 (HER-2) or progesterone receptor (PR) in 3 clinical IHC laboratories was quantified and analyzed with Levey-Jennings graphs. Moreover, conventional tissue controls were supplemented with a new QC comprised of HER-2 or PR peptide antigens coupled onto 8 μm glass beads. At institution 1, this more stringent analysis identified a decrease in the HER-2 tissue control that had escaped notice by subjective evaluation. The decrement was due to heterogeneity in the tissue control itself. At institution 2, we identified a 1-day sudden drop in the PR tissue control, also undetected by subjective evaluation, due to counterstain variability. At institution 3, a QC shift was identified, but only with 1 of 2 controls mounted on each slide. The QC shift was due to use of the instrument's selective reagent drop zones dispense feature. None of these events affected patient diagnoses. These case examples illustrate that subjective QC evaluation of tissue controls can detect gross assay failure but not subtle changes. The fact that QC issues arose from each site, and in only a pilot study, suggests that immunohistochemical stain variability may be an underappreciated problem.
2015-05-01
in consultation with the site management . 4.0 DATA TYPES AND QUALITY CONTROL A sampling plan must account for the collection, handling, and...GUIDANCE DOCUMENT Cost-Effective, Ultra-Sensitive Groundwater Monitoring for Site Remediation and Management : Standard Operating Procedures...Groundwater Monitoring for Site Remediation and Management 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Halden, R.U., Roll, I.B. 5d
An accelerated solvent extraction (ASE) device was evaluated as a semi-automated means for extracting arsenicals from quality control (QC) samples and DORM-2 [standard reference material (SRM)]. Unlike conventional extraction procedures, the ASE requires that the sample be dispe...
ERIC Educational Resources Information Center
Waagen, Christopher L.
William Ouchi's Theory Z, a theory that focuses on the identification of both management and labor with the company's goals, emphasizes communication structures and styles. Ringi is a Japanese procedure for decision making in which all levels of management participate. In Ringi, a manager's task is to communicate. In quality control (Q-C) circles,…
WE-A-210-00: Educational: Diagnostic Ultrasound QA
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
This presentation will focus on the present role of ultrasound medical physics in clinical practices. The first part of the presentation will provide an overview of ultrasound QC methodologies and testing procedures. A brief review of ultrasound phantoms utilized in these testing procedures will be presented. The second part of the presentation will summarize ultrasound imaging technical standards and professional guidelines by American College of Radiology (ACR), American Institute of Ultrasound in Medicine (AIUM), American Association of Physicists in Medicine (AAPM) and International Electrotechnical Commission (IEC). The current accreditation requirements by ACR and AIUM for ultrasound practices will be describedmore » and the practical aspects of implementing QC programs to be compliant with these requirements will be discussed. Learning Objectives: Achieve familiarity with common ultrasound QC test methods and ultrasound phantoms. Understand the coverage of the existing testing standards and professional guidelines on diagnostic ultrasound imaging. Learn what a medical physicist needs to know about ultrasound program accreditation and be able to implement ultrasound QC programs accordingly.« less
jqcML: an open-source java API for mass spectrometry quality control data in the qcML format.
Bittremieux, Wout; Kelchtermans, Pieter; Valkenborg, Dirk; Martens, Lennart; Laukens, Kris
2014-07-03
The awareness that systematic quality control is an essential factor to enable the growth of proteomics into a mature analytical discipline has increased over the past few years. To this aim, a controlled vocabulary and document structure have recently been proposed by Walzer et al. to store and disseminate quality-control metrics for mass-spectrometry-based proteomics experiments, called qcML. To facilitate the adoption of this standardized quality control routine, we introduce jqcML, a Java application programming interface (API) for the qcML data format. First, jqcML provides a complete object model to represent qcML data. Second, jqcML provides the ability to read, write, and work in a uniform manner with qcML data from different sources, including the XML-based qcML file format and the relational database qcDB. Interaction with the XML-based file format is obtained through the Java Architecture for XML Binding (JAXB), while generic database functionality is obtained by the Java Persistence API (JPA). jqcML is released as open-source software under the permissive Apache 2.0 license and can be downloaded from https://bitbucket.org/proteinspector/jqcml .
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoisak, J; Manger, R; Dragojevic, I
Purpose: To perform a failure mode and effects analysis (FMEA) of the process for treating superficial skin cancers with the Xoft Axxent electronic brachytherapy (eBx) system, given the recent introduction of expanded quality control (QC) initiatives at our institution. Methods: A process map was developed listing all steps in superficial treatments with Xoft eBx, from the initial patient consult to the completion of the treatment course. The process map guided the FMEA to identify the failure modes for each step in the treatment workflow and assign Risk Priority Numbers (RPN), calculated as the product of the failure mode’s probability ofmore » occurrence (O), severity (S) and lack of detectability (D). FMEA was done with and without the inclusion of recent QC initiatives such as increased staffing, physics oversight, standardized source calibration, treatment planning and documentation. The failure modes with the highest RPNs were identified and contrasted before and after introduction of the QC initiatives. Results: Based on the FMEA, the failure modes with the highest RPN were related to source calibration, treatment planning, and patient setup/treatment delivery (Fig. 1). The introduction of additional physics oversight, standardized planning and safety initiatives such as checklists and time-outs reduced the RPNs of these failure modes. High-risk failure modes that could be mitigated with improved hardware and software interlocks were identified. Conclusion: The FMEA analysis identified the steps in the treatment process presenting the highest risk. The introduction of enhanced QC initiatives mitigated the risk of some of these failure modes by decreasing their probability of occurrence and increasing their detectability. This analysis demonstrates the importance of well-designed QC policies, procedures and oversight in a Xoft eBx programme for treatment of superficial skin cancers. Unresolved high risk failure modes highlight the need for non-procedural quality initiatives such as improved planning software and more robust hardware interlock systems.« less
Parvin, C A
1993-03-01
The error detection characteristics of quality-control (QC) rules that use control observations within a single analytical run are investigated. Unlike the evaluation of QC rules that span multiple analytical runs, most of the fundamental results regarding the performance of QC rules applied within a single analytical run can be obtained from statistical theory, without the need for simulation studies. The case of two control observations per run is investigated for ease of graphical display, but the conclusions can be extended to more than two control observations per run. Results are summarized in a graphical format that offers many interesting insights into the relations among the various QC rules. The graphs provide heuristic support to the theoretical conclusions that no QC rule is best under all error conditions, but the multirule that combines the mean rule and a within-run standard deviation rule offers an attractive compromise.
40 CFR 98.364 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... or operator shall document the procedures used to ensure the accuracy of gas flow rate, gas... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Manure Management § 98.364 Monitoring and QA/QC requirements... fraction of total manure managed in each system component. (c) The CH4 concentration of gas from digesters...
40 CFR 98.364 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... or operator shall document the procedures used to ensure the accuracy of gas flow rate, gas... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Manure Management § 98.364 Monitoring and QA/QC requirements... fraction of total manure managed in each system component. (c) The CH4 concentration of gas from digesters...
40 CFR 98.364 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... or operator shall document the procedures used to ensure the accuracy of gas flow rate, gas... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Manure Management § 98.364 Monitoring and QA/QC requirements... fraction of total manure managed in each system component. (c) The CH4 concentration of gas from digesters...
Kaufmann-Kolle, Petra; Szecsenyi, Joachim; Broge, Björn; Haefeli, Walter Emil; Schneider, Antonius
2011-01-01
The purpose of this cluster-randomised controlled trial was to evaluate the efficacy of quality circles (QCs) working either with general data-based feedback or with an open benchmark within the field of asthma care and drug-drug interactions. Twelve QCs, involving 96 general practitioners from 85 practices, were randomised. Six QCs worked with traditional anonymous feedback and six with an open benchmark. Two QC meetings supported with feedback reports were held covering the topics "drug-drug interactions" and "asthma"; in both cases discussions were guided by a trained moderator. Outcome measures included health-related quality of life and patient satisfaction with treatment, asthma severity and number of potentially inappropriate drug combinations as well as the general practitioners' satisfaction in relation to the performance of the QC. A significant improvement in the treatment of asthma was observed in both trial arms. However, there was only a slight improvement regarding inappropriate drug combinations. There were no relevant differences between the group with open benchmark (B-QC) and traditional quality circles (T-QC). The physicians' satisfaction with the QC performance was significantly higher in the T-QCs. General practitioners seem to take a critical perspective about open benchmarking in quality circles. Caution should be used when implementing benchmarking in a quality circle as it did not improve healthcare when compared to the traditional procedure with anonymised comparisons. Copyright © 2011. Published by Elsevier GmbH.
Krishnan, S; Webb, S; Henderson, A R; Cheung, C M; Nazir, D J; Richardson, H
1999-03-01
The Laboratory Proficiency Testing Program (LPTP) assesses the analytical performance of all licensed laboratories in Ontario. The LPTP Enzymes, Cardiac Markers, and Lipids Committee conducted a "Patterns of Practice" survey to assess the in-house quality control (QC) practices of laboratories in Ontario using cholesterol as the QC paradigm. The survey was questionnaire-based seeking information on statistical calculations, software rules, review process and data retention, and so on. Copies of the in-house cholesterol QC graphs were requested. A total of 120 of 210 laboratories were randomly chosen to receive the questionnaires during 1995 and 1996; 115 laboratories responded, although some did not answer all questions. The majority calculate means and standard deviations (SD) every month, using anywhere from 4 to >100 data points. 65% use a fixed mean and SD, while 17% use means calculated from the previous month. A few use a floating or cumulative mean. Some laboratories that do not use fixed means use a fixed SD. About 90% use some form of statistical quality control rules. The most common rules used to detect random error are 1(3s)/R4s while 2(2s)/4(1s)/10x are used for systematic errors. About 20% did not assay any QC at levels >5.5 mmol/L. Quality control data are reviewed daily (technologists), weekly and monthly (supervisors/directors). Most laboratories retain their QC records for up to 3 years on paper and magnetic media. On some QC graphs the mean and SD, QC product lot number, or reference to action logs are not apparent. Quality control practices in Ontario are, therefore, disappointing. Improvement is required in the use of clinically appropriate concentrations of QC material and documentation on QC graphs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hood, F.C.
1992-05-01
Quality assurance and quality control (QA/QC) of analytical chemistry laboratory activities are essential to the validity and usefulness of resultant data. However, in themselves, conventional QA/QC measures will not always ensure that fraudulent data are not generated. Conventional QA/QC measures are based on the assumption that work will be done in good faith; to assure against fraudulent practices, QA/QC measures must be tailored to specific analyses protocols in anticipation of intentional misapplication of those protocols. Application of specific QA/QC measures to ensure against fraudulent practices result in an increased administrative burden being placed on the analytical process; accordingly, in keepingmore » with graded QA philosophy, data quality objectives must be used to identify specific points of concern for special control to minimize the administrative impact.« less
QA/QC in the laboratory. Session F
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hood, F.C.
1992-05-01
Quality assurance and quality control (QA/QC) of analytical chemistry laboratory activities are essential to the validity and usefulness of resultant data. However, in themselves, conventional QA/QC measures will not always ensure that fraudulent data are not generated. Conventional QA/QC measures are based on the assumption that work will be done in good faith; to assure against fraudulent practices, QA/QC measures must be tailored to specific analyses protocols in anticipation of intentional misapplication of those protocols. Application of specific QA/QC measures to ensure against fraudulent practices result in an increased administrative burden being placed on the analytical process; accordingly, in keepingmore » with graded QA philosophy, data quality objectives must be used to identify specific points of concern for special control to minimize the administrative impact.« less
Gantner, Pierre; Mélard, Adeline; Damond, Florence; Delaugerre, Constance; Dina, Julia; Gueudin, Marie; Maillard, Anne; Sauné, Karine; Rodallec, Audrey; Tuaillon, Edouard; Plantier, Jean-Christophe; Rouzioux, Christine; Avettand-Fenoel, Véronique
2017-11-01
Viral reservoirs represent an important barrier to HIV cure. Accurate markers of HIV reservoirs are needed to develop multicenter studies. The aim of this multicenter quality control (QC) was to evaluate the inter-laboratory reproducibility of total HIV-1-DNA quantification. Ten laboratories of the ANRS-AC11 working group participated by quantifying HIV-DNA with a real-time qPCR assay (Biocentric) in four samples (QCMD). Good reproducibility was found between laboratories (standard deviation ≤ 0.2 log 10 copies/10 6 PBMC) for the three positive QC that were correctly classified by each laboratory (QC1
Li, Junming; He, Zhiyao; Yu, Shui; Li, Shuangzhi; Ma, Qing; Yu, Yiyi; Zhang, Jialin; Li, Rui; Zheng, Yu; He, Gu; Song, Xiangrong
2012-10-01
In this study, quercetin (QC) with cancer chemoprevention effect and anticancer potential was loaded into polymeric micelles of methoxy poly(ethylene glycol)-cholesterol conjugate (mPEG-Chol) in order to increase its water solubility. MPEG-Chol with lower critical micelle concentration (CMC) value (4.0 x 10(-7) M - 13 x 10(-7) M) was firstly synthesized involving two steps of chemical modification on cholesterol by esterification, and then QC was incorporated into mPEG-Chol micelles by self-assembly method. After the process parameters were optimized, QC-loaded micelles had higher drug loading (3.66%) and entrapment efficiency (93.51%) and nano-sized diameter (116 nm). DSC analysis demonstrated that QC had been incorporated non-covalently into the micelles and existed as an amorphous state or a solid solution in the polymeric matrix. The freeze-dried formulation with addition of 1% (w/v) mannitol as cryoprotectant was successfully developed for the long-term storage of QC-loaded micelles. Compared to free QC, QC-loaded micelles could release QC more slowly. Moreover, the release of QC from micelles was slightly faster in PBS at pH 5 than that in PBS at pH 7.4, which implied that QC-loaded micelles might be pH-sensitive and thereby selectively deliver QC to tumor tissue with unwanted side effects. Therefore, mPEG-Chol was a promising micellar vector for the controlled and targeted drug delivery of QC to tumor and QC-loaded micelles were also worth being further investigated as a potential formulation for cancer chemoprevention and treatment.
A comprehensive quality control workflow for paired tumor-normal NGS experiments.
Schroeder, Christopher M; Hilke, Franz J; Löffler, Markus W; Bitzer, Michael; Lenz, Florian; Sturm, Marc
2017-06-01
Quality control (QC) is an important part of all NGS data analysis stages. Many available tools calculate QC metrics from different analysis steps of single sample experiments (raw reads, mapped reads and variant lists). Multi-sample experiments, as sequencing of tumor-normal pairs, require additional QC metrics to ensure validity of results. These multi-sample QC metrics still lack standardization. We therefore suggest a new workflow for QC of DNA sequencing of tumor-normal pairs. With this workflow well-known single-sample QC metrics and additional metrics specific for tumor-normal pairs can be calculated. The segmentation into different tools offers a high flexibility and allows reuse for other purposes. All tools produce qcML, a generic XML format for QC of -omics experiments. qcML uses quality metrics defined in an ontology, which was adapted for NGS. All QC tools are implemented in C ++ and run both under Linux and Windows. Plotting requires python 2.7 and matplotlib. The software is available under the 'GNU General Public License version 2' as part of the ngs-bits project: https://github.com/imgag/ngs-bits. christopher.schroeder@med.uni-tuebingen.de. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Automated quality control in a file-based broadcasting workflow
NASA Astrophysics Data System (ADS)
Zhang, Lina
2014-04-01
Benefit from the development of information and internet technologies, television broadcasting is transforming from inefficient tape-based production and distribution to integrated file-based workflows. However, no matter how many changes have took place, successful broadcasting still depends on the ability to deliver a consistent high quality signal to the audiences. After the transition from tape to file, traditional methods of manual quality control (QC) become inadequate, subjective, and inefficient. Based on China Central Television's full file-based workflow in the new site, this paper introduces an automated quality control test system for accurate detection of hidden troubles in media contents. It discusses the system framework and workflow control when the automated QC is added. It puts forward a QC criterion and brings forth a QC software followed this criterion. It also does some experiments on QC speed by adopting parallel processing and distributed computing. The performance of the test system shows that the adoption of automated QC can make the production effective and efficient, and help the station to achieve a competitive advantage in the media market.
Dietel, Manfred; Bubendorf, Lukas; Dingemans, Anne-Marie C; Dooms, Christophe; Elmberger, Göran; García, Rosa Calero; Kerr, Keith M; Lim, Eric; López-Ríos, Fernando; Thunnissen, Erik; Van Schil, Paul E; von Laffert, Maximilian
2016-01-01
Background There is currently no Europe-wide consensus on the appropriate preanalytical measures and workflow to optimise procedures for tissue-based molecular testing of non-small-cell lung cancer (NSCLC). To address this, a group of lung cancer experts (see list of authors) convened to discuss and propose standard operating procedures (SOPs) for NSCLC. Methods Based on earlier meetings and scientific expertise on lung cancer, a multidisciplinary group meeting was aligned. The aim was to include all relevant aspects concerning NSCLC diagnosis. After careful consideration, the following topics were selected and each was reviewed by the experts: surgical resection and sampling; biopsy procedures for analysis; preanalytical and other variables affecting quality of tissue; tissue conservation; testing procedures for epidermal growth factor receptor, anaplastic lymphoma kinase and ROS proto-oncogene 1, receptor tyrosine kinase (ROS1) in lung tissue and cytological specimens; as well as standardised reporting and quality control (QC). Finally, an optimal workflow was described. Results Suggested optimal procedures and workflows are discussed in detail. The broad consensus was that the complex workflow presented can only be executed effectively by an interdisciplinary approach using a well-trained team. Conclusions To optimise diagnosis and treatment of patients with NSCLC, it is essential to establish SOPs that are adaptable to the local situation. In addition, a continuous QC system and a local multidisciplinary tumour-type-oriented board are essential. PMID:26530085
Folks, Russell D; Garcia, Ernest V; Taylor, Andrew T
2007-03-01
Quantitative nuclear renography has numerous potential sources of error. We previously reported the initial development of a computer software module for comprehensively addressing the issue of quality control (QC) in the analysis of radionuclide renal images. The objective of this study was to prospectively test the QC software. The QC software works in conjunction with standard quantitative renal image analysis using a renal quantification program. The software saves a text file that summarizes QC findings as possible errors in user-entered values, calculated values that may be unreliable because of the patient's clinical condition, and problems relating to acquisition or processing. To test the QC software, a technologist not involved in software development processed 83 consecutive nontransplant clinical studies. The QC findings of the software were then tabulated. QC events were defined as technical (study descriptors that were out of range or were entered and then changed, unusually sized or positioned regions of interest, or missing frames in the dynamic image set) or clinical (calculated functional values judged to be erroneous or unreliable). Technical QC events were identified in 36 (43%) of 83 studies. Clinical QC events were identified in 37 (45%) of 83 studies. Specific QC events included starting the camera after the bolus had reached the kidney, dose infiltration, oversubtraction of background activity, and missing frames in the dynamic image set. QC software has been developed to automatically verify user input, monitor calculation of renal functional parameters, summarize QC findings, and flag potentially unreliable values for the nuclear medicine physician. Incorporation of automated QC features into commercial or local renal software can reduce errors and improve technologist performance and should improve the efficiency and accuracy of image interpretation.
Eight years of quality control in Bulgaria: impact on mammography practice.
Avramova-Cholakova, S; Lilkov, G; Kaneva, M; Terziev, K; Nakov, I; Mutkurov, N; Kovacheva, D; Ivanova, M; Vasilev, D
2015-07-01
The requirements for quality control (QC) in diagnostic radiology were introduced in Bulgarian legislation in 2005. Hospital medical physicists and several private medical physics groups provide QC services to radiology departments. The aim of this study was to analyse data from QC tests in mammography and to investigate the impact of QC introduction on mammography practice in the country. The study was coordinated by the National Centre of Radiobiology and Radiation Protection. All medical physics services were requested to fill in standardised forms with information about most important parameters routinely measured during QC. All QC service providers responded. Results demonstrated significant improvement of practice since the introduction of QC, with reduction of established deviations from 65 % during the first year to 7 % in the last year. The systems that do not meet the acceptability criteria were suspended from use. Performance of automatic exposure control and digital detectors are not regularly tested because of the absence of requirements in the legislation. The need of updated guidance and training of medical physicists to reflect the change in technology was demonstrated. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Quality Control in Primary Schools: Progress from 2001-2006
ERIC Educational Resources Information Center
Hofman, Roelande H.; de Boom, Jan; Hofman, W. H. Adriaan
2010-01-01
This article presents findings of research into the quality control (QC) of schools from 2001-2006. In 2001 several targets for QC were set and the progress of 939 primary schools is presented. Furthermore, using cluster analysis, schools are classified into four QC-types that differ in their focus on school (self) evaluation and school…
Mora, Patricia; Faulkner, Keith; Mahmoud, Ahmed M; Gershan, Vesna; Kausik, Aruna; Zdesar, Urban; Brandan, María-Ester; Kurt, Serap; Davidović, Jasna; Salama, Dina H; Aribal, Erkin; Odio, Clara; Chaturvedi, Arvind K; Sabih, Zahida; Vujnović, Saša; Paez, Diana; Delis, Harry
2018-04-01
The International Atomic Energy Agency (IAEA) through a Coordinated Research Project on "Enhancing Capacity for Early Detection and Diagnosis of Breast Cancer through Imaging", brought together a group of mammography radiologists, medical physicists and radiographers; to investigate current practices and improve procedures for the early detection of breast cancer by strengthening both the clinical and medical physics components. This paper addresses the medical physics component. The countries that participated in the CRP were Bosnia and Herzegovina, Costa Rica, Egypt, India, Kenya, the Frmr. Yug. Rep. of Macedonia, Mexico, Nigeria, Pakistan, Philippines, Slovenia, Turkey, Uganda, United Kingdom and Zambia. Ten institutions participated using IAEA quality control protocols in 9 digital and 3 analogue mammography equipment. A spreadsheet for data collection was generated and distributed. Evaluation of image quality was done using TOR MAX and DMAM2 Gold phantoms. QC results for analogue equipment showed satisfactory results. QC tests performed on digital systems showed that improvements needed to be implemented, especially in thickness accuracy, signal difference to noise ratio (SDNR) values for achievable levels, uniformity and modulation transfer function (MTF). Mean glandular dose (MGD) was below international recommended levels for patient radiation protection. Evaluation of image quality by phantoms also indicated the need for improvement. Common activities facilitated improvement in mammography practice, including training of medical physicists in QC programs and infrastructure was improved and strengthened; networking among medical physicists and radiologists took place and was maintained over time. IAEA QC protocols provided a uniformed approach to QC measurements. Copyright © 2018 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Jones, A Kyle; Heintz, Philip; Geiser, William; Goldman, Lee; Jerjian, Khachig; Martin, Melissa; Peck, Donald; Pfeiffer, Douglas; Ranger, Nicole; Yorkston, John
2015-11-01
Quality control (QC) in medical imaging is an ongoing process and not just a series of infrequent evaluations of medical imaging equipment. The QC process involves designing and implementing a QC program, collecting and analyzing data, investigating results that are outside the acceptance levels for the QC program, and taking corrective action to bring these results back to an acceptable level. The QC process involves key personnel in the imaging department, including the radiologist, radiologic technologist, and the qualified medical physicist (QMP). The QMP performs detailed equipment evaluations and helps with oversight of the QC program, the radiologic technologist is responsible for the day-to-day operation of the QC program. The continued need for ongoing QC in digital radiography has been highlighted in the scientific literature. The charge of this task group was to recommend consistency tests designed to be performed by a medical physicist or a radiologic technologist under the direction of a medical physicist to identify problems with an imaging system that need further evaluation by a medical physicist, including a fault tree to define actions that need to be taken when certain fault conditions are identified. The focus of this final report is the ongoing QC process, including rejected image analysis, exposure analysis, and artifact identification. These QC tasks are vital for the optimal operation of a department performing digital radiography.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, A. Kyle, E-mail: kyle.jones@mdanderson.org; Geiser, William; Heintz, Philip
Quality control (QC) in medical imaging is an ongoing process and not just a series of infrequent evaluations of medical imaging equipment. The QC process involves designing and implementing a QC program, collecting and analyzing data, investigating results that are outside the acceptance levels for the QC program, and taking corrective action to bring these results back to an acceptable level. The QC process involves key personnel in the imaging department, including the radiologist, radiologic technologist, and the qualified medical physicist (QMP). The QMP performs detailed equipment evaluations and helps with oversight of the QC program, the radiologic technologist ismore » responsible for the day-to-day operation of the QC program. The continued need for ongoing QC in digital radiography has been highlighted in the scientific literature. The charge of this task group was to recommend consistency tests designed to be performed by a medical physicist or a radiologic technologist under the direction of a medical physicist to identify problems with an imaging system that need further evaluation by a medical physicist, including a fault tree to define actions that need to be taken when certain fault conditions are identified. The focus of this final report is the ongoing QC process, including rejected image analysis, exposure analysis, and artifact identification. These QC tasks are vital for the optimal operation of a department performing digital radiography.« less
Song, Ting; Li, Nan; Zarepisheh, Masoud; Li, Yongbao; Gautier, Quentin; Zhou, Linghong; Mell, Loren; Jiang, Steve; Cerviño, Laura
2016-01-01
Intensity-modulated radiation therapy (IMRT) currently plays an important role in radiotherapy, but its treatment plan quality can vary significantly among institutions and planners. Treatment plan quality control (QC) is a necessary component for individual clinics to ensure that patients receive treatments with high therapeutic gain ratios. The voxel-weighting factor-based plan re-optimization mechanism has been proved able to explore a larger Pareto surface (solution domain) and therefore increase the possibility of finding an optimal treatment plan. In this study, we incorporated additional modules into an in-house developed voxel weighting factor-based re-optimization algorithm, which was enhanced as a highly automated and accurate IMRT plan QC tool (TPS-QC tool). After importing an under-assessment plan, the TPS-QC tool was able to generate a QC report within 2 minutes. This QC report contains the plan quality determination as well as information supporting the determination. Finally, the IMRT plan quality can be controlled by approving quality-passed plans and replacing quality-failed plans using the TPS-QC tool. The feasibility and accuracy of the proposed TPS-QC tool were evaluated using 25 clinically approved cervical cancer patient IMRT plans and 5 manually created poor-quality IMRT plans. The results showed high consistency between the QC report quality determinations and the actual plan quality. In the 25 clinically approved cases that the TPS-QC tool identified as passed, a greater difference could be observed for dosimetric endpoints for organs at risk (OAR) than for planning target volume (PTV), implying that better dose sparing could be achieved in OAR than in PTV. In addition, the dose-volume histogram (DVH) curves of the TPS-QC tool re-optimized plans satisfied the dosimetric criteria more frequently than did the under-assessment plans. In addition, the criteria for unsatisfied dosimetric endpoints in the 5 poor-quality plans could typically be satisfied when the TPS-QC tool generated re-optimized plans without sacrificing other dosimetric endpoints. In addition to its feasibility and accuracy, the proposed TPS-QC tool is also user-friendly and easy to operate, both of which are necessary characteristics for clinical use.
AfterQC: automatic filtering, trimming, error removing and quality control for fastq data.
Chen, Shifu; Huang, Tanxiao; Zhou, Yanqing; Han, Yue; Xu, Mingyan; Gu, Jia
2017-03-14
Some applications, especially those clinical applications requiring high accuracy of sequencing data, usually have to face the troubles caused by unavoidable sequencing errors. Several tools have been proposed to profile the sequencing quality, but few of them can quantify or correct the sequencing errors. This unmet requirement motivated us to develop AfterQC, a tool with functions to profile sequencing errors and correct most of them, plus highly automated quality control and data filtering features. Different from most tools, AfterQC analyses the overlapping of paired sequences for pair-end sequencing data. Based on overlapping analysis, AfterQC can detect and cut adapters, and furthermore it gives a novel function to correct wrong bases in the overlapping regions. Another new feature is to detect and visualise sequencing bubbles, which can be commonly found on the flowcell lanes and may raise sequencing errors. Besides normal per cycle quality and base content plotting, AfterQC also provides features like polyX (a long sub-sequence of a same base X) filtering, automatic trimming and K-MER based strand bias profiling. For each single or pair of FastQ files, AfterQC filters out bad reads, detects and eliminates sequencer's bubble effects, trims reads at front and tail, detects the sequencing errors and corrects part of them, and finally outputs clean data and generates HTML reports with interactive figures. AfterQC can run in batch mode with multiprocess support, it can run with a single FastQ file, a single pair of FastQ files (for pair-end sequencing), or a folder for all included FastQ files to be processed automatically. Based on overlapping analysis, AfterQC can estimate the sequencing error rate and profile the error transform distribution. The results of our error profiling tests show that the error distribution is highly platform dependent. Much more than just another new quality control (QC) tool, AfterQC is able to perform quality control, data filtering, error profiling and base correction automatically. Experimental results show that AfterQC can help to eliminate the sequencing errors for pair-end sequencing data to provide much cleaner outputs, and consequently help to reduce the false-positive variants, especially for the low-frequency somatic mutations. While providing rich configurable options, AfterQC can detect and set all the options automatically and require no argument in most cases.
[Development of quality assurance/quality control web system in radiotherapy].
Okamoto, Hiroyuki; Mochizuki, Toshihiko; Yokoyama, Kazutoshi; Wakita, Akihisa; Nakamura, Satoshi; Ueki, Heihachi; Shiozawa, Keiko; Sasaki, Koji; Fuse, Masashi; Abe, Yoshihisa; Itami, Jun
2013-12-01
Our purpose is to develop a QA/QC (quality assurance/quality control) web system using a server-side script language such as HTML (HyperText Markup Language) and PHP (Hypertext Preprocessor), which can be useful as a tool to share information about QA/QC in radiotherapy. The system proposed in this study can be easily built in one's own institute, because HTML can be easily handled. There are two desired functions in a QA/QC web system: (i) To review the results of QA/QC for a radiotherapy machine, manuals, and reports necessary for routinely performing radiotherapy through this system. By disclosing the results, transparency can be maintained, (ii) To reveal a protocol for QA/QC in one's own institute using pictures and movies relating to QA/QC for simplicity's sake, which can also be used as an educational tool for junior radiation technologists and medical physicists. By using this system, not only administrators, but also all staff involved in radiotherapy, can obtain information about the conditions and accuracy of treatment machines through the QA/QC web system.
40 CFR 98.252 - GHGs to report.
Code of Federal Regulations, 2010 CFR
2010-07-01
... § 98.253(f) and the monitoring and QA/QC methods, missing data procedures, reporting requirements, and... methods, missing data procedures, reporting requirements, and recordkeeping requirements of subpart P of...
40 CFR 98.252 - GHGs to report.
Code of Federal Regulations, 2014 CFR
2014-07-01
... calculation methodologies from § 98.253(f) and the monitoring and QA/QC methods, missing data procedures... methods, missing data procedures, reporting requirements, and recordkeeping requirements of subpart P of...
Song, Ting; Li, Nan; Zarepisheh, Masoud; Li, Yongbao; Gautier, Quentin; Zhou, Linghong; Mell, Loren; Jiang, Steve; Cerviño, Laura
2016-01-01
Intensity-modulated radiation therapy (IMRT) currently plays an important role in radiotherapy, but its treatment plan quality can vary significantly among institutions and planners. Treatment plan quality control (QC) is a necessary component for individual clinics to ensure that patients receive treatments with high therapeutic gain ratios. The voxel-weighting factor-based plan re-optimization mechanism has been proved able to explore a larger Pareto surface (solution domain) and therefore increase the possibility of finding an optimal treatment plan. In this study, we incorporated additional modules into an in-house developed voxel weighting factor-based re-optimization algorithm, which was enhanced as a highly automated and accurate IMRT plan QC tool (TPS-QC tool). After importing an under-assessment plan, the TPS-QC tool was able to generate a QC report within 2 minutes. This QC report contains the plan quality determination as well as information supporting the determination. Finally, the IMRT plan quality can be controlled by approving quality-passed plans and replacing quality-failed plans using the TPS-QC tool. The feasibility and accuracy of the proposed TPS-QC tool were evaluated using 25 clinically approved cervical cancer patient IMRT plans and 5 manually created poor-quality IMRT plans. The results showed high consistency between the QC report quality determinations and the actual plan quality. In the 25 clinically approved cases that the TPS-QC tool identified as passed, a greater difference could be observed for dosimetric endpoints for organs at risk (OAR) than for planning target volume (PTV), implying that better dose sparing could be achieved in OAR than in PTV. In addition, the dose-volume histogram (DVH) curves of the TPS-QC tool re-optimized plans satisfied the dosimetric criteria more frequently than did the under-assessment plans. In addition, the criteria for unsatisfied dosimetric endpoints in the 5 poor-quality plans could typically be satisfied when the TPS-QC tool generated re-optimized plans without sacrificing other dosimetric endpoints. In addition to its feasibility and accuracy, the proposed TPS-QC tool is also user-friendly and easy to operate, both of which are necessary characteristics for clinical use. PMID:26930204
NASA Astrophysics Data System (ADS)
Kapanen, Mika; Tenhunen, Mikko; Hämäläinen, Tuomo; Sipilä, Petri; Parkkinen, Ritva; Järvinen, Hannu
2006-07-01
Quality control (QC) data of radiotherapy linear accelerators, collected by Helsinki University Central Hospital between the years 2000 and 2004, were analysed. The goal was to provide information for the evaluation and elaboration of QC of accelerator outputs and to propose a method for QC data analysis. Short- and long-term drifts in outputs were quantified by fitting empirical mathematical models to the QC measurements. Normally, long-term drifts were well (<=1%) modelled by either a straight line or a single-exponential function. A drift of 2% occurred in 18 ± 12 months. The shortest drift times of only 2-3 months were observed for some new accelerators just after the commissioning but they stabilized during the first 2-3 years. The short-term reproducibility and the long-term stability of local constancy checks, carried out with a sealed plane parallel ion chamber, were also estimated by fitting empirical models to the QC measurements. The reproducibility was 0.2-0.5% depending on the positioning practice of a device. Long-term instabilities of about 0.3%/month were observed for some checking devices. The reproducibility of local absorbed dose measurements was estimated to be about 0.5%. The proposed empirical model fitting of QC data facilitates the recognition of erroneous QC measurements and abnormal output behaviour, caused by malfunctions, offering a tool to improve dose control.
You, Jun; Zhou, Jinping; Li, Qian; Zhang, Lina
2012-03-20
As a weak base, β-glycerophosphate (β-GP) was used to spontaneously initiate gelation of quaternized cellulose (QC) solutions at body temperature. The QC/β-GP solutions are flowable below or at room temperature but gel rapidly under physiological conditions. In order to clarify the sol-gel transition process of the QC/β-GP systems, the complex was investigated by dynamic viscoelastic measurements. The shear storage modulus (G') and loss modulus (G″) as a function of (1) concentration of β-GP (c(β-GP)), (2) concentration of QC (c(QC)), (3) degree of substitution (DS; i.e., the average number of substituted hydroxyl groups in the anhydroglucose unit) of QC, (4) viscosity-average molecular weight (M(η)) of QC, and (5) solvent medium were studied by the oscillatory rheology. The sol-gel transition temperature of QC/β-GP solutions decreased with an increase of c(QC) and c(β-GP), the M(η) of QC, and a decrease of the DS of QC and pH of the solvent. The sol-gel transition temperature and time could be easily controlled by adjusting the concentrations of QC and β-GP, M(η) and DS of QC, and the solvent medium. Gels formed after heating were irreversible; i.e., after cooling to lower temperature they could not be dissolved to become liquid again. The aggregation and entanglement of QC chains, electrostatic interaction, and hydrogen bonding between QC and β-GP were the main factors responsible for the irreversible sol-gel transition behavior of QC/β-GP systems.
Oral Solid Dosage Form Disintegration Testing - The Forgotten Test.
Al-Gousous, Jozef; Langguth, Peter
2015-09-01
Since its inception in the 1930s, disintegration testing has become an important quality control (QC) test in pharmaceutical industry, and disintegration test procedures for various dosage forms have been described by the different pharmacopoeias, with harmonization among them still not quite complete. However, because of the fact that complete disintegration does not necessarily imply complete dissolution, much more research has been focused on dissolution rather than on disintegration testing. Nevertheless, owing to its simplicity, disintegration testing seems to be an attractive replacement to dissolution testing as recognized by the International Conference on Harmonization guidelines, in some cases. Therefore, with proper research being carried out to overcome the associated challenges, the full potential of disintegration testing could be tapped saving considerable efforts allocated to QC testing and quality assurance. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.
Vassileva, J; Dimov, A; Slavchev, A; Karadjov, A
2005-01-01
Results from a Bulgarian patient dose survey in diagnostic radiology are presented. Reference levels for entrance surface dose (ESD) were 0.9 mGy for chest radiography (PA), 30 mGy for lumbar spine (Lat), 10 mGy for pelvis, 5 mGy for skull (AP), 3 mGy for skull (Lat) and 13 mGy for mammography. Quality control (QC) programmes were proposed for various areas of diagnostic radiology. Film processing QC warranted special attention. Proposed QC programmes included parameters to be tested, level of expertise needed and two action levels: remedial and suspension. Programmes were tested under clinical conditions to assess initial results and draw conclusions for further QC system development. On the basis of international experience, measurement protocols were developed for all parameters tested. QC equipment was provided as part of the PHARE project. A future problem for QC programme implementation may be the small number of medical physics experts in diagnostic radiology.
Quality assurance and quality control of geochemical data—A primer for the research scientist
Geboy, Nicholas J.; Engle, Mark A.
2011-01-01
Geochemistry is a constantly expanding science. More and more, scientists are employing geochemical tools to help answer questions about the Earth and earth system processes. Scientists may assume that the responsibility of examining and assessing the quality of the geochemical data they generate is not theirs but rather that of the analytical laboratories to which their samples have been submitted. This assumption may be partially based on knowledge about internal and external quality assurance and quality control (QA/QC) programs in which analytical laboratories typically participate. Or there may be a perceived lack of time or resources to adequately examine data quality. Regardless of the reason, the lack of QA/QC protocols can lead to the generation and publication of erroneous data. Because the interpretations drawn from the data are primary products to U.S. Geological Survey (USGS) stakeholders, the consequences of publishing erroneous results can be significant. The principal investigator of a scientific study ultimately is responsible for the quality and interpretation of the project's findings, and thus must also play a role in the understanding, implementation, and presentation of QA/QC information about the data. Although occasionally ignored, QA/QC protocols apply not only to procedures in the laboratory but also in the initial planning of a research study and throughout the life of the project. Many of the tenets of developing a sound QA/QC program or protocols also parallel the core concepts of developing a good study: What is the main objective of the study? Will the methods selected provide data of enough resolution to answer the hypothesis? How should samples be collected? Are there known or unknown artifacts or contamination sources in the sampling and analysis methods? Assessing data quality requires communication between the scientists responsible for designing the study and those collecting samples, analyzing samples, treating data, and interpreting results. This primer has been developed to provide basic information and guidance about developing QA/QC protocols for geochemical studies. It is not intended to be a comprehensive guide but rather an introduction to key concepts tied to a list of relevant references for further reading. The guidelines are presented in stepwise order beginning with presampling considerations and continuing through final data interpretation. The goal of this primer is to outline basic QA/QC practices that scientists can use before, during, and after chemical analysis to ensure the validity of the data they collect with the goal of providing defendable results and conclusions.
Automated locomotor activity monitoring as a quality control assay for mass-reared tephritid flies.
Dominiak, Bernard C; Fanson, Benjamin G; Collins, Samuel R; Taylor, Phillip W
2014-02-01
The Sterile Insect Technique (SIT) requires vast numbers of consistently high quality insects to be produced over long periods. Quality control (QC) procedures are critical to effective SIT, both providing quality assurance and warning of operational deficiencies. We here present a potential new QC assay for mass rearing of Queensland fruit flies (Bactrocera tryoni Froggatt) for SIT; locomotor activity monitoring. We investigated whether automated locomotor activity monitors (LAMs) that simply detect how often a fly passes an infrared sensor in a glass tube might provide similar insights but with much greater economy. Activity levels were generally lower for females than for males, and declined over five days in the monitor for both sexes. Female activity levels were not affected by irradiation, but males irradiated at 60 or 70 Gy had reduced activity levels compared with unirradiated controls. We also found some evidence that mild heat shock of pupae results in adults with reduced activity. LAM offers a convenient, effective and economical assay to probe such changes. © 2013 Society of Chemical Industry.
A method to establish seismic noise baselines for automated station assessment
McNamara, D.E.; Hutt, C.R.; Gee, L.S.; Benz, H.M.; Buland, R.P.
2009-01-01
We present a method for quantifying station noise baselines and characterizing the spectral shape of out-of-nominal noise sources. Our intent is to automate this method in order to ensure that only the highest-quality data are used in rapid earthquake products at NEIC. In addition, the station noise baselines provide a valuable tool to support the quality control of GSN and ANSS backbone data and metadata. The procedures addressed here are currently in development at the NEIC, and work is underway to understand how quickly changes from nominal can be observed and used within the NEIC processing framework. The spectral methods and software used to compute station baselines and described herein (PQLX) can be useful to both permanent and portable seismic stations operators. Applications include: general seismic station and data quality control (QC), evaluation of instrument responses, assessment of near real-time communication system performance, characterization of site cultural noise conditions, and evaluation of sensor vault design, as well as assessment of gross network capabilities (McNamara et al. 2005). Future PQLX development plans include incorporating station baselines for automated QC methods and automating station status report generation and notification based on user-defined QC parameters. The PQLX software is available through the USGS (http://earthquake. usgs.gov/research/software/pqlx.php) and IRIS (http://www.iris.edu/software/ pqlx/).
Quality control in urinalysis.
Takubo, T; Tatsumi, N
1999-01-01
Quality control (QC) has been introduced in laboratories, and QC surveys in urinalysis have been performed by College of American Pathologist, by Japanese Association of Medical Technologists, by Osaka Medical Association and by manufacturers. QC survey in urinalysis for synthetic urine by the reagent strip and instrument made in same manufacturer, and by an automated urine cell analyser provided satisfactory results among laboratories. QC survey in urinalysis for synthetic urine by the reagent strips and instruments made by various manufacturers indicated differences in the determination values among manufacturers, and between manual and automated methods because the reagent strips and instruments have different characteristics, respectively. QC photo survey in urinalysis on the microscopic photos of urine sediment constituents indicated differences in the identification of cells among laboratories. From the results, it is necessary to standardize a reagent strip method, manual and automated methods, and synthetic urine.
222-S Laboratory Quality Assurance Plan. Revision 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meznarich, H.K.
1995-07-31
This Quality Assurance Plan provides,quality assurance (QA) guidance, regulatory QA requirements (e.g., 10 CFR 830.120), and quality control (QC) specifications for analytical service. This document follows the U.S Department of Energy (DOE) issued Hanford Analytical Services Quality Assurance Plan (HASQAP). In addition, this document meets the objectives of the Quality Assurance Program provided in the WHC-CM-4-2, Section 2.1. Quality assurance elements required in the Guidelines and Specifications for Preparing Quality Assurance Program Plans (QAMS-004) and Interim Guidelines and Specifications for Preparing Quality Assurance Project Plans (QAMS-005) from the US Environmental Protection Agency (EPA) are covered throughout this document. A qualitymore » assurance index is provided in the Appendix A. This document also provides and/or identifies the procedural information that governs laboratory operations. The personnel of the 222-S Laboratory and the Standards Laboratory including managers, analysts, QA/QC staff, auditors, and support staff shall use this document as guidance and instructions for their operational and quality assurance activities. Other organizations that conduct activities described in this document for the 222-S Laboratory shall follow this QA/QC document.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Joseph; Pirrung, Meg; McCue, Lee Ann
FQC is software that facilitates quality control of FASTQ files by carrying out a QC protocol using FastQC, parsing results, and aggregating quality metrics into an interactive dashboard designed to richly summarize individual sequencing runs. The dashboard groups samples in dropdowns for navigation among the data sets, utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data.
Brown, Joseph; Pirrung, Meg; McCue, Lee Ann
2017-06-09
FQC is software that facilitates quality control of FASTQ files by carrying out a QC protocol using FastQC, parsing results, and aggregating quality metrics into an interactive dashboard designed to richly summarize individual sequencing runs. The dashboard groups samples in dropdowns for navigation among the data sets, utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data.
Quantum key distribution using card, base station and trusted authority
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nordholt, Jane E.; Hughes, Richard John; Newell, Raymond Thorson
Techniques and tools for quantum key distribution ("QKD") between a quantum communication ("QC") card, base station and trusted authority are described herein. In example implementations, a QC card contains a miniaturized QC transmitter and couples with a base station. The base station provides a network connection with the trusted authority and can also provide electric power to the QC card. When coupled to the base station, after authentication by the trusted authority, the QC card acquires keys through QKD with a trust authority. The keys can be used to set up secure communication, for authentication, for access control, or formore » other purposes. The QC card can be implemented as part of a smart phone or other mobile computing device, or the QC card can be used as a fillgun for distribution of the keys.« less
Quantum key distribution using card, base station and trusted authority
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nordholt, Jane Elizabeth; Hughes, Richard John; Newell, Raymond Thorson
Techniques and tools for quantum key distribution ("QKD") between a quantum communication ("QC") card, base station and trusted authority are described herein. In example implementations, a QC card contains a miniaturized QC transmitter and couples with a base station. The base station provides a network connection with the trusted authority and can also provide electric power to the QC card. When coupled to the base station, after authentication by the trusted authority, the QC card acquires keys through QKD with a trusted authority. The keys can be used to set up secure communication, for authentication, for access control, or formore » other purposes. The QC card can be implemented as part of a smart phone or other mobile computing device, or the QC card can be used as a fillgun for distribution of the keys.« less
Applying Sigma Metrics to Reduce Outliers.
Litten, Joseph
2017-03-01
Sigma metrics can be used to predict assay quality, allowing easy comparison of instrument quality and predicting which tests will require minimal quality control (QC) rules to monitor the performance of the method. A Six Sigma QC program can result in fewer controls and fewer QC failures for methods with a sigma metric of 5 or better. The higher the number of methods with a sigma metric of 5 or better, the lower the costs for reagents, supplies, and control material required to monitor the performance of the methods. Copyright © 2016 Elsevier Inc. All rights reserved.
Francy, D.S.; Jones, A.L.; Myers, Donna N.; Rowe, G.L.; Eberle, Michael; Sarver, K.M.
1998-01-01
The U.S. Geological Survey (USGS), Water Resources Division (WRD), requires that quality-assurance/quality-control (QA/QC) activities be included in any sampling and analysis program. Operational QA/QC procedures address local needs while incorporating national policies. Therefore, specific technical policies were established for all activities associated with water-quality project being done by the Ohio District. The policies described in this report provide Ohio District personnel, cooperating agencies, and others with a reference manual on QA/QC procedures that are followed in collecitng and analyzing water-quality samples and reporting water-quality information in the Ohio District. The project chief, project support staff, District Water-Quality Specialist, and District Laboratory Coordinator are all involved in planning and implementing QA/QC activities at the district level. The District Chief and other district-level managers provide oversight, and the Regional Water-Quality Specialist, Office of Water Quality (USGS headquarters), and the Branch of Quality Systems within the Office of Water Quality create national QA/QC polices and provide assistance to District personnel. In the literature, the quality of all measurement data is expressed in terms of precision, variability, bias, accuracy, completeness, representativeness, and comparability. In the Ohio District, bias and variability will be used to describe quality-control data generated from samples in the field and laboratory. Each project chief must plan for implementation and financing of QA/QC activities necessary to achieve data-quality objectives. At least 15 percent of the total project effort must be directed toward QA/QC activities. Of this total, 5-10 percent will be used for collection and analysis of quality-control samples. This is an absolute minimum, and more may be required based on project objectives. Proper techniques must be followed in the collection and processing of surface-water, ground-water, biological, precipitation, bed-sediment, bedload, suspended-sediment, and solid-phase samples. These techniques are briefly described in this report and are extensively documented. The reference documents listed in this report will be kept by the District librarian and District Water-Quality Specialist and updated regularly so that they are available to all District staff. Proper handling and documentation before, during, and after field activities are essential to ensure the integrity of the sample and to correct erroneous reporting of data results. Field sites are to be properly identified and entered into the data base before field data-collection activities begin. During field activities, field notes are to be completed and sample bottles appropriately labeled a nd stored. After field activities, all paperwork is to be completed promptly and samples transferred to the laboratory within allowable holding times. All equipment used by District personnel for the collection and processing of water-quality samples is to be properly operated, maintained, and calibrated by project personnel. This includes equipment for onsite measurement of water-quality characteristics (temperature, specific conductance, pH, dissolved oxygen, alkalinity, acidity, and turbidity) and equipment and instruments used for biological sampling. The District Water-Quality Specialist and District Laboratory Coordinator are responsible for preventive maintenance and calibration of equipment in the Ohio District laboratory. The USGS National Water Quality Laboratory in Arvada, Colo., is the primary source of analytical services for most project work done by the Ohio District. Analyses done at the Ohio District laboratory are usually those that must be completed within a few hours of sample collection. Contract laboratories or other USGS laboratories are sometimes used instead of the NWQL or the Ohio District laboratory. When a contract laboratory is used, the projec
Embankment quality and assessment of moisture control implementation.
DOT National Transportation Integrated Search
2016-02-01
A specification for contractor moisture quality control (QC) in roadway embankment construction has been in use for approximately 10 : years in Iowa on about 190 projects. The use of this QC specification and the development of the soils certificatio...
Southam, Lorraine; Panoutsopoulou, Kalliope; Rayner, N William; Chapman, Kay; Durrant, Caroline; Ferreira, Teresa; Arden, Nigel; Carr, Andrew; Deloukas, Panos; Doherty, Michael; Loughlin, John; McCaskie, Andrew; Ollier, William E R; Ralston, Stuart; Spector, Timothy D; Valdes, Ana M; Wallis, Gillian A; Wilkinson, J Mark; Marchini, Jonathan; Zeggini, Eleftheria
2011-05-01
Imputation is an extremely valuable tool in conducting and synthesising genome-wide association studies (GWASs). Directly typed SNP quality control (QC) is thought to affect imputation quality. It is, therefore, common practise to use quality-controlled (QCed) data as an input for imputing genotypes. This study aims to determine the effect of commonly applied QC steps on imputation outcomes. We performed several iterations of imputing SNPs across chromosome 22 in a dataset consisting of 3177 samples with Illumina 610 k (Illumina, San Diego, CA, USA) GWAS data, applying different QC steps each time. The imputed genotypes were compared with the directly typed genotypes. In addition, we investigated the correlation between alternatively QCed data. We also applied a series of post-imputation QC steps balancing elimination of poorly imputed SNPs and information loss. We found that the difference between the unQCed data and the fully QCed data on imputation outcome was minimal. Our study shows that imputation of common variants is generally very accurate and robust to GWAS QC, which is not a major factor affecting imputation outcome. A minority of common-frequency SNPs with particular properties cannot be accurately imputed regardless of QC stringency. These findings may not generalise to the imputation of low frequency and rare variants.
Betsou, Fay; Bulla, Alexandre; Cho, Sang Yun; Clements, Judith; Chuaqui, Rodrigo; Coppola, Domenico; De Souza, Yvonne; De Wilde, Annemieke; Grizzle, William; Guadagni, Fiorella; Gunter, Elaine; Heil, Stacey; Hodgkinson, Verity; Kessler, Joseph; Kiehntopf, Michael; Kim, Hee Sung; Koppandi, Iren; Shea, Katheryn; Singh, Rajeev; Sobel, Marc; Somiari, Stella; Spyropoulos, Demetri; Stone, Mars; Tybring, Gunnel; Valyi-Nagy, Klara; Van den Eynden, Gert; Wadhwa, Lalita
2016-10-01
This technical report presents quality control (QC) assays that can be performed in order to qualify clinical biospecimens that have been biobanked for use in research. Some QC assays are specific to a disease area. Some QC assays are specific to a particular downstream analytical platform. When such a qualification is not possible, QC assays are presented that can be performed to stratify clinical biospecimens according to their biomolecular quality.
Bulla, Alexandre; Cho, Sang Yun; Clements, Judith; Chuaqui, Rodrigo; Coppola, Domenico; De Souza, Yvonne; De Wilde, Annemieke; Grizzle, William; Guadagni, Fiorella; Gunter, Elaine; Heil, Stacey; Hodgkinson, Verity; Kessler, Joseph; Kiehntopf, Michael; Kim, Hee Sung; Koppandi, Iren; Shea, Katheryn; Singh, Rajeev; Sobel, Marc; Somiari, Stella; Spyropoulos, Demetri; Stone, Mars; Tybring, Gunnel; Valyi-Nagy, Klara; Van den Eynden, Gert; Wadhwa, Lalita
2016-01-01
This technical report presents quality control (QC) assays that can be performed in order to qualify clinical biospecimens that have been biobanked for use in research. Some QC assays are specific to a disease area. Some QC assays are specific to a particular downstream analytical platform. When such a qualification is not possible, QC assays are presented that can be performed to stratify clinical biospecimens according to their biomolecular quality. PMID:27046294
Jackson, David; Bramwell, David
2013-12-16
Proteomics technologies can be effective for the discovery and assay of protein forms altered with disease. However, few examples of successful biomarker discovery yet exist. Critical to addressing this is the widespread implementation of appropriate QC (quality control) methodology. Such QC should combine the rigour of clinical laboratory assays with a suitable treatment of the complexity of the proteome by targeting separate assignable causes of variation. We demonstrate an approach, metric and example workflow for users to develop such targeted QC rules systematically and objectively, using a publicly available plasma DIGE data set. Hierarchical clustering analysis of standard channels is first used to discover correlated groups of features corresponding to specific assignable sources of technical variation. These effects are then quantified using a statistical distance metric, and followed on control charts. This allows measurement of process drift and the detection of runs that outlie for any given effect. A known technical issue on originally rejected gels was detected validating this approach, and relevant novel effects were also detected and classified effectively. Our approach was effective for 2-DE QC. Whilst we demonstrated this in a retrospective DIGE experiment, the principles would apply to ongoing QC and other proteomic technologies. This work asserts that properly carried out QC is essential to proteomics discovery experiments. Its significance is that it provides one possible novel framework for applying such methods, with a particular consideration of how to handle the complexity of the proteome. It not only focusses on 2DE-based methodology but also demonstrates general principles. A combination of results and discussion based upon a publicly available data set is used to illustrate the approach and allows a structured discussion of factors that experimenters may wish to bear in mind in other situations. The demonstration is on retrospective data only for reasons of scope, but the principles applied are also important for ongoing QC, and this work serves as a step towards a later demonstration of that application. This article is part of a Special Issue entitled: Standardization and Quality Control in Proteomics. © 2013.
Jordan, Gregor; Onami, Ichio; Heinrich, Julia; Staack, Roland F
2017-11-01
Assessment of active drug exposure of biologics may be crucial for drug development. Typically, ligand-binding assay methods are used to provide free/active drug concentrations. To what extent hybrid LC-MS/MS procedures enable correct 'active' drug quantification is currently under consideration. Experimental & results: The relevance of appropriate extraction condition was evaluated by a hybrid target capture immuno-affinity LC-MS/MS method using total and free/active quality controls (QCs). The rapid extraction (10 min) provided correct results, whereas overnight incubation resulted in significant overestimation of the free/active drug (monclonal antibody) concentration. Conventional total QCs were inappropriate to determine optimal method conditions in contrast to free/active QCs. The 'free/active analyte QC concept' enables development of appropriate extraction conditions for correct active drug quantification by hybrid LC-MS/MS.
INAA Application for Trace Element Determination in Biological Reference Material
NASA Astrophysics Data System (ADS)
Atmodjo, D. P. D.; Kurniawati, S.; Lestiani, D. D.; Adventini, N.
2017-06-01
Trace element determination in biological samples is often used in the study of health and toxicology. Determination change to its essentiality and toxicity of trace element require an accurate determination method, which implies that a good Quality Control (QC) procedure should be performed. In this study, QC for trace element determination in biological samples was applied by analyzing the Standard Reference Material (SRM) Bovine muscle 8414 NIST using Instrumental Neutron Activation Analysis (INAA). Three selected trace element such as Fe, Zn, and Se were determined. Accuracy of the elements showed as %recovery and precision as %coefficient of variance (%CV). The result showed that %recovery of Fe, Zn, and Se were in the range between 99.4-107%, 92.7-103%, and 91.9-112%, respectively, whereas %CV were 2.92, 3.70, and 5.37%, respectively. These results showed that INAA method is precise and accurate for trace element determination in biological matrices.
76 FR 67315 - Supplemental Nutrition Assistance Program: Quality Control Error Tolerance Threshold
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-01
...This direct final rule is amending the Quality Control (QC) review error threshold in our regulations from $25.00 to $50.00. The purpose for raising the QC error threshold is to make permanent the temporary threshold change that was required by the American Recovery and Reinvestment Act of 2008. This change does not have an impact on the public. The QC system measures the accuracy of the eligibility system for the Supplemental Nutrition Assistance Program (SNAP).
77 FR 75968 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-26
... information unless it displays a currently valid OMB control number. Food and Nutrition Service Title: Quality... required to perform Quality Control (QC) review for the Supplemental Nutrition Assistance Program (SNAP). The FNS-380-1, Quality Control Review Schedule is for State use to collect both QC data and case...
WE-AB-206-00: Diagnostic QA/QC Hands-On Workshop
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
The involvement of medical physicists in diagnostic ultrasound imaging service is increasing due to QC and accreditation requirements. The goal of this ultrasound hands-on workshop is to demonstrate quality control (QC) testing in diagnostic ultrasound and to provide updates in ACR ultrasound accreditation requirements. The first half of this workshop will include two presentations reviewing diagnostic ultrasound QA/QC and ACR ultrasound accreditation requirements. The second half of the workshop will include live demonstrations of basic QC tests. An array of ultrasound testing phantoms and ultrasound scanners will be available for attendees to learn diagnostic ultrasound QC in a hands-on environmentmore » with live demonstrations and on-site instructors. The targeted attendees are medical physicists in diagnostic imaging. Learning Objectives: Gain familiarity with common elements of a QA/QC program for diagnostic ultrasound imaging dentify QC tools available for testing diagnostic ultrasound systems and learn how to use these tools Learn ACR ultrasound accreditation requirements Jennifer Walter is an employee of American College of Radiology on Ultrasound Accreditation.« less
Robust modular product family design
NASA Astrophysics Data System (ADS)
Jiang, Lan; Allada, Venkat
2001-10-01
This paper presents a modified Taguchi methodology to improve the robustness of modular product families against changes in customer requirements. The general research questions posed in this paper are: (1) How to effectively design a product family (PF) that is robust enough to accommodate future customer requirements. (2) How far into the future should designers look to design a robust product family? An example of a simplified vacuum product family is used to illustrate our methodology. In the example, customer requirements are selected as signal factors; future changes of customer requirements are selected as noise factors; an index called quality characteristic (QC) is set to evaluate the product vacuum family; and the module instance matrix (M) is selected as control factor. Initially a relation between the objective function (QC) and the control factor (M) is established, and then the feasible M space is systemically explored using a simplex method to determine the optimum M and the corresponding QC values. Next, various noise levels at different time points are introduced into the system. For each noise level, the optimal values of M and QC are computed and plotted on a QC-chart. The tunable time period of the control factor (the module matrix, M) is computed using the QC-chart. The tunable time period represents the maximum time for which a given control factor can be used to satisfy current and future customer needs. Finally, a robustness index is used to break up the tunable time period into suitable time periods that designers should consider while designing product families.
Introducing Quality Control in the Chemistry Teaching Laboratory Using Control Charts
ERIC Educational Resources Information Center
Schazmann, Benjamin; Regan, Fiona; Ross, Mary; Diamond, Dermot; Paull, Brett
2009-01-01
Quality control (QC) measures are less prevalent in teaching laboratories than commercial settings possibly owing to a lack of commercial incentives or teaching resources. This article focuses on the use of QC assessment in the analytical techniques of high performance liquid chromatography (HPLC) and ultraviolet-visible spectroscopy (UV-vis) at…
The Development of Quality Control Genotyping Approaches: A Case Study Using Elite Maize Lines.
Chen, Jiafa; Zavala, Cristian; Ortega, Noemi; Petroli, Cesar; Franco, Jorge; Burgueño, Juan; Costich, Denise E; Hearne, Sarah J
2016-01-01
Quality control (QC) of germplasm identity and purity is a critical component of breeding and conservation activities. SNP genotyping technologies and increased availability of markers provide the opportunity to employ genotyping as a low-cost and robust component of this QC. In the public sector available low-cost SNP QC genotyping methods have been developed from a very limited panel of markers of 1,000 to 1,500 markers without broad selection of the most informative SNPs. Selection of optimal SNPs and definition of appropriate germplasm sampling in addition to platform section impact on logistical and resource-use considerations for breeding and conservation applications when mainstreaming QC. In order to address these issues, we evaluated the selection and use of SNPs for QC applications from large DArTSeq data sets generated from CIMMYT maize inbred lines (CMLs). Two QC genotyping strategies were developed, the first is a "rapid QC", employing a small number of SNPs to identify potential mislabeling of seed packages or plots, the second is a "broad QC", employing a larger number of SNP, used to identify each germplasm entry and to measure heterogeneity. The optimal marker selection strategies combined the selection of markers with high minor allele frequency, sampling of clustered SNP in proportion to marker cluster distance and selecting markers that maintain a uniform genomic distribution. The rapid and broad QC SNP panels selected using this approach were further validated using blind test assessments of related re-generation samples. The influence of sampling within each line was evaluated. Sampling 192 individuals would result in close to 100% possibility of detecting a 5% contamination in the entry, and approximately a 98% probability to detect a 2% contamination of the line. These results provide a framework for the establishment of QC genotyping. A comparison of financial and time costs for use of these approaches across different platforms is discussed providing a framework for institutions involved in maize conservation and breeding to assess the resource use effectiveness of QC genotyping. Application of these research findings, in combination with existing QC approaches, will ensure the regeneration, distribution and use in breeding of true to type inbred germplasm. These findings also provide an effective approach to optimize SNP selection for QC genotyping in other species.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu, Z.
The involvement of medical physicists in diagnostic ultrasound imaging service is increasing due to QC and accreditation requirements. The goal of this ultrasound hands-on workshop is to demonstrate quality control (QC) testing in diagnostic ultrasound and to provide updates in ACR ultrasound accreditation requirements. The first half of this workshop will include two presentations reviewing diagnostic ultrasound QA/QC and ACR ultrasound accreditation requirements. The second half of the workshop will include live demonstrations of basic QC tests. An array of ultrasound testing phantoms and ultrasound scanners will be available for attendees to learn diagnostic ultrasound QC in a hands-on environmentmore » with live demonstrations and on-site instructors. The targeted attendees are medical physicists in diagnostic imaging. Learning Objectives: Gain familiarity with common elements of a QA/QC program for diagnostic ultrasound imaging dentify QC tools available for testing diagnostic ultrasound systems and learn how to use these tools Learn ACR ultrasound accreditation requirements Jennifer Walter is an employee of American College of Radiology on Ultrasound Accreditation.« less
WE-AB-206-01: Diagnostic Ultrasound Imaging Quality Assurance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zagzebski, J.
The involvement of medical physicists in diagnostic ultrasound imaging service is increasing due to QC and accreditation requirements. The goal of this ultrasound hands-on workshop is to demonstrate quality control (QC) testing in diagnostic ultrasound and to provide updates in ACR ultrasound accreditation requirements. The first half of this workshop will include two presentations reviewing diagnostic ultrasound QA/QC and ACR ultrasound accreditation requirements. The second half of the workshop will include live demonstrations of basic QC tests. An array of ultrasound testing phantoms and ultrasound scanners will be available for attendees to learn diagnostic ultrasound QC in a hands-on environmentmore » with live demonstrations and on-site instructors. The targeted attendees are medical physicists in diagnostic imaging. Learning Objectives: Gain familiarity with common elements of a QA/QC program for diagnostic ultrasound imaging dentify QC tools available for testing diagnostic ultrasound systems and learn how to use these tools Learn ACR ultrasound accreditation requirements Jennifer Walter is an employee of American College of Radiology on Ultrasound Accreditation.« less
Stapanian, Martin A.; Lewis, Timothy E; Palmer, Craig J.; Middlebrook Amos, Molly
2016-01-01
Unlike most laboratory studies, rigorous quality assurance/quality control (QA/QC) procedures may be lacking in ecosystem restoration (“ecorestoration”) projects, despite legislative mandates in the United States. This is due, in part, to ecorestoration specialists making the false assumption that some types of data (e.g. discrete variables such as species identification and abundance classes) are not subject to evaluations of data quality. Moreover, emergent behavior manifested by complex, adapting, and nonlinear organizations responsible for monitoring the success of ecorestoration projects tend to unconsciously minimize disorder, QA/QC being an activity perceived as creating disorder. We discuss similarities and differences in assessing precision and accuracy for field and laboratory data. Although the concepts for assessing precision and accuracy of ecorestoration field data are conceptually the same as laboratory data, the manner in which these data quality attributes are assessed is different. From a sample analysis perspective, a field crew is comparable to a laboratory instrument that requires regular “recalibration,” with results obtained by experts at the same plot treated as laboratory calibration standards. Unlike laboratory standards and reference materials, the “true” value for many field variables is commonly unknown. In the laboratory, specific QA/QC samples assess error for each aspect of the measurement process, whereas field revisits assess precision and accuracy of the entire data collection process following initial calibration. Rigorous QA/QC data in an ecorestoration project are essential for evaluating the success of a project, and they provide the only objective “legacy” of the dataset for potential legal challenges and future uses.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Angers, Crystal Plume; Bottema, Ryan; Buckley, Les
Purpose: Treatment unit uptime statistics are typically used to monitor radiation equipment performance. The Ottawa Hospital Cancer Centre has introduced the use of Quality Control (QC) test success as a quality indicator for equipment performance and overall health of the equipment QC program. Methods: Implemented in 2012, QATrack+ is used to record and monitor over 1100 routine machine QC tests each month for 20 treatment and imaging units ( http://qatrackplus.com/ ). Using an SQL (structured query language) script, automated queries of the QATrack+ database are used to generate program metrics such as the number of QC tests executed and themore » percentage of tests passing, at tolerance or at action. These metrics are compared against machine uptime statistics already reported within the program. Results: Program metrics for 2015 show good correlation between pass rate of QC tests and uptime for a given machine. For the nine conventional linacs, the QC test success rate was consistently greater than 97%. The corresponding uptimes for these units are better than 98%. Machines that consistently show higher failure or tolerance rates in the QC tests have lower uptimes. This points to either poor machine performance requiring corrective action or to problems with the QC program. Conclusions: QATrack+ significantly improves the organization of QC data but can also aid in overall equipment management. Complimenting machine uptime statistics with QC test metrics provides a more complete picture of overall machine performance and can be used to identify areas of improvement in the machine service and QC programs.« less
NASA Astrophysics Data System (ADS)
Kawka, O. E.; Nelson, J. S.; Manalang, D.; Kelley, D. S.
2016-02-01
The Cabled Array component of the NSF-funded Ocean Observatories Initiative (OOI) provides access to real-time physical, chemical, geological, and biological data from water column and seafloor platforms/instruments at sites spanning the southern half of the Juan de Fuca Plate. The Quality Assurance (QA) program for OOI data is designed to ensure that data products meet OOI science requirements. This overall data QA plan establishes the guidelines for assuring OOI data quality and summarizes Quality Control (QC) protocols and procedures, based on best practices, which can be utilized to ensure the highest quality data across the OOI program. This presentation will highlight, specifically, the QA/QC approach being utilized for the OOI Cabled Array infrastructure and data and will include a summary of both shipboard and shore-based protocols currently in use. Aspects addressed will be pre-deployment instrument testing and calibration checks, post-deployment and pre-recovery field verification of data, and post-recovery "as-found" testing of instruments. Examples of QA/QC data will be presented and specific cases of cabled data will be discussed in the context of quality assessments and adjustment/correction of OOI datasets overall for inherent sensor drift and/or instrument fouling.
Operational Processing of Ground Validation Data for the Tropical Rainfall Measuring Mission
NASA Technical Reports Server (NTRS)
Kulie, Mark S.; Robinson, Mike; Marks, David A.; Ferrier, Brad S.; Rosenfeld, Danny; Wolff, David B.
1999-01-01
The Tropical Rainfall Measuring Mission (TRMM) satellite was successfully launched in November 1997. A primary goal of TRMM is to sample tropical rainfall using the first active spaceborne precipitation radar. To validate TRMM satellite observations, a comprehensive Ground Validation (GV) Program has been implemented for this mission. A key component of GV is the analysis and quality control of meteorological ground-based radar data from four primary sites: Melbourne, FL; Houston, TX; Darwin, Australia; and Kwajalein Atoll, RMI. As part of the TRMM GV effort, the Joint Center for Earth Systems Technology (JCET) at the University of Maryland, Baltimore County, has been tasked with developing and implementing an operational system to quality control (QC), archive, and provide data for subsequent rainfall product generation from the four primary GV sites. This paper provides an overview of the JCET operational environment. A description of the QC algorithm and performance, in addition to the data flow procedure between JCET and the TRNM science and Data Information System (TSDIS), are presented. The impact of quality-controlled data on higher level rainfall and reflectivity products will also be addressed, Finally, a brief description of JCET's expanded role into producing reference rainfall products will be discussed.
Quality control management and communication between radiologists and technologists.
Nagy, Paul G; Pierce, Benjamin; Otto, Misty; Safdar, Nabile M
2008-06-01
The greatest barrier to quality control (QC) in the digital imaging environment is the lack of communication and documentation between those who interpret images and those who acquire them. Paper-based QC methods are insufficient in a digital image management system. Problem work flow must be incorporated into reengineering efforts when migrating to a digital practice. The authors implemented a Web-based QC feedback tool to document and facilitate the communication of issues identified by radiologists. The goal was to promote a responsive and constructive tool that contributes to a culture of quality. The hypothesis was that by making it easier for radiologists to submit quality issues, the number of QC issues submitted would increase. The authors integrated their Web-based quality tracking system with a clinical picture archiving and communication system so that radiologists could report quality issues without disrupting clinical work flow. Graphical dashboarding techniques aid supervisors in using this database to identify the root causes of different types of issues. Over the initial 12-month rollout period, starting in the general section, the authors recorded 20 times more QC issues submitted by radiologists, accompanied by a rise in technologists' responsiveness to QC issues. For technologists with high numbers of QC issues, the incorporation of data from this tracking system proved useful in performance appraisals and in driving individual improvement. This tool is an example of the types of information technology innovations that can be leveraged to support QC in the digital imaging environment. Initial data suggest that the result is not only an improvement in quality but higher levels of satisfaction for both radiologists and technologists.
Internal quality control: planning and implementation strategies.
Westgard, James O
2003-11-01
The first essential in setting up internal quality control (IQC) of a test procedure in the clinical laboratory is to select the proper IQC procedure to implement, i.e. choosing the statistical criteria or control rules, and the number of control measurements, according to the quality required for the test and the observed performance of the method. Then the right IQC procedure must be properly implemented. This review focuses on strategies for planning and implementing IQC procedures in order to improve the quality of the IQC. A quantitative planning process is described that can be implemented with graphical tools such as power function or critical-error graphs and charts of operating specifications. Finally, a total QC strategy is formulated to minimize cost and maximize quality. A general strategy for IQC implementation is recommended that employs a three-stage design in which the first stage provides high error detection, the second stage low false rejection and the third stage prescribes the length of the analytical run, making use of an algorithm involving the average of normal patients' data.
The purpose of this SOP is to describe the procedures undertaken to treat censored data which are below detection limits. This SOP uses data that have been properly coded and certified with appropriate QA/QC procedures by the University of Arizona NHEXAS and Battelle Laboratorie...
The purpose of this SOP is to describe the procedures undertaken to calculate sampling weights. The sampling weights are needed to obtain weighted statistics of the NHEXAS data. This SOP uses data that have been properly coded and certified with appropriate QA/QC procedures by t...
The purpose of this SOP is to describe the procedures undertaken to calculate the dermal exposure to chlorpyrifos and diazinon. This SOP uses data that have been properly coded and certified with appropriate QA/QC procedures by the University of Arizona NHEXAS and Battelle Labora...
The purpose of this SOP is to describe the procedures undertaken to calculate the time activity pattern of the NHEXAS samples. This SOP uses data that have been properly coded and certified with appropriate QA/QC procedures by the University of Arizona NHEXAS and Battelle Laborat...
Results-driven approach to improving quality and productivity
John Dramm
2000-01-01
Quality control (QC) programs do not often realize their full potential. Elaborate and expensive QC programs can easily get side tracked by the process of building a program with promises of âSomeday, this will all pay off.â Training employees in QC methods is no guarantee that quality will improve. Several documented cases show that such activity-centered efforts...
Bergallo, M; Costa, C; Tarallo, S; Daniele, R; Merlino, C; Segoloni, G P; Negro Ponzi, A; Cavallo, R
2006-06-01
The human cytomegalovirus (HCMV) is an important pathogen in immunocompromised patients, such as transplant recipients. The use of sensitive and rapid diagnostic assays can have a great impact on antiviral prophylaxis and therapy monitoring and diagnosing active disease. Quantification of HCMV DNA may additionally have prognostic value and guide routine management. The aim of this study was to develop a reliable internally-controlled quantitative-competitive PCR (QC-PCR) for the detection and quantification of HCMV DNA viral load in peripheral blood and compare it with other methods: the HCMV pp65 antigenaemia assay in leukocyte fraction, the HCMV viraemia, both routinely employed in our laboratory, and the nucleic acid sequence-based amplification (NASBA) for detection of HCMV pp67-mRNA. Quantitative-competitive PCR is a procedure for nucleic acid quantification based on co-amplification of competitive templates, the target DNA and a competitor functioning as internal standard. In particular, a standard curve is generated by amplifying 10(2) to 10(5) copies of target pCMV-435 plasmid with 10(4) copies of competitor pCMV-C plasmid. Clinical samples derived from 40 kidney transplant patients were tested by spiking 10(4) copies of pCMV-C into the PCR mix as internal control, and comparing results with the standard curve. Of the 40 patients studied, 39 (97.5%) were positive for HCMV DNA by QC-PCR. While the correlation between the number of pp65-positive cells and the number of HCMV DNA genome copies/mL and the former and the pp67mRNA-positivity were statistically significant, there was no significant correlation between HCMV DNA viral load assayed by QC-PCR and HCMV viraemia. The QC-PCR assay could detect from 10(2) to over 10(7) copies of HCMV DNA with a range of linearity between 10(2) and 10(5) genomes.
Pichler, Peter; Mazanek, Michael; Dusberger, Frederico; Weilnböck, Lisa; Huber, Christian G; Stingl, Christoph; Luider, Theo M; Straube, Werner L; Köcher, Thomas; Mechtler, Karl
2012-11-02
While the performance of liquid chromatography (LC) and mass spectrometry (MS) instrumentation continues to increase, applications such as analyses of complete or near-complete proteomes and quantitative studies require constant and optimal system performance. For this reason, research laboratories and core facilities alike are recommended to implement quality control (QC) measures as part of their routine workflows. Many laboratories perform sporadic quality control checks. However, successive and systematic longitudinal monitoring of system performance would be facilitated by dedicated automatic or semiautomatic software solutions that aid an effortless analysis and display of QC metrics over time. We present the software package SIMPATIQCO (SIMPle AuTomatIc Quality COntrol) designed for evaluation of data from LTQ Orbitrap, Q-Exactive, LTQ FT, and LTQ instruments. A centralized SIMPATIQCO server can process QC data from multiple instruments. The software calculates QC metrics supervising every step of data acquisition from LC and electrospray to MS. For each QC metric the software learns the range indicating adequate system performance from the uploaded data using robust statistics. Results are stored in a database and can be displayed in a comfortable manner from any computer in the laboratory via a web browser. QC data can be monitored for individual LC runs as well as plotted over time. SIMPATIQCO thus assists the longitudinal monitoring of important QC metrics such as peptide elution times, peak widths, intensities, total ion current (TIC) as well as sensitivity, and overall LC-MS system performance; in this way the software also helps identify potential problems. The SIMPATIQCO software package is available free of charge.
2012-01-01
While the performance of liquid chromatography (LC) and mass spectrometry (MS) instrumentation continues to increase, applications such as analyses of complete or near-complete proteomes and quantitative studies require constant and optimal system performance. For this reason, research laboratories and core facilities alike are recommended to implement quality control (QC) measures as part of their routine workflows. Many laboratories perform sporadic quality control checks. However, successive and systematic longitudinal monitoring of system performance would be facilitated by dedicated automatic or semiautomatic software solutions that aid an effortless analysis and display of QC metrics over time. We present the software package SIMPATIQCO (SIMPle AuTomatIc Quality COntrol) designed for evaluation of data from LTQ Orbitrap, Q-Exactive, LTQ FT, and LTQ instruments. A centralized SIMPATIQCO server can process QC data from multiple instruments. The software calculates QC metrics supervising every step of data acquisition from LC and electrospray to MS. For each QC metric the software learns the range indicating adequate system performance from the uploaded data using robust statistics. Results are stored in a database and can be displayed in a comfortable manner from any computer in the laboratory via a web browser. QC data can be monitored for individual LC runs as well as plotted over time. SIMPATIQCO thus assists the longitudinal monitoring of important QC metrics such as peptide elution times, peak widths, intensities, total ion current (TIC) as well as sensitivity, and overall LC–MS system performance; in this way the software also helps identify potential problems. The SIMPATIQCO software package is available free of charge. PMID:23088386
Individualized Quality Control Plan (IQCP): Is It Value-Added for Clinical Microbiology?
Sharp, Susan E; Miller, Melissa B; Hindler, Janet
2015-12-01
The Center for Medicaid and Medicare Services (CMS) recently published their Individualized Quality Control Plan (IQCP [https://www.cms.gov/regulations-and-guidance/legislation/CLIA/Individualized_Quality_Control_Plan_IQCP.html]), which will be the only option for quality control (QC) starting in January 2016 if laboratories choose not to perform Clinical Laboratory Improvement Act (CLIA) [U.S. Statutes at Large 81(1967):533] default QC. Laboratories will no longer be able to use "equivalent QC" (EQC) or the Clinical and Laboratory Standards Institute (CLSI) standards alone for quality control of their microbiology systems. The implementation of IQCP in clinical microbiology laboratories will most certainly be an added burden, the benefits of which are currently unknown. Copyright © 2015, American Society for Microbiology. All Rights Reserved.
Nitrosi, Andrea; Bertolini, Marco; Borasi, Giovanni; Botti, Andrea; Barani, Adriana; Rivetti, Stefano; Pierotti, Luisa
2009-12-01
Ideally, medical x-ray imaging systems should be designed to deliver maximum image quality at an acceptable radiation risk to the patient. Quality assurance procedures are employed to ensure that these standards are maintained. A quality control protocol for direct digital radiography (DDR) systems is described and discussed. Software to automatically process and analyze the required images was developed. In this paper, the initial results obtained on equipment of different DDR manufacturers were reported. The protocol was developed to highlight even small discrepancies in standard operating performance.
WE-AB-206-02: ACR Ultrasound Accreditation: Requirements and Pitfalls
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walter, J.
The involvement of medical physicists in diagnostic ultrasound imaging service is increasing due to QC and accreditation requirements. The goal of this ultrasound hands-on workshop is to demonstrate quality control (QC) testing in diagnostic ultrasound and to provide updates in ACR ultrasound accreditation requirements. The first half of this workshop will include two presentations reviewing diagnostic ultrasound QA/QC and ACR ultrasound accreditation requirements. The second half of the workshop will include live demonstrations of basic QC tests. An array of ultrasound testing phantoms and ultrasound scanners will be available for attendees to learn diagnostic ultrasound QC in a hands-on environmentmore » with live demonstrations and on-site instructors. The targeted attendees are medical physicists in diagnostic imaging. Learning Objectives: Gain familiarity with common elements of a QA/QC program for diagnostic ultrasound imaging dentify QC tools available for testing diagnostic ultrasound systems and learn how to use these tools Learn ACR ultrasound accreditation requirements Jennifer Walter is an employee of American College of Radiology on Ultrasound Accreditation.« less
Operational quality control of daily precipitation using spatio-climatological consistency testing
NASA Astrophysics Data System (ADS)
Scherrer, S. C.; Croci-Maspoli, M.; van Geijtenbeek, D.; Naguel, C.; Appenzeller, C.
2010-09-01
Quality control (QC) of meteorological data is of utmost importance for climate related decisions. The search for an effective automated QC of precipitation data has proven difficult and many weather services still use mainly manual inspection of daily precipitation including MeteoSwiss. However, man power limitations force many weather services to move towards less labour intensive and more automated QC with the challenge to keeping data quality high. In the last decade, several approaches have been presented to objectify daily precipitation QC. Here we present a spatio-climatological approach that will be implemented operationally at MeteoSwiss. It combines the information from the event based spatial distribution of everyday's precipitation field and the historical information of the interpolation error using different precipitation intensity intervals. Expert judgement shows that the system is able to detect potential outliers very well (hardly any missed errors) without creating too many false alarms that need human inspection. 50-80% of all flagged values have been classified as real errors by the data editor. This is much better than the roughly 15-20% using standard spatial regression tests. Very helpful in the QC process is the automatic redistribution of accumulated several day sums. Manual inspection in operations can be reduced and the QC of precipitation objectified substantially.
The Quality Control Circle: Is It for Education?
ERIC Educational Resources Information Center
Land, Arthur J.
From its start in Japan after World War II, the Quality Control Circle (Q.C.) approach to management and organizational operation evolved into what it is today: people doing similar work meeting regularly to identify, objectively analyze, and develop solutions to problems. The Q.C. approach meets Maslow's theory of motivation by inviting…
Analytical approaches to quality assurance and quality control in rangeland monitoring data
USDA-ARS?s Scientific Manuscript database
Producing quality data to support land management decisions is the goal of every rangeland monitoring program. However, the results of quality assurance (QA) and quality control (QC) efforts to improve data quality are rarely reported. The purpose of QA and QC is to prevent and describe non-sampling...
qcML: An Exchange Format for Quality Control Metrics from Mass Spectrometry Experiments*
Walzer, Mathias; Pernas, Lucia Espona; Nasso, Sara; Bittremieux, Wout; Nahnsen, Sven; Kelchtermans, Pieter; Pichler, Peter; van den Toorn, Henk W. P.; Staes, An; Vandenbussche, Jonathan; Mazanek, Michael; Taus, Thomas; Scheltema, Richard A.; Kelstrup, Christian D.; Gatto, Laurent; van Breukelen, Bas; Aiche, Stephan; Valkenborg, Dirk; Laukens, Kris; Lilley, Kathryn S.; Olsen, Jesper V.; Heck, Albert J. R.; Mechtler, Karl; Aebersold, Ruedi; Gevaert, Kris; Vizcaíno, Juan Antonio; Hermjakob, Henning; Kohlbacher, Oliver; Martens, Lennart
2014-01-01
Quality control is increasingly recognized as a crucial aspect of mass spectrometry based proteomics. Several recent papers discuss relevant parameters for quality control and present applications to extract these from the instrumental raw data. What has been missing, however, is a standard data exchange format for reporting these performance metrics. We therefore developed the qcML format, an XML-based standard that follows the design principles of the related mzML, mzIdentML, mzQuantML, and TraML standards from the HUPO-PSI (Proteomics Standards Initiative). In addition to the XML format, we also provide tools for the calculation of a wide range of quality metrics as well as a database format and interconversion tools, so that existing LIMS systems can easily add relational storage of the quality control data to their existing schema. We here describe the qcML specification, along with possible use cases and an illustrative example of the subsequent analysis possibilities. All information about qcML is available at http://code.google.com/p/qcml. PMID:24760958
qcML: an exchange format for quality control metrics from mass spectrometry experiments.
Walzer, Mathias; Pernas, Lucia Espona; Nasso, Sara; Bittremieux, Wout; Nahnsen, Sven; Kelchtermans, Pieter; Pichler, Peter; van den Toorn, Henk W P; Staes, An; Vandenbussche, Jonathan; Mazanek, Michael; Taus, Thomas; Scheltema, Richard A; Kelstrup, Christian D; Gatto, Laurent; van Breukelen, Bas; Aiche, Stephan; Valkenborg, Dirk; Laukens, Kris; Lilley, Kathryn S; Olsen, Jesper V; Heck, Albert J R; Mechtler, Karl; Aebersold, Ruedi; Gevaert, Kris; Vizcaíno, Juan Antonio; Hermjakob, Henning; Kohlbacher, Oliver; Martens, Lennart
2014-08-01
Quality control is increasingly recognized as a crucial aspect of mass spectrometry based proteomics. Several recent papers discuss relevant parameters for quality control and present applications to extract these from the instrumental raw data. What has been missing, however, is a standard data exchange format for reporting these performance metrics. We therefore developed the qcML format, an XML-based standard that follows the design principles of the related mzML, mzIdentML, mzQuantML, and TraML standards from the HUPO-PSI (Proteomics Standards Initiative). In addition to the XML format, we also provide tools for the calculation of a wide range of quality metrics as well as a database format and interconversion tools, so that existing LIMS systems can easily add relational storage of the quality control data to their existing schema. We here describe the qcML specification, along with possible use cases and an illustrative example of the subsequent analysis possibilities. All information about qcML is available at http://code.google.com/p/qcml. © 2014 by The American Society for Biochemistry and Molecular Biology, Inc.
Quality Control of Meteorological Observations
NASA Technical Reports Server (NTRS)
Collins, William; Dee, Dick; Rukhovets, Leonid
1999-01-01
For the first time, a problem of the meteorological observation quality control (QC) was formulated by L.S. Gandin at the Main Geophysical Observatory in the 70's. Later in 1988 L.S. Gandin began adapting his ideas in complex quality control (CQC) to the operational environment at the National Centers for Environmental Prediction. The CQC was first applied by L.S.Gandin and his colleagues to detection and correction of errors in rawinsonde heights and temperatures using a complex of hydrostatic residuals.Later, a full complex of residuals, vertical and horizontal optimal interpolations and baseline checks were added for the checking and correction of a wide range of meteorological variables. some other of Gandin's ideas were applied and substantially developed at other meteorological centers. A new statistical QC was recently implemented in the Goddard Data Assimilation System. The central component of any quality control is a buddy check which is a test of individual suspect observations against available nearby non-suspect observations. A novel feature of this test is that the error variances which are used for QC decision are re-estimated on-line. As a result, the allowed tolerances for suspect observations can depend on local atmospheric conditions. The system is then better able to accept extreme values observed in deep cyclones, jet streams and so on. The basic statements of this adaptive buddy check are described. Some results of the on-line QC including moisture QC are presented.
Proteomics Quality Control: Quality Control Software for MaxQuant Results.
Bielow, Chris; Mastrobuoni, Guido; Kempa, Stefan
2016-03-04
Mass spectrometry-based proteomics coupled to liquid chromatography has matured into an automatized, high-throughput technology, producing data on the scale of multiple gigabytes per instrument per day. Consequently, an automated quality control (QC) and quality analysis (QA) capable of detecting measurement bias, verifying consistency, and avoiding propagation of error is paramount for instrument operators and scientists in charge of downstream analysis. We have developed an R-based QC pipeline called Proteomics Quality Control (PTXQC) for bottom-up LC-MS data generated by the MaxQuant software pipeline. PTXQC creates a QC report containing a comprehensive and powerful set of QC metrics, augmented with automated scoring functions. The automated scores are collated to create an overview heatmap at the beginning of the report, giving valuable guidance also to nonspecialists. Our software supports a wide range of experimental designs, including stable isotope labeling by amino acids in cell culture (SILAC), tandem mass tags (TMT), and label-free data. Furthermore, we introduce new metrics to score MaxQuant's Match-between-runs (MBR) functionality by which peptide identifications can be transferred across Raw files based on accurate retention time and m/z. Last but not least, PTXQC is easy to install and use and represents the first QC software capable of processing MaxQuant result tables. PTXQC is freely available at https://github.com/cbielow/PTXQC .
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-06
... Establishing Test Procedures for the Analysis of Pollutants Under the Clean Water Act; Analysis and Sampling... for use as an alternative oil and grease method. Some comments were specific to the sampling...-side comparison using the specific procedures (e.g. sampling frequency, number of samples, QA/QC, and...
The purpose of this SOP is to describe the procedures undertaken to calculate sampling weights. The sampling weights are needed to obtain weighted statistics of the study data. This SOP uses data that have been properly coded and certified with appropriate QA/QC procedures by th...
The purpose of this SOP is to describe the procedures undertaken to calculate the dermal exposure using a probabilistic approach. This SOP uses data that have been properly coded and certified with appropriate QA/QC procedures by the University of Arizona NHEXAS and Battelle Labo...
The purpose of this SOP is to describe the procedures undertaken to calculate the inhalation exposures to chlorpyrifos and diazinon using the probabilistic approach. This SOP uses data that have been properly coded and certified with appropriate QA/QC procedures by the University...
The purpose of this SOP is to describe the procedures undertaken to convert servings to kilograms for each food item used in the Diet Diary questionnaire. This SOP uses data that have been properly coded and certified with appropriate QA/QC procedures by the University of Arizon...
The purpose of this SOP is to describe the procedures undertaken for estimating inhalation exposures to chlorpyrifos and Diazinon. This SOP uses data that have been properly coded and certified with appropriate QA/QC procedures by the University of Arizona NHEXAS and Battelle La...
Khanna, Niharika; Shaya, Fadia T; Chirikov, Viktor V; Sharp, David; Steffen, Ben
2016-01-01
We present data on quality of care (QC) improvement in 35 of 45 National Quality Forum metrics reported annually by 52 primary care practices recognized as patient-centered medical homes (PCMHs) that participated in the Maryland Multi-Payor Program from 2011 to 2013. We assigned QC metrics to (1) chronic, (2) preventive, and (3) mental health care domains. The study used a panel data design with no control group. Using longitudinal fixed-effects regressions, we modeled QC and case mix severity in a PCMH. Overall, 35 of 45 quality metrics reported by 52 PCMHs demonstrated improvement over 3 years, and case mix severity did not affect the achievement of quality improvement. From 2011 to 2012, QC increased by 0.14 (P < .01) for chronic, 0.15 (P < .01) for preventive, and 0.34 (P < .01) for mental health care domains; from 2012 to 2013 these domains increased by 0.03 (P = .06), 0.04 (P = .05), and 0.07 (P = .12), respectively. In univariate analyses, lower National Commission on Quality Assurance PCMH level was associated with higher QC for the mental health care domain, whereas case mix severity did not correlate with QC. In multivariate analyses, higher QC correlated with larger practices, greater proportion of older patients, and readmission visits. Rural practices had higher proportions of Medicaid patients, lower QC, and higher QC improvement in interaction analyses with time. The gains in QC in the chronic disease domain, the preventive care domain, and, most significantly, the mental health care domain were observed over time regardless of patient case mix severity. QC improvement was generally not modified by practice characteristics, except for rurality. © Copyright 2016 by the American Board of Family Medicine.
Izumida, Fernanda Emiko; Ribeiro, Roberta Chuqui; Giampaolo, Eunice Teresinha; Machado, Ana Lucia; Pavarina, Ana Cláudia; Vergani, Carlos Eduardo
2011-12-01
This study investigated the effect of microwave disinfection on the roughness of three heat-polymerised acrylic resins after tooth brushing. Microwave disinfection has been recommended to reduce cross-contamination. However, this procedure may also influence the physical and mechanical properties of acrylic resins. Specimens (40 × 20 × 2 mm) of resins: Lucitone 550 (L), QC 20(QC) and Acron MC (A) were prepared and divided into four groups (n = 10): Control groups 1 (C1) and 2 (C2) - stored in water for 48 h or 7 days; Test groups 1 (MW2) and 2 (MW7) - stored in water for 48 h and disinfected (650 W for 6 min) daily for 2 or 7 days, respectively. After treatments, the specimens were placed in a tooth brushing machine at a rate of 60 reciprocal strokes per minute. The specimens were brushed with 20 000 strokes, which represent approximately 2 years of denture cleansing. The surface roughness (Ra) was evaluated before and after the tooth brushing. Data were analysed by two-way anova and Tukey Honestly Significant Difference (HSD) post hoc tests (α = 0.05). The data revealed significant changes between test groups for A and L resins. Comparison among resins revealed that for MW7, the roughness of A was significantly lower than that of L. After the seven microwave cycles, it could be seen that the roughness values of QC were significantly lower than those of L. The roughness of QC after brushing was not significantly affected by microwave disinfection. For A and L, seven microwave cycles resulted in increased roughness. © 2011 The Gerodontology Society and John Wiley & Sons A/S.
Delis, H; Christaki, K; Healy, B; Loreti, G; Poli, G L; Toroi, P; Meghzifene, A
2017-09-01
Quality control (QC), according to ISO definitions, represents the most basic level of quality. It is considered to be the snapshot of the performance or the characteristics of a product or service, in order to verify that it complies with the requirements. Although it is usually believed that "the role of medical physicists in Diagnostic Radiology is QC", this, not only limits the contribution of medical physicists, but is also no longer adequate to meet the needs of Diagnostic Radiology in terms of Quality. In order to assure quality practices more organized activities and efforts are required in the modern era of diagnostic radiology. The complete system of QC is just one element of a comprehensive quality assurance (QA) program that aims at ensuring that the requirements of quality of a product or service will consistently be fulfilled. A comprehensive Quality system, starts even before the procurement of any equipment, as the need analysis and the development of specifications are important components under the QA framework. Further expanding this framework of QA, a comprehensive Quality Management System can provide additional benefits to a Diagnostic Radiology service. Harmonized policies and procedures and elements such as mission statement or job descriptions can provide clarity and consistency in the services provided, enhancing the outcome and representing a solid platform for quality improvement. The International Atomic Energy Agency (IAEA) promotes this comprehensive quality approach in diagnostic imaging and especially supports the field of comprehensive clinical audits as a tool for quality improvement. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Haga, Yoshihiro; Chida, Koichi; Inaba, Yohei; Kaga, Yuji; Meguro, Taiichiro; Zuguchi, Masayuki
2016-02-01
As the use of diagnostic X-ray equipment with flat panel detectors (FPDs) has increased, so has the importance of proper management of FPD systems. To ensure quality control (QC) of FPD system, an easy method for evaluating FPD imaging performance for both stationary and moving objects is required. Until now, simple rotatable QC phantoms have not been available for the easy evaluation of the performance (spatial resolution and dynamic range) of FPD in imaging moving objects. We developed a QC phantom for this purpose. It consists of three thicknesses of copper and a rotatable test pattern of piano wires of various diameters. Initial tests confirmed its stable performance. Our moving phantom is very useful for QC of FPD images of moving objects because it enables visual evaluation of image performance (spatial resolution and dynamic range) easily.
AutoLock: a semiautomated system for radiotherapy treatment plan quality control
Lowe, Matthew; Hardy, Mark J.; Boylan, Christopher J.; Whitehurst, Philip; Rowbottom, Carl G.
2015-01-01
A semiautomated system for radiotherapy treatment plan quality control (QC), named AutoLock, is presented. AutoLock is designed to augment treatment plan QC by automatically checking aspects of treatment plans that are well suited to computational evaluation, whilst summarizing more subjective aspects in the form of a checklist. The treatment plan must pass all automated checks and all checklist items must be acknowledged by the planner as correct before the plan is finalized. Thus AutoLock uniquely integrates automated treatment plan QC, an electronic checklist, and plan finalization. In addition to reducing the potential for the propagation of errors, the integration of AutoLock into the plan finalization workflow has improved efficiency at our center. Detailed audit data are presented, demonstrating that the treatment plan QC rejection rate fell by around a third following the clinical introduction of AutoLock. PACS number: 87.55.Qr PMID:26103498
Quality control and conduct of genome-wide association meta-analyses.
Winkler, Thomas W; Day, Felix R; Croteau-Chonka, Damien C; Wood, Andrew R; Locke, Adam E; Mägi, Reedik; Ferreira, Teresa; Fall, Tove; Graff, Mariaelisa; Justice, Anne E; Luan, Jian'an; Gustafsson, Stefan; Randall, Joshua C; Vedantam, Sailaja; Workalemahu, Tsegaselassie; Kilpeläinen, Tuomas O; Scherag, André; Esko, Tonu; Kutalik, Zoltán; Heid, Iris M; Loos, Ruth J F
2014-05-01
Rigorous organization and quality control (QC) are necessary to facilitate successful genome-wide association meta-analyses (GWAMAs) of statistics aggregated across multiple genome-wide association studies. This protocol provides guidelines for (i) organizational aspects of GWAMAs, and for (ii) QC at the study file level, the meta-level across studies and the meta-analysis output level. Real-world examples highlight issues experienced and solutions developed by the GIANT Consortium that has conducted meta-analyses including data from 125 studies comprising more than 330,000 individuals. We provide a general protocol for conducting GWAMAs and carrying out QC to minimize errors and to guarantee maximum use of the data. We also include details for the use of a powerful and flexible software package called EasyQC. Precise timings will be greatly influenced by consortium size. For consortia of comparable size to the GIANT Consortium, this protocol takes a minimum of about 10 months to complete.
AutoLock: a semiautomated system for radiotherapy treatment plan quality control.
Dewhurst, Joseph M; Lowe, Matthew; Hardy, Mark J; Boylan, Christopher J; Whitehurst, Philip; Rowbottom, Carl G
2015-05-08
A semiautomated system for radiotherapy treatment plan quality control (QC), named AutoLock, is presented. AutoLock is designed to augment treatment plan QC by automatically checking aspects of treatment plans that are well suited to computational evaluation, whilst summarizing more subjective aspects in the form of a checklist. The treatment plan must pass all automated checks and all checklist items must be acknowledged by the planner as correct before the plan is finalized. Thus AutoLock uniquely integrates automated treatment plan QC, an electronic checklist, and plan finalization. In addition to reducing the potential for the propagation of errors, the integration of AutoLock into the plan finalization workflow has improved efficiency at our center. Detailed audit data are presented, demonstrating that the treatment plan QC rejection rate fell by around a third following the clinical introduction of AutoLock.
Quality control and conduct of genome-wide association meta-analyses
Winkler, Thomas W; Day, Felix R; Croteau-Chonka, Damien C; Wood, Andrew R; Locke, Adam E; Mägi, Reedik; Ferreira, Teresa; Fall, Tove; Graff, Mariaelisa; Justice, Anne E; Luan, Jian'an; Gustafsson, Stefan; Randall, Joshua C; Vedantam, Sailaja; Workalemahu, Tsegaselassie; Kilpeläinen, Tuomas O; Scherag, André; Esko, Tonu; Kutalik, Zoltán; Heid, Iris M; Loos, Ruth JF
2014-01-01
Rigorous organization and quality control (QC) are necessary to facilitate successful genome-wide association meta-analyses (GWAMAs) of statistics aggregated across multiple genome-wide association studies. This protocol provides guidelines for [1] organizational aspects of GWAMAs, and for [2] QC at the study file level, the meta-level across studies, and the meta-analysis output level. Real–world examples highlight issues experienced and solutions developed by the GIANT Consortium that has conducted meta-analyses including data from 125 studies comprising more than 330,000 individuals. We provide a general protocol for conducting GWAMAs and carrying out QC to minimize errors and to guarantee maximum use of the data. We also include details for use of a powerful and flexible software package called EasyQC. For consortia of comparable size to the GIANT consortium, the present protocol takes a minimum of about 10 months to complete. PMID:24762786
ERIC Educational Resources Information Center
Espy, John; And Others
A project was conducted to field test selected first- and second-year courses in a postsecondary nuclear quality assurance/quality control (QA/QC) technician curriculum and to develop the teaching/learning modules for seven technical specialty courses remaining in the QA/QC technician curriculum. The field testing phase of the project involved the…
Preliminary Quality Control System Design for the Pell Grant Program.
ERIC Educational Resources Information Center
Advanced Technology, Inc., Reston, VA.
A preliminary design for a quality control (QC) system for the Pell Grant Program is proposed, based on the needs of the Office of Student Financial Assistance (OSFA). The applicability of the general design for other student aid programs administered by OSFA is also considered. The following steps included in a strategic approach to QC system…
On Quality Control Procedures Being Adopted for TRMM LBA and KWAJEX Soundings Data Sets
NASA Technical Reports Server (NTRS)
Roy, B.; Halverson, Jeffrey B.; Starr, David OC. (Technical Monitor)
2001-01-01
During NASA's Tropical Rainfall Measuring Mission (TRMM) field campaigns Large Scale Biosphere Atmosphere (LBA) held in Amazonia (Brazil) in the period January- February, 1999, and the Kwajalein Experiment (KWAJEX) held in the Republic of Marshall Islands in the period between August-September, 1999, extensive radiosonde observations (raob) were collected using VIZ and Vaisala sondes which have different response characteristics. In all, 320 raob for LBA and 972 fixed raob for KWAJEX have been obtained and are being processed. Most atmospheric sensible heat source (Q1) and apparent moisture sink (Q2) budget studies are based on sounding data, and the accuracy of the raob is important especially in regions of deep moist convection. A data quality control (QC) project has been initiated at GSFC by the principal investigator (JBH), and this paper addresses some of the quantitative findings for the level I and II QC procedures. Based on these quantitative assessment of sensor (or system) biases associated with each type of sonde, the initial data repair work will be started. Evidence of moisture biases between the two different sondes (VIZ and Vaisala) has been shown earlier by Halverson et al. (2000). Vaisala humidity sensors are found to have a low-level dry bias in the boundary layer, whereas above 600 mb the VIZ sensor tends to register a dryer atmosphere. All raob data were subjected to a limit check based on an algorithm already well tested for the raob data obtained during the Tropical Ocean Global Atmosphere (TOGA-COARE).
Gas electron multiplier (GEM) foil test, repair and effective gain calculation
NASA Astrophysics Data System (ADS)
Tahir, Muhammad; Zubair, Muhammad; Khan, Tufail A.; Khan, Ashfaq; Malook, Asad
2018-06-01
The focus of my research is based on the gas electron multiplier (GEM) foil test, repairing and effective gain calculation of GEM detector. During my research work define procedure of GEM foil testing short-circuit, detection short-circuits in the foil. Study different ways to remove the short circuits in the foils. Set and define the GEM foil testing procedures in the open air, and with nitrogen gas. Measure the leakage current of the foil and applying different voltages with specified step size. Define the Quality Control (QC) tests and different components of GEM detectors before assembly. Calculate the effective gain of GEM detectors using 109Cd and 55Fe radioactive source.
Data-quality measures for stakeholder-implemented watershed-monitoring programs
Greve, Adrienne I.
2002-01-01
Community-based watershed groups, many of which collect environmental data, have steadily increased in number over the last decade. The data generated by these programs are often underutilized due to uncertainty in the quality of data produced. The incorporation of data-quality measures into stakeholder monitoring programs lends statistical validity to data. Data-quality measures are divided into three steps: quality assurance, quality control, and quality assessment. The quality-assurance step attempts to control sources of error that cannot be directly quantified. This step is part of the design phase of a monitoring program and includes clearly defined, quantifiable objectives, sampling sites that meet the objectives, standardized protocols for sample collection, and standardized laboratory methods. Quality control (QC) is the collection of samples to assess the magnitude of error in a data set due to sampling, processing, transport, and analysis. In order to design a QC sampling program, a series of issues needs to be considered: (1) potential sources of error, (2) the type of QC samples, (3) inference space, (4) the number of QC samples, and (5) the distribution of the QC samples. Quality assessment is the process of evaluating quality-assurance measures and analyzing the QC data in order to interpret the environmental data. Quality assessment has two parts: one that is conducted on an ongoing basis as the monitoring program is running, and one that is conducted during the analysis of environmental data. The discussion of the data-quality measures is followed by an example of their application to a monitoring program in the Big Thompson River watershed of northern Colorado.
40 CFR 98.174 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... the same plant instruments or procedures that are used for accounting purposes (such as weigh hoppers... density and volume measurements, etc.), record the totals for each process input and output for each... applicable) during the test using the same plant instruments or procedures that are used for accounting...
40 CFR 98.294 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... scales or methods used for accounting purposes. (3) Document the procedures used to ensure the accuracy of the monthly measurements of trona consumed. (b) If you calculate CO2 process emissions based on... your facility, or methods used for accounting purposes. (3) Document the procedures used to ensure the...
Lelental, Natalia; Brandner, Sebastian; Kofanova, Olga; Blennow, Kaj; Zetterberg, Henrik; Andreasson, Ulf; Engelborghs, Sebastiaan; Mroczko, Barbara; Gabryelewicz, Tomasz; Teunissen, Charlotte; Mollenhauer, Brit; Parnetti, Lucilla; Chiasserini, Davide; Molinuevo, Jose Luis; Perret-Liaudet, Armand; Verbeek, Marcel M; Andreasen, Niels; Brosseron, Frederic; Bahl, Justyna M C; Herukka, Sanna-Kaisa; Hausner, Lucrezia; Frölich, Lutz; Labonte, Anne; Poirier, Judes; Miller, Anne-Marie; Zilka, Norbert; Kovacech, Branislav; Urbani, Andrea; Suardi, Silvia; Oliveira, Catarina; Baldeiras, Ines; Dubois, Bruno; Rot, Uros; Lehmann, Sylvain; Skinningsrud, Anders; Betsou, Fay; Wiltfang, Jens; Gkatzima, Olymbia; Winblad, Bengt; Buchfelder, Michael; Kornhuber, Johannes; Lewczuk, Piotr
2016-03-01
Assay-vendor independent quality control (QC) samples for neurochemical dementia diagnostics (NDD) biomarkers are so far commercially unavailable. This requires that NDD laboratories prepare their own QC samples, for example by pooling leftover cerebrospinal fluid (CSF) samples. To prepare and test alternative matrices for QC samples that could facilitate intra- and inter-laboratory QC of the NDD biomarkers. Three matrices were validated in this study: (A) human pooled CSF, (B) Aβ peptides spiked into human prediluted plasma, and (C) Aβ peptides spiked into solution of bovine serum albumin in phosphate-buffered saline. All matrices were tested also after supplementation with an antibacterial agent (sodium azide). We analyzed short- and long-term stability of the biomarkers with ELISA and chemiluminescence (Fujirebio Europe, MSD, IBL International), and performed an inter-laboratory variability study. NDD biomarkers turned out to be stable in almost all samples stored at the tested conditions for up to 14 days as well as in samples stored deep-frozen (at - 80°C) for up to one year. Sodium azide did not influence biomarker stability. Inter-center variability of the samples sent at room temperature (pooled CSF, freeze-dried CSF, and four artificial matrices) was comparable to the results obtained on deep-frozen samples in other large-scale projects. Our results suggest that it is possible to replace self-made, CSF-based QC samples with large-scale volumes of QC materials prepared with artificial peptides and matrices. This would greatly facilitate intra- and inter-laboratory QC schedules for NDD measurements.
Ford, Eric C; Terezakis, Stephanie; Souranis, Annette; Harris, Kendra; Gay, Hiram; Mutic, Sasa
2012-11-01
To quantify the error-detection effectiveness of commonly used quality control (QC) measures. We analyzed incidents from 2007-2010 logged into a voluntary in-house, electronic incident learning systems at 2 academic radiation oncology clinics. None of the incidents resulted in patient harm. Each incident was graded for potential severity using the French Nuclear Safety Authority scoring scale; high potential severity incidents (score >3) were considered, along with a subset of 30 randomly chosen low severity incidents. Each report was evaluated to identify which of 15 common QC checks could have detected it. The effectiveness was calculated, defined as the percentage of incidents that each QC measure could detect, both for individual QC checks and for combinations of checks. In total, 4407 incidents were reported, 292 of which had high-potential severity. High- and low-severity incidents were detectable by 4.0 ± 2.3 (mean ± SD) and 2.6 ± 1.4 QC checks, respectively (P<.001). All individual checks were less than 50% sensitive with the exception of pretreatment plan review by a physicist (63%). An effectiveness of 97% was achieved with 7 checks used in combination and was not further improved with more checks. The combination of checks with the highest effectiveness includes physics plan review, physician plan review, Electronic Portal Imaging Device-based in vivo portal dosimetry, radiation therapist timeout, weekly physics chart check, the use of checklists, port films, and source-to-skin distance checks. Some commonly used QC checks such as pretreatment intensity modulated radiation therapy QA do not substantially add to the ability to detect errors in these data. The effectiveness of QC measures in radiation oncology depends sensitively on which checks are used and in which combinations. A small percentage of errors cannot be detected by any of the standard formal QC checks currently in broad use, suggesting that further improvements are needed. These data require confirmation with a broader incident-reporting database. Copyright © 2012 Elsevier Inc. All rights reserved.
Kadowaki, Hisae; Satrimafitrah, Pasjan; Takami, Yasunari; Nishitoh, Hideki
2018-05-09
The maintenance of endoplasmic reticulum (ER) homeostasis is essential for cell function. ER stress-induced pre-emptive quality control (ERpQC) helps alleviate the burden to a stressed ER by limiting further protein loading. We have previously reported the mechanisms of ERpQC, which includes a rerouting step and a degradation step. Under ER stress conditions, Derlin family proteins (Derlins), which are components of ER-associated degradation, reroute specific ER-targeting proteins to the cytosol. Newly synthesized rerouted polypeptides are degraded via the cytosolic chaperone Bag6 and the AAA-ATPase p97 in the ubiquitin-proteasome system. However, the mechanisms by which ER-targeting proteins are rerouted from the ER translocation pathway to the cytosolic degradation pathway and how the E3 ligase ubiquitinates ERpQC substrates remain unclear. Here, we show that ERpQC substrates are captured by the carboxyl-terminus region of Derlin-1 and ubiquitinated by the HRD1 E3 ubiquitin ligase prior to degradation. Moreover, HRD1 forms a large ERpQC-related complex composed of Sec61α and Derlin-1 during ER stress. These findings indicate that the association of the degradation factor HRD1 with the translocon and the rerouting factor Derlin-1 may be necessary for the smooth and effective clearance of ERpQC substrates.
Revisiting the Procedures for the Vector Data Quality Assurance in Practice
NASA Astrophysics Data System (ADS)
Erdoğan, M.; Torun, A.; Boyacı, D.
2012-07-01
Immense use of topographical data in spatial data visualization, business GIS (Geographic Information Systems) solutions and applications, mobile and location-based services forced the topo-data providers to create standard, up-to-date and complete data sets in a sustainable frame. Data quality has been studied and researched for more than two decades. There have been un-countable numbers of references on its semantics, its conceptual logical and representations and many applications on spatial databases and GIS. However, there is a gap between research and practice in the sense of spatial data quality which increases the costs and decreases the efficiency of data production. Spatial data quality is well-known by academia and industry but usually in different context. The research on spatial data quality stated several issues having practical use such as descriptive information, metadata, fulfillment of spatial relationships among data, integrity measures, geometric constraints etc. The industry and data producers realize them in three stages; pre-, co- and post data capturing. The pre-data capturing stage covers semantic modelling, data definition, cataloguing, modelling, data dictionary and schema creation processes. The co-data capturing stage covers general rules of spatial relationships, data and model specific rules such as topologic and model building relationships, geometric threshold, data extraction guidelines, object-object, object-belonging class, object-non-belonging class, class-class relationships to be taken into account during data capturing. And post-data capturing stage covers specified QC (quality check) benchmarks and checking compliance to general and specific rules. The vector data quality criteria are different from the views of producers and users. But these criteria are generally driven by the needs, expectations and feedbacks of the users. This paper presents a practical method which closes the gap between theory and practice. Development of spatial data quality concepts into developments and application requires existence of conceptual, logical and most importantly physical existence of data model, rules and knowledge of realization in a form of geo-spatial data. The applicable metrics and thresholds are determined on this concrete base. This study discusses application of geo-spatial data quality issues and QA (quality assurance) and QC procedures in the topographic data production. Firstly we introduce MGCP (Multinational Geospatial Co-production Program) data profile of NATO (North Atlantic Treaty Organization) DFDD (DGIWG Feature Data Dictionary), the requirements of data owner, the view of data producers for both data capturing and QC and finally QA to fulfil user needs. Then, our practical and new approach which divides the quality into three phases is introduced. Finally, implementation of our approach to accomplish metrics, measures and thresholds of quality definitions is discussed. In this paper, especially geometry and semantics quality and quality control procedures that can be performed by the producers are discussed. Some applicable best-practices that we experienced on techniques of quality control, defining regulations that define the objectives and data production procedures are given in the final remarks. These quality control procedures should include the visual checks over the source data, captured vector data and printouts, some automatic checks that can be performed by software and some semi-automatic checks by the interaction with quality control personnel. Finally, these quality control procedures should ensure the geometric, semantic, attribution and metadata quality of vector data.
The purpose of this SOP is to describe the procedures undertaken for calculating ingestion exposure using the indirect method of exposure estimation. This SOP uses This SOP uses data that have been properly coded and certified with appropriate QA/QC procedures by the University ...
Develop a Methodology to Evaluate the Effectiveness of QC/QA Specifications (Phase II)
DOT National Transportation Integrated Search
1998-08-01
The Texas Department of Transportation (TxDOT) has been implementing statistically based quality control/quality assurance (QC/QA) specifications for hot mix asphalt concrete pavements since the early 1990s. These specifications have been continuousl...
A Benchmark Study on Error Assessment and Quality Control of CCS Reads Derived from the PacBio RS
Jiao, Xiaoli; Zheng, Xin; Ma, Liang; Kutty, Geetha; Gogineni, Emile; Sun, Qiang; Sherman, Brad T.; Hu, Xiaojun; Jones, Kristine; Raley, Castle; Tran, Bao; Munroe, David J.; Stephens, Robert; Liang, Dun; Imamichi, Tomozumi; Kovacs, Joseph A.; Lempicki, Richard A.; Huang, Da Wei
2013-01-01
PacBio RS, a newly emerging third-generation DNA sequencing platform, is based on a real-time, single-molecule, nano-nitch sequencing technology that can generate very long reads (up to 20-kb) in contrast to the shorter reads produced by the first and second generation sequencing technologies. As a new platform, it is important to assess the sequencing error rate, as well as the quality control (QC) parameters associated with the PacBio sequence data. In this study, a mixture of 10 prior known, closely related DNA amplicons were sequenced using the PacBio RS sequencing platform. After aligning Circular Consensus Sequence (CCS) reads derived from the above sequencing experiment to the known reference sequences, we found that the median error rate was 2.5% without read QC, and improved to 1.3% with an SVM based multi-parameter QC method. In addition, a De Novo assembly was used as a downstream application to evaluate the effects of different QC approaches. This benchmark study indicates that even though CCS reads are post error-corrected it is still necessary to perform appropriate QC on CCS reads in order to produce successful downstream bioinformatics analytical results. PMID:24179701
A Benchmark Study on Error Assessment and Quality Control of CCS Reads Derived from the PacBio RS.
Jiao, Xiaoli; Zheng, Xin; Ma, Liang; Kutty, Geetha; Gogineni, Emile; Sun, Qiang; Sherman, Brad T; Hu, Xiaojun; Jones, Kristine; Raley, Castle; Tran, Bao; Munroe, David J; Stephens, Robert; Liang, Dun; Imamichi, Tomozumi; Kovacs, Joseph A; Lempicki, Richard A; Huang, Da Wei
2013-07-31
PacBio RS, a newly emerging third-generation DNA sequencing platform, is based on a real-time, single-molecule, nano-nitch sequencing technology that can generate very long reads (up to 20-kb) in contrast to the shorter reads produced by the first and second generation sequencing technologies. As a new platform, it is important to assess the sequencing error rate, as well as the quality control (QC) parameters associated with the PacBio sequence data. In this study, a mixture of 10 prior known, closely related DNA amplicons were sequenced using the PacBio RS sequencing platform. After aligning Circular Consensus Sequence (CCS) reads derived from the above sequencing experiment to the known reference sequences, we found that the median error rate was 2.5% without read QC, and improved to 1.3% with an SVM based multi-parameter QC method. In addition, a De Novo assembly was used as a downstream application to evaluate the effects of different QC approaches. This benchmark study indicates that even though CCS reads are post error-corrected it is still necessary to perform appropriate QC on CCS reads in order to produce successful downstream bioinformatics analytical results.
Hong, Sung Kuk; Choi, Seung Jun; Shin, Saeam; Lee, Wonmok; Pinto, Naina; Shin, Nari; Lee, Kwangjun; Hong, Seong Geun; Kim, Young Ah; Lee, Hyukmin; Kim, Heejung; Song, Wonkeun; Lee, Sun Hwa; Yong, Dongeun; Lee, Kyungwon; Chong, Yunsop
2015-11-01
Quality control (QC) processes are being performed in the majority of clinical microbiology laboratories to ensure the performance of microbial identification and antimicrobial susceptibility testing by using ATCC strains. To obtain these ATCC strains, some inconveniences are encountered concerning the purchase cost of the strains and the shipping time required. This study was focused on constructing a database of reference strains for QC processes using domestic bacterial strains, concentrating primarily on antimicrobial susceptibility testing. Three strains (Escherichia coli, Pseudomonas aeruginosa, and Staphylococcus aureus) that showed legible results in preliminary testing were selected. The minimal inhibitory concentrations (MICs) and zone diameters (ZDs) of eight antimicrobials for each strain were determined according to the CLSI M23. All resulting MIC and ZD ranges included at least 95% of the data. The ZD QC ranges obtained by using the CLSI method were less than 12 mm, and the MIC QC ranges extended no more than five dilutions. This study is a preliminary attempt to construct a bank of Korean QC strains. With further studies, a positive outcome toward cost and time reduction can be anticipated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Joseph; Pirrung, Meg; McCue, Lee Ann
FQC is software that facilitates large-scale quality control of FASTQ files by carrying out a QC protocol, parsing results, and aggregating quality metrics within and across experiments into an interactive dashboard. The dashboard utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data.
ITC Guidelines on Quality Control in Scoring, Test Analysis, and Reporting of Test Scores
ERIC Educational Resources Information Center
Allalouf, Avi
2014-01-01
The Quality Control (QC) Guidelines are intended to increase the efficiency, precision, and accuracy of the scoring, analysis, and reporting process of testing. The QC Guidelines focus on large-scale testing operations where multiple forms of tests are created for use on set dates. However, they may also be used for a wide variety of other testing…
CT and MRI slice separation evaluation by LabView developed software.
Acri, Giuseppe; Testagrossa, Barbara; Sestito, Angela; Bonanno, Lilla; Vermiglio, Giuseppe
2018-02-01
The efficient use of Computed Tomography (CT) and Magnetic Resonance Imaging (MRI) equipment necessitates establishing adequate quality-control (QC) procedures. In particular, the accuracy of slice separation, during multislices acquisition, requires scan exploration of phantoms containing test objects. To simplify such procedures, a novel phantom and a computerised LabView-based procedure have been devised, enabling determination the midpoint of full width at half maximum (FWHM) in real time while the distance from the profile midpoint of two progressive images is evaluated and measured. The results were compared with those obtained by processing the same phantom images with commercial software. To validate the proposed methodology the Fisher test was conducted on the resulting data sets. In all cases, there was no statistically significant variation between the commercial procedure and the LabView one, which can be used on any CT and MRI diagnostic devices. Copyright © 2017. Published by Elsevier GmbH.
Létourneau, Daniel; Wang, An; Amin, Md Nurul; Pearce, Jim; McNiven, Andrea; Keller, Harald; Norrlinger, Bernhard; Jaffray, David A
2014-12-01
High-quality radiation therapy using highly conformal dose distributions and image-guided techniques requires optimum machine delivery performance. In this work, a monitoring system for multileaf collimator (MLC) performance, integrating semiautomated MLC quality control (QC) tests and statistical process control tools, was developed. The MLC performance monitoring system was used for almost a year on two commercially available MLC models. Control charts were used to establish MLC performance and assess test frequency required to achieve a given level of performance. MLC-related interlocks and servicing events were recorded during the monitoring period and were investigated as indicators of MLC performance variations. The QC test developed as part of the MLC performance monitoring system uses 2D megavoltage images (acquired using an electronic portal imaging device) of 23 fields to determine the location of the leaves with respect to the radiation isocenter. The precision of the MLC performance monitoring QC test and the MLC itself was assessed by detecting the MLC leaf positions on 127 megavoltage images of a static field. After initial calibration, the MLC performance monitoring QC test was performed 3-4 times/week over a period of 10-11 months to monitor positional accuracy of individual leaves for two different MLC models. Analysis of test results was performed using individuals control charts per leaf with control limits computed based on the measurements as well as two sets of specifications of ± 0.5 and ± 1 mm. Out-of-specification and out-of-control leaves were automatically flagged by the monitoring system and reviewed monthly by physicists. MLC-related interlocks reported by the linear accelerator and servicing events were recorded to help identify potential causes of nonrandom MLC leaf positioning variations. The precision of the MLC performance monitoring QC test and the MLC itself was within ± 0.22 mm for most MLC leaves and the majority of the apparent leaf motion was attributed to beam spot displacements between irradiations. The MLC QC test was performed 193 and 162 times over the monitoring period for the studied units and recalibration had to be repeated up to three times on one of these units. For both units, rate of MLC interlocks was moderately associated with MLC servicing events. The strongest association with the MLC performance was observed between the MLC servicing events and the total number of out-of-control leaves. The average elapsed time for which the number of out-of-specification or out-of-control leaves was within a given performance threshold was computed and used to assess adequacy of MLC test frequency. A MLC performance monitoring system has been developed and implemented to acquire high-quality QC data at high frequency. This is enabled by the relatively short acquisition time for the images and automatic image analysis. The monitoring system was also used to record and track the rate of MLC-related interlocks and servicing events. MLC performances for two commercially available MLC models have been assessed and the results support monthly test frequency for widely accepted ± 1 mm specifications. Higher QC test frequency is however required to maintain tighter specification and in-control behavior.
Development of concrete QC/QA specifications for highway construction in Kentucky.
DOT National Transportation Integrated Search
2001-08-01
There is a growing trend toward quality-based specifications in highway construction. A large number of quality control/quality assurance (QC/QA) specifications shift the responsibility of day-to-day testing from the state DOH to the contractor. This...
Portland cement concrete pavement review of QC/QA data 2000 through 2009.
DOT National Transportation Integrated Search
2011-04-01
This report analyzes the Quality Control/Quality Assurance (QC/QA) data for Portland cement concrete pavement : (PCCP) awarded in the years 2000 through 2009. Analysis of the overall performance of the projects is accomplished by : reviewing the Calc...
NASA Technical Reports Server (NTRS)
Kosterev, A. A.; Tittel, F. K.; Durante, W.; Allen, M.; Kohler, R.; Gmachl, C.; Capasso, F.; Sivco, D. L.; Cho, A. Y.
2002-01-01
We report the first application of pulsed, near-room-temperature quantum cascade laser technology to the continuous detection of biogenic CO production rates above viable cultures of vascular smooth muscle cells. A computer-controlled sequence of measurements over a 9-h period was obtained, resulting in a minimum detectable CO production of 20 ppb in a 1-m optical path above a standard cell-culture flask. Data-processing procedures for real-time monitoring of both biogenic and ambient atmospheric CO concentrations are described.
Brown, Joseph; Pirrung, Meg; McCue, Lee Ann
2017-06-09
FQC is software that facilitates quality control of FASTQ files by carrying out a QC protocol using FastQC, parsing results, and aggregating quality metrics into an interactive dashboard designed to richly summarize individual sequencing runs. The dashboard groups samples in dropdowns for navigation among the data sets, utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data. FQC is implemented in Python 3 and Javascript, and is maintained under an MIT license. Documentation and source code is available at: https://github.com/pnnl/fqc . joseph.brown@pnnl.gov. © The Author(s) 2017. Published by Oxford University Press.
McClure, Matthew C.; McCarthy, John; Flynn, Paul; McClure, Jennifer C.; Dair, Emma; O'Connell, D. K.; Kearney, John F.
2018-01-01
A major use of genetic data is parentage verification and identification as inaccurate pedigrees negatively affect genetic gain. Since 2012 the international standard for single nucleotide polymorphism (SNP) verification in Bos taurus cattle has been the ISAG SNP panels. While these ISAG panels provide an increased level of parentage accuracy over microsatellite markers (MS), they can validate the wrong parent at ≤1% misconcordance rate levels, indicating that more SNP are needed if a more accurate pedigree is required. With rapidly increasing numbers of cattle being genotyped in Ireland that represent 61 B. taurus breeds from a wide range of farm types: beef/dairy, AI/pedigree/commercial, purebred/crossbred, and large to small herd size the Irish Cattle Breeding Federation (ICBF) analyzed different SNP densities to determine that at a minimum ≥500 SNP are needed to consistently predict only one set of parents at a ≤1% misconcordance rate. For parentage validation and prediction ICBF uses 800 SNP (ICBF800) selected based on SNP clustering quality, ISAG200 inclusion, call rate (CR), and minor allele frequency (MAF) in the Irish cattle population. Large datasets require sample and SNP quality control (QC). Most publications only deal with SNP QC via CR, MAF, parent-progeny conflicts, and Hardy-Weinberg deviation, but not sample QC. We report here parentage, SNP QC, and a genomic sample QC pipelines to deal with the unique challenges of >1 million genotypes from a national herd such as SNP genotype errors from mis-tagging of animals, lab errors, farm errors, and multiple other issues that can arise. We divide the pipeline into two parts: a Genotype QC and an Animal QC pipeline. The Genotype QC identifies samples with low call rate, missing or mixed genotype classes (no BB genotype or ABTG alleles present), and low genotype frequencies. The Animal QC handles situations where the genotype might not belong to the listed individual by identifying: >1 non-matching genotypes per animal, SNP duplicates, sex and breed prediction mismatches, parentage and progeny validation results, and other situations. The Animal QC pipeline make use of ICBF800 SNP set where appropriate to identify errors in a computationally efficient yet still highly accurate method. PMID:29599798
Data services providing by the Ukrainian NODC (MHI NASU)
NASA Astrophysics Data System (ADS)
Eremeev, V.; Godin, E.; Khaliulin, A.; Ingerov, A.; Zhuk, E.
2009-04-01
At modern stage of the World Ocean study information support of investigation based on ad-vanced computer technologies becomes of particular importance. These abstracts are devoted to presentation of several data services developed in the Ukrainian NODC on the base of the Ma-rine Environmental and Information Technologies Department of MHI NASU. The Data Quality Control Service Using experience of international collaboration in the field of data collection and quality check we have developed the quality control (QC) software providing both preliminary(automatic) and expert(manual) data quality check procedures. The current version of the QC software works for the Mediterranean and Black seas and includes the climatic arrays for hydrological and few hydrochemical parameters based on such products as MEDAR/MEDATLAS II, Physical Oceanography of the Black Sea and Climatic Atlas of Oxygen and Hydrogen Sulfide in the Black sea. The data quality check procedure includes metadata control and hydrological and hydrochemical data control. Metadata control provides checking of duplicate cruises and pro-files, date and chronology, ship velocity, station location, sea depth and observation depth. Data QC procedure includes climatic (or range for parameters with small number of observations) data QC, density inversion check for hydrological data and searching for spikes. Using of cli-matic fields and profiles prepared by regional oceanography experts leads to more reliable results of data quality check procedure. The Data Access Services The Ukrainian NODC provides two products for data access - on-line software and data access module for the MHI NASU local net. This software allows select-ing data on rectangle area, on date, on months, on cruises. The result of query is metadata which are presented in the table and the visual presentation of stations on the map. It is possible to see both metadata and data. For this purpose it is necessary to select station in the table of metadata or on the map. There is also an opportunity to export data in ODV format. The product is avail-able on http://www.ocean.nodc.org.ua/DataAccess.php The local net version provides access to the oceanological database of the MHI NASU. The cur-rent version allows selecting data by spatial and temporal limits, depth, values of parameters, quality flags and works for the Mediterranean and Black seas. It provides visualization of meta-data and data, statistics of data selection, data export into several data formats. The Operational Data Management Services The collaborators of the MHI Experimental Branch developed a system of obtaining information on water pressure and temperature, as well as on atmospheric pressure. Sea level observations are also conducted. The obtained data are transferred online. The interface for operation data access was developed. It allows to select parameters (sea level, water temperature, atmospheric pressure, wind and wa-ter pressure) and time interval to see parameter graphics. The product is available on http://www.ocean.nodc.org.ua/Katsively.php . The Climatic products The current version of the Climatic Atlas includes maps on such pa-rameters as temperature, salinity, density, heat storage, dynamic heights, upper boundary of hy-drogen sulfide and lower boundary of oxygen for the Black sea basin. Maps for temperature, sa-linity, density were calculated on 19 standard depths and averaged monthly for depths 0 - 300 m and annually for lower depth values. The climatic maps of upper boundary of hydrogen sulfide and lower boundary of oxygen were averaged by decades from 20 till 90 of the XX century and by seasons. Two versions of climatic atlas viewer - on-line and desktop for presentation of the climatic maps were developed. They provide similar functions of selection and viewing maps by parameter, month and depth and saving maps in various formats. On-line version of atlas is available on http://www.ocean.nodc.org.ua/Main_Atlas.php .
Keller, Sune H; Sibomana, Merence; Olesen, Oline V; Svarer, Claus; Holm, Søren; Andersen, Flemming L; Højgaard, Liselotte
2012-03-01
Many authors have reported the importance of motion correction (MC) for PET. Patient motion during scanning disturbs kinetic analysis and degrades resolution. In addition, using misaligned transmission for attenuation and scatter correction may produce regional quantification bias in the reconstructed emission images. The purpose of this work was the development of quality control (QC) methods for MC procedures based on external motion tracking (EMT) for human scanning using an optical motion tracking system. Two scans with minor motion and 5 with major motion (as reported by the optical motion tracking system) were selected from (18)F-FDG scans acquired on a PET scanner. The motion was measured as the maximum displacement of the markers attached to the subject's head and was considered to be major if larger than 4 mm and minor if less than 2 mm. After allowing a 40- to 60-min uptake time after tracer injection, we acquired a 6-min transmission scan, followed by a 40-min emission list-mode scan. Each emission list-mode dataset was divided into 8 frames of 5 min. The reconstructed time-framed images were aligned to a selected reference frame using either EMT or the AIR (automated image registration) software. The following 3 QC methods were used to evaluate the EMT and AIR MC: a method using the ratio between 2 regions of interest with gray matter voxels (GM) and white matter voxels (WM), called GM/WM; mutual information; and cross correlation. The results of the 3 QC methods were in agreement with one another and with a visual subjective inspection of the image data. Before MC, the QC method measures varied significantly in scans with major motion and displayed limited variations on scans with minor motion. The variation was significantly reduced and measures improved after MC with AIR, whereas EMT MC performed less well. The 3 presented QC methods produced similar results and are useful for evaluating tracer-independent external-tracking motion-correction methods for human brain scans.
20 CFR 602.21 - Standard methods and procedures.
Code of Federal Regulations, 2010 CFR
2010-04-01
..., (2) Use a questionnaire, prescribed by the Department, which is designed to obtain such data as the Department deems necessary for the operation of the QC program; require completion of the questionnaire by...
40 CFR 98.264 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
...-process phosphoric acid process line. You can use existing plant procedures that are used for accounting... the process line. Conduct the representative bulk sampling using the applicable standard method in the...
40 CFR 98.294 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... designed to measure the total alkalinity in soda ash not in trona. The modified method referred to above... requirements. Section 98.293 provides three different procedures for emission calculations. The appropriate...
40 CFR 98.294 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... designed to measure the total alkalinity in soda ash not in trona. The modified method referred to above... requirements. Section 98.293 provides three different procedures for emission calculations. The appropriate...
40 CFR 98.294 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... designed to measure the total alkalinity in soda ash not in trona. The modified method referred to above... requirements. Section 98.293 provides three different procedures for emission calculations. The appropriate...
40 CFR 98.294 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... designed to measure the total alkalinity in soda ash not in trona. The modified method referred to above... requirements. Section 98.293 provides three different procedures for emission calculations. The appropriate...
Comprehensive Testing Guidelines to Increase Efficiency in INDOT Operations : [Technical Summary
DOT National Transportation Integrated Search
2012-01-01
When the Indiana Department of Transportation designs a pavement project, a decision for QC/QA (Quality Control/ Quality Assurance) or nonQC/QA is made solely based on the quantity of pavement materials to be used in the project. Once the pavement...
Comprehensive Testing Guidelines to Increase Efficiency in INDOT Operations : [Technical Summary
DOT National Transportation Integrated Search
2012-01-01
When the Indiana Department of Transportation designs : a pavement project, a decision for QC/QA (Quality Control/ : Quality Assurance) or nonQC/QA is made solely : based on the quantity of pavement materials to be used : in the project. Once the ...
Hot mix asphalt voids acceptance review of QC/QA data 2000 through 2010.
DOT National Transportation Integrated Search
2011-10-01
This report analyzes the quality control/quality assurance (QC/QA) data for hot mix asphalt (HMA) using : voids acceptance as the testing criteria awarded in the years 2000 through 2010. Analysis of the overall : performance of the projects is accomp...
An Automatic Quality Control Pipeline for High-Throughput Screening Hit Identification.
Zhai, Yufeng; Chen, Kaisheng; Zhong, Yang; Zhou, Bin; Ainscow, Edward; Wu, Ying-Ta; Zhou, Yingyao
2016-09-01
The correction or removal of signal errors in high-throughput screening (HTS) data is critical to the identification of high-quality lead candidates. Although a number of strategies have been previously developed to correct systematic errors and to remove screening artifacts, they are not universally effective and still require fair amount of human intervention. We introduce a fully automated quality control (QC) pipeline that can correct generic interplate systematic errors and remove intraplate random artifacts. The new pipeline was first applied to ~100 large-scale historical HTS assays; in silico analysis showed auto-QC led to a noticeably stronger structure-activity relationship. The method was further tested in several independent HTS runs, where QC results were sampled for experimental validation. Significantly increased hit confirmation rates were obtained after the QC steps, confirming that the proposed method was effective in enriching true-positive hits. An implementation of the algorithm is available to the screening community. © 2016 Society for Laboratory Automation and Screening.
Non-monotonicity and divergent time scale in Axelrod model dynamics
NASA Astrophysics Data System (ADS)
Vazquez, F.; Redner, S.
2007-04-01
We study the evolution of the Axelrod model for cultural diversity, a prototypical non-equilibrium process that exhibits rich dynamics and a dynamic phase transition between diversity and an inactive state. We consider a simple version of the model in which each individual possesses two features that can assume q possibilities. Within a mean-field description in which each individual has just a few interaction partners, we find a phase transition at a critical value qc between an active, diverse state for q < qc and a frozen state. For q lesssim qc, the density of active links is non-monotonic in time and the asymptotic approach to the steady state is controlled by a time scale that diverges as (q-qc)-1/2.
CARINA data synthesis project: pH data scale unification and cruise adjustments
NASA Astrophysics Data System (ADS)
Velo, A.; Pérez, F. F.; Lin, X.; Key, R. M.; Tanhua, T.; de La Paz, M.; van Heuven, S.; Jutterström, S.; Ríos, A. F.
2009-10-01
Data on carbon and carbon-relevant hydrographic and hydrochemical parameters from previously non-publicly available cruise data sets in the Artic Mediterranean Seas (AMS), Atlantic and Southern Ocean have been retrieved and merged to a new database: CARINA (CARbon IN the Atlantic). These data have gone through rigorous quality control (QC) procedures to assure the highest possible quality and consistency. The data for most of the measured parameters in the CARINA database were objectively examined in order to quantify systematic differences in the reported values, i.e. secondary quality control. Systematic biases found in the data have been corrected in the data products, i.e. three merged data files with measured, calculated and interpolated data for each of the three CARINA regions; AMS, Atlantic and Southern Ocean. Out of a total of 188 cruise entries in the CARINA database, 59 reported pH measured values. Here we present details of the secondary QC on pH for the CARINA database. Procedures of quality control, including crossover analysis between cruises and inversion analysis of all crossover data are briefly described. Adjustments were applied to the pH values for 21 of the cruises in the CARINA dataset. With these adjustments the CARINA database is consistent both internally as well as with GLODAP data, an oceanographic data set based on the World Hydrographic Program in the 1990s. Based on our analysis we estimate the internal accuracy of the CARINA pH data to be 0.005 pH units. The CARINA data are now suitable for accurate assessments of, for example, oceanic carbon inventories and uptake rates and for model validation.
CARINA TCO2 data in the Atlantic Ocean
NASA Astrophysics Data System (ADS)
Pierrot, D.; Brown, P.; van Heuven, S.; Tanhua, T.; Schuster, U.; Wanninkhof, R.; Key, R. M.
2010-01-01
Water column data of carbon and carbon-relevant hydrographic and hydrochemical parameters from 188 cruises in the Arctic, Atlantic and Southern Ocean have been retrieved and merged in a new data base: the CARINA (CARbon IN the Atlantic) Project. These data have gone through rigorous quality control (QC) procedures to assure the highest possible quality and consistency. Secondary quality control, which involved objective study of data in order to quantify systematic differences in the reported values, was performed for the pertinent parameters in the CARINA data base. Systematic biases in the data have been corrected in the data products. The products are three merged data files with measured, adjusted and interpolated data of all cruises for each of the three CARINA regions (Arctic, Atlantic and Southern Ocean). Ninety-eight cruises were conducted in the "Atlantic" defined as the region south of the Greenland-Iceland-Scotland Ridge and north of about 30° S. Here we report the details of the secondary QC which was done on the total dissolved inorganic carbon (TCO2) data and the adjustments that were applied to yield the final data product in the Atlantic. Procedures of quality control - including crossover analysis between stations and inversion analysis of all crossover data - are briefly described. Adjustments were applied to TCO2 measurements for 17 of the cruises in the Atlantic Ocean region. With these adjustments, the CARINA data base is consistent both internally as well as with GLODAP data, an oceanographic data set based on the WOCE Hydrographic Program in the 1990s, and is now suitable for accurate assessments of, for example, regional oceanic carbon inventories, uptake rates and model validation.
CARINA alkalinity data in the Atlantic Ocean
NASA Astrophysics Data System (ADS)
Velo, A.; Perez, F. F.; Brown, P.; Tanhua, T.; Schuster, U.; Key, R. M.
2009-08-01
Data on carbon and carbon-relevant hydrographic and hydrochemical parameters from previously non-publicly available cruise data sets in the Arctic, Atlantic and Southern Ocean have been retrieved and merged to a new database: CARINA (CARbon IN the Atlantic). These data have gone through rigorous quality control (QC) procedures to assure the highest possible quality and consistency. The data for most of the measured parameters in the CARINA data base were objectively examined in order to quantify systematic differences in the reported values, i.e. secondary quality control. Systematic biases found in the data have been corrected in the data products, i.e. three merged data files with measured, calculated and interpolated data for each of the three CARINA regions; Arctic, Atlantic and Southern Ocean. Out of a total of 188 cruise entries in the CARINA database, 98 were conducted in the Atlantic Ocean and of these, 75 cruises report alkalinity values. Here we present details of the secondary QC on alkalinity for the Atlantic Ocean part of CARINA. Procedures of quality control, including crossover analysis between cruises and inversion analysis of all crossover data are briefly described. Adjustments were applied to the alkalinity values for 16 of the cruises in the Atlantic Ocean region. With these adjustments the CARINA database is consistent both internally as well as with GLODAP data, an oceanographic data set based on the World Hydrographic Program in the 1990s. Based on our analysis we estimate the internal accuracy of the CARINA-ATL alkalinity data to be 3.3 μmol kg-1. The CARINA data are now suitable for accurate assessments of, for example, oceanic carbon inventories and uptake rates and for model validation.
CARINA: nutrient data in the Atlantic Ocean
NASA Astrophysics Data System (ADS)
Tanhua, T.; Brown, P. J.; Key, R. M.
2009-11-01
Data on carbon and carbon-relevant hydrographic and hydrochemical parameters from previously non-publicly available cruise data sets in the Arctic, Atlantic and Southern Ocean have been retrieved and merged to a new database: CARINA (CARbon IN the Atlantic). These data have gone through rigorous quality control (QC) procedures to assure the highest possible quality and consistency. The data for most of the measured parameters in the CARINA data base were objectively examined in order to quantify systematic differences in the reported values, i.e. secondary quality control. Systematic biases found in the data have been corrected in the data products, i.e. three merged data files with measured, calculated and interpolated data for each of the three CARINA regions; Arctic Mediterranean Seas, Atlantic and Southern Ocean. Out of a total of 188 cruise entries in the CARINA database, 98 were conducted in the Atlantic Ocean and of these 84 cruises report nitrate values, 79 silicate, and 78 phosphate. Here we present details of the secondary QC for nutrients for the Atlantic Ocean part of CARINA. Procedures of quality control, including crossover analysis between cruises and inversion analysis of all crossover data are briefly described. Adjustments were applied to the nutrient values for 43 of the cruises in the Atlantic Ocean region. With these adjustments the CARINA database is consistent both internally as well as with GLODAP data, an oceanographic data set based on the World Hydrographic Program in the 1990s (Key et al., 2004). Based on our analysis we estimate the internal accuracy of the CARINA-ATL nutrient data to be: nitrate 1.5%; phosphate 2.6%; silicate 3.1%. The CARINA data are now suitable for accurate assessments of, for example, oceanic carbon inventories and uptake rates and for model validation.
CARINA: nutrient data in the Atlantic Ocean
NASA Astrophysics Data System (ADS)
Tanhua, T.; Brown, P. J.; Key, R. M.
2009-07-01
Data on carbon and carbon-relevant hydrographic and hydrochemical parameters from previously non-publicly available cruise data sets in the Arctic, Atlantic and Southern Ocean have been retrieved and merged to a new database: CARINA (CARbon IN the Atlantic). These data have gone through rigorous quality control (QC) procedures to assure the highest possible quality and consistency. The data for most of the measured parameters in the CARINA data base were objectively examined in order to quantify systematic differences in the reported values, i.e. secondary quality control. Systematic biases found in the data have been corrected in the data products, i.e. three merged data files with measured, calculated and interpolated data for each of the three CARINA regions; Arctic, Atlantic and Southern Ocean. Out of a total of 188 cruise entries in the CARINA database, 98 were conducted in the Atlantic Ocean and of these 84 cruises report nitrate values, 79 silicate, and 78 phosphate. Here we present details of the secondary QC for nutrients for the Atlantic Ocean part of CARINA. Procedures of quality control, including crossover analysis between cruises and inversion analysis of all crossover data are briefly described. Adjustments were applied to the nutrient values for 43 of the cruises in the Atlantic Ocean region. With these adjustments the CARINA database is consistent both internally as well as with GLODAP data, an oceanographic data set based on the World Hydrographic Program in the 1990s (Key et al., 2004). Based on our analysis we estimate the internal accuracy of the CARINA-ATL nutrient data to be: nitrate 1.5%; phosphate 2.6%; silicate 3.1%. The CARINA data are now suitable for accurate assessments of, for example, oceanic carbon inventories and uptake rates and for model validation.
CARINA alkalinity data in the Atlantic Ocean
NASA Astrophysics Data System (ADS)
Velo, A.; Perez, F. F.; Brown, P.; Tanhua, T.; Schuster, U.; Key, R. M.
2009-11-01
Data on carbon and carbon-relevant hydrographic and hydrochemical parameters from previously non-publicly available cruise data sets in the Arctic, Atlantic and Southern Ocean have been retrieved and merged to a new database: CARINA (CARbon IN the Atlantic). These data have gone through rigorous quality control (QC) procedures to assure the highest possible quality and consistency. The data for most of the measured parameters in the CARINA data base were objectively examined in order to quantify systematic differences in the reported values, i.e. secondary quality control. Systematic biases found in the data have been corrected in the data products, i.e. three merged data files with measured, calculated and interpolated data for each of the three CARINA regions; Arctic, Atlantic and Southern Ocean. Out of a total of 188 cruise entries in the CARINA database, 98 were conducted in the Atlantic Ocean and of these, 75 cruises report alkalinity values. Here we present details of the secondary QC on alkalinity for the Atlantic Ocean part of CARINA. Procedures of quality control, including crossover analysis between cruises and inversion analysis of all crossover data are briefly described. Adjustments were applied to the alkalinity values for 16 of the cruises in the Atlantic Ocean region. With these adjustments the CARINA database is consistent both internally as well as with GLODAP data, an oceanographic data set based on the World Hydrographic Program in the 1990s. Based on our analysis we estimate the internal accuracy of the CARINA-ATL alkalinity data to be 3.3 μmol kg-1. The CARINA data are now suitable for accurate assessments of, for example, oceanic carbon inventories and uptake rates and for model validation.
Dimech, Wayne; Karakaltsas, Marina; Vincini, Giuseppe A
2018-05-25
A general trend towards conducting infectious disease serology testing in centralized laboratories means that quality control (QC) principles used for clinical chemistry testing are applied to infectious disease testing. However, no systematic assessment of methods used to establish QC limits has been applied to infectious disease serology testing. A total of 103 QC data sets, obtained from six different infectious disease serology analytes, were parsed through standard methods for establishing statistical control limits, including guidelines from Public Health England, USA Clinical and Laboratory Standards Institute (CLSI), German Richtlinien der Bundesärztekammer (RiliBÄK) and Australian QConnect. The percentage of QC results failing each method was compared. The percentage of data sets having more than 20% of QC results failing Westgard rules when the first 20 results were used to calculate the mean±2 standard deviation (SD) ranged from 3 (2.9%) for R4S to 66 (64.1%) for 10X rule, whereas the percentage ranged from 0 (0%) for R4S to 32 (40.5%) for 10X when the first 100 results were used to calculate the mean±2 SD. By contrast, the percentage of data sets with >20% failing the RiliBÄK control limits was 25 (24.3%). Only two data sets (1.9%) had more than 20% of results outside the QConnect Limits. The rate of failure of QCs using QConnect Limits was more applicable for monitoring infectious disease serology testing compared with UK Public Health, CLSI and RiliBÄK, as the alternatives to QConnect Limits reported an unacceptably high percentage of failures across the 103 data sets.
Hot mix asphalt voids acceptance review of QC/QA data 2000 through 2004.
DOT National Transportation Integrated Search
2006-07-01
This report analyzes the Quality Control/Quality Assurance (QC/QA) data for hot mix asphalt using voids acceptance as : the testing criteria for the years 2000 through 2004. Analysis of the overall quality of the HMA is accomplished by : reviewing th...
User-friendly solutions for microarray quality control and pre-processing on ArrayAnalysis.org
Eijssen, Lars M. T.; Jaillard, Magali; Adriaens, Michiel E.; Gaj, Stan; de Groot, Philip J.; Müller, Michael; Evelo, Chris T.
2013-01-01
Quality control (QC) is crucial for any scientific method producing data. Applying adequate QC introduces new challenges in the genomics field where large amounts of data are produced with complex technologies. For DNA microarrays, specific algorithms for QC and pre-processing including normalization have been developed by the scientific community, especially for expression chips of the Affymetrix platform. Many of these have been implemented in the statistical scripting language R and are available from the Bioconductor repository. However, application is hampered by lack of integrative tools that can be used by users of any experience level. To fill this gap, we developed a freely available tool for QC and pre-processing of Affymetrix gene expression results, extending, integrating and harmonizing functionality of Bioconductor packages. The tool can be easily accessed through a wizard-like web portal at http://www.arrayanalysis.org or downloaded for local use in R. The portal provides extensive documentation, including user guides, interpretation help with real output illustrations and detailed technical documentation. It assists newcomers to the field in performing state-of-the-art QC and pre-processing while offering data analysts an integral open-source package. Providing the scientific community with this easily accessible tool will allow improving data quality and reuse and adoption of standards. PMID:23620278
40 CFR 98.184 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... you determine process CO2 emissions using the carbon mass balance procedure in § 98.183(b)(2)(i) and... Instrumental Determination of Carbon, Hydrogen, and Nitrogen in Laboratory Samples of Coal (incorporated by...
40 CFR 98.184 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... you determine process CO2 emissions using the carbon mass balance procedure in § 98.183(b)(2)(i) and... Instrumental Determination of Carbon, Hydrogen, and Nitrogen in Laboratory Samples of Coal (incorporated by...
40 CFR 98.114 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... requirements. If you determine annual process CO2 emissions using the carbon mass balance procedure in § 98.113... D5373-08 Standard Test Methods for Instrumental Determination of Carbon, Hydrogen, and Nitrogen in...
40 CFR 98.114 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... requirements. If you determine annual process CO2 emissions using the carbon mass balance procedure in § 98.113... D5373-08 Standard Test Methods for Instrumental Determination of Carbon, Hydrogen, and Nitrogen in...
40 CFR 98.184 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... you determine process CO2 emissions using the carbon mass balance procedure in § 98.183(b)(2)(i) and... Instrumental Determination of Carbon, Hydrogen, and Nitrogen in Laboratory Samples of Coal (incorporated by...
40 CFR 98.114 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... requirements. If you determine annual process CO2 emissions using the carbon mass balance procedure in § 98.113... D5373-08 Standard Test Methods for Instrumental Determination of Carbon, Hydrogen, and Nitrogen in...
40 CFR 98.184 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... you determine process CO2 emissions using the carbon mass balance procedure in § 98.183(b)(2)(i) and... Instrumental Determination of Carbon, Hydrogen, and Nitrogen in Laboratory Samples of Coal (incorporated by...
40 CFR 98.114 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... requirements. If you determine annual process CO2 emissions using the carbon mass balance procedure in § 98.113... D5373-08 Standard Test Methods for Instrumental Determination of Carbon, Hydrogen, and Nitrogen in...
Training on automated machine guidance.
DOT National Transportation Integrated Search
2009-05-01
"Beginning in 2006, WisDOT and the Construction Materials Support Center (CMSC) at UW-Madison worked together : to develop the specifications and the QA/QC procedures for GPS machine guidance on highway grading projects. These : specifications and pr...
Technical Note: Independent component analysis for quality assurance in functional MRI.
Astrakas, Loukas G; Kallistis, Nikolaos S; Kalef-Ezra, John A
2016-02-01
Independent component analysis (ICA) is an established method of analyzing human functional MRI (fMRI) data. Here, an ICA-based fMRI quality control (QC) tool was developed and used. ICA-based fMRI QC tool to be used with a commercial phantom was developed. In an attempt to assess the performance of the tool relative to preexisting alternative tools, it was used seven weeks before and eight weeks after repair of a faulty gradient amplifier of a non-state-of-the-art MRI unit. More specifically, its performance was compared with the AAPM 100 acceptance testing and quality assurance protocol and two fMRI QC protocols, proposed by Freidman et al. ["Report on a multicenter fMRI quality assurance protocol," J. Magn. Reson. Imaging 23, 827-839 (2006)] and Stocker et al. ["Automated quality assurance routines for fMRI data applied to a multicenter study," Hum. Brain Mapp. 25, 237-246 (2005)], respectively. The easily developed and applied ICA-based QC protocol provided fMRI QC indices and maps equally sensitive to fMRI instabilities with the indices and maps of other established protocols. The ICA fMRI QC indices were highly correlated with indices of other fMRI QC protocols and in some cases theoretically related to them. Three or four independent components with slow varying time series are detected under normal conditions. ICA applied on phantom measurements is an easy and efficient tool for fMRI QC. Additionally, it can protect against misinterpretations of artifact components as human brain activations. Evaluating fMRI QC indices in the central region of a phantom is not always the optimal choice.
Molecular Characterization of Tick Salivary Gland Glutaminyl Cyclase
Adamson, Steven W.; Browning, Rebecca E.; Chao, Chien-Chung; Bateman, Robert C.; Ching, Wei-Mei; Karim, Shahid
2013-01-01
Glutaminyl cyclase (QC) catalyzes the cyclization of N-terminal glutamine residues into pyroglutamate. This post-translational modification extends the half-life of peptides and, in some cases, is essential in binding to their cognate receptor. Due to its potential role in the post-translational modification of tick neuropeptides, we report the molecular, biochemical and physiological characterization of salivary gland QC during the prolonged blood-feeding of the black-legged tick (Ixodes scapularis) and the gulf-coast tick (Amblyomma maculatum). QC sequences from I. scapularis and A. maculatum showed a high degree of amino acid identity to each other and other arthropods and residues critical for zinc-binding/catalysis (D159, E202, and H330) or intermediate stabilization (E201, W207, D248, D305, F325, and W329) are conserved. Analysis of QC transcriptional gene expression kinetics depicts an upregulation during the blood-meal of adult female ticks prior to fast feeding phases in both I. scapularis and A. maculatum suggesting a functional link with blood meal uptake. QC enzymatic activity was detected in saliva and extracts of tick salivary glands and midguts. Recombinant QC was shown to be catalytically active. Furthermore, knockdown of QC-transcript by RNA interference resulted in lower enzymatic activity, and small, unviable egg masses in both studied tick species as well as lower engorged tick weights for I. scapularis. These results suggest that the post-translational modification of neurotransmitters and other bioactive peptides by QC is critical to oviposition and potentially other physiological processes. Moreover, these data suggest that tick-specific QC-modified neurotransmitters/hormones or other relevant parts of this system could potentially be used as novel physiological targets for tick control. PMID:23770496
Comparison of quality control software tools for diffusion tensor imaging.
Liu, Bilan; Zhu, Tong; Zhong, Jianhui
2015-04-01
Image quality of diffusion tensor imaging (DTI) is critical for image interpretation, diagnostic accuracy and efficiency. However, DTI is susceptible to numerous detrimental artifacts that may impair the reliability and validity of the obtained data. Although many quality control (QC) software tools are being developed and are widely used and each has its different tradeoffs, there is still no general agreement on an image quality control routine for DTIs, and the practical impact of these tradeoffs is not well studied. An objective comparison that identifies the pros and cons of each of the QC tools will be helpful for the users to make the best choice among tools for specific DTI applications. This study aims to quantitatively compare the effectiveness of three popular QC tools including DTI studio (Johns Hopkins University), DTIprep (University of North Carolina at Chapel Hill, University of Iowa and University of Utah) and TORTOISE (National Institute of Health). Both synthetic and in vivo human brain data were used to quantify adverse effects of major DTI artifacts to tensor calculation as well as the effectiveness of different QC tools in identifying and correcting these artifacts. The technical basis of each tool was discussed, and the ways in which particular techniques affect the output of each of the tools were analyzed. The different functions and I/O formats that three QC tools provide for building a general DTI processing pipeline and integration with other popular image processing tools were also discussed. Copyright © 2015 Elsevier Inc. All rights reserved.
The purpose of this SOP is to define the procedures used for the initial and periodic verification and validation of computer programs used during the Arizona NHEXAS project and the "Border" study. Keywords: Computers; Software; QA/QC.
The National Human Exposure Assessment Sur...
Quality Control Methodology Of A Surface Wind Observational Database In North Eastern North America
NASA Astrophysics Data System (ADS)
Lucio-Eceiza, Etor E.; Fidel González-Rouco, J.; Navarro, Jorge; Conte, Jorge; Beltrami, Hugo
2016-04-01
This work summarizes the design and application of a Quality Control (QC) procedure for an observational surface wind database located in North Eastern North America. The database consists of 526 sites (486 land stations and 40 buoys) with varying resolutions of hourly, 3 hourly and 6 hourly data, compiled from three different source institutions with uneven measurement units and changing measuring procedures, instrumentation and heights. The records span from 1953 to 2010. The QC process is composed of different phases focused either on problems related with the providing source institutions or measurement errors. The first phases deal with problems often related with data recording and management: (1) compilation stage dealing with the detection of typographical errors, decoding problems, site displacements and unification of institutional practices; (2) detection of erroneous data sequence duplications within a station or among different ones; (3) detection of errors related with physically unrealistic data measurements. The last phases are focused on instrumental errors: (4) problems related with low variability, placing particular emphasis on the detection of unrealistic low wind speed records with the help of regional references; (5) high variability related erroneous records; (6) standardization of wind speed record biases due to changing measurement heights, detection of wind speed biases on week to monthly timescales, and homogenization of wind direction records. As a result, around 1.7% of wind speed records and 0.4% of wind direction records have been deleted, making a combined total of 1.9% of removed records. Additionally, around 15.9% wind speed records and 2.4% of wind direction data have been also corrected.
Sharma, Kuldeep; Giri, Kalpeshkumar; Dhiman, Vinay; Dixit, Abhishek; Zainuddin, Mohd; Mullangi, Ramesh
2015-05-01
A highly sensitive, specific and rapid LC-ESI-MS/MS method has been developed and validated for simultaneous quantification of methotrexate (MTX) and tofacitinib (TFB) in rat plasma (50 μL) using phenacetin as an internal standard (IS), as per the US Food and Drug Administration guidelines. After a solid-phase extraction procedure, the separation of the analytes and IS was performed on a Chromolith RP₁₈e column using an isocratic mobile phase of 5 m m ammonium acetate (pH 5.0) and acetonitrile at a ratio of 25:75 (v/v) using flow-gradient with a total run time of 3.5 min. The detection was performed in multiple reaction monitoring mode, using the transitions of m/z 455.2 → 308.3, m/z 313.2 → 149.2 and m/z 180.3 → 110.2 for MTX, TFB and IS, respectively. The calibration curves were linear over the range of 0.49-91.0 and 0.40-74.4 ng/mL for MTX and TFB, respectively. The intra- and interday accuracy and precision values for MTX and TFB were <15% at low quality control (QC), medium QC and high QC and <20% at lower limit of quantification. The validated assay was applied to derive the pharmacokinetic parameters for MTX and TFB post-dosing of MTX and TFB orally and intravenously to rats. Copyright © 2014 John Wiley & Sons, Ltd.
Sampson, Maureen L; Gounden, Verena; van Deventer, Hendrik E; Remaley, Alan T
2016-02-01
The main drawback of the periodic analysis of quality control (QC) material is that test performance is not monitored in time periods between QC analyses, potentially leading to the reporting of faulty test results. The objective of this study was to develop a patient based QC procedure for the more timely detection of test errors. Results from a Chem-14 panel measured on the Beckman LX20 analyzer were used to develop the model. Each test result was predicted from the other 13 members of the panel by multiple regression, which resulted in correlation coefficients between the predicted and measured result of >0.7 for 8 of the 14 tests. A logistic regression model, which utilized the measured test result, the predicted test result, the day of the week and time of day, was then developed for predicting test errors. The output of the logistic regression was tallied by a daily CUSUM approach and used to predict test errors, with a fixed specificity of 90%. The mean average run length (ARL) before error detection by CUSUM-Logistic Regression (CSLR) was 20 with a mean sensitivity of 97%, which was considerably shorter than the mean ARL of 53 (sensitivity 87.5%) for a simple prediction model that only used the measured result for error detection. A CUSUM-Logistic Regression analysis of patient laboratory data can be an effective approach for the rapid and sensitive detection of clinical laboratory errors. Published by Elsevier Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bjarnason, T A; Department of Radiology, University of British Columbia, Vancouver; Yang, C J
2014-08-15
Measuring the CT collimation width and assessing the shape of the overall profile is a relatively straightforward quality control (QC) measure that impacts both image quality and patient dose, and is often required at acceptance and routine testing. Most CT facilities have access to computed radiography (CR) systems, so performing CT collimation profile assessments using CR plates requires no additional equipment. Previous studies have shown how to effectively use CR plates to measure the radiation profile width. However, a major limitation of the previous work is that the full dynamic range of CR detector plates are not used, since themore » CR processing technology reduces the dynamic range of the DICOM output to 2{sup 10}, requiring the sensitivity and latitude settings of CR reader to be adjusted to prevent clipping of the CT profile data. Such adjustments to CR readers unnecessarily complicate the QC procedure. These clipping artefacts hinder the ability to accurately assess CT collimation width because the full-width at half maximum value of the penumbras are not properly determined if the maximum dose of the profile is not available. Furthermore, any inconsistencies in the radiation profile shape are lost if the profile plateau is clipped off. In this work we developed an opensource Matlab script for straightforward CT profile width measurements using raw CR data that also allows assessment of the profile shape without clipping, and applied this approach during CT QC.« less
Practical Shipbuilding Standards for Surface Preparation and Coatings
1979-07-01
strong solvent and apply over last coat of epoxy within 48 hours. *Minimum Dry Film Thickness 12.0 SAFETY AND POLUTION CONTROL 12.5 Safety solvents shall...Owner Inspec ion (3) QA/QC Dept. Inspectors. (4) Craft Inspectors (5) Craft Supervision Inspection Only (6) QA/QC Dept. Audit Only (7) Are
Slice-thickness evaluation in CT and MRI: an alternative computerised procedure.
Acri, G; Tripepi, M G; Causa, F; Testagrossa, B; Novario, R; Vermiglio, G
2012-04-01
The efficient use of computed tomography (CT) and magnetic resonance imaging (MRI) equipment necessitates establishing adequate quality-control (QC) procedures. In particular, the accuracy of slice thickness (ST) requires scan exploration of phantoms containing test objects (plane, cone or spiral). To simplify such procedures, a novel phantom and a computerised LabView-based procedure have been devised, enabling determination of full width at half maximum (FWHM) in real time. The phantom consists of a polymethyl methacrylate (PMMA) box, diagonally crossed by a PMMA septum dividing the box into two sections. The phantom images were acquired and processed using the LabView-based procedure. The LabView (LV) results were compared with those obtained by processing the same phantom images with commercial software, and the Fisher exact test (F test) was conducted on the resulting data sets to validate the proposed methodology. In all cases, there was no statistically significant variation between the two different procedures and the LV procedure, which can therefore be proposed as a valuable alternative to other commonly used procedures and be reliably used on any CT and MRI scanner.
Bosnjak, J; Ciraj-Bjelac, O; Strbac, B
2008-01-01
Application of a quality control (QC) programme is very important when optimisation of image quality and reduction of patient exposure is desired. QC surveys of diagnostics imaging equipment in Republic of Srpska (entity of Bosnia and Herzegovina) has been systematically performed since 2001. The presented results are mostly related to the QC test results of X-ray tubes and generators for diagnostic radiology units in 92 radiology departments. In addition, results include workplace monitoring and usage of personal protective devices for staff and patients. Presented results showed the improvements in the implementation of the QC programme within the period 2001--2005. Also, more attention is given to appropriate maintenance of imaging equipment, which was one of the main problems in the past. Implementation of a QC programme is a continuous and complex process. To achieve good performance of imaging equipment, additional tests are to be introduced, along with image quality assessment and patient dosimetry. Training is very important in order to achieve these goals.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Létourneau, Daniel, E-mail: daniel.letourneau@rmp.uh.on.ca; McNiven, Andrea; Keller, Harald
2014-12-15
Purpose: High-quality radiation therapy using highly conformal dose distributions and image-guided techniques requires optimum machine delivery performance. In this work, a monitoring system for multileaf collimator (MLC) performance, integrating semiautomated MLC quality control (QC) tests and statistical process control tools, was developed. The MLC performance monitoring system was used for almost a year on two commercially available MLC models. Control charts were used to establish MLC performance and assess test frequency required to achieve a given level of performance. MLC-related interlocks and servicing events were recorded during the monitoring period and were investigated as indicators of MLC performance variations. Methods:more » The QC test developed as part of the MLC performance monitoring system uses 2D megavoltage images (acquired using an electronic portal imaging device) of 23 fields to determine the location of the leaves with respect to the radiation isocenter. The precision of the MLC performance monitoring QC test and the MLC itself was assessed by detecting the MLC leaf positions on 127 megavoltage images of a static field. After initial calibration, the MLC performance monitoring QC test was performed 3–4 times/week over a period of 10–11 months to monitor positional accuracy of individual leaves for two different MLC models. Analysis of test results was performed using individuals control charts per leaf with control limits computed based on the measurements as well as two sets of specifications of ±0.5 and ±1 mm. Out-of-specification and out-of-control leaves were automatically flagged by the monitoring system and reviewed monthly by physicists. MLC-related interlocks reported by the linear accelerator and servicing events were recorded to help identify potential causes of nonrandom MLC leaf positioning variations. Results: The precision of the MLC performance monitoring QC test and the MLC itself was within ±0.22 mm for most MLC leaves and the majority of the apparent leaf motion was attributed to beam spot displacements between irradiations. The MLC QC test was performed 193 and 162 times over the monitoring period for the studied units and recalibration had to be repeated up to three times on one of these units. For both units, rate of MLC interlocks was moderately associated with MLC servicing events. The strongest association with the MLC performance was observed between the MLC servicing events and the total number of out-of-control leaves. The average elapsed time for which the number of out-of-specification or out-of-control leaves was within a given performance threshold was computed and used to assess adequacy of MLC test frequency. Conclusions: A MLC performance monitoring system has been developed and implemented to acquire high-quality QC data at high frequency. This is enabled by the relatively short acquisition time for the images and automatic image analysis. The monitoring system was also used to record and track the rate of MLC-related interlocks and servicing events. MLC performances for two commercially available MLC models have been assessed and the results support monthly test frequency for widely accepted ±1 mm specifications. Higher QC test frequency is however required to maintain tighter specification and in-control behavior.« less
Laboratory quality management system: road to accreditation and beyond.
Wadhwa, V; Rai, S; Thukral, T; Chopra, M
2012-01-01
This review attempts to clarify the concepts of Laboratory Quality Management System (Lab QMS) for a medical testing and diagnostic laboratory in a holistic way and hopes to expand the horizon beyond quality control (QC) and quality assurance. It provides an insight on accreditation bodies and highlights a glimpse of existing laboratory practices but essentially it takes the reader through the journey of accreditation and during the course of reading and understanding this document, prepares the laboratory for the same. Some of the areas which have not been highlighted previously include: requirement for accreditation consultants, laboratory infrastructure and scope, applying for accreditation, document preparation. This section is well supported with practical illustrations and necessary tables and exhaustive details like preparation of a standard operating procedure and a quality manual. Concept of training and privileging of staff has been clarified and a few of the QC exercises have been dealt with in a novel way. Finally, a practical advice for facing an actual third party assessment and caution needed to prevent post-assessment pitfalls has been dealt with.
NASA Astrophysics Data System (ADS)
Sturtevant, C.; Hackley, S.; Lee, R.; Holling, G.; Bonarrigo, S.
2017-12-01
Quality assurance and control (QA/QC) is one of the most important yet challenging aspects of producing research-quality data. Data quality issues are multi-faceted, including sensor malfunctions, unmet theoretical assumptions, and measurement interference from humans or the natural environment. Tower networks such as Ameriflux, ICOS, and NEON continue to grow in size and sophistication, yet tools for robust, efficient, scalable QA/QC have lagged. Quality control remains a largely manual process heavily relying on visual inspection of data. In addition, notes of measurement interference are often recorded on paper without an explicit pathway to data flagging. As such, an increase in network size requires a near-proportional increase in personnel devoted to QA/QC, quickly stressing the human resources available. We present a scalable QA/QC framework in development for NEON that combines the efficiency and standardization of automated checks with the power and flexibility of human review. This framework includes fast-response monitoring of sensor health, a mobile application for electronically recording maintenance activities, traditional point-based automated quality flagging, and continuous monitoring of quality outcomes and longer-term holistic evaluations. This framework maintains the traceability of quality information along the entirety of the data generation pipeline, and explicitly links field reports of measurement interference to quality flagging. Preliminary results show that data quality can be effectively monitored and managed for a multitude of sites with a small group of QA/QC staff. Several components of this framework are open-source, including a R-Shiny application for efficiently monitoring, synthesizing, and investigating data quality issues.
Dudev, Todor; Devereux, Mike; Meuwly, Markus; Lim, Carmay; Piquemal, Jean-Philip; Gresh, Nohad
2015-02-15
The alkali metal cations in the series Li(+)-Cs(+) act as major partners in a diversity of biological processes and in bioinorganic chemistry. In this article, we present the results of their calibration in the context of the SIBFA polarizable molecular mechanics/dynamics procedure. It relies on quantum-chemistry (QC) energy-decomposition analyses of their monoligated complexes with representative O-, N-, S-, and Se- ligands, performed with the aug-cc-pVTZ(-f) basis set at the Hartree-Fock level. Close agreement with QC is obtained for each individual contribution, even though the calibration involves only a limited set of cation-specific parameters. This agreement is preserved in tests on polyligated complexes with four and six O- ligands, water and formamide, indicating the transferability of the procedure. Preliminary extensions to density functional theory calculations are reported. © 2014 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Brolin, Gustav; Sjögreen Gleisner, Katarina; Ljungberg, Michael
2013-05-01
In dynamic renal scintigraphy, the main interest is the radiopharmaceutical redistribution as a function of time. Quality control (QC) of renal procedures often relies on phantom experiments to compare image-based results with the measurement setup. A phantom with a realistic anatomy and time-varying activity distribution is therefore desirable. This work describes a pharmacokinetic (PK) compartment model for 99mTc-MAG3, used for defining a dynamic whole-body activity distribution within a digital phantom (XCAT) for accurate Monte Carlo (MC)-based images for QC. Each phantom structure is assigned a time-activity curve provided by the PK model, employing parameter values consistent with MAG3 pharmacokinetics. This approach ensures that the total amount of tracer in the phantom is preserved between time points, and it allows for modifications of the pharmacokinetics in a controlled fashion. By adjusting parameter values in the PK model, different clinically realistic scenarios can be mimicked, regarding, e.g., the relative renal uptake and renal transit time. Using the MC code SIMIND, a complete set of renography images including effects of photon attenuation, scattering, limited spatial resolution and noise, are simulated. The obtained image data can be used to evaluate quantitative techniques and computer software in clinical renography.
Quality Control of Wind Data from 50-MHz Doppler Radar Wind Profiler
NASA Technical Reports Server (NTRS)
Vacek, Austin
2016-01-01
Upper-level wind profiles obtained from a 50-MHz Doppler Radar Wind Profiler (DRWP) instrument at Kennedy Space Center are incorporated in space launch vehicle design and day-of-launch operations to assess wind effects on the vehicle during ascent. Automated and manual quality control (QC) techniques are implemented to remove spurious data in the upper-level wind profiles caused from atmospheric and non-atmospheric artifacts over the 2010-2012 period of record (POR). By adding the new quality controlled profiles with older profiles from 1997-2009, a robust database will be constructed of upper-level wind characteristics. Statistical analysis will determine the maximum, minimum, and 95th percentile of the wind components from the DRWP profiles over recent POR and compare against the older database. Additionally, this study identifies specific QC flags triggered during the QC process to understand how much data is retained and removed from the profiles.
Quality Control of Wind Data from 50-MHz Doppler Radar Wind Profiler
NASA Technical Reports Server (NTRS)
Vacek, Austin
2015-01-01
Upper-level wind profiles obtained from a 50-MHz Doppler Radar Wind Profiler (DRWP) instrument at Kennedy Space Center are incorporated in space launch vehicle design and day-of-launch operations to assess wind effects on the vehicle during ascent. Automated and manual quality control (QC) techniques are implemented to remove spurious data in the upper-level wind profiles caused from atmospheric and non-atmospheric artifacts over the 2010-2012 period of record (POR). By adding the new quality controlled profiles with older profiles from 1997-2009, a robust database will be constructed of upper-level wind characteristics. Statistical analysis will determine the maximum, minimum, and 95th percentile of the wind components from the DRWP profiles over recent POR and compare against the older database. Additionally, this study identifies specific QC flags triggered during the QC process to understand how much data is retained and removed from the profiles.
RNA-SeQC: RNA-seq metrics for quality control and process optimization.
DeLuca, David S; Levin, Joshua Z; Sivachenko, Andrey; Fennell, Timothy; Nazaire, Marc-Danie; Williams, Chris; Reich, Michael; Winckler, Wendy; Getz, Gad
2012-06-01
RNA-seq, the application of next-generation sequencing to RNA, provides transcriptome-wide characterization of cellular activity. Assessment of sequencing performance and library quality is critical to the interpretation of RNA-seq data, yet few tools exist to address this issue. We introduce RNA-SeQC, a program which provides key measures of data quality. These metrics include yield, alignment and duplication rates; GC bias, rRNA content, regions of alignment (exon, intron and intragenic), continuity of coverage, 3'/5' bias and count of detectable transcripts, among others. The software provides multi-sample evaluation of library construction protocols, input materials and other experimental parameters. The modularity of the software enables pipeline integration and the routine monitoring of key measures of data quality such as the number of alignable reads, duplication rates and rRNA contamination. RNA-SeQC allows investigators to make informed decisions about sample inclusion in downstream analysis. In summary, RNA-SeQC provides quality control measures critical to experiment design, process optimization and downstream computational analysis. See www.genepattern.org to run online, or www.broadinstitute.org/rna-seqc/ for a command line tool.
The purpose of this SOP is to define the procedures used for the initial and periodic verification and validation of computer programs used during the Arizona NHEXAS project and the Border study. Keywords: Computers; Software; QA/QC.
The U.S.-Mexico Border Program is sponsored ...
Cirillo, Daniela M.; Hoffner, Sven; Ismail, Nazir A.; Kaur, Devinder; Lounis, Nacer; Metchock, Beverly; Pfyffer, Gaby E.; Venter, Amour
2016-01-01
The aim of this study was to establish standardized drug susceptibility testing (DST) methodologies and reference MIC quality control (QC) ranges for bedaquiline, a diarylquinoline antimycobacterial, used in the treatment of adults with multidrug-resistant tuberculosis. Two tier-2 QC reproducibility studies of bedaquiline DST were conducted in eight laboratories using Clinical Laboratory and Standards Institute (CLSI) guidelines. Agar dilution and broth microdilution methods were evaluated. Mycobacterium tuberculosis H37Rv was used as the QC reference strain. Bedaquiline MIC frequency, mode, and geometric mean were calculated. When resulting data occurred outside predefined CLSI criteria, the entire laboratory data set was excluded. For the agar dilution MIC, a 4-dilution QC range (0.015 to 0.12 μg/ml) centered around the geometric mean included 95.8% (7H10 agar dilution; 204/213 observations with one data set excluded) or 95.9% (7H11 agar dilution; 232/242) of bedaquiline MICs. For the 7H9 broth microdilution MIC, a 3-dilution QC range (0.015 to 0.06 μg/ml) centered around the mode included 98.1% (207/211, with one data set excluded) of bedaquiline MICs. Microbiological equivalence was demonstrated for bedaquiline MICs determined using 7H10 agar and 7H11 agar but not for bedaquiline MICs determined using 7H9 broth and 7H10 agar or 7H9 broth and 7H11 agar. Bedaquiline DST methodologies and MIC QC ranges against the H37Rv M. tuberculosis reference strain have been established: 0.015 to 0.12 μg/ml for the 7H10 and 7H11 agar dilution MICs and 0.015 to 0.06 μg/ml for the 7H9 broth microdilution MIC. These methodologies and QC ranges will be submitted to CLSI and EUCAST to inform future research and provide guidance for routine clinical bedaquiline DST in laboratories worldwide. PMID:27654337
NASA Technical Reports Server (NTRS)
Orcutt, John M.; Brenton, James C.
2016-01-01
An accurate database of meteorological data is essential for designing any aerospace vehicle and for preparing launch commit criteria. Meteorological instrumentation were recently placed on the three Lightning Protection System (LPS) towers at Kennedy Space Center (KSC) launch complex 39B (LC-39B), which provide a unique meteorological dataset existing at the launch complex over an extensive altitude range. Data records of temperature, dew point, relative humidity, wind speed, and wind direction are produced at 40, 78, 116, and 139 m at each tower. The Marshall Space Flight Center Natural Environments Branch (EV44) received an archive that consists of one-minute averaged measurements for the period of record of January 2011 - April 2015. However, before the received database could be used EV44 needed to remove any erroneous data from within the database through a comprehensive quality control (QC) process. The QC process applied to the LPS towers' meteorological data is similar to other QC processes developed by EV44, which were used in the creation of meteorological databases for other towers at KSC. The QC process utilized in this study has been modified specifically for use with the LPS tower database. The QC process first includes a check of each individual sensor. This check includes removing any unrealistic data and checking the temporal consistency of each variable. Next, data from all three sensors at each height are checked against each other, checked against climatology, and checked for sensors that erroneously report a constant value. Then, a vertical consistency check of each variable at each tower is completed. Last, the upwind sensor at each level is selected to minimize the influence of the towers and other structures at LC-39B on the measurements. The selection process for the upwind sensor implemented a study of tower-induced turbulence. This paper describes in detail the QC process, QC results, and the attributes of the LPS towers meteorological database.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
The goal of this ultrasound hands-on workshop is to demonstrate advancements in high intensity focused ultrasound (HIFU) and to demonstrate quality control (QC) testing in diagnostic ultrasound. HIFU is a therapeutic modality that uses ultrasound waves as carriers of energy. HIFU is used to focus a beam of ultrasound energy into a small volume at specific target locations within the body. The focused beam causes localized high temperatures and produces a well-defined regions of necrosis. This completely non-invasive technology has great potential for tumor ablation and targeted drug delivery. At the workshop, attendees will see configurations, applications, and hands-on demonstrationsmore » with on-site instructors at separate stations. The involvement of medical physicists in diagnostic ultrasound imaging service is increasing due to QC and accreditation requirements. At the workshop, an array of ultrasound testing phantoms and ultrasound scanners will be provided for attendees to learn diagnostic ultrasound QC in a hands-on environment with live demonstrations of the techniques. Target audience: Medical physicists and other medical professionals in diagnostic imaging and radiation oncology with interest in high-intensity focused ultrasound and in diagnostic ultrasound QC. Learning Objectives: Learn ultrasound physics and safety for HIFU applications through live demonstrations Get an overview of the state-of-the art in HIFU technologies and equipment Gain familiarity with common elements of a quality control program for diagnostic ultrasound imaging Identify QC tools available for testing diagnostic ultrasound systems and learn how to use these tools List of supporting vendors for HIFU and diagnostic ultrasound QC hands-on workshop: Philips Healthcare Alpinion Medical Systems Verasonics, Inc Zonare Medical Systems, Inc Computerized Imaging Reference Systems (CIRS), Inc. GAMMEX, Inc., Cablon Medical BV Steffen Sammet: NIH/NCI grant 5R25CA132822, NIH/NINDS grant 5R25NS080949 and Philips Healthcare research grant C32.« less
Raef, A.
2009-01-01
The recent proliferation of the 3D reflection seismic method into the near-surface area of geophysical applications, especially in response to the emergence of the need to comprehensively characterize and monitor near-surface carbon dioxide sequestration in shallow saline aquifers around the world, justifies the emphasis on cost-effective and robust quality control and assurance (QC/QA) workflow of 3D seismic data preprocessing that is suitable for near-surface applications. The main purpose of our seismic data preprocessing QC is to enable the use of appropriate header information, data that are free of noise-dominated traces, and/or flawed vertical stacking in subsequent processing steps. In this article, I provide an account of utilizing survey design specifications, noise properties, first breaks, and normal moveout for rapid and thorough graphical QC/QA diagnostics, which are easy to apply and efficient in the diagnosis of inconsistencies. A correlated vibroseis time-lapse 3D-seismic data set from a CO2-flood monitoring survey is used for demonstrating QC diagnostics. An important by-product of the QC workflow is establishing the number of layers for a refraction statics model in a data-driven graphical manner that capitalizes on the spatial coverage of the 3D seismic data. ?? China University of Geosciences (Wuhan) and Springer-Verlag GmbH 2009.
The quality control theory of aging.
Ladiges, Warren
2014-01-01
The quality control (QC) theory of aging is based on the concept that aging is the result of a reduction in QC of cellular systems designed to maintain lifelong homeostasis. Four QC systems associated with aging are 1) inadequate protein processing in a distressed endoplasmic reticulum (ER); 2) histone deacetylase (HDAC) processing of genomic histones and gene silencing; 3) suppressed AMPK nutrient sensing with inefficient energy utilization and excessive fat accumulation; and 4) beta-adrenergic receptor (BAR) signaling and environmental and emotional stress. Reprogramming these systems to maintain efficiency and prevent aging would be a rational strategy for increased lifespan and improved health. The QC theory can be tested with a pharmacological approach using three well-known and safe, FDA-approved drugs: 1) phenyl butyric acid, a chemical chaperone that enhances ER function and is also an HDAC inhibitor, 2) metformin, which activates AMPK and is used to treat type 2 diabetes, and 3) propranolol, a beta blocker which inhibits BAR signaling and is used to treat hypertension and anxiety. A critical aspect of the QC theory, then, is that aging is associated with multiple cellular systems that can be targeted with drug combinations more effectively than with single drugs. But more importantly, these drug combinations will effectively prevent, delay, or reverse chronic diseases of aging that impose such a tremendous health burden on our society.
Unanticipated error in HbA(1c) measurement on the HLC-723 G7 analyzer.
van den Ouweland, Johannes M W; de Keijzer, Marinus H; van Daal, Henny
2010-04-01
Investigation of falsely elevated HbA(1c) measurements on the HLC-723 G7 analyser. Comparison of HbA(1c) in blood samples that were diluted either in hemolysis reagent or water. HbA(1c) results became falsely elevated when samples were diluted in hemolysis reagent, but not in water. QC-procedures failed to detect this error as calibrator and QC samples were manually diluted in water, according to manufacturer's instructions, whereas patient samples were automatically diluted using hemolysing reagent. After replacement of the instruments' sample-loop and rotor seal comparable HbA(1c) results were obtained, irrespective of dilution with hemolysing reagent or water. This case illustrates the importance of treating calibrator and QC materials similar to routine patient samples in order to prevent unnoticed drift in patient HbA(1c) results. Copyright 2010 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Office of Student Financial Aid Quality Improvement Program: Design and Implementation Plan.
ERIC Educational Resources Information Center
Advanced Technology, Inc., Reston, VA.
The purpose and direction of the Office of Student Financial Aid (OSFA) quality improvement program are described. The background and context for the Pell Grant quality control (QC) design study and the meaning of QC are reviewed. The general approach to quality improvement consists of the following elements: a strategic approach that enables OSFA…
FASTQ quality control dashboard
DOE Office of Scientific and Technical Information (OSTI.GOV)
2016-07-25
FQCDB builds up existing open source software, FastQC, implementing a modern web interface for across parsed output of FastQC. In addition, FQCDB is extensible as a web service to include additional plots of type line, boxplot, or heatmap, across data formatted according to guidelines. The interface is also configurable via more readable JSON format, enabling customization by non-web programmers.
Quality Circles: An Innovative Program to Improve Military Hospitals
1982-08-01
quality control. However, Dr. Kaoru Ishikawa is credited with starting the first "Quality Control Circles" and registering them with the Japanese Union of...McGregor and Abraham Maslow into a unique style of management. In 1962 Dr. Ishikawa , a professor at Tokyo University, developed the QC concept based on...RECOMMENDATIONS Conclusions The QC concept has come a long way since Dr. Ishikawa gave it birth in 1962. It has left an enviable record of success along its
Guillot, Sophie; Guiso, Nicole
2016-08-01
The French National Reference Centre (NRC) for Whooping Cough carried out an external quality control (QC) analysis in 2010 for the PCR diagnosis of whooping cough. The main objective of the study was to assess the impact of this QC in the participating laboratories through a repeat analysis in 2012. Copyright © 2016, American Society for Microbiology. All Rights Reserved.
NASA Astrophysics Data System (ADS)
Sicoe, G. M.; Belu, N.; Rachieru, N.; Nicolae, E. V.
2017-10-01
Presently, in the automotive industry, the tendency is to adapt permanently to the changes and introduce the market tendency in the new products that leads of the customer satisfaction. Many quality techniques were adopted in this field to continuous improvement of product and process quality and advantages were also gained. The present paper has focused on possibilities that offers the use of Quality Assurance Matrix (QAM) and Quality Control Story (QC Story) to provide largest protection against nonconformities in the production process, throughout a case study in the automotive industry. There is a direct relationship from the QAM to a QC Story analysis. The failures identified using QAM are treated with QC Story methodology. Using this methods, will help to decrease the PPM values and will increase the quality performance and the customer satisfaction.
Focant, Jean-François; Eppe, Gauthier; Massart, Anne-Cécile; Scholl, Georges; Pirard, Catherine; De Pauw, Edwin
2006-10-13
We report on the use of a state-of-the-art method for the measurement of selected polychlorinated dibenzo-p-dioxins, polychlorinated dibenzofurans and polychlorinated biphenyls in human serum specimens. The sample preparation procedure is based on manual small size solid-phase extraction (SPE) followed by automated clean-up and fractionation using multi-sorbent liquid chromatography columns. SPE cartridges and all clean-up columns are disposable. Samples are processed in batches of 20 units, including one blank control (BC) sample and one quality control (QC) sample. The analytical measurement is performed using gas chromatography coupled to isotope dilution high-resolution mass spectrometry. The sample throughput corresponds to one series of 20 samples per day, from sample reception to data quality cross-check and reporting, once the procedure has been started and series of samples keep being produced. Four analysts are required to ensure proper performances of the procedure. The entire procedure has been validated under International Organization for Standardization (ISO) 17025 criteria and further tested over more than 1500 unknown samples during various epidemiological studies. The method is further discussed in terms of reproducibility, efficiency and long-term stability regarding the 35 target analytes. Data related to quality control and limit of quantification (LOQ) calculations are also presented and discussed.
Purba, Fredrick Dermawan; Hunfeld, Joke A M; Iskandarsyah, Aulia; Fitriana, Titi Sahidah; Sadarjoen, Sawitri S; Passchier, Jan; Busschbach, Jan J V
2017-05-01
In valuing health states using generic questionnaires such as EQ-5D, there are unrevealed issues with the quality of the data collection. The aims were to describe the problems encountered during valuation and to evaluate a quality control report and subsequent retraining of interviewers in improving this valuation. Data from the first 266 respondents in an EQ-5D-5L valuation study were used. Interviewers were trained and answered questions regarding problems during these initial interviews. Thematic analysis was used, and individual feedback was provided. After completion of 98 interviews, a first quantitative quality control (QC) report was generated, followed by a 1-day retraining program. Subsequently individual feedback was also given on the basis of follow-up QCs. The Wilcoxon signed-rank test was used to assess improvements based on 7 indicators of quality as identified in the first QC and the QC conducted after a further 168 interviews. Interviewers encountered problems in recruiting respondents. Solutions provided were: optimization of the time of interview, the use of broader networks and the use of different scripts to explain the project's goals to respondents. For problems in interviewing process, solutions applied were: developing the technical and personal skills of the interviewers and stimulating the respondents' thought processes. There were also technical problems related to hardware, software and internet connections. There was an improvement in all 7 indicators of quality after the second QC. Training before and during a study, and individual feedback on the basis of a quantitative QC, can increase the validity of values obtained from generic questionnaires.
Wanja, Elizabeth; Achilla, Rachel; Obare, Peter; Adeny, Rose; Moseti, Caroline; Otieno, Victor; Morang'a, Collins; Murigi, Ephantus; Nyamuni, John; Monthei, Derek R; Ogutu, Bernhards; Buff, Ann M
2017-05-25
One objective of the Kenya National Malaria Strategy 2009-2017 is scaling access to prompt diagnosis and effective treatment. In 2013, a quality assurance (QA) pilot was implemented to improve accuracy of malaria diagnostics at selected health facilities in low-transmission counties of Kenya. Trends in malaria diagnostic and QA indicator performance during the pilot are described. From June to December 2013, 28 QA officers provided on-the-job training and mentoring for malaria microscopy, malaria rapid diagnostic tests and laboratory QA/quality control (QC) practices over four 1-day visits at 83 health facilities. QA officers observed and recorded laboratory conditions and practices and cross-checked blood slides for malaria parasite presence, and a portion of cross-checked slides were confirmed by reference laboratories. Eighty (96%) facilities completed the pilot. Among 315 personnel at pilot initiation, 13% (n = 40) reported malaria diagnostics training within the previous 12 months. Slide positivity ranged from 3 to 7%. Compared to the reference laboratory, microscopy sensitivity ranged from 53 to 96% and positive predictive value from 39 to 53% for facility staff and from 60 to 96% and 52 to 80%, respectively, for QA officers. Compared to reference, specificity ranged from 88 to 98% and negative predictive value from 98 to 99% for health-facility personnel and from 93 to 99% and 99%, respectively, for QA officers. The kappa value ranged from 0.48-0.66 for facility staff and 0.57-0.84 for QA officers compared to reference. The only significant test performance improvement observed for facility staff was for specificity from 88% (95% CI 85-90%) to 98% (95% CI 97-99%). QA/QC practices, including use of positive-control slides, internal and external slide cross-checking and recording of QA/QC activities, all increased significantly across the pilot (p < 0.001). Reference material availability also increased significantly; availability of six microscopy job aids and seven microscopy standard operating procedures increased by a mean of 32 percentage points (p < 0.001) and 38 percentage points (p < 0.001), respectively. Significant gains were observed in malaria QA/QC practices over the pilot. However, these advances did not translate into improved accuracy of malaria diagnostic performance perhaps because of the limited duration of the QA pilot implementation.
Gresh, Nohad; Perahia, David; de Courcy, Benoit; Foret, Johanna; Roux, Céline; El-Khoury, Lea; Piquemal, Jean-Philip; Salmon, Laurent
2016-12-15
Zn-metalloproteins are a major class of targets for drug design. They constitute a demanding testing ground for polarizable molecular mechanics/dynamics aimed at extending the realm of quantum chemistry (QC) to very long-duration molecular dynamics (MD). The reliability of such procedures needs to be demonstrated upon comparing the relative stabilities of competing candidate complexes of inhibitors with the recognition site stabilized in the course of MD. This could be necessary when no information is available regarding the experimental structure of the inhibitor-protein complex. Thus, this study bears on the phosphomannose isomerase (PMI) enzyme, considered as a potential therapeutic target for the treatment of several bacterial and parasitic diseases. We consider its complexes with 5-phospho-d-arabinonohydroxamate and three analog ligands differing by the number and location of their hydroxyl groups. We evaluate the energy accuracy expectable from a polarizable molecular mechanics procedure, SIBFA. This is done by comparisons with ab initio quantum-chemistry (QC) calculations in the following cases: (a) the complexes of the four ligands in three distinct structures extracted from the entire PMI-ligand energy-minimized structures, and totaling up to 264 atoms; (b) the solvation energies of several energy-minimized complexes of each ligand with a shell of 64 water molecules; (c) the conformational energy differences of each ligand in different conformations characterized in the course of energy-minimizations; and (d) the continuum solvation energies of the ligands in different conformations. The agreements with the QC results appear convincing. On these bases, we discuss the prospects of applying the procedure to ligand-macromolecule recognition problems. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Impact of dose calibrators quality control programme in Argentina
NASA Astrophysics Data System (ADS)
Furnari, J. C.; de Cabrejas, M. L.; del C. Rotta, M.; Iglicki, F. A.; Milá, M. I.; Magnavacca, C.; Dima, J. C.; Rodríguez Pasqués, R. H.
1992-02-01
The national Quality Control (QC) programme for radionuclide calibrators started 12 years ago. Accuracy and the implementation of a QC programme were evaluated over all these years at 95 nuclear medicine laboratories where dose calibrators were in use. During all that time, the Metrology Group of CNEA has distributed 137Cs sealed sources to check stability and has been performing periodic "checking rounds" and postal surveys using unknown samples (external quality control). An account of the results of both methods is presented. At present, more of 65% of the dose calibrators measure activities with an error less than 10%.
[Highly quality-controlled radiation therapy].
Shirato, Hiroki
2005-04-01
Advanced radiation therapy for intracranial disease has focused on set-up accuracy for the past 15 years. However, quality control in the prescribed dose is actually as important as the tumor set-up in radiation therapy. Because of the complexity of the three-dimensional radiation treatment planning system in recent years, the highly quality-controlled prescription of the dose has now been reappraised as the mainstream to improve the treatment outcome of radiation therapy for intracranial disease. The Japanese Committee for Quality Control of Radiation Therapy has developed fundamental requirements such as a QC committee in each hospital, a medical physicist, dosimetrists (QC members), and an external audit.
Network-Centric Quantum Communications
NASA Astrophysics Data System (ADS)
Hughes, Richard
2014-03-01
Single-photon quantum communications (QC) offers ``future-proof'' cryptographic security rooted in the laws of physics. Today's quantum-secured communications cannot be compromised by unanticipated future technological advances. But to date, QC has only existed in point-to-point instantiations that have limited ability to address the cyber security challenges of our increasingly networked world. In my talk I will describe a fundamentally new paradigm of network-centric quantum communications (NQC) that leverages the network to bring scalable, QC-based security to user groups that may have no direct user-to-user QC connectivity. With QC links only between each of N users and a trusted network node, NQC brings quantum security to N2 user pairs, and to multi-user groups. I will describe a novel integrated photonics quantum smartcard (``QKarD'') and its operation in a multi-node NQC test bed. The QKarDs are used to implement the quantum cryptographic protocols of quantum identification, quantum key distribution and quantum secret splitting. I will explain how these cryptographic primitives are used to provide key management for encryption, authentication, and non-repudiation for user-to-user communications. My talk will conclude with a description of a recent demonstration that QC can meet both the security and quality-of-service (latency) requirements for electric grid control commands and data. These requirements cannot be met simultaneously with present-day cryptography.
Gubler, Hanspeter; Clare, Nicholas; Galafassi, Laurent; Geissler, Uwe; Girod, Michel; Herr, Guy
2018-06-01
We describe the main characteristics of the Novartis Helios data analysis software system (Novartis, Basel, Switzerland) for plate-based screening and profiling assays, which was designed and built about 11 years ago. It has been in productive use for more than 10 years and is one of the important standard software applications running for a large user community at all Novartis Institutes for BioMedical Research sites globally. A high degree of automation is reached by embedding the data analysis capabilities into a software ecosystem that deals with the management of samples, plates, and result data files, including automated data loading. The application provides a series of analytical procedures, ranging from very simple to advanced, which can easily be assembled by users in very flexible ways. This also includes the automatic derivation of a large set of quality control (QC) characteristics at every step. Any of the raw, intermediate, and final results and QC-relevant quantities can be easily explored through linked visualizations. Links to global assay metadata management, data warehouses, and an electronic lab notebook system are in place. Automated transfer of relevant data to data warehouses and electronic lab notebook systems are also implemented.
ENVIRONMENTAL RELEASE OF ASBESTOS/SUBSTITUTES FROM COMMERCIAL PRODUCTS USE AND DISPOSAL
For the first time, the release of respirable asbestos fibers has been quantified in terms of standard mechanical forces using widely accepted methodology and specified QA/QC procedures. Both fabrication of new products from asbestos containing materials and repair or removal of ...
Disk diffusion quality control guidelines for NVP-PDF 713: a novel peptide deformylase inhibitor.
Anderegg, Tamara R; Jones, Ronald N
2004-01-01
NVP-PDF713 is a peptide deformylase inhibitor that has emerged as a candidate for treating Gram-positive infections and selected Gram-negative species that commonly cause community-acquired respiratory tract infections. This report summarizes the results of a multi-center (seven participants) disk diffusion quality control (QC) investigation for NVP PDF-713 using guidelines of the National Committee for Clinical Laboratory Standards and the standardized disk diffusion method. A total of 420 NVP-PDF 713 zone diameter values were generated for each QC organism. The proposed zone diameter ranges contained 97.6-99.8% of the reported participant results and were: Staphylococcus aureus ATCC 25923 (25-35 mm), Streptococcus pneumoniae ATCC 49619 (30-37 mm), and Haemophilus influenzae ATCC 49247 (24-32 mm). These QC criteria for the disk diffusion method should be applied during the NVP-PDF 713 clinical trials to maximize test accuracy.
Development of the QA/QC Procedures for a Neutron Interrogation System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Obhodas, Jasmina; Sudac, Davorin; Valkovic, Vladivoj
In order to perform QA/QC procedures for a system dedicated to the neutron interrogation of objects for the presence of threat materials one needs to perform measurements of reference materials (RM) having the same (or similar) atomic ratios as real materials. It is well known that explosives, drugs, and various other benign materials, contain chemical elements such as hydrogen, oxygen, carbon and nitrogen in distinctly different quantities. For example, a high carbon-to-oxygen ratio (C/O) is characteristic of drugs. Explosives can be differentiated by measurement of both C/O and nitrogen-to-oxygen (N/O) ratios. The C/N ratio of the chemical warfare agents, coupledmore » with the measurement of elements such as fluorine and phosphorus, clearly differentiate them from the conventional explosives. Correlations between theoretical values and experimental results obtained in laboratory conditions for C/O and N/C ratios of simulants of hexogen (RDX), TNT, DLM2, TATP, cocaine, heroin, yperite, tetranitromethane, peroxide methylethyl-ketone, nitromethane and ethyleneglycol dinitrate are presented. (authors)« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yung, J; Stefan, W; Reeve, D
2015-06-15
Purpose: Phantom measurements allow for the performance of magnetic resonance (MR) systems to be evaluated. Association of Physicists in Medicine (AAPM) Report No. 100 Acceptance Testing and Quality Assurance Procedures for MR Imaging Facilities, American College of Radiology (ACR) MR Accreditation Program MR phantom testing, and ACR MRI quality control (QC) program documents help to outline specific tests for establishing system performance baselines as well as system stability over time. Analyzing and processing tests from multiple systems can be time-consuming for medical physicists. Besides determining whether tests are within predetermined limits or criteria, monitoring longitudinal trends can also help preventmore » costly downtime of systems during clinical operation. In this work, a semi-automated QC program was developed to analyze and record measurements in a database that allowed for easy access to historical data. Methods: Image analysis was performed on 27 different MR systems of 1.5T and 3.0T field strengths from GE and Siemens manufacturers. Recommended measurements involved the ACR MRI Accreditation Phantom, spherical homogenous phantoms, and a phantom with an uniform hole pattern. Measurements assessed geometric accuracy and linearity, position accuracy, image uniformity, signal, noise, ghosting, transmit gain, center frequency, and magnetic field drift. The program was designed with open source tools, employing Linux, Apache, MySQL database and Python programming language for the front and backend. Results: Processing time for each image is <2 seconds. Figures are produced to show regions of interests (ROIs) for analysis. Historical data can be reviewed to compare previous year data and to inspect for trends. Conclusion: A MRI quality assurance and QC program is necessary for maintaining high quality, ACR MRI Accredited MR programs. A reviewable database of phantom measurements assists medical physicists with processing and monitoring of large datasets. Longitudinal data can reveal trends that although are within passing criteria indicate underlying system issues.« less
NASA Astrophysics Data System (ADS)
Saavedra, Juan Alejandro
Quality Control (QC) and Quality Assurance (QA) strategies vary significantly across industries in the manufacturing sector depending on the product being built. Such strategies range from simple statistical analysis and process controls, decision-making process of reworking, repairing, or scraping defective product. This study proposes an optimal QC methodology in order to include rework stations during the manufacturing process by identifying the amount and location of these workstations. The factors that are considered to optimize these stations are cost, cycle time, reworkability and rework benefit. The goal is to minimize the cost and cycle time of the process, but increase the reworkability and rework benefit. The specific objectives of this study are: (1) to propose a cost estimation model that includes energy consumption, and (2) to propose an optimal QC methodology to identify quantity and location of rework workstations. The cost estimation model includes energy consumption as part of the product direct cost. The cost estimation model developed allows the user to calculate product direct cost as the quality sigma level of the process changes. This provides a benefit because a complete cost estimation calculation does not need to be performed every time the processes yield changes. This cost estimation model is then used for the QC strategy optimization process. In order to propose a methodology that provides an optimal QC strategy, the possible factors that affect QC were evaluated. A screening Design of Experiments (DOE) was performed on seven initial factors and identified 3 significant factors. It reflected that one response variable was not required for the optimization process. A full factorial DOE was estimated in order to verify the significant factors obtained previously. The QC strategy optimization is performed through a Genetic Algorithm (GA) which allows the evaluation of several solutions in order to obtain feasible optimal solutions. The GA evaluates possible solutions based on cost, cycle time, reworkability and rework benefit. Finally it provides several possible solutions because this is a multi-objective optimization problem. The solutions are presented as chromosomes that clearly state the amount and location of the rework stations. The user analyzes these solutions in order to select one by deciding which of the four factors considered is most important depending on the product being manufactured or the company's objective. The major contribution of this study is to provide the user with a methodology used to identify an effective and optimal QC strategy that incorporates the number and location of rework substations in order to minimize direct product cost, and cycle time, and maximize reworkability, and rework benefit.
Huang, Kai-Fa; Liaw, Su-Sen; Huang, Wei-Lin; Chia, Cho-Yun; Lo, Yan-Chung; Chen, Yi-Ling; Wang, Andrew H.-J.
2011-01-01
Aberrant pyroglutamate formation at the N terminus of certain peptides and proteins, catalyzed by glutaminyl cyclases (QCs), is linked to some pathological conditions, such as Alzheimer disease. Recently, a glutaminyl cyclase (QC) inhibitor, PBD150, was shown to be able to reduce the deposition of pyroglutamate-modified amyloid-β peptides in brain of transgenic mouse models of Alzheimer disease, leading to a significant improvement of learning and memory in those transgenic animals. Here, we report the 1.05–1.40 Å resolution structures, solved by the sulfur single-wavelength anomalous dispersion phasing method, of the Golgi-luminal catalytic domain of the recently identified Golgi-resident QC (gQC) and its complex with PBD150. We also describe the high-resolution structures of secretory QC (sQC)-PBD150 complex and two other gQC-inhibitor complexes. gQC structure has a scaffold similar to that of sQC but with a relatively wider and negatively charged active site, suggesting a distinct substrate specificity from sQC. Upon binding to PBD150, a large loop movement in gQC allows the inhibitor to be tightly held in its active site primarily by hydrophobic interactions. Further comparisons of the inhibitor-bound structures revealed distinct interactions of the inhibitors with gQC and sQC, which are consistent with the results from our inhibitor assays reported here. Because gQC and sQC may play different biological roles in vivo, the different inhibitor binding modes allow the design of specific inhibitors toward gQC and sQC. PMID:21288892
Huang, Kai-Fa; Liaw, Su-Sen; Huang, Wei-Lin; Chia, Cho-Yun; Lo, Yan-Chung; Chen, Yi-Ling; Wang, Andrew H-J
2011-04-08
Aberrant pyroglutamate formation at the N terminus of certain peptides and proteins, catalyzed by glutaminyl cyclases (QCs), is linked to some pathological conditions, such as Alzheimer disease. Recently, a glutaminyl cyclase (QC) inhibitor, PBD150, was shown to be able to reduce the deposition of pyroglutamate-modified amyloid-β peptides in brain of transgenic mouse models of Alzheimer disease, leading to a significant improvement of learning and memory in those transgenic animals. Here, we report the 1.05-1.40 Å resolution structures, solved by the sulfur single-wavelength anomalous dispersion phasing method, of the Golgi-luminal catalytic domain of the recently identified Golgi-resident QC (gQC) and its complex with PBD150. We also describe the high-resolution structures of secretory QC (sQC)-PBD150 complex and two other gQC-inhibitor complexes. gQC structure has a scaffold similar to that of sQC but with a relatively wider and negatively charged active site, suggesting a distinct substrate specificity from sQC. Upon binding to PBD150, a large loop movement in gQC allows the inhibitor to be tightly held in its active site primarily by hydrophobic interactions. Further comparisons of the inhibitor-bound structures revealed distinct interactions of the inhibitors with gQC and sQC, which are consistent with the results from our inhibitor assays reported here. Because gQC and sQC may play different biological roles in vivo, the different inhibitor binding modes allow the design of specific inhibitors toward gQC and sQC.
40 CFR 98.334 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... used for accounting purposes including direct measurement weighing or through the use of purchase records same plant instruments or procedures that are used for accounting purposes (such as weigh hoppers... density and volume measurements, etc.). Record the total mass for the materials consumed each calendar...
Code of Federal Regulations, 2010 CFR
2010-01-01
... the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF AGRICULTURE FOOD... subpart B of this part to submit arguments either in support of or against the State agency's position. (d...
Code of Federal Regulations, 2011 CFR
2011-01-01
... the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF AGRICULTURE FOOD... date established for the conclusion of any discovery pursuant to § 283.29, a motion that its appeal be...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ford, Eric C., E-mail: eford@uw.edu; Terezakis, Stephanie; Souranis, Annette
Purpose: To quantify the error-detection effectiveness of commonly used quality control (QC) measures. Methods: We analyzed incidents from 2007-2010 logged into a voluntary in-house, electronic incident learning systems at 2 academic radiation oncology clinics. None of the incidents resulted in patient harm. Each incident was graded for potential severity using the French Nuclear Safety Authority scoring scale; high potential severity incidents (score >3) were considered, along with a subset of 30 randomly chosen low severity incidents. Each report was evaluated to identify which of 15 common QC checks could have detected it. The effectiveness was calculated, defined as the percentagemore » of incidents that each QC measure could detect, both for individual QC checks and for combinations of checks. Results: In total, 4407 incidents were reported, 292 of which had high-potential severity. High- and low-severity incidents were detectable by 4.0 {+-} 2.3 (mean {+-} SD) and 2.6 {+-} 1.4 QC checks, respectively (P<.001). All individual checks were less than 50% sensitive with the exception of pretreatment plan review by a physicist (63%). An effectiveness of 97% was achieved with 7 checks used in combination and was not further improved with more checks. The combination of checks with the highest effectiveness includes physics plan review, physician plan review, Electronic Portal Imaging Device-based in vivo portal dosimetry, radiation therapist timeout, weekly physics chart check, the use of checklists, port films, and source-to-skin distance checks. Some commonly used QC checks such as pretreatment intensity modulated radiation therapy QA do not substantially add to the ability to detect errors in these data. Conclusions: The effectiveness of QC measures in radiation oncology depends sensitively on which checks are used and in which combinations. A small percentage of errors cannot be detected by any of the standard formal QC checks currently in broad use, suggesting that further improvements are needed. These data require confirmation with a broader incident-reporting database.« less
Production of latex agglutination reagents for pneumococcal serotyping
2013-01-01
Background The current ‘gold standard’ for serotyping pneumococci is the Quellung test. This technique is laborious and requires a certain level of training to correctly perform. Commercial pneumococcal latex agglutination serotyping reagents are available, but these are expensive. In-house production of latex agglutination reagents can be a cost-effective alternative to using commercially available reagents. This paper describes a method for the production and quality control (QC) of latex reagents, including problem solving recommendations, for pneumococcal serotyping. Results Here we describe a method for the production of latex agglutination reagents based on the passive adsorption of antibodies to latex particles. Sixty-five latex agglutination reagents were made using the PneuCarriage Project (PCP) method, of which 35 passed QC. The other 30 reagents failed QC due to auto-agglutination (n=2), no reactivity with target serotypes (n=8) or cross-reactivity with non-target serotypes (n=20). Dilution of antisera resulted in a further 27 reagents passing QC. The remaining three reagents passed QC when prepared without centrifugation and wash steps. Protein estimates indicated that latex reagents that failed QC when prepared using the PCP method passed when made with antiserum containing ≤ 500 μg/ml of protein. Sixty-one nasopharyngeal isolates were serotyped with our in-house latex agglutination reagents, with the results showing complete concordance with the Quellung reaction. Conclusions The method described here to produce latex agglutination reagents allows simple and efficient serotyping of pneumococci and may be applicable to latex agglutination reagents for typing or identification of other microorganisms. We recommend diluting antisera or removing centrifugation and wash steps for any latex reagents that fail QC. Our latex reagents are cost-effective, technically undemanding to prepare and remain stable for long periods of time, making them ideal for use in low-income countries. PMID:23379961
NASA Astrophysics Data System (ADS)
Young, K.; Voemel, H.; Morris, D.
2015-12-01
In-situ measurement systems are used to monitor the atmosphere whereby instruments are located in the area of interest and are in direct contact with what is being measured. Dropsondes and radiosondes are instruments used to collect high-vertical-resolution profiles of the atmosphere. The dropsondes are deployed from aircraft and, as they descend, they collect pressure, temperature and humidity data at a half-second rate, and GPS wind data at a quarter-second rate. Radiosondes are used to collect high-resolution measurements of the atmosphere, from the ground to approximately 30 kilometers. Carried by a large helium-filled balloon, they ascend upward through the atmosphere measuring pressure, temperature, relative humidity, and GPS winds at a one-second rate. Advancements in atmospheric research, technology and data assimilation techniques have contributed to driving the need for higher quality, higher resolution radiosonde and dropsonde data at an increasingly rapid rate. These data most notably represent a valuable resource for initializing numerical prediction models, calibrating and validating satellite retrieval techniques for atmospheric profiles, and for climatological research. The In-Situ Sensing Facility, at NCAR, has developed an extensive, multi-step process of quality control (QC). Traditionally, QC has been a time intensive process that involves evaluating data products using a variety of visualization tools and statistical methods. With a greater need for real-time data in the field and a reduced turn-around time for final quality controlled data, new and improved procedures for streamlining statistical analysis and QC are being implemented. Improvements have also been made on two fronts regarding implementation of a comprehensive data management plan. The first was ensuring ease of data accessibility through an intuitive centralized data archive system, that both keeps a record of data users and assigns digital object identifiers to each unique data set. The second improvement was to define appropriate criteria needed for documentation and metadata so that data users have all of the relevant information needed to properly use and understand the complexities of these measurements.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boellaard, Ronald, E-mail: r.boellaard@vumc.nl; European Association of Nuclear Medicine Research Ltd., Vienna 1060; European Association of Nuclear Medicine Physics Committee, Vienna 1060
2015-10-15
Purpose: Integrated positron emission tomography/magnetic resonance (PET/MR) systems derive the PET attenuation correction (AC) from dedicated MR sequences. While MR-AC performs reasonably well in clinical patient imaging, it may fail for phantom-based quality control (QC). The authors assess the applicability of different protocols for PET QC in multicenter PET/MR imaging. Methods: The National Electrical Manufacturers Association NU 2 2007 image quality phantom was imaged on three combined PET/MR systems: a Philips Ingenuity TF PET/MR, a Siemens Biograph mMR, and a GE SIGNA PET/MR (prototype) system. The phantom was filled according to the EANM FDG-PET/CT guideline 1.0 and scanned for 5more » min over 1 bed. Two MR-AC imaging protocols were tested: standard clinical procedures and a dedicated protocol for phantom tests. Depending on the system, the dedicated phantom protocol employs a two-class (water and air) segmentation of the MR data or a CT-based template. Differences in attenuation- and SUV recovery coefficients (RC) are reported. PET/CT-based simulations were performed to simulate the various artifacts seen in the AC maps (μ-map) and their impact on the accuracy of phantom-based QC. Results: Clinical MR-AC protocols caused substantial errors and artifacts in the AC maps, resulting in underestimations of the reconstructed PET activity of up to 27%, depending on the PET/MR system. Using dedicated phantom MR-AC protocols, PET bias was reduced to −8%. Mean and max SUV RC met EARL multicenter PET performance specifications for most contrast objects, but only when using the dedicated phantom protocol. Simulations confirmed the bias in experimental data to be caused by incorrect AC maps resulting from the use of clinical MR-AC protocols. Conclusions: Phantom-based quality control of PET/MR systems in a multicenter, multivendor setting may be performed with sufficient accuracy, but only when dedicated phantom acquisition and processing protocols are used for attenuation correction.« less
MO-AB-210-03: Workshop [Advancements in high intensity focused ultrasound
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu, Z.
The goal of this ultrasound hands-on workshop is to demonstrate advancements in high intensity focused ultrasound (HIFU) and to demonstrate quality control (QC) testing in diagnostic ultrasound. HIFU is a therapeutic modality that uses ultrasound waves as carriers of energy. HIFU is used to focus a beam of ultrasound energy into a small volume at specific target locations within the body. The focused beam causes localized high temperatures and produces a well-defined regions of necrosis. This completely non-invasive technology has great potential for tumor ablation and targeted drug delivery. At the workshop, attendees will see configurations, applications, and hands-on demonstrationsmore » with on-site instructors at separate stations. The involvement of medical physicists in diagnostic ultrasound imaging service is increasing due to QC and accreditation requirements. At the workshop, an array of ultrasound testing phantoms and ultrasound scanners will be provided for attendees to learn diagnostic ultrasound QC in a hands-on environment with live demonstrations of the techniques. Target audience: Medical physicists and other medical professionals in diagnostic imaging and radiation oncology with interest in high-intensity focused ultrasound and in diagnostic ultrasound QC. Learning Objectives: Learn ultrasound physics and safety for HIFU applications through live demonstrations Get an overview of the state-of-the art in HIFU technologies and equipment Gain familiarity with common elements of a quality control program for diagnostic ultrasound imaging Identify QC tools available for testing diagnostic ultrasound systems and learn how to use these tools List of supporting vendors for HIFU and diagnostic ultrasound QC hands-on workshop: Philips Healthcare Alpinion Medical Systems Verasonics, Inc Zonare Medical Systems, Inc Computerized Imaging Reference Systems (CIRS), Inc. GAMMEX, Inc., Cablon Medical BV Steffen Sammet: NIH/NCI grant 5R25CA132822, NIH/NINDS grant 5R25NS080949 and Philips Healthcare research grant C32.« less
MO-AB-210-02: Ultrasound Imaging and Therapy-Hands On Workshop
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sammet, S.
The goal of this ultrasound hands-on workshop is to demonstrate advancements in high intensity focused ultrasound (HIFU) and to demonstrate quality control (QC) testing in diagnostic ultrasound. HIFU is a therapeutic modality that uses ultrasound waves as carriers of energy. HIFU is used to focus a beam of ultrasound energy into a small volume at specific target locations within the body. The focused beam causes localized high temperatures and produces a well-defined regions of necrosis. This completely non-invasive technology has great potential for tumor ablation and targeted drug delivery. At the workshop, attendees will see configurations, applications, and hands-on demonstrationsmore » with on-site instructors at separate stations. The involvement of medical physicists in diagnostic ultrasound imaging service is increasing due to QC and accreditation requirements. At the workshop, an array of ultrasound testing phantoms and ultrasound scanners will be provided for attendees to learn diagnostic ultrasound QC in a hands-on environment with live demonstrations of the techniques. Target audience: Medical physicists and other medical professionals in diagnostic imaging and radiation oncology with interest in high-intensity focused ultrasound and in diagnostic ultrasound QC. Learning Objectives: Learn ultrasound physics and safety for HIFU applications through live demonstrations Get an overview of the state-of-the art in HIFU technologies and equipment Gain familiarity with common elements of a quality control program for diagnostic ultrasound imaging Identify QC tools available for testing diagnostic ultrasound systems and learn how to use these tools List of supporting vendors for HIFU and diagnostic ultrasound QC hands-on workshop: Philips Healthcare Alpinion Medical Systems Verasonics, Inc Zonare Medical Systems, Inc Computerized Imaging Reference Systems (CIRS), Inc. GAMMEX, Inc., Cablon Medical BV Steffen Sammet: NIH/NCI grant 5R25CA132822, NIH/NINDS grant 5R25NS080949 and Philips Healthcare research grant C32.« less
MO-AB-210-01: Ultrasound Imaging and Therapy-Hands On Workshop
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu, Z.
The goal of this ultrasound hands-on workshop is to demonstrate advancements in high intensity focused ultrasound (HIFU) and to demonstrate quality control (QC) testing in diagnostic ultrasound. HIFU is a therapeutic modality that uses ultrasound waves as carriers of energy. HIFU is used to focus a beam of ultrasound energy into a small volume at specific target locations within the body. The focused beam causes localized high temperatures and produces a well-defined regions of necrosis. This completely non-invasive technology has great potential for tumor ablation and targeted drug delivery. At the workshop, attendees will see configurations, applications, and hands-on demonstrationsmore » with on-site instructors at separate stations. The involvement of medical physicists in diagnostic ultrasound imaging service is increasing due to QC and accreditation requirements. At the workshop, an array of ultrasound testing phantoms and ultrasound scanners will be provided for attendees to learn diagnostic ultrasound QC in a hands-on environment with live demonstrations of the techniques. Target audience: Medical physicists and other medical professionals in diagnostic imaging and radiation oncology with interest in high-intensity focused ultrasound and in diagnostic ultrasound QC. Learning Objectives: Learn ultrasound physics and safety for HIFU applications through live demonstrations Get an overview of the state-of-the art in HIFU technologies and equipment Gain familiarity with common elements of a quality control program for diagnostic ultrasound imaging Identify QC tools available for testing diagnostic ultrasound systems and learn how to use these tools List of supporting vendors for HIFU and diagnostic ultrasound QC hands-on workshop: Philips Healthcare Alpinion Medical Systems Verasonics, Inc Zonare Medical Systems, Inc Computerized Imaging Reference Systems (CIRS), Inc. GAMMEX, Inc., Cablon Medical BV Steffen Sammet: NIH/NCI grant 5R25CA132822, NIH/NINDS grant 5R25NS080949 and Philips Healthcare research grant C32.« less
NASA Astrophysics Data System (ADS)
Susskind, J.; Rosenberg, R. I.
2016-12-01
The GEOS-5 Data Assimilation System (DAS) generates a global analysis every six hours by combining the previous six hour forecast for that time period with contemporaneous observations. These observations include in-situ observations as well as those taken by satellite borne instruments, such as AIRS/AMSU on EOS Aqua and CrIS/ATMS on S-NPP. Operational data assimilation methodology assimilates observed channel radiances Ri for IR sounding instruments such as AIRS and CrIS, but only for those channels i in a given scene whose radiances are thought to be unaffected by clouds. A limitation of this approach is that radiances in most tropospheric sounding channels are affected by clouds under partial cloud cover conditions, which occurs most of the time. The AIRS Science Team Version-6 retrieval algorithm generates cloud cleared radiances (CCR's) for each channel in a given scene, which represent the radiances AIRS would have observed if the scene were cloud free, and then uses them to determine quality controlled (QC'd) temperature profiles T(p) under all cloud conditions. There are potential advantages to assimilate either AIRS QC'd CCR's or QC'd T(p) instead of Ri in that the spatial coverage of observations is greater under partial cloud cover. We tested these two alternate data assimilation approaches by running three parallel data assimilation experiments over different time periods using GEOS-5. Experiment 1 assimilated all observations as done operationally, Experiment 2 assimilated QC'd values of AIRS CCRs in place of AIRS radiances, and Experiment 3 assimilated QC'd values of T(p) in place of observed radiances. Assimilation of QC'd AIRS T(p) resulted in significant improvement in seven day forecast skill compared to assimilation of CCR's or assimilation of observed radiances, especially in the Southern Hemisphere Extra-tropics.
Nowik, Patrik; Bujila, Robert; Poludniowski, Gavin; Fransson, Annette
2015-07-08
The purpose of this study was to develop a method of performing routine periodical quality controls (QC) of CT systems by automatically analyzing key performance indicators (KPIs), obtainable from images of manufacturers' quality assurance (QA) phantoms. A KPI pertains to a measurable or determinable QC parameter that is influenced by other underlying fundamental QC parameters. The established KPIs are based on relationships between existing QC parameters used in the annual testing program of CT scanners at the Karolinska University Hospital in Stockholm, Sweden. The KPIs include positioning, image noise, uniformity, homogeneity, the CT number of water, and the CT number of air. An application (MonitorCT) was developed to automatically evaluate phantom images in terms of the established KPIs. The developed methodology has been used for two years in clinical routine, where CT technologists perform daily scans of the manufacturer's QA phantom and automatically send the images to MonitorCT for KPI evaluation. In the cases where results were out of tolerance, actions could be initiated in less than 10 min. 900 QC scans from two CT scanners have been collected and analyzed over the two-year period that MonitorCT has been active. Two types of errors have been registered in this period: a ring artifact was discovered with the image noise test, and a calibration error was detected multiple times with the CT number test. In both cases, results were outside the tolerances defined for MonitorCT, as well as by the vendor. Automated monitoring of KPIs is a powerful tool that can be used to supplement established QC methodologies. Medical physicists and other professionals concerned with the performance of a CT system will, using such methods, have access to comprehensive data on the current and historical (trend) status of the system such that swift actions can be taken in order to ensure the quality of the CT examinations, patient safety, and minimal disruption of service.
NASA Astrophysics Data System (ADS)
Chan, S.; Billesbach, D. P.; Hanson, C. V.; Biraud, S.
2014-12-01
The AmeriFlux quality assurance and quality control (QA/QC) technical team conducts short term (<2 weeks) intercomparisons using a portable eddy covariance system (PECS) to maintain high quality data observations and data consistency across the AmeriFlux network (http://ameriflux.lbl.gov/). Site intercomparisons identify discrepancies between the in situ and portable measurements and calculated fluxes. Findings are jointly discussed by the site staff and the QA/QC team to improve in the situ observations. Despite the relatively short duration of an individual site intercomparison, the accumulated record of all site visits (numbering over 100 since 2002) is a unique dataset. The ability to deploy redundant sensors provides a rare opportunity to identify, quantify, and understand uncertainties in eddy covariance and ancillary measurements. We present a few specific case studies from QA/QC site visits to highlight and share new and relevant findings related to eddy covariance instrumentation and operation.
Brummel, Olaf; Waidhas, Fabian; Bauer, Udo; Wu, Yanlin; Bochmann, Sebastian; Steinrück, Hans-Peter; Papp, Christian; Bachmann, Julien; Libuda, Jörg
2017-07-06
The two valence isomers norbornadiene (NBD) and quadricyclane (QC) enable solar energy storage in a single molecule system. We present a new photoelectrochemical infrared reflection absorption spectroscopy (PEC-IRRAS) experiment, which allows monitoring of the complete energy storage and release cycle by in situ vibrational spectroscopy. Both processes were investigated, the photochemical conversion from NBD to QC using the photosensitizer 4,4'-bis(dimethylamino)benzophenone (Michler's ketone, MK) and the electrochemically triggered cycloreversion from QC to NBD. Photochemical conversion was obtained with characteristic conversion times on the order of 500 ms. All experiments were performed under full potential control in a thin-layer configuration with a Pt(111) working electrode. The vibrational spectra of NBD, QC, and MK were analyzed in the fingerprint region, permitting quantitative analysis of the spectroscopic data. We determined selectivities for both the photochemical conversion and the electrochemical cycloreversion and identified the critical steps that limit the reversibility of the storage cycle.
Lourens, Chris; Lindegardh, Niklas; Barnes, Karen I.; Guerin, Philippe J.; Sibley, Carol H.; White, Nicholas J.
2014-01-01
Comprehensive assessment of antimalarial drug resistance should include measurements of antimalarial blood or plasma concentrations in clinical trials and in individual assessments of treatment failure so that true resistance can be differentiated from inadequate drug exposure. Pharmacometric modeling is necessary to assess pharmacokinetic-pharmacodynamic relationships in different populations to optimize dosing. To accomplish both effectively and to allow comparison of data from different laboratories, it is essential that drug concentration measurement is accurate. Proficiency testing (PT) of laboratory procedures is necessary for verification of assay results. Within the Worldwide Antimalarial Resistance Network (WWARN), the goal of the quality assurance/quality control (QA/QC) program is to facilitate and sustain high-quality antimalarial assays. The QA/QC program consists of an international PT program for pharmacology laboratories and a reference material (RM) program for the provision of antimalarial drug standards, metabolites, and internal standards for laboratory use. The RM program currently distributes accurately weighed quantities of antimalarial drug standards, metabolites, and internal standards to 44 pharmacology, in vitro, and drug quality testing laboratories. The pharmacology PT program has sent samples to eight laboratories in four rounds of testing. WWARN technical experts have provided advice for correcting identified problems to improve performance of subsequent analysis and ultimately improved the quality of data. Many participants have demonstrated substantial improvements over subsequent rounds of PT. The WWARN QA/QC program has improved the quality and value of antimalarial drug measurement in laboratories globally. It is a model that has potential to be applied to strengthening laboratories more widely and improving the therapeutics of other infectious diseases. PMID:24777099
NASA Technical Reports Server (NTRS)
Barbre, Robert E., Jr.
2012-01-01
This paper presents the process used by the Marshall Space Flight Center Natural Environments Branch (EV44) to quality control (QC) data from the Kennedy Space Center's 50-MHz Doppler Radar Wind Profiler for use in vehicle wind loads and steering commands. The database has been built to mitigate limitations of using the currently archived databases from weather balloons. The DRWP database contains wind measurements from approximately 2.7-18.6 km altitude at roughly five minute intervals for the August 1997 to December 2009 period of record, and the extensive QC process was designed to remove spurious data from various forms of atmospheric and non-atmospheric artifacts. The QC process is largely based on DRWP literature, but two new algorithms have been developed to remove data contaminated by convection and excessive first guess propagations from the Median Filter First Guess Algorithm. In addition to describing the automated and manual QC process in detail, this paper describes the extent of the data retained. Roughly 58% of all possible wind observations exist in the database, with approximately 100 times as many complete profile sets existing relative to the EV44 balloon databases. This increased sample of near-continuous wind profile measurements may help increase launch availability by reducing the uncertainty of wind changes during launch countdown
Torres, Leticia; Liu, Yue; Guitreau, Amy; Yang, Huiping; Tiersch, Terrence R
2017-12-01
Quality control (QC) is essential for reproducible and efficient functioning of germplasm repositories. However, many biomedical fish models present significant QC challenges due to small body sizes (<5 cm) and miniscule sperm volumes (<5 μL). Using minimal volumes of sperm, we used Zebrafish to evaluate common QC endpoints as surrogates for fertilization success along sequential steps of cryopreservation. First, concentrations of calibration bead suspensions were evaluated with a Makler ® counting chamber by using different sample volumes and mixing methods. For sperm analysis, samples were initially diluted at a 1:30 ratio with Hanks' balanced salt solution (HBSS). Motility was evaluated by using different ratios of sperm and activation medium, and membrane integrity was analyzed with flow cytometry at different concentrations. Concentration and sperm motility could be confidently estimated by using volumes as small as 1 μL, whereas membrane integrity required a minimum of 2 μL (at 1 × 10 6 cells/mL). Thus, <5 μL of sperm suspension (after dilution to 30-150 μL with HBSS) was required to evaluate sperm quality by using three endpoints. Sperm quality assessment using a combination of complementary endpoints enhances QC efforts during cryopreservation, increasing reliability and reproducibility, and reducing waste of time and resources.
The importance of quality control in validating concentrations ...
A national-scale survey of 247 contaminants of emerging concern (CECs), including organic and inorganic chemical compounds, and microbial contaminants, was conducted in source and treated drinking water samples from 25 treatment plants across the United States. Multiple methods were used to determine these CECs, including six analytical methods to measure 174 pharmaceuticals, personal care products, and pesticides. A three-component quality assurance/quality control (QA/QC) program was designed for the subset of 174 CECs which allowed us to assess and compare performances of the methods used. The three components included: 1) a common field QA/QC protocol and sample design, 2) individual investigator-developed method-specific QA/QC protocols, and 3) a suite of 46 method comparison analytes that were determined in two or more analytical methods. Overall method performance for the 174 organic chemical CECs was assessed by comparing spiked recoveries in reagent, source, and treated water over a two-year period. In addition to the 247 CECs reported in the larger drinking water study, another 48 pharmaceutical compounds measured did not consistently meet predetermined quality standards. Methodologies that did not seem suitable for these analytes are overviewed. The need to exclude analytes based on method performance demonstrates the importance of additional QA/QC protocols. This paper compares the method performance of six analytical methods used to measure 174 emer
Kim, Sung-Su; Choi, Hyun-Jeung; Kim, Jin Ju; Kim, M Sun; Lee, In-Seon; Byun, Bohyun; Jia, Lina; Oh, Myung Ryurl; Moon, Youngho; Park, Sarah; Choi, Joon-Seok; Chae, Seoung Wan; Nam, Byung-Ho; Kim, Jin-Soo; Kim, Jihun; Min, Byung Soh; Lee, Jae Seok; Won, Jae-Kyung; Cho, Soo Youn; Choi, Yoon-La; Shin, Young Kee
2018-01-11
In clinical translational research and molecular in vitro diagnostics, a major challenge in the detection of genetic mutations is overcoming artefactual results caused by the low-quality of formalin-fixed paraffin-embedded tissue (FFPET)-derived DNA (FFPET-DNA). Here, we propose the use of an 'internal quality control (iQC) index' as a criterion for judging the minimum quality of DNA for PCR-based analyses. In a pre-clinical study comparing the results from droplet digital PCR-based EGFR mutation test (ddEGFR test) and qPCR-based EGFR mutation test (cobas EGFR test), iQC index ≥ 0.5 (iQC copies ≥ 500, using 3.3 ng of FFPET-DNA [1,000 genome equivalents]) was established, indicating that more than half of the input DNA was amplifiable. Using this criterion, we conducted a retrospective comparative clinical study of the ddEGFR and cobas EGFR tests for the detection of EGFR mutations in non-small cell lung cancer (NSCLC) FFPET-DNA samples. Compared with the cobas EGFR test, the ddEGFR test exhibited superior analytical performance and equivalent or higher clinical performance. Furthermore, iQC index is a reliable indicator of the quality of FFPET-DNA and could be used to prevent incorrect diagnoses arising from low-quality samples.
40 CFR 98.174 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... emissions using the carbon mass balance procedure in § 98.173(b)(1), you must: (1) Except as provided in... Standard Test Methods for Instrumental Determination of Carbon, Hydrogen, and Nitrogen in Laboratory... Carbon, Sulfur, Nitrogen, and Oxygen in Steel, Iron, Nickel, and Cobalt Alloys by Various Combustion and...
40 CFR 98.174 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... emissions using the carbon mass balance procedure in § 98.173(b)(1), you must: (1) Except as provided in... Standard Test Methods for Instrumental Determination of Carbon, Hydrogen, and Nitrogen in Laboratory... Carbon, Sulfur, Nitrogen, and Oxygen in Steel, Iron, Nickel, and Cobalt Alloys by Various Combustion and...
40 CFR 98.174 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... emissions using the carbon mass balance procedure in § 98.173(b)(1), you must: (1) Except as provided in... Standard Test Methods for Instrumental Determination of Carbon, Hydrogen, and Nitrogen in Laboratory... Carbon, Sulfur, Nitrogen, and Oxygen in Steel, Iron, Nickel, and Cobalt Alloys by Various Combustion and...
40 CFR 98.244 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... procedures specified in § 98.34(c). (b) If you use the mass balance methodology in § 98.243(c), use the... Test Methods for Instrumental Determination of Carbon, Hydrogen, and Nitrogen in Petroleum Products and... Instrumental Determination of Carbon, Hydrogen, and Nitrogen in Laboratory Samples of Coal (incorporated by...
40 CFR 98.174 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... emissions using the carbon mass balance procedure in § 98.173(b)(1), you must: (1) Except as provided in... Methods for Instrumental Determination of Carbon, Hydrogen, and Nitrogen in Laboratory Samples of Coal..., Nitrogen, and Oxygen in Steel, Iron, Nickel, and Cobalt Alloys by Various Combustion and Fusion Techniques...
40 CFR 98.244 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... procedures specified in § 98.34(c). (b) If you use the mass balance methodology in § 98.243(c), use the...) Standard Test Methods for Instrumental Determination of Carbon, Hydrogen, and Nitrogen in Petroleum... for Instrumental Determination of Carbon, Hydrogen, and Nitrogen in Laboratory Samples of Coal...
40 CFR 136.6 - Method modifications and analytical requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... person or laboratory using a test procedure (analytical method) in this Part. (2) Chemistry of the method... (analytical method) provided that the chemistry of the method or the determinative technique is not changed... prevent efficient recovery of organic pollutants and prevent the method from meeting QC requirements, the...
Establishment of QC/QA procedures for open-graded mixes : final report.
DOT National Transportation Integrated Search
1998-09-01
The State of Oregon has employed the use of porous concrete surfaces (E- and F-mixes) since the 1970s. The use of porous mixes has increased substantially in the past five years. Previously, no work had been done to evaluate whether the quality contr...
Individualized Quality Control Plan (IQCP): Is It Value-Added for Clinical Microbiology?
Miller, Melissa B.; Hindler, Janet
2015-01-01
The Center for Medicaid and Medicare Services (CMS) recently published their Individualized Quality Control Plan (IQCP [https://www.cms.gov/regulations-and-guidance/legislation/CLIA/Individualized_Quality_Control_Plan_IQCP.html]), which will be the only option for quality control (QC) starting in January 2016 if laboratories choose not to perform Clinical Laboratory Improvement Act (CLIA) [U.S. Statutes at Large 81(1967):533] default QC. Laboratories will no longer be able to use “equivalent QC” (EQC) or the Clinical and Laboratory Standards Institute (CLSI) standards alone for quality control of their microbiology systems. The implementation of IQCP in clinical microbiology laboratories will most certainly be an added burden, the benefits of which are currently unknown. PMID:26447112
NASA Astrophysics Data System (ADS)
Hussmann, Stephan; Lau, Wing Y.; Chu, Terry; Grothof, Markus
2003-07-01
Traditionally, the measuring or monitoring system of manufacturing industries uses sensors, computers and screens for their quality control (Q.C.). The acquired information is fed back to the control room by wires, which - for obvious reason - are not suitable in many environments. This paper describes a method to solve this problem by employing the new Bluetooth technology to set up a complete new system, where a total wireless solution is made feasible. This new Q.C. system allows several line scan cameras to be connected at once to a graphical user interface (GUI) that can monitor the production process. There are many Bluetooth devices available on the market such as cell-phones, headsets, printers, PDA etc. However, the detailed application is a novel implementation in the industrial Q.C. area. This paper will contain more details about the Bluetooth standard and why it is used (nework topologies, host controller interface, data rates, etc.), the Bluetooth implemetation in the microcontroller of the line scan camera, and the GUI and its features.
Analysis of glycoprotein processing in the endoplasmic reticulum using synthetic oligosaccharides.
Ito, Yukishige; Takeda, Yoichi
2012-01-01
Protein quality control (QC) in the endoplasmic reticulum (ER) comprises many steps, including folding and transport of nascent proteins as well as degradation of misfolded proteins. Recent studies have revealed that high-mannose-type glycans play a pivotal role in the QC process. To gain knowledge about the molecular basis of this process with well-defined homogeneous compounds, we achieved a convergent synthesis of high-mannose-type glycans and their functionalized derivatives. We focused on analyses of UDP-Glc: glycoprotein glucosyltransferase (UGGT) and ER Glucosidase II, which play crucial roles in glycoprotein QC; however, their specificities remain unclear. In addition, we established an in vitro assay system mimicking the in vivo condition which is highly crowded because of the presence of various biomacromolecules.
Massin, Frédéric; Huili, Cai; Decot, Véronique; Stoltz, Jean-François; Bensoussan, Danièle; Latger-Cannard, Véronique
2015-01-01
Stem cells for autologous and allogenic transplantation are obtained from several sources including bone marrow, peripheral blood or cord blood. Accurate enumeration of viable CD34+ hematopoietic stem cells (HSC) is routinely used in clinical settings, especially to monitor progenitor cell mobilization and apheresis. The number of viable CD34+ HSC has also been shown to be the most critical factor in haematopoietic engraftment. The International Society for Cellular Therapy actually recommends the use of single-platform flow cytometry system using 7-AAD as a viability dye. In a way to move routine analysis from a BD FACSCaliburTM instrument to a BD FACSCantoTM II, according to ISO 15189 standard guidelines, we define laboratory performance data of the BDTM Stem Cell Enumeration (SCE) kit on a CE-IVD system including a BD FACSCanto II flow cytometer and the BD FACSCantoTM Clinical Software. InterQCTM software, a real time internet laboratory QC management system developed by VitroTM and distributed by Becton DickinsonTM, was also tested to monitor daily QC data, to define the internal laboratory statistics and to compare them to external laboratories. Precision was evaluated with BDTM Stem Cell Control (high and low) results and the InterQC software, an internet laboratory QC management system by Vitro. This last one drew Levey-Jennings curves and generated numeral statistical parameters allowing detection of potential changes in the system performances as well as interlaboratory comparisons. Repeatability, linearity and lower limits of detection were obtained with routine samples from different origins. Agreement evaluation between BD FACSCanto II system versus BD FACSCalibur system was tested on fresh peripheral blood, freeze-thawed apheresis, fresh bone marrow and fresh cord blood samples. Instrument's measure and staining repeatability clearly evidenced acceptable variability on the different samples tested. Intra- and inter-laboratory CV in CD34+ cell absolute count are consistent and reproducible. Linearity analysis, established between 2 and 329 cells/μl showed a linear relation between expected counts and measured counts (R2=0.97). Linear regression and Bland-Altman representations showed an excellent correlation on samples from different sources between the two systems and allowed the transfer of routine analysis from BD FACSCalibur to BD FACSCanto II. The BD SCE kit provides an accurate measure of the CD34 HSC, and can be used in daily routine to optimize the enumeration of hematopoietic CD34+ stem cells by flow cytometry. Moreover, the InterQC system seems to be a very useful tool for laboratory daily quality monitoring and thus for accreditation.
Eder, Anne F; Dy, Beth A; DeMerse, Barbara; Wagner, Stephen J; Stramer, Susan L; O'Neill, E Mary; Herron, Ross M
2017-12-01
Apheresis technology to collect platelet (PLT) components differs among devices. We evaluated the relationship of the plateletpheresis device with bacterial contamination and reported septic transfusion reactions. Plateletpheresis was performed using Amicus (Fenwal, a Fresenius Kabi Company) or Trima (Trima Accel, TerumoBCT) from 2010 to 2014. All donations used inlet-line sample diversion and were tested by quality control (QC; Day 1) aerobic culture. Rates of bacterial contamination and septic reactions to PLTs were calculated for both devices. During the 5-year study period, plateletpheresis collections using Amicus and Trima devices totaled 1,486,888 and 671,955 donations, respectively. The rate of confirmed-positive bacterial cultures of apheresis PLT donations was significantly higher with Amicus than with Trima (252 vs. 112 per 10 6 donations [odds ratio {OR}, 2.3; 95% confidence interval {CI}, 1.8-2.9]). Septic transfusion reactions were caused by 30 apheresis PLT units from 25 contaminated Amicus procedures and three apheresis PLT units from three contaminated Trima procedures. The overall rate of septic reactions was significantly higher with apheresis PLT components collected with Amicus than with Trima (16.8 vs. 4.5 per 10 6 donations [OR, 3.8; 95% CI, 1.1-12.5]). All apheresis PLT components implicated in septic transfusion reactions had negative QC culture results incubated through Day 5 (i.e., false negatives). Apheresis technology affects bacterial contamination of plateletpheresis collections. The device-specific, higher rate of confirmed-positive bacterial culture results also correlated with a significantly higher rate of reported septic transfusion reactions to apheresis PLTs. © 2017 AABB.
QC-ART: A tool for real-time quality control assessment of mass spectrometry-based proteomics data.
Stanfill, Bryan A; Nakayasu, Ernesto S; Bramer, Lisa M; Thompson, Allison M; Ansong, Charles K; Clauss, Therese; Gritsenko, Marina A; Monroe, Matthew E; Moore, Ronald J; Orton, Daniel J; Piehowski, Paul D; Schepmoes, Athena A; Smith, Richard D; Webb-Robertson, Bobbie-Jo; Metz, Thomas O
2018-04-17
Liquid chromatography-mass spectrometry (LC-MS)-based proteomics studies of large sample cohorts can easily require from months to years to complete. Acquiring consistent, high-quality data in such large-scale studies is challenging because of normal variations in instrumentation performance over time, as well as artifacts introduced by the samples themselves, such as those due to collection, storage and processing. Existing quality control methods for proteomics data primarily focus on post-hoc analysis to remove low-quality data that would degrade downstream statistics; they are not designed to evaluate the data in near real-time, which would allow for interventions as soon as deviations in data quality are detected. In addition to flagging analyses that demonstrate outlier behavior, evaluating how the data structure changes over time can aide in understanding typical instrument performance or identify issues such as a degradation in data quality due to the need for instrument cleaning and/or re-calibration. To address this gap for proteomics, we developed Quality Control Analysis in Real-Time (QC-ART), a tool for evaluating data as they are acquired in order to dynamically flag potential issues with instrument performance or sample quality. QC-ART has similar accuracy as standard post-hoc analysis methods with the additional benefit of real-time analysis. We demonstrate the utility and performance of QC-ART in identifying deviations in data quality due to both instrument and sample issues in near real-time for LC-MS-based plasma proteomics analyses of a sample subset of The Environmental Determinants of Diabetes in the Young cohort. We also present a case where QC-ART facilitated the identification of oxidative modifications, which are often underappreciated in proteomic experiments. Published under license by The American Society for Biochemistry and Molecular Biology, Inc.
Li, Chunhua; Lu, Ling; Wu, Xianghong; Wang, Chuanxi; Bennett, Phil; Lu, Teng; Murphy, Donald
2009-08-01
In this study, we characterized the full-length genomic sequences of 13 distinct hepatitis C virus (HCV) genotype 4 isolates/subtypes: QC264/4b, QC381/4c, QC382/4d, QC193/4g, QC383/4k, QC274/4l, QC249/4m, QC97/4n, QC93/4o, QC139/4p, QC262/4q, QC384/4r and QC155/4t. These were amplified, using RT-PCR, from the sera of patients now residing in Canada, 11 of which were African immigrants. The resulting genomes varied between 9421 and 9475 nt in length and each contains a single ORF of 9018-9069 nt. The sequences showed nucleotide similarities of 77.3-84.3 % in comparison with subtypes 4a (GenBank accession no. Y11604) and 4f (EF589160) and 70.6-72.8 % in comparison with genotype 1 (M62321/1a, M58335/1b, D14853/1c, and 1?/AJ851228) reference sequences. These similarities were often higher than those currently defined by HCV classification criteria for subtype (75.0-80.0 %) and genotype (67.0-70.0 %) division, respectively. Further analyses of the complete and partial E1 and partial NS5B sequences confirmed these 13 'provisionally assigned subtypes'.
Zamengo, Luca; Frison, Giampietro; Tedeschi, Gianpaola; Frasson, Samuela
2016-08-01
Blood alcohol concentration is the most frequent analytical determination carried out in forensic toxicology laboratories worldwide. It is usually required to assess if an offence has been committed by comparing blood alcohol levels with specified legal limits, which can vary widely among countries. Due to possible serious legal consequences associated with non-compliant alcohol levels, measurement uncertainty should be carefully evaluated, along with other metrological aspects which can influence the final result. The whole procedure can be time-consuming and error-generating in routine practice, increasing the risks for unreliable assessments. A software application named Ethanol WorkBook (EtWB) was developed at the author's laboratory by using Visual Basic for Application language and MS Excel(®), with the aim of providing help to forensic analysts involved in blood alcohol determinations. The program can (i) calculate measurement uncertainties and decision limits with different methodologies; (ii) assess compliance to specification limits with a guard-band approach; (iii) manage quality control (QC) data and create control charts for QC samples; (iv) create control maps from real cases data archives; (v) provide laboratory reports with graphical outputs for elaborated data and (vi) create comprehensive searchable case archives. A typical example of drink driving case is presented and discussed to illustrate the importance of a metrological approach for reliable compliance assessment and to demonstrate software application in routine practice. The tool is made freely available to the scientific community at request. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
From field notes to data portal - An operational QA/QC framework for tower networks
NASA Astrophysics Data System (ADS)
Sturtevant, C.; Hackley, S.; Meehan, T.; Roberti, J. A.; Holling, G.; Bonarrigo, S.
2016-12-01
Quality assurance and control (QA/QC) is one of the most important yet challenging aspects of producing research-quality data. This is especially so for environmental sensor networks collecting numerous high-frequency measurement streams at distributed sites. Here, the quality issues are multi-faceted, including sensor malfunctions, unmet theoretical assumptions, and measurement interference from the natural environment. To complicate matters, there are often multiple personnel managing different sites or different steps in the data flow. For large, centrally managed sensor networks such as NEON, the separation of field and processing duties is in the extreme. Tower networks such as Ameriflux, ICOS, and NEON continue to grow in size and sophistication, yet tools for robust, efficient, scalable QA/QC have lagged. Quality control remains a largely manual process relying on visual inspection of the data. In addition, notes of observed measurement interference or visible problems are often recorded on paper without an explicit pathway to data flagging during processing. As such, an increase in network size requires a near-proportional increase in personnel devoted to QA/QC, quickly stressing the human resources available. There is a need for a scalable, operational QA/QC framework that combines the efficiency and standardization of automated tests with the power and flexibility of visual checks, and includes an efficient communication pathway from field personnel to data processors to end users. Here we propose such a framework and an accompanying set of tools in development, including a mobile application template for recording tower maintenance and an R/shiny application for efficiently monitoring and synthesizing data quality issues. This framework seeks to incorporate lessons learned from the Ameriflux community and provide tools to aid continued network advancements.
Integrative Blood Pressure Response to Upright Tilt Post Renal Denervation
Howden, Erin J.; East, Cara; Lawley, Justin S.; Stickford, Abigail S.L.; Verhees, Myrthe; Fu, Qi
2017-01-01
Abstract BACKGROUND Whether renal denervation (RDN) in patients with resistant hypertension normalizes blood pressure (BP) regulation in response to routine cardiovascular stimuli such as upright posture is unknown. We conducted an integrative study of BP regulation in patients with resistant hypertension who had received RDN to characterize autonomic circulatory control. METHODS Twelve patients (60 ± 9 [SD] years, n = 10 males) who participated in the Symplicity HTN-3 trial were studied and compared to 2 age-matched normotensive (Norm) and hypertensive (unmedicated, HTN) control groups. BP, heart rate (HR), cardiac output (Qc), muscle sympathetic nerve activity (MSNA), and neurohormonal variables were measured supine, and 30° (5 minutes) and 60° (20 minutes) head-up-tilt (HUT). Total peripheral resistance (TPR) was calculated from mean arterial pressure and Qc. RESULTS Despite treatment with RDN and 4.8 (range, 3–7) antihypertensive medications, the RDN had significantly higher supine systolic BP compared to Norm and HTN (149 ± 15 vs. 118 ± 6, 108 ± 8 mm Hg, P < 0.001). When supine, RDN had higher HR, TPR, MSNA, plasma norepinephrine, and effective arterial elastance compared to Norm. Plasma norepinephrine, Qc, and HR were also higher in the RDN vs. HTN. During HUT, BP remained higher in the RDN, due to increases in Qc, plasma norepinephrine, and aldosterone. CONCLUSION We provide evidence of a possible mechanism by which BP remains elevated post RDN, with the observation of increased Qc and arterial stiffness, as well as plasma norepinephrine and aldosterone levels at approximately 2 years post treatment. These findings may be the consequence of incomplete ablation of sympathetic renal nerves or be related to other factors. PMID:28338768
Mammography dosimetry using an in-house developed polymethyl methacrylate phantom.
Sharma, Reena; Sharma, Sunil Dutt; Mayya, Y S; Chourasiya, G
2012-08-01
Phantom-based measurements in mammography are well-established for quality assurance (QA) and quality control (QC) procedures involving equipment performance and comparisons of X-ray machines. Polymethyl methacrylate (PMMA) is among the best suitable materials for simulation of the breast. For carrying out QA/QC exercises in India, a mammographic PMMA phantom with engraved slots for keeping thermoluminescence dosemeters (TLD) has been developed. The radiation transmission property of the developed phantom was compared with the commercially available phantoms for verifying its suitability for mammography dosimetry. The breast entrance exposure (BEE), mean glandular dose (MGD), percentage depth dose (PDD), percentage surface dose distribution (PSDD), calibration testing of automatic exposure control (AEC) and density control function of a mammography machine were measured using this phantom. MGD was derived from the measured BEE following two different methodologies and the results were compared. The PDD and PSDD measurements were carried out using LiF: Mg, Cu, P chips. The in-house phantom was found comparable with the commercially available phantoms. The difference in the MGD values derived using two different methods were found in the range of 17.5-32.6 %. Measured depth ranges in the phantom lie between 0.32 and 0.40 cm for 75 % depth dose, 0.73 and 0.92 cm for 50 % depth dose, and 1.54 and 1.78 cm for 25 % depth dose. Higher PSDD value was observed towards chest wall edge side of the phantom, which is due to the orientation of cathode-anode axis along the chest wall to the nipple direction. Results obtained for AEC configuration testing shows that the observed mean optical density (O.D) of the phantom image was 1.59 and O.D difference for every successive increase in thickness of the phantom was within±0.15 O.D. Under density control function testing, at -2 and -1 density settings, the variation in film image O.D was within±0.15 O.D of the normal density setting '0' and at +2 and +1 density setting, it was observed to be within±0.30 O.D. This study indicates that the locally made PMMA TLD slot phantom can be used to measure various mammography QC parameters which are essentially required for better outcomes in mammography.
Sho, Shonan; Court, Colin M; Winograd, Paul; Lee, Sangjun; Hou, Shuang; Graeber, Thomas G; Tseng, Hsian-Rong; Tomlinson, James S
2017-07-01
Sequencing analysis of circulating tumor cells (CTCs) enables "liquid biopsy" to guide precision oncology strategies. However, this requires low-template whole genome amplification (WGA) that is prone to errors and biases from uneven amplifications. Currently, quality control (QC) methods for WGA products, as well as the number of CTCs needed for reliable downstream sequencing, remain poorly defined. We sought to define strategies for selecting and generating optimal WGA products from low-template input as it relates to their potential applications in precision oncology strategies. Single pancreatic cancer cells (HPAF-II) were isolated using laser microdissection. WGA was performed using multiple displacement amplification (MDA), multiple annealing and looping based amplification (MALBAC) and PicoPLEX. Quality of amplified DNA products were assessed using a multiplex/RT-qPCR based method that evaluates for 8-cancer related genes and QC-scores were assigned. We utilized this scoring system to assess the impact of de novo modifications to the WGA protocol. WGA products were subjected to Sanger sequencing, array comparative genomic hybridization (aCGH) and next generation sequencing (NGS) to evaluate their performances in respective downstream analyses providing validation of the QC-score. Single-cell WGA products exhibited a significant sample-to-sample variability in amplified DNA quality as assessed by our 8-gene QC assay. Single-cell WGA products that passed the pre-analysis QC had lower amplification bias and improved aCGH/NGS performance metrics when compared to single-cell WGA products that failed the QC. Increasing the number of cellular input resulted in improved QC-scores overall, but a resultant WGA product that consistently passed the QC step required a starting cellular input of at least 20-cells. Our modified-WGA protocol effectively reduced this number, achieving reproducible high-quality WGA products from ≥5-cells as a starting template. A starting cellular input of 5 to 10-cells amplified using the modified-WGA achieved aCGH and NGS results that closely matched that of unamplified, batch genomic DNA. The modified-WGA protocol coupled with the 8-gene QC serve as an effective strategy to enhance the quality of low-template WGA reactions. Furthermore, a threshold number of 5-10 cells are likely needed for a reliable WGA reaction and product with high fidelity to the original starting template.
40 CFR 98.324 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Procedures and Inspection Tracking System Handbook Number: PH-08-V-1, January 1, 2008 (incorporated by... paragraphs (d)(1) through (d)(2) of this section. (1) ASTM D1945-03, Standard Test Method for Analysis of... Reformed Gas by Gas Chromatography; ASTM D4891-89 (Reapproved 2006), Standard Test Method for Heating Value...
40 CFR 98.324 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Procedures and Inspection Tracking System Handbook Number: PH-08-V-1, January 1, 2008 (incorporated by... paragraphs (d)(1) through (d)(2) of this section. (1) ASTM D1945-03, Standard Test Method for Analysis of... Reformed Gas by Gas Chromatography; ASTM D4891-89 (Reapproved 2006), Standard Test Method for Heating Value...
40 CFR 98.324 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Procedures and Inspection Tracking System Handbook Number: PH-08-V-1, January 1, 2008 (incorporated by... ASTM D1945-03, Standard Test Method for Analysis of Natural Gas by Gas Chromatography; ASTM D1946-90... (Reapproved 2006), Standard Test Method for Heating Value of Gases in Natural Gas Range by Stoichiometric...
40 CFR 98.244 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... procedures specified in § 98.34(c). (b) If you use the mass balance methodology in § 98.243(c), use the... of Carbon, Hydrogen, and Nitrogen in Petroleum Products and Lubricants (incorporated by reference..., Hydrogen, and Nitrogen in Laboratory Samples of Coal (incorporated by reference, see § 98.7). (viii) Method...
40 CFR 98.244 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... procedures specified in § 98.34(c). (b) If you use the mass balance methodology in § 98.243(c), use the... of Carbon, Hydrogen, and Nitrogen in Petroleum Products and Lubricants (incorporated by reference..., Hydrogen, and Nitrogen in Laboratory Samples of Coal (incorporated by reference, see § 98.7). (viii) Method...
40 CFR 98.244 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... procedures specified in § 98.34(c). (b) If you use the mass balance methodology in § 98.243(c), use the... of Carbon, Hydrogen, and Nitrogen in Petroleum Products and Lubricants (incorporated by reference..., Hydrogen, and Nitrogen in Laboratory Samples of Coal (incorporated by reference, see § 98.7). (viii) Method...
40 CFR 98.54 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... in paragraphs (b)(1) through (b)(3) of this section. (1) EPA Method 320, Measurement of Vapor Phase...) Direct measurement (such as using flow meters or weigh scales). (2) Existing plant procedures used for accounting purposes. (d) You must conduct all required performance tests according to the methods in § 98.54...
40 CFR 98.464 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
...) Determine DOC value of a waste stream by either using at least a 60-day anaerobic biodegradation test as... biodegradation test and determine the DOC value of a waste stream following the procedures and requirements in...-based standards organization to conduct a minimum of a 60-day anaerobic biodegradation test. Consensus...
NASA Astrophysics Data System (ADS)
Christianson, D. S.; Beekwilder, N.; Chan, S.; Cheah, Y. W.; Chu, H.; Dengel, S.; O'Brien, F.; Pastorello, G.; Sandesh, M.; Torn, M. S.; Agarwal, D.
2017-12-01
AmeriFlux is a network of scientists who independently collect eddy covariance and related environmental observations at over 250 locations across the Americas. As part of the AmeriFlux Management Project, the AmeriFlux Data Team manages standardization, collection, quality assurance / quality control (QA/QC), and distribution of data submitted by network members. To generate data products that are timely, QA/QC'd, and repeatable, and have traceable provenance, we developed a semi-automated data processing pipeline. The new pipeline consists of semi-automated format and data QA/QC checks. Results are communicated via on-line reports as well as an issue-tracking system. Data processing time has been reduced from 2-3 days to a few hours of manual review time, resulting in faster data availability from the time of data submission. The pipeline is scalable to the network level and has the following key features. (1) On-line results of the format QA/QC checks are available immediately for data provider review. This enables data providers to correct and resubmit data quickly. (2) The format QA/QC assessment includes an automated attempt to fix minor format errors. Data submissions that are formatted in the new AmeriFlux FP-In standard can be queued for the data QA/QC assessment, often with minimal delay. (3) Automated data QA/QC checks identify and communicate potentially erroneous data via online, graphical quick views that highlight observations with unexpected values, incorrect units, time drifts, invalid multivariate correlations, and/or radiation shadows. (4) Progress through the pipeline is integrated with an issue-tracking system that facilitates communications between data providers and the data processing team in an organized and searchable fashion. Through development of these and other features of the pipeline, we present solutions to challenges that include optimizing automated with manual processing, bridging legacy data management infrastructure with various software tools, and working across interdisciplinary and international science cultures. Additionally, we discuss results from community member feedback that helped refine QA/QC communications for efficient data submission and revision.
The Ocean Observatories Initiative Data Management and QA/QC: Lessons Learned and the Path Ahead
NASA Astrophysics Data System (ADS)
Vardaro, M.; Belabbassi, L.; Garzio, L. M.; Knuth, F.; Smith, M. J.; Kerfoot, J.; Crowley, M. F.
2016-02-01
The Ocean Observatories Initiative (OOI) is a multi-decadal, NSF-funded program that will provide long-term, near real-time cabled and telemetered measurements of climate variability, ocean circulation, ecosystem dynamics, air-sea exchange, seafloor processes, and plate-scale geodynamics. The OOI platforms consist of seafloor sensors, fixed moorings, and mobile assets containing over 700 operational instruments in the Atlantic and Pacific oceans. Rutgers University operates the Cyberinfrastructure (CI) component of the OOI, which acquires, processes and distributes data to scientists, researchers, educators and the public. It will also provide observatory mission command and control, data assessment and distribution, and long-term data management. The Rutgers Data Management Team consists of a data manager and four data evaluators, who are tasked with ensuring data completeness and quality, as well as interaction with OOI users to facilitate data delivery and utility. Here we will discuss the procedures developed to guide the data team workflow, the automated QC algorithms and human-in-the-loop (HITL) annotations that are used to flag suspect data (whether due to instrument failures, biofouling, or unanticipated events), system alerts and alarms, long-term data storage and CF (Climate and Forecast) standard compliance, and the lessons learned during construction and the first several months of OOI operations.
Batt, Angela L; Furlong, Edward T; Mash, Heath E; Glassmeyer, Susan T; Kolpin, Dana W
2017-02-01
A national-scale survey of 247 contaminants of emerging concern (CECs), including organic and inorganic chemical compounds, and microbial contaminants, was conducted in source and treated drinking water samples from 25 treatment plants across the United States. Multiple methods were used to determine these CECs, including six analytical methods to measure 174 pharmaceuticals, personal care products, and pesticides. A three-component quality assurance/quality control (QA/QC) program was designed for the subset of 174 CECs which allowed us to assess and compare performances of the methods used. The three components included: 1) a common field QA/QC protocol and sample design, 2) individual investigator-developed method-specific QA/QC protocols, and 3) a suite of 46 method comparison analytes that were determined in two or more analytical methods. Overall method performance for the 174 organic chemical CECs was assessed by comparing spiked recoveries in reagent, source, and treated water over a two-year period. In addition to the 247 CECs reported in the larger drinking water study, another 48 pharmaceutical compounds measured did not consistently meet predetermined quality standards. Methodologies that did not seem suitable for these analytes are overviewed. The need to exclude analytes based on method performance demonstrates the importance of additional QA/QC protocols. Published by Elsevier B.V.
A real-time automated quality control of rain gauge data based on multiple sensors
NASA Astrophysics Data System (ADS)
qi, Y.; Zhang, J.
2013-12-01
Precipitation is one of the most important meteorological and hydrological variables. Automated rain gauge networks provide direct measurements of precipitation and have been used for numerous applications such as generating regional and national precipitation maps, calibrating remote sensing data, and validating hydrological and meteorological model predictions. Automated gauge observations are prone to a variety of error sources (instrument malfunction, transmission errors, format changes), and require careful quality controls (QC). Many previous gauge QC techniques were based on neighborhood checks within the gauge network itself and the effectiveness is dependent on gauge densities and precipitation regimes. The current study takes advantage of the multi-sensor data sources in the National Mosaic and Multi-Sensor QPE (NMQ/Q2) system and developes an automated gauge QC scheme based the consistency of radar hourly QPEs and gauge observations. Error characteristics of radar and gauge as a function of the radar sampling geometry, precipitation regimes, and the freezing level height are considered. The new scheme was evaluated by comparing an NMQ national gauge-based precipitation product with independent manual gauge observations. Twelve heavy rainfall events from different seasons and areas of the United States are selected for the evaluation, and the results show that the new NMQ product with QC'ed gauges has a more physically spatial distribution than the old product. And the new product agrees much better statistically with the independent gauges.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vile, D; Zhang, L; Cuttino, L
2016-06-15
Purpose: To create a quality assurance program based upon a risk-based assessment of a newly implemented SirSpheres Y-90 procedure. Methods: A process map was created for a newly implemented SirSpheres procedure at a community hospital. The process map documented each step of this collaborative procedure, as well as the roles and responsibilities of each member. From the process map, different potential failure modes were determined as well as any current controls in place. From this list, a full failure mode and effects analysis (FMEA) was performed by grading each failure mode’s likelihood of occurrence, likelihood of detection, and potential severity.more » These numbers were then multiplied to compute the risk priority number (RPN) for each potential failure mode. Failure modes were then ranked based on their RPN. Additional controls were then added, with failure modes corresponding to the highest RPNs taking priority. Results: A process map was created that succinctly outlined each step in the SirSpheres procedure in its current implementation. From this, 72 potential failure modes were identified and ranked according to their associated RPN. Quality assurance controls and safety barriers were then added for failure modes associated with the highest risk being addressed first. Conclusion: A quality assurance program was created from a risk-based assessment of the SirSpheres process. Process mapping and FMEA were effective in identifying potential high-risk failure modes for this new procedure, which were prioritized for new quality assurance controls. TG 100 recommends the fault tree analysis methodology to design a comprehensive and effective QC/QM program, yet we found that by simply introducing additional safety barriers to address high RPN failure modes makes the whole process simpler and safer.« less
SU-D-201-04: Evaluation of Elekta Agility MLC Performance Using Statistical Process Control
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meyers, SM; Balderson, MJ; Letourneau, D
2016-06-15
Purpose: to evaluate the performance and stability of the Elekta Agility MLC model using an automated quality control (QC) test in combination with statistical process control tools. Methods: Leaf positions were collected daily for 11 Elekta units over 5–19 months using the automated QC test, which analyzes 23 MV images to determine the location of MLC leaves relative to the radiation isocenter. The leaf positions are measured at 5 nominal positions, and images are acquired at collimator 0° and 180° to capture all MLC leaves in the field-of-view. Leaf positioning accuracy was assessed using individual and moving range control charts.more » Control limits were recomputed following MLC recalibration (occurred 1–2 times for 4 units). Specification levels of ±0.5, ±1 and ±1.5mm were tested. The mean and range of duration between out-of-control and out-of-specification events were determined. Results: Leaf position varied little over time, as confirmed by very tight individual control limits (mean ±0.19mm, range 0.09–0.44). Mean leaf position error was −0.03mm (range −0.89–0.83). Due to sporadic out-of-control events, the mean in-control duration was 3.3 days (range 1–23). Data stayed within ±1mm specification for 205 days on average (range 3–372) and within ±1.5mm for the entire date range. Measurements stayed within ±0.5mm for 1 day on average (range 0–17); however, our MLC leaves were not calibrated to this level of accuracy. Conclusion: The Elekta Agility MLC model was found to perform with high stability, as evidenced by the tight control limits. The in-specification durations support the current recommendation of monthly MLC QC tests with a ±1mm tolerance. Future work is on-going to determine if Agility performance can be optimized further using high-frequency QC test results to drive recalibration frequency. Factors that can affect leaf positioning accuracy, including beam spot motion, leaf gain calibration, drifting leaves, and image artifacts, are under investigation.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, T; Graham, C L; Sundsmo, T
This procedure provides instructions for the calibration and use of the Canberra iSolo Low Background Alpha/Beta Counting System (iSolo) that is used for counting air filters and swipe samples. This detector is capable of providing radioisotope identification (e.g., it can discriminate between radon daughters and plutonium). This procedure includes step-by-step instructions for: (1) Performing periodic or daily 'Background' and 'Efficiency QC' checks; (2) Setting-up the iSolo for counting swipes and air filters; (3) Counting swipes and air filters for alpha and beta activity; and (4) Annual calibration.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ma, Biao; Yamaguchi, Keiichi; Fukuoka, Mayuko
To accelerate the logical drug design procedure, we created the program “NAGARA,” a plugin for PyMOL, and applied it to the discovery of small compounds called medical chaperones (MCs) that stabilize the cellular form of a prion protein (PrP{sup C}). In NAGARA, we constructed a single platform to unify the docking simulation (DS), free energy calculation by molecular dynamics (MD) simulation, and interfragment interaction energy (IFIE) calculation by quantum chemistry (QC) calculation. NAGARA also enables large-scale parallel computing via a convenient graphical user interface. Here, we demonstrated its performance and its broad applicability from drug discovery to lead optimization withmore » full compatibility with various experimental methods including Western blotting (WB) analysis, surface plasmon resonance (SPR), and nuclear magnetic resonance (NMR) measurements. Combining DS and WB, we discovered anti-prion activities for two compounds and tegobuvir (TGV), a non-nucleoside non-structural protein NS5B polymerase inhibitor showing activity against hepatitis C virus genotype 1. Binding profiles predicted by MD and QC are consistent with those obtained by SPR and NMR. Free energy analyses showed that these compounds stabilize the PrP{sup C} conformation by decreasing the conformational fluctuation of the PrP{sup C}. Because TGV has been already approved as a medicine, its extension to prion diseases is straightforward. Finally, we evaluated the affinities of the fragmented regions of TGV using QC and found a clue for its further optimization. By repeating WB, MD, and QC recursively, we were able to obtain the optimum lead structure. - Highlights: • NAGARA integrates docking simulation, molecular dynamics, and quantum chemistry. • We found many compounds, e.g., tegobuvir (TGV), that exhibit anti-prion activities. • We obtained insights into the action mechanism of TGV as a medical chaperone. • Using QC, we obtained useful information for optimization of the lead compound, TGV. • NAGARA is a convenient platform for drug discovery and lead optimization.« less
Protecting the proteome: Eukaryotic cotranslational quality control pathways
2014-01-01
The correct decoding of messenger RNAs (mRNAs) into proteins is an essential cellular task. The translational process is monitored by several quality control (QC) mechanisms that recognize defective translation complexes in which ribosomes are stalled on substrate mRNAs. Stalled translation complexes occur when defects in the mRNA template, the translation machinery, or the nascent polypeptide arrest the ribosome during translation elongation or termination. These QC events promote the disassembly of the stalled translation complex and the recycling and/or degradation of the individual mRNA, ribosomal, and/or nascent polypeptide components, thereby clearing the cell of improper translation products and defective components of the translation machinery. PMID:24535822
Bujila, Robert; Poludniowski, Gavin; Fransson, Annette
2015-01-01
The purpose of this study was to develop a method of performing routine periodical quality controls (QC) of CT systems by automatically analyzing key performance indicators (KPIs), obtainable from images of manufacturers' quality assurance (QA) phantoms. A KPI pertains to a measurable or determinable QC parameter that is influenced by other underlying fundamental QC parameters. The established KPIs are based on relationships between existing QC parameters used in the annual testing program of CT scanners at the Karolinska University Hospital in Stockholm, Sweden. The KPIs include positioning, image noise, uniformity, homogeneity, the CT number of water, and the CT number of air. An application (MonitorCT) was developed to automatically evaluate phantom images in terms of the established KPIs. The developed methodology has been used for two years in clinical routine, where CT technologists perform daily scans of the manufacturer's QA phantom and automatically send the images to MonitorCT for KPI evaluation. In the cases where results were out of tolerance, actions could be initiated in less than 10 min. 900 QC scans from two CT scanners have been collected and analyzed over the two‐year period that MonitorCT has been active. Two types of errors have been registered in this period: a ring artifact was discovered with the image noise test, and a calibration error was detected multiple times with the CT number test. In both cases, results were outside the tolerances defined for MonitorCT, as well as by the vendor. Automated monitoring of KPIs is a powerful tool that can be used to supplement established QC methodologies. Medical physicists and other professionals concerned with the performance of a CT system will, using such methods, have access to comprehensive data on the current and historical (trend) status of the system such that swift actions can be taken in order to ensure the quality of the CT examinations, patient safety, and minimal disruption of service PACS numbers: 87.57.C‐, 87.57.N‐, 87.57.Q‐ PMID:26219012
[IMPLEMENTATION OF A QUALITY MANAGEMENT SYSTEM IN A NUTRITION UNIT ACCORDING TO ISO 9001:2008].
Velasco Gimeno, Cristina; Cuerda Compés, Cristina; Alonso Puerta, Alba; Frías Soriano, Laura; Camblor Álvarez, Miguel; Bretón Lesmes, Irene; Plá Mestre, Rosa; Izquierdo Membrilla, Isabel; García-Peris, Pilar
2015-09-01
the implementation of quality management systems (QMS) in the health sector has made great progress in recent years, remains a key tool for the management and improvement of services provides to patients. to describe the process of implementing a quality management system (QMS) according to the standard ISO 9001:2008 in a Nutrition Unit. the implementation began in October 2012. Nutrition Unit was supported by Hospital Preventive Medicine and Quality Management Service (PMQM). Initially training sessions on QMS and ISO standards for staff were held. Quality Committee (QC) was established with representation of the medical and nursing staff. Every week, meeting took place among members of the QC and PMQM to define processes, procedures and quality indicators. We carry on a 2 months follow-up of these documents after their validation. a total of 4 processes were identified and documented (Nutritional status assessment, Nutritional treatment, Monitoring of nutritional treatment and Planning and control of oral feeding) and 13 operating procedures in which all the activity of the Unit were described. The interactions among them were defined in the processes map. Each process has associated specific quality indicators for measuring the state of the QMS, and identifying opportunities for improvement. All the documents associated with requirements of ISO 9001:2008 were developed: quality policy, quality objectives, quality manual, documents and records control, internal audit, nonconformities and corrective and preventive actions. The unit was certified by AENOR in April 2013. the implementation of a QMS causes a reorganization of the activities of the Unit in order to meet customer's expectations. Documenting these activities ensures a better understanding of the organization, defines the responsibilities of all staff and brings a better management of time and resources. QMS also improves the internal communication and is a motivational element. Explore the satisfaction and expectations of patients can include their view in the design of care processes. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.
CARINA data synthesis project: pH data scale unification and cruise adjustments
NASA Astrophysics Data System (ADS)
Velo, A.; Pérez, F. F.; Lin, X.; Key, R. M.; Tanhua, T.; de La Paz, M.; Olsen, A.; van Heuven, S.; Jutterström, S.; Ríos, A. F.
2010-05-01
Data on carbon and carbon-relevant hydrographic and hydrochemical parameters from 188 previously non-publicly available cruise data sets in the Artic Mediterranean Seas (AMS), Atlantic Ocean and Southern Ocean have been retrieved and merged to a new database: CARINA (CARbon IN the Atlantic Ocean). These data have gone through rigorous quality control (QC) procedures to assure the highest possible quality and consistency. The data for most of the measured parameters in the CARINA database were objectively examined in order to quantify systematic differences in the reported values. Systematic biases found in the data have been corrected in the data products, three merged data files with measured, calculated and interpolated data for each of the three CARINA regions; AMS, Atlantic Ocean and Southern Ocean. Out of a total of 188 cruise entries in the CARINA database, 59 reported pH measured values. All reported pH data have been unified to the Sea-Water Scale (SWS) at 25 °C. Here we present details of the secondary QC of pH in the CARINA database and the scale unification to SWS at 25 °C. The pH scale has been converted for 36 cruises. Procedures of quality control, including crossover analysis between cruises and inversion analysis are described. Adjustments were applied to the pH values for 21 of the cruises in the CARINA dataset. With these adjustments the CARINA database is consistent both internally as well as with the GLODAP data, an oceanographic data set based on the World Hydrographic Program in the 1990s. Based on our analysis we estimate the internal consistency of the CARINA pH data to be 0.005 pH units. The CARINA data are now suitable for accurate assessments of, for example, oceanic carbon inventories and uptake rates, for ocean acidification assessment and for model validation.
General Quality Control (QC) Guidelines for SAM Methods
Learn more about quality control guidelines and recommendations for the analysis of samples using the methods listed in EPA's Selected Analytical Methods for Environmental Remediation and Recovery (SAM).
Spectrally high performing quantum cascade lasers
NASA Astrophysics Data System (ADS)
Toor, Fatima
Quantum cascade (QC) lasers are versatile semiconductor light sources that can be engineered to emit light of almost any wavelength in the mid- to far-infrared (IR) and terahertz region from 3 to 300 mum [1-5]. Furthermore QC laser technology in the mid-IR range has great potential for applications in environmental, medical and industrial trace gas sensing [6-10] since several chemical vapors have strong rovibrational frequencies in this range and are uniquely identifiable by their absorption spectra through optical probing of absorption and transmission. Therefore, having a wide range of mid-IR wavelengths in a single QC laser source would greatly increase the specificity of QC laser-based spectroscopic systems, and also make them more compact and field deployable. This thesis presents work on several different approaches to multi-wavelength QC laser sources that take advantage of band-structure engineering and the uni-polar nature of QC lasers. Also, since for chemical sensing, lasers with narrow linewidth are needed, work is presented on a single mode distributed feedback (DFB) QC laser. First, a compact four-wavelength QC laser source, which is based on a 2-by-2 module design, with two waveguides having QC laser stacks for two different emission wavelengths each, one with 7.0 mum/11.2 mum, and the other with 8.7 mum/12.0 mum is presented. This is the first design of a four-wavelength QC laser source with widely different emission wavelengths that uses minimal optics and electronics. Second, since there are still several unknown factors that affect QC laser performance, results on a first ever study conducted to determine the effects of waveguide side-wall roughness on QC laser performance using the two-wavelength waveguides is presented. The results are consistent with Rayleigh scattering effects in the waveguides, with roughness effecting shorter wavelengths more than longer wavelengths. Third, a versatile time-multiplexed multi-wavelength QC laser system that emits at lambda = 10.8 mum for positive and lambda = 8.6 mum for negative polarity current with microsecond time delay is presented. Such a system is the first demonstration of a time and wavelength multiplexed system that uses a single QC laser. Fourth, work on the design and fabrication of a single-mode distributed feedback (DFB) QC laser emitting at lambda ≈ 7.7 mum to be used in a QC laser based photoacoustic sensor is presented. The DFB QC laser had a temperature tuning co-efficient of 0.45 nm/K for a temperature range of 80 K to 320 K, and a side mode suppression ratio of greater than 30 dB. Finally, study on the lateral mode patterns of wide ridge QC lasers is presented. The results include the observation of degenerate and non-degenerate lateral modes in wide ridge QC lasers emitting at lambda ≈ 5.0 mum. This study was conducted with the end goal of using wide ridge QC lasers in a novel technique to spatiospectrally combine multiple transverse modes to obtain an ultra high power single spot QC laser beam.
2012-09-30
briefing for aircraft operations in Diego Garcia, reports posted on EOL field catalog in realtime (http://catalog.eol.ucar.edu/cgi- bin/dynamo/report...index); • Dropsonde data processing on all P3 flights and realtime QC/reporting to GTS; and • Science summary of aircraft missions posted on EOL ...data analysis, worked with EOL on data quality control (QC), participated in the DYNAMO Sounding Workshop at EOL /NCAR from 6-7 February 2012
40 CFR 98.344 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... minutes between samples and determine the methane composition of the landfill gas using one of the methods.... ER30OC09.136 Where: CCH4 = Methane concentration in the landfill gas (volume %) for use in Equation HH-4 of... procedures used to ensure the accuracy of the estimates of disposal quantities and, if applicable, gas flow...
40 CFR 98.344 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... minutes between samples and determine the methane composition of the landfill gas using one of the methods.... ER30OC09.136 Where: CCH4 = Methane concentration in the landfill gas (volume %) for use in Equation HH-4 of... landfill gas (volume %, dry basis). (f) The owner or operator shall document the procedures used to ensure...
40 CFR 98.344 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... minutes between samples and determine the methane composition of the landfill gas using one of the methods.... ER30OC09.136 Where: CCH4 = Methane concentration in the landfill gas (volume %) for use in Equation HH-4 of... procedures used to ensure the accuracy of the estimates of disposal quantities and, if applicable, gas flow...
NASA Astrophysics Data System (ADS)
Shoji, Yoshinori; Sato, Kazutoshi; Yabuki, Masanori; Tsuda, Toshitaka
2017-11-01
We installed two global navigation satellite system (GNSS) antennas on a research vessel, the RYOFU MARU of the Japan Meteorological Agency, and conducted experimental observations to assess the GNSS-derived precipitable water vapor (PWV) from October 19, 2016, to August 6, 2017. One antenna was set on the mast (MAST), while another antenna was set on the upper deck (DECK). The GNSS analysis was conducted using the precise point positioning procedure with a real-time GNSS orbit. A quality control (QC) procedure based on the amount of zenith tropospheric delay (ZTD) time variation was proposed. After the QC was applied, the retrieved PWVs were compared to 77 radiosonde observations. The PWVs of MAST agreed with the radiosonde observations with a 1.7 mm root mean square (RMS) difference, a - 0.7-mm bias, and 3.6% rejection rate, while that of DECK showed a 3.2, - 0.8 mm, and 15.7%. The larger RMS and higher rejection rate of DECK imply a stronger multi-path effect on the deck. The differences in the GNSS PWV versus radiosonde observations were compared to the atmospheric delay, the estimated altitude of the GNSS antenna, the vessel's moving speed, the wind speed, and the wave height. The atmospheric delay and GNSS antenna altitude showed moderate correlation with the differences. The results suggest the kinematic PPP's potential for practical water vapor monitoring over oceans worldwide. At the same time, from the growing negative biases with the PWV value and with estimated antenna altitude, it could be inferred that the difficulty grows in separating the signal delay from the vertical coordinate under high-humidity conditions.[Figure not available: see fulltext.
Glutaminyl Cyclase Knock-out Mice Exhibit Slight Hypothyroidism but No Hypogonadism
Schilling, Stephan; Kohlmann, Stephanie; Bäuscher, Christoph; Sedlmeier, Reinhard; Koch, Birgit; Eichentopf, Rico; Becker, Andreas; Cynis, Holger; Hoffmann, Torsten; Berg, Sabine; Freyse, Ernst-Joachim; von Hörsten, Stephan; Rossner, Steffen; Graubner, Sigrid; Demuth, Hans-Ulrich
2011-01-01
Glutaminyl cyclases (QCs) catalyze the formation of pyroglutamate (pGlu) residues at the N terminus of peptides and proteins. Hypothalamic pGlu hormones, such as thyrotropin-releasing hormone and gonadotropin-releasing hormone are essential for regulation of metabolism and fertility in the hypothalamic pituitary thyroid and gonadal axes, respectively. Here, we analyzed the consequences of constitutive genetic QC ablation on endocrine functions and on the behavior of adult mice. Adult homozygous QC knock-out mice are fertile and behave indistinguishably from wild type mice in tests of motor function, cognition, general activity, and ingestion behavior. The QC knock-out results in a dramatic drop of enzyme activity in the brain, especially in hypothalamus and in plasma. Other peripheral organs like liver and spleen still contain QC activity, which is most likely caused by its homolog isoQC. The serum gonadotropin-releasing hormone, TSH, and testosterone concentrations were not changed by QC depletion. The serum thyroxine was decreased by 24% in homozygous QC knock-out animals, suggesting a mild hypothyroidism. QC knock-out mice were indistinguishable from wild type with regard to blood glucose and glucose tolerance, thus differing from reports of thyrotropin-releasing hormone knock-out mice significantly. The results suggest a significant formation of the hypothalamic pGlu hormones by alternative mechanisms, like spontaneous cyclization or conversion by isoQC. The different effects of QC depletion on the hypothalamic pituitary thyroid and gonadal axes might indicate slightly different modes of substrate conversion of both enzymes. The absence of significant abnormalities in QC knock-out mice suggests the presence of a therapeutic window for suppression of QC activity in current drug development. PMID:21330373
Oliveira, M; Lopez, G; Geambastiani, P; Ubeda, C
2018-05-01
A quality assurance (QA) program is a valuable tool for the continuous production of optimal quality images. The aim of this paper is to assess a newly developed automatic computer software for image quality (IR) evaluation in fluoroscopy X-ray systems. Test object images were acquired using one fluoroscopy system, Siemens Axiom Artis model (Siemens AG, Medical Solutions Erlangen, Germany). The software was developed as an ImageJ plugin. Two image quality parameters were assessed: high-contrast spatial resolution (HCSR) and signal-to-noise ratio (SNR). The time between manual and automatic image quality assessment procedures were compared. The paired t-test was used to assess the data. p Values of less than 0.05 were considered significant. The Fluoro-QC software generated faster IQ evaluation results (mean = 0.31 ± 0.08 min) than manual procedure (mean = 4.68 ± 0.09 min). The mean difference between techniques was 4.36 min. Discrepancies were identified in the region of interest (ROI) areas drawn manually with evidence of user dependence. The new software presented the results of two tests (HCSR = 3.06, SNR = 5.17) and also collected information from the DICOM header. Significant differences were not identified between manual and automatic measures of SNR (p value = 0.22) and HCRS (p value = 0.46). The Fluoro-QC software is a feasible, fast and free to use method for evaluating imaging quality parameters on fluoroscopy systems. Copyright © 2017 The College of Radiographers. Published by Elsevier Ltd. All rights reserved.
Satellite-Based Quantum Communications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hughes, Richard J; Nordholt, Jane E; McCabe, Kevin P
2010-09-20
Single-photon quantum communications (QC) offers the attractive feature of 'future proof', forward security rooted in the laws of quantum physics. Ground based quantum key distribution (QKD) experiments in optical fiber have attained transmission ranges in excess of 200km, but for larger distances we proposed a methodology for satellite-based QC. Over the past decade we have devised solutions to the technical challenges to satellite-to-ground QC, and we now have a clear concept for how space-based QC could be performed and potentially utilized within a trusted QKD network architecture. Functioning as a trusted QKD node, a QC satellite ('QC-sat') could deliver secretmore » keys to the key stores of ground-based trusted QKD network nodes, to each of which multiple users are connected by optical fiber or free-space QC. A QC-sat could thereby extend quantum-secured connectivity to geographically disjoint domains, separated by continental or inter-continental distances. In this paper we describe our system concept that makes QC feasible with low-earth orbit (LEO) QC-sats (200-km-2,000-km altitude orbits), and the results of link modeling of expected performance. Using the architecture that we have developed, LEO satellite-to-ground QKD will be feasible with secret bit yields of several hundred 256-bit AES keys per contact. With multiple ground sites separated by {approx} 100km, mitigation of cloudiness over any single ground site would be possible, potentially allowing multiple contact opportunities each day. The essential next step is an experimental QC-sat. A number of LEO-platforms would be suitable, ranging from a dedicated, three-axis stabilized small satellite, to a secondary experiment on an imaging satellite. to the ISS. With one or more QC-sats, low-latency quantum-secured communications could then be provided to ground-based users on a global scale. Air-to-ground QC would also be possible.« less
Joint design of QC-LDPC codes for coded cooperation system with joint iterative decoding
NASA Astrophysics Data System (ADS)
Zhang, Shunwai; Yang, Fengfan; Tang, Lei; Ejaz, Saqib; Luo, Lin; Maharaj, B. T.
2016-03-01
In this paper, we investigate joint design of quasi-cyclic low-density-parity-check (QC-LDPC) codes for coded cooperation system with joint iterative decoding in the destination. First, QC-LDPC codes based on the base matrix and exponent matrix are introduced, and then we describe two types of girth-4 cycles in QC-LDPC codes employed by the source and relay. In the equivalent parity-check matrix corresponding to the jointly designed QC-LDPC codes employed by the source and relay, all girth-4 cycles including both type I and type II are cancelled. Theoretical analysis and numerical simulations show that the jointly designed QC-LDPC coded cooperation well combines cooperation gain and channel coding gain, and outperforms the coded non-cooperation under the same conditions. Furthermore, the bit error rate performance of the coded cooperation employing jointly designed QC-LDPC codes is better than those of random LDPC codes and separately designed QC-LDPC codes over AWGN channels.
NASA Astrophysics Data System (ADS)
Kwiatek, Grzegorz; Blanke, Aglaja; Olszewska, Dorota; Orlecka-Sikora, Beata; Lasocki, Stanisław; Kozlovskaya, Elena; Nevalainen, Jouni; Schmittbuhl, Jean; Grasso, Jean-Robert; Schaming, Marc; Bigarre, Pascal; Kinscher, Jannes-Lennart; Saccorotti, Gilberto; Garcia, Alexander; Cassidy, Nigel; Toon, Sam; Mutke, Grzegorz; Sterzel, Mariusz; Szepieniec, Tomasz
2017-04-01
The Thematic Core Service "Anthropogenic Hazards" (TCS AH) integrates data and provides various data services in a form of complete e-research infrastructure for advanced analysis and geophysical modelling of anthropogenic hazard due to georesources exploitation. TCS AH is based on the prototype built in the framework of the IS-EPOS project POIG.02.03.00-14-090/13-00 (https://tcs.ah-epos.eu/). The TCS AH is currently being further developed within EPOS Implementation phase (H2020-INFRADEV-1-2015-1, INFRADEV-3-2015). The TCS AH aims to have a measurable impact on innovative research and development by providing a comprehensive, wide-scale and high quality research infrastructure available to the scientific community, industrial partners and public. One of the main deliverable of TCS AH is the access to numerous induced seismicity datasets called "episodes". The episode is defined as a comprehensive set of data describing the geophysical process induced or triggered by technological activity, which under certain circumstances can become hazardous for people, infrastructure and the environment. The episode is a time-correlated, standardized collection of geophysical, technological and other relevant geodata forming complete documentation of seismogenic process. In addition to the 6 episodes already implemented during previous phase of integration, and 3 episodes integrated within SHEER project, at least 18 new episodes related to conventional hydrocarbon extraction, reservoir treatment, underground mining and geothermal energy production are currently being integrated into the TCS AH. The heterogeneous multi-disciplinary data from different episodes are subjected to an extensive quality control (QC) procedure composed of five steps and involving the collaborative work of data providers, quality control team, IT team, that is being supervised by the quality control manager with the aid of Redmine platform. The first three steps of QC are performed at local data center and include the (1) transfer of episode data to the local data center, (2) data standardization and validation of formats, (3) metadata preparation according to TCS AH metadata scheme. The final two steps of QC are performed already at the level of TCS AH website and include (4) Contextual analysis of data quality followed by appearance of episode in TCS AH maintenance area, and finally the (5) Episode publication at TCS AH website.
Ruberu, S R; Langlois, G W; Masuda, M; Perera, S Kusum
2012-01-01
The receptor-binding assay (RBA) method for determining saxatoxin (STX) and its numerous analogues, which cause paralytic shellfish poisoning (PSP) in humans, was evaluated in a single laboratory study. Each step of the assay preparation procedure including the performance of the multi-detector TopCount® instrument was evaluated for its contribution to method variability. The overall inherent RBA variability was determined to be 17%. Variability within the 12 detectors was observed; however, there was no reproducible pattern in detector performance. This observed variability among detectors could be attributed to other factors, such as pipetting errors. In an attempt to reduce the number of plates rejected due to excessive variability in the method's quality control parameters, a statistical approach was evaluated using either Grubbs' test or the Student's t-test for rejecting outliers in the measurement of triplicate wells. This approach improved the ratio of accepted versus rejected plates, saving cost and time for rerunning the assay. However, the potential reduction in accuracy and the lack of improvement in precision suggests caution when using this approach. The current study has recommended an alternate quality control procedure for accepting or rejecting plates in place of the criteria currently used in the published assay, or the alternative of outlier testing. The recommended procedure involves the development of control charts to monitor the critical parameters identified in the published method (QC sample, EC₅₀, slope of calibration curve), with the addition of a fourth critical parameter which is the top value (100% binding) of the calibration curve.
NASA Technical Reports Server (NTRS)
Barbre, Robert E., Jr.
2015-01-01
This paper describes in detail the QC and splicing methodology for KSC's 50- and 915-MHz DRWP measurements that generates an extensive archive of vertically complete profiles from 0.20-18.45 km. The concurrent POR from each archive extends from April 2000 to December 2009. MSFC NE applies separate but similar QC processes to each of the 50- and 915-MHz DRWP archives. DRWP literature and data examination provide the basis for developing and applying the automated and manual QC processes on both archives. Depending on the month, the QC'ed 50- and 915-MHz DRWP archives retain 52-65% and 16-30% of the possible data, respectively. The 50- and 915-MHz DRWP QC archives retain 84-91% and 85-95%, respectively, of all the available data provided that data exist in the non- QC'ed archives. Next, MSFC NE applies an algorithm to splice concurrent measurements from both DRWP sources. Last, MSFC NE generates a composite profile from the (up to) five available spliced profiles to effectively characterize boundary layer winds and to utilize all possible 915-MHz DRWP measurements at each timestamp. During a given month, roughly 23,000-32,000 complete profiles exist from 0.25-18.45 km from the composite profiles' archive, and approximately 5,000- 27,000 complete profiles exist from an archive utilizing an individual 915-MHz DRWP. One can extract a variety of profile combinations (pairs, triplets, etc.) from this sample for a given application. The sample of vertically complete DRWP wind measurements not only gives launch vehicle customers greater confidence in loads and trajectory assessments versus using balloon output, but also provides flexibility to simulate different DOL situations across applicable altitudes. In addition to increasing sample size and providing more flexibility for DOL simulations in the vehicle design phase, the spliced DRWP database provides any upcoming launch vehicle program with the capability to utilize DRWP profiles on DOL to compute vehicle steering commands, provided the program applies the procedures that this report describes to new DRWP data on DOL. Decker et al. (2015) details how SLS is proposing to use DRWP data and splicing techniques on DOL. Although automation could enhance the current DOL 50-MHz DRWP QC process and could streamline any future DOL 915-MHz DRWP QC and splicing process, the DOL community would still require manual intervention to ensure that the vehicle only uses valid profiles. If a program desires to use high spatial resolution profiles, then the algorithm could randomly add high-frequency components to the DRWP profiles. The spliced DRWP database provides lots of flexibility in how one performs DOL simulations, and the algorithms that this report provides will assist the aerospace and atmospheric communities that are interested in utilizing the DRWP.
quantGenius: implementation of a decision support system for qPCR-based gene quantification.
Baebler, Špela; Svalina, Miha; Petek, Marko; Stare, Katja; Rotter, Ana; Pompe-Novak, Maruša; Gruden, Kristina
2017-05-25
Quantitative molecular biology remains a challenge for researchers due to inconsistent approaches for control of errors in the final results. Due to several factors that can influence the final result, quantitative analysis and interpretation of qPCR data are still not trivial. Together with the development of high-throughput qPCR platforms, there is a need for a tool allowing for robust, reliable and fast nucleic acid quantification. We have developed "quantGenius" ( http://quantgenius.nib.si ), an open-access web application for a reliable qPCR-based quantification of nucleic acids. The quantGenius workflow interactively guides the user through data import, quality control (QC) and calculation steps. The input is machine- and chemistry-independent. Quantification is performed using the standard curve approach, with normalization to one or several reference genes. The special feature of the application is the implementation of user-guided QC-based decision support system, based on qPCR standards, that takes into account pipetting errors, assay amplification efficiencies, limits of detection and quantification of the assays as well as the control of PCR inhibition in individual samples. The intermediate calculations and final results are exportable in a data matrix suitable for further statistical analysis or visualization. We additionally compare the most important features of quantGenius with similar advanced software tools and illustrate the importance of proper QC system in the analysis of qPCR data in two use cases. To our knowledge, quantGenius is the only qPCR data analysis tool that integrates QC-based decision support and will help scientists to obtain reliable results which are the basis for biologically meaningful data interpretation.
The purpose of this presentation is to present an overview of the quality control (QC) sections of a draft EPA document entitled, "Quality Assurance/Quality Control Guidance for Laboratories Performing PCR Analyses on Environmental Samples." This document has been prepared by th...
7 CFR 275.10 - Scope and purpose.
Code of Federal Regulations, 2010 CFR
2010-01-01
... to enhanced funding. (b) The objectives of quality control reviews are to provide: (1) A systematic... FOOD STAMP AND FOOD DISTRIBUTION PROGRAM PERFORMANCE REPORTING SYSTEM Quality Control (QC) Reviews... responsible for conducting quality control reviews. For food stamp quality control reviews, a sample of...
Environment-induced quantum coherence spreading of a qubit
NASA Astrophysics Data System (ADS)
Pozzobom, Mauro B.; Maziero, Jonas
2017-02-01
We make a thorough study of the spreading of quantum coherence (QC), as quantified by the l1-norm QC, when a qubit (a two-level quantum system) is subjected to noise quantum channels commonly appearing in quantum information science. We notice that QC is generally not conserved and that even incoherent initial states can lead to transitory system-environment QC. We show that for the amplitude damping channel the evolved total QC can be written as the sum of local and non-local parts, with the last one being equal to entanglement. On the other hand, for the phase damping channel (PDC) entanglement does not account for all non-local QC, with the gap between them depending on time and also on the qubit's initial state. Besides these issues, the possibility and conditions for time invariance of QC are regarded in the case of bit, phase, and bit-phase flip channels. Here we reveal the qualitative dynamical inequivalence between these channels and the PDC and show that the creation of system-environment entanglement does not necessarily imply the destruction of the qubit's QC. We also investigate the resources needed for non-local QC creation, showing that while the PDC requires initial coherence of the qubit, for some other channels non-zero population of the excited state (i.e., energy) is sufficient. Related to that, considering the depolarizing channel we notice the qubit's ability to act as a catalyst for the creation of joint QC and entanglement, without need for nonzero initial QC or excited state population.
Development of the Quality Assurance/Quality Control Procedures for a Neutron Interrogation System
NASA Astrophysics Data System (ADS)
Obhođaš, Jasmina; Sudac, Davorin; Valković, Vladivoj
2016-06-01
In order to perform Quality Assurance/Quality Control (QA/QC) procedures for a system dedicated to the neutron interrogation of objects for the presence of threat materials one needs to perform measurements of reference materials (RM) i.e. simulants having the same (or similar) atomic ratios as real materials. It is well known that explosives, drugs, and various other benign materials, contain chemical elements such as hydrogen, oxygen, carbon and nitrogen in distinctly different quantities. For example, a high carbon-to-oxygen ratio (C/O) is characteristic of drugs. Explosives can be differentiated by measurement of both (C/O) and nitrogen-to-oxygen (N/O) ratios. The C/N ratio of the chemical warfare agents, coupled with the measurement of elements such as fluorine and phosphorus, clearly differentiate them from the conventional explosives. Here we present the RM preparation, calibration procedure and correlations attained between theoretical values and experimentally obtained results in laboratory conditions for C/O and N/C ratios of prepared hexogen (RDX), TNT, DLM2, TATP, cocaine, heroin, yperite, tetranitromethane, peroxide methylethylketone, nitromethane and ethyleneglycol dinitrate simulants. We have shown that analyses of the gamma ray spectra by using simple unfolding model developed for this purpose gave a nice agreement with the chemical formula of created simulants, thus the calibration quality was successfully tested.
BWR Steam Dryer Alternating Stress Assessment Procedures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morante, R. J.; Hambric, S. A.; Ziada, S.
2016-12-01
This report presents an overview of Boiling Water Reactor (BWR) steam dryer design; the fatigue cracking failures that occurred at the Quad Cities (QC) plants and their root causes; a history of BWR Extended Power Uprates (EPUs) in the USA; and a discussion of steam dryer modifications/replacements, alternating stress mechanisms on steam dryers, and structural integrity evaluations (static and alternating stress).
Announcement—guidance document for acquiring reliable data in ecological restoration projects
Stapanian, Martin A.; Rodriguez, Karen; Lewis, Timothy E.; Blume, Louis; Palmer, Craig J.; Walters, Lynn; Schofield, Judith; Amos, Molly M.; Bucher, Adam
2016-01-01
The Laurentian Great Lakes are undergoing intensive ecological restoration in Canada and the United States. In the United States, an interagency committee was formed to facilitate implementation of quality practices for federally funded restoration projects in the Great Lakes basin. The Committee's responsibilities include developing a guidance document that will provide a common approach to the application of quality assurance and quality control (QA/QC) practices for restoration projects. The document will serve as a “how-to” guide for ensuring data quality during each aspect of ecological restoration projects. In addition, the document will provide suggestions on linking QA/QC data with the routine project data and hints on creating detailed supporting documentation. Finally, the document will advocate integrating all components of the project, including QA/QC applications, into an overarching decision-support framework. The guidance document is expected to be released by the U.S. EPA Great Lakes National Program Office in 2017.
1994-03-04
WalerQC METHOD BANK 30104 79-0146 TRHICLOROE1Ifl.BEE(TE) 0.j U11.01 WalerQC UShODSBAIN 301 04W 79-0146 TRIILMOROBHYLBEE (TCE) IU 1101.. alerQC METHOD...OOUL1!ANE -SS 89 %IC WSWeQC METHOD BANK 3020(1400 22M 0-Si-S 2*OOCLOROBUTANE -SI 902 sm WalerQC METHOD BLANK 8020(1400 22M 0-365 1.4003C2LOROSUfANE...SS 920 %wI WmerQC METHMOD BANK 0102(1400 CH 10-56-5 I.OX4-D01OOSUANE -SI IisBc WaNer C METHOD BLANK 8100(1400 22 10-5&5 2.40 EHOROSUTANE -SI 92 IC
Cendejas, Richard A; Phillips, Mark C; Myers, Tanya L; Taubman, Matthew S
2010-12-06
An external-cavity (EC) quantum cascade (QC) laser using optical feedback from a partial-reflector is reported. With this configuration, the otherwise multi-mode emission of a Fabry-Perot QC laser was made single-mode with optical output powers exceeding 40 mW. A mode-hop free tuning range of 2.46 cm(-1) was achieved by synchronously tuning the EC length and QC laser current. The linewidth of the partial-reflector EC-QC laser was measured for integration times from 100 μs to 4 seconds, and compared to a distributed feedback QC laser. Linewidths as small as 480 kHz were recorded for the EC-QC laser.
Depuydt, Christophe E; Arbyn, Marc; Benoy, Ina H; Vandepitte, Johan; Vereecken, Annie J; Bogers, Johannes J
2009-01-01
The objective of this prospective study was to compare the number of CIN2+cases detected in negative cytology by different quality control (QC) methods. Full rescreening, high-risk (HR) human papillomavirus (HPV)-targeted reviewing and HR HPV detection were compared. Randomly selected negative cytology detected by BD FocalPoint™ (NFR), by guided screening of the prescreened which needed further review (GS) and by manual screening (MS) was used. A 3-year follow-up period was available. Full rescreening of cytology only detected 23.5% of CIN2+ cases, whereas the cytological rescreening of oncogenic positive slides (high-risk HPV-targeted reviewing) detected 7 of 17 CIN2+ cases (41.2%). Quantitative real-time PCR for 15 oncogenic HPV types detected all CIN2+ cases. Relative sensitivity to detect histological CIN2+ was 0.24 for full rescreening, 0.41 for HR-targeted reviewing and 1.00 for HR HPV detection. In more than half of the reviewed negative cytological preparations associated with histological CIN2+cases no morphologically abnormal cells were detected despite a positive HPV test. The visual cut-off for the detection of abnormal cytology was established at 6.5 HR HPV copies/cell. High-risk HPV detection has a higher yield for detection of CIN2+ cases as compared to manual screening followed by 5% full review, or compared to targeted reviewing of smears positive for oncogenic HPV types, and show diagnostic properties that support its use as a QC procedure in cytologic laboratories. PMID:18544049
Sobol, Wlad T
2002-01-01
A simple kinetic model that describes the time evolution of the chemical concentration of an arbitrary compound within the tank of an automatic film processor is presented. It provides insights into the kinetics of chemistry concentration inside the processor's tank; the results facilitate the tasks of processor tuning and quality control (QC). The model has successfully been used in several troubleshooting sessions of low-volume mammography processors for which maintaining consistent QC tracking was difficult due to fluctuations of bromide levels in the developer tank.
Proximate Composition Analysis.
2016-01-01
The proximate composition of foods includes moisture, ash, lipid, protein and carbohydrate contents. These food components may be of interest in the food industry for product development, quality control (QC) or regulatory purposes. Analyses used may be rapid methods for QC or more accurate but time-consuming official methods. Sample collection and preparation must be considered carefully to ensure analysis of a homogeneous and representative sample, and to obtain accurate results. Estimation methods of moisture content, ash value, crude lipid, total carbohydrates, starch, total free amino acids and total proteins are put together in a lucid manner.
Deficiencies of product labeling directions for the preparation of radiopharmaceuticals.
Hung, Joseph C; Ponto, James A; Gadient, Katie R; Frie, Julia A; Aksamit, Carolyn M; Enquist, Cassandra L; Carrels, Katie E
2004-01-01
To identify potential deficiencies in product labeling (package insert) instructions for the preparation of radiopharmaceuticals. Preparation instructions, which include both reconstitution and quality control (QC) directions, as stated in the package inserts were evaluated for all commercially available reconstituted radiopharmaceuticals. Reviews of the package inserts were initially performed by each author, and then all identified deficiencies were compiled and evaluated by all authors. The preparation scenario for each package insert evaluated was based on a centralized nuclear pharmacy operation assuming typical support personnel, standard operating equipment, and workload. The instructions as stated in each package insert for the preparation (including QC) were rated as inadequate if a satisfactory preparation could not be prepared by a nuclear pharmacist or physician when instructions were followed exactly. Identified deficiencies in package insert instructions for the preparation of radiopharmaceuticals fell into the following five categories: (1) absent or incomplete directions (especially with regard to QC procedures); (2) restrictive directions (e.g., specific requirement to use designated needles, chromatography solvents, counting devices), (3) inconsistent directions (e.g., different reconstituted volumes for the same final drug product, unworkable expiration times); (4) impractical directions (e.g., unrealistically low reconstituted activity limits, dangerously high number of radiolabeled particles); and (5) vague directions (e.g., use of the words "should," "may," "recommend"). Manufacturers' directions for the preparation of radiopharmaceuticals often contain deficiencies and should be viewed as standard guidance rather than as requirements. Just as physicians are permitted to use U.S. Food and Drug Administration (FDA)-approved drugs for off-label indications, nuclear pharmacists should be allowed to use alternative methods for preparing radiopharmaceuticals, provided those methods have been validated to be as good as the stated directions and that the nuclear pharmacists do not engage in activities that fall outside the normal practice of pharmacy. Manufacturers, FDA, nuclear pharmacists, and nuclear physicians should work together to address identified deficiencies in package insert directions.
NASA Astrophysics Data System (ADS)
Bitew, M. M.; Goodrich, D. C.; Demaria, E.; Heilman, P.; Kautz, M. A.
2017-12-01
Walnut Gulch is a semi-arid environment experimental watershed and Long Term Agro-ecosystem Research (LTAR) site managed by USDA-ARS Southwest Watershed Research Center for which high-resolution long-term hydro-climatic data are available across its 150 km2 drainage area. In this study, we present the analysis of 50 years of continuous hourly rainfall data to evaluate runoff control and generation processes for improving the QA-QC plans of Walnut Gulch to create high-quality data set that is critical for reducing water balance uncertainties. Multiple linear regression models were developed to relate rainfall properties, runoff characteristics and watershed properties. The rainfall properties were summarized to event based total depth, maximum intensity, duration, the location of the storm center with respect to the outlet, and storm size normalized to watershed area. We evaluated the interaction between the runoff and rainfall and runoff as antecedent moisture condition (AMC), antecedent runoff condition (ARC) and, runoff depth and duration for each rainfall events. We summarized each of the watershed properties such as contributing area, slope, shape, channel length, stream density, channel flow area, and percent of the area of retention stock ponds for each of the nested catchments in Walnut Gulch. The evaluation of the model using basic and categorical statistics showed good predictive skill throughout the watersheds. The model produced correlation coefficients ranging from 0.4-0.94, Nash efficiency coefficients up to 0.77, and Kling-Gupta coefficients ranging from 0.4 to 0.98. The model predicted 92% of all runoff generations and 98% of no-runoff across all sub-watersheds in Walnut Gulch. The regression model also indicated good potential to complement the QA-QC procedures in place for Walnut Gulch dataset publications developed over the years since the 1960s through identification of inconsistencies in rainfall and runoff relations.
Steuer, Andrea E; Forss, Anna-Maria; Dally, Annika M; Kraemer, Thomas
2014-11-01
In the context of driving under the influence of drugs (DUID), not only common drugs of abuse may have an influence, but also medications with similar mechanisms of action. Simultaneous quantification of a variety of drugs and medications relevant in this context allows faster and more effective analyses. Therefore, multi-analyte approaches have gained more and more popularity in recent years. Usually, calibration curves for such procedures contain a mixture of all analytes, which might lead to mutual interferences. In this study we investigated whether the use of such mixtures leads to reliable results for authentic samples containing only one or two analytes. Five hundred microliters of whole blood were extracted by routine solid-phase extraction (SPE, HCX). Analysis was performed on an ABSciex 3200 QTrap instrument with ESI+ in scheduled MRM mode. The method was fully validated according to international guidelines including selectivity, recovery, matrix effects, accuracy and precision, stabilities, and limit of quantification. The selected SPE provided recoveries >60% for all analytes except 6-monoacetylmorphine (MAM) with coefficients of variation (CV) below 15% or 20% for quality controls (QC) LOW and HIGH, respectively. Ion suppression >30% was found for benzoylecgonine, hydrocodone, hydromorphone, MDA, oxycodone, and oxymorphone at QC LOW, however CVs were always below 10% (n=6 different whole blood samples). Accuracy and precision criteria were fulfilled for all analytes except for MAM. Systematic investigation of accuracy determined for QC MED in a multi-analyte mixture compared to samples containing only single analytes revealed no relevant differences for any analyte, indicating that a multi-analyte calibration is suitable for the presented method. Comparison of approximately 60 samples to a former GC-MS method showed good correlation. The newly validated method was successfully applied to more than 1600 routine samples and 3 proficiency tests. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Functional C1q is present in the skin mucus of Siberian sturgeon (Acipenser baerii).
Fan, Chunxin; Wang, Jian; Zhang, Xuguang; Song, Jiakun
2015-01-01
The skin mucus of fish acts as the first line of self-protection against pathogens in the aquatic environment and comprises a number of innate immune components. However, the presence of the critical classical complement component C1q, which links the innate and adaptive immune systems of mammalians, has not been explored in a primitive actinopterygian fish. In this study, we report that C1q is present in the skin mucus of the Siberian sturgeon (Acipenser baerii). The skin mucus was able to inhibit the growth of Escherichia coli. The bacteriostatic activity of the skin mucus was reduced by heating and by pre-incubation with EDTA or mouse anti-human C1q antibody. We also detected C1q protein in skin mucus using the western blot procedure and isolated a cDNA that encodes the Siberian sturgeon C1qC, which had 44.7-51.4% identity with C1qCs in teleosts and tetrapods. A phylogenetic analysis revealed that Siberian sturgeon C1qC lies at the root of the actinopterygian branch and is separate from the tetrapod branch. The C1qC transcript was expressed in many tissues as well as in skin. Our data indicate that C1q is present in the skin mucus of the Siberian sturgeon to protect against water-borne bacteria, and the C1qC found in the sturgeon may represent the primitive form of teleost and tetrapod C1qCs. © 2014 International Society of Zoological Sciences, Institute of Zoology/Chinese Academy of Sciences and Wiley Publishing Asia Pty Ltd.
Mendoza-Parra, Marco-Antonio; Saravaki, Vincent; Cholley, Pierre-Etienne; Blum, Matthias; Billoré, Benjamin; Gronemeyer, Hinrich
2016-01-01
We have established a certification system for antibodies to be used in chromatin immunoprecipitation assays coupled to massive parallel sequencing (ChIP-seq). This certification comprises a standardized ChIP procedure and the attribution of a numerical quality control indicator (QCi) to biological replicate experiments. The QCi computation is based on a universally applicable quality assessment that quantitates the global deviation of randomly sampled subsets of ChIP-seq dataset with the original genome-aligned sequence reads. Comparison with a QCi database for >28,000 ChIP-seq assays were used to attribute quality grades (ranging from 'AAA' to 'DDD') to a given dataset. In the present report we used the numerical QC system to assess the factors influencing the quality of ChIP-seq assays, including the nature of the target, the sequencing depth and the commercial source of the antibody. We have used this approach specifically to certify mono and polyclonal antibodies obtained from Active Motif directed against the histone modification marks H3K4me3, H3K27ac and H3K9ac for ChIP-seq. The antibodies received the grades AAA to BBC ( www.ngs-qc.org). We propose to attribute such quantitative grading of all antibodies attributed with the label "ChIP-seq grade".
Coda Wave Attenuation Characteristics for North Anatolian Fault Zone, Turkey
NASA Astrophysics Data System (ADS)
Sertcelik, Fadime; Guleroglu, Mehmet
2017-10-01
North Anatolian Fault Zone, on which large earthquakes have occurred in the past, migrates regularly from east to west, and it is one of the most active faults in the world. The purpose of this study is to estimate the coda wave quality factor (Qc) for each of the five sub regionsthat were determined according to the fault rupture of these large earthquakes and along the fault. 978 records have been analyzed for 1.5, 3, 6, 9, 12 and 18 Hz frequencies by Single Backscattering Method. Along the fault, the variations in the Qc with lapse time are determined via, Qc = (136±25)f(0.96±0.027), Qc = (208±22)f(0.85±0.02) Qc = (307±28)f(0.72±0.025) at 20, 30, 40 sec lapse times, respectively. The estimated average frequency-dependence quality factor for all lapse time are; Qc(f) = (189±26)f(0.86±0.02) for Karliova-Tokat region; Qc(f) = (216±19)f(0.76±0.018) for Tokat-Çorum region; Qc(f) = (232±18)f(0.76±0.019) for Çorum-Adapazari region; Qc(f) = (280±28)f(0.79±0.021) for Adapazari-Yalova region; Qc(f) = (252±26)f(0.81±0.022) for Yalova-Gulf of Saros region. The coda wave quality factor at all the lapse times and frequencies is Qc(f) = (206±15)f(0.85±0.012) in the study area. The most change of Qc with lapse time is determined at Yalova-Saros region. The result may be related to heterogeneity degree of rapidly decreases towards the deep crust like compared to the other sub region. Moreover, the highest Qc is calculated between Adapazari - Yalova. It was interpreted as a result of seismic energy released by 1999 Kocaeli Earthquake. Besides, it couldn't be established a causal relationship between the regional variation of Qc with frequency and lapse time associated to migration of the big earthquakes. These results have been interpreted as the attenuation mechanism is affected by both regional heterogeneity and consist of a single or multi strands of the fault structure.
Iqbal, Sahar; Mustansar, Tazeen
2017-03-01
Sigma is a metric that quantifies the performance of a process as a rate of Defects-Per-Million opportunities. In clinical laboratories, sigma metric analysis is used to assess the performance of laboratory process system. Sigma metric is also used as a quality management strategy for a laboratory process to improve the quality by addressing the errors after identification. The aim of this study is to evaluate the errors in quality control of analytical phase of laboratory system by sigma metric. For this purpose sigma metric analysis was done for analytes using the internal and external quality control as quality indicators. Results of sigma metric analysis were used to identify the gaps and need for modification in the strategy of laboratory quality control procedure. Sigma metric was calculated for quality control program of ten clinical chemistry analytes including glucose, chloride, cholesterol, triglyceride, HDL, albumin, direct bilirubin, total bilirubin, protein and creatinine, at two control levels. To calculate the sigma metric imprecision and bias was calculated with internal and external quality control data, respectively. The minimum acceptable performance was considered as 3 sigma. Westgard sigma rules were applied to customize the quality control procedure. Sigma level was found acceptable (≥3) for glucose (L2), cholesterol, triglyceride, HDL, direct bilirubin and creatinine at both levels of control. For rest of the analytes sigma metric was found <3. The lowest value for sigma was found for chloride (1.1) at L2. The highest value of sigma was found for creatinine (10.1) at L3. HDL was found with the highest sigma values at both control levels (8.8 and 8.0 at L2 and L3, respectively). We conclude that analytes with the sigma value <3 are required strict monitoring and modification in quality control procedure. In this study application of sigma rules provided us the practical solution for improved and focused design of QC procedure.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Colletti, Lisa Michelle
2016-02-19
In December of 2015, the old qualified Prim UV-Vis instrument used in the ANC122 and NFANC122 procedures failed its monthly calibration check. Attempts to get the system to pass by cleaning flow cells, replacing the light bulb and realigning the light path failed to get sustained reproducible results from the system. As it could no longer pass QA/QC requirements the decision to take it out of service was made. To replace the system, a previously purchased Thermo Fisher Scientific UV-VIS spectrometer (model Genesys 10S; Serial #: 2L5R059137) was retrieved from storage. This report shows that the system is performing asmore » required and meeting all QC requirements listed in ANC122 and NF-ANC122« less
Creating a comprehensive quality-controlled dataset of severe weather occurrence in Europe
NASA Astrophysics Data System (ADS)
Groenemeijer, P.; Kühne, T.; Liang, Z.; Holzer, A.; Feuerstein, B.; Dotzek, N.
2010-09-01
Ground-truth quality-controlled data on severe weather occurrence is required for meaningful research on severe weather hazards. Such data are collected by observation networks of several authorities in Europe, most prominently the National Hydrometeorological Institutes (NHMS). However, some events challenge the capabilities of such conventional networks by their isolated and short-lived nature. These rare and very localized but extreme events include thunderstorm wind gusts, large hail and tornadoes and are poorly resolved by synoptic observations. Moreover, their detection by remote-sensing techniques such as radar and satellites is in development and has proven to be difficult. Using the fact that all across across Europe there are many people with a special personal or professional interest in such events, who are typically organized in associations, allows pursuing a different strategy. Data delivered to the European Severe Weather Database is recorded and quality controlled by ESSL and a large number of partners including the Hydrometeorological Institutes of Germany, Finland, Austria, Italy and Bulgaria. Additionally, nine associations of storm spotters and centres of expertise in these and other countries are involved. The two categories of organizations (NHMSes/other) each have different privileges in the quality control procedure, which involves assigning a quality level of QC0+ (plausibility checked), QC1 (confirmed by reliable sources) or QC2 (verified) to each of the reports. Within the EWENT project funded by the EU 7th framework programme, the RegioExakt project funded by the German Ministry of Education and Research, and with support from the German Weather Service (DWD), several enhancements of the ESWD have been and will be carried out. Completed enhancements include the creation of an interface that allows partner organizations to upload data automatically, in the case of our German partner "Skywarn Germany" in near-real time. Moreover, the database's web-interface has been translated into 14 European languages. At the time of writing, a nowcast-mode to the web interface, which renders the ESWD a convenient tool for meteorologists in forecast centres, is being developed. In the near future, within the EWENT project, an extension of the data set with several other isolated but extreme events including avalanches, landslides, heavy snowfall and extremely powerful lightning flashes, is foreseen. The resulting ESWD dataset, that grows at a rate of 4000-5000 events per year, is used for wide range of purposes including the validation of remote-sensing techniques, forecast verification studies, projections of the future severe storm climate, and risk assessments. Its users include scientists working for EUMETSAT, NASA, NSSL, DLR, and several reinsurance companies.
7 CFR 275.10 - Scope and purpose.
Code of Federal Regulations, 2011 CFR
2011-01-01
... FOOD STAMP AND FOOD DISTRIBUTION PROGRAM PERFORMANCE REPORTING SYSTEM Quality Control (QC) Reviews... responsible for conducting quality control reviews. For food stamp quality control reviews, a sample of... terminated (called negative cases). Reviews shall be conducted on active cases to determine if households are...
Anderson, Nancy
2015-11-15
As of January 1, 2016, microbiology laboratories can choose to adopt a new quality control option, the Individualized Quality Control Plan (IQCP), under the Clinical Laboratory Improvement Amendments of 1988 (CLIA). This voluntary approach increases flexibility for meeting regulatory requirements and provides laboratories the opportunity to customize QC for their testing in their unique environments and by their testing personnel. IQCP is an all-inclusive approach to quality based on risk management to address potential errors in the total testing process. It includes three main steps, (1) performing a risk assessment, (2) developing a QC plan, and (3) monitoring the plan through quality assessment. Resources are available from the Centers for Medicare & Medicaid Services, Centers for Disease Control and Prevention, American Society for Microbiology, Clinical and Laboratory Standards Institute, and accrediting organizations, such as the College of American Pathologists and Joint Commission, to assist microbiology laboratories implementing IQCP.
NASA Astrophysics Data System (ADS)
Yuan, Jian-guo; Liang, Meng-qi; Wang, Yong; Lin, Jin-zhao; Pang, Yu
2016-03-01
A novel lower-complexity construction scheme of quasi-cyclic low-density parity-check (QC-LDPC) codes for optical transmission systems is proposed based on the structure of the parity-check matrix for the Richardson-Urbanke (RU) algorithm. Furthermore, a novel irregular QC-LDPC(4 288, 4 020) code with high code-rate of 0.937 is constructed by this novel construction scheme. The simulation analyses show that the net coding gain ( NCG) of the novel irregular QC-LDPC(4 288,4 020) code is respectively 2.08 dB, 1.25 dB and 0.29 dB more than those of the classic RS(255, 239) code, the LDPC(32 640, 30 592) code and the irregular QC-LDPC(3 843, 3 603) code at the bit error rate ( BER) of 10-6. The irregular QC-LDPC(4 288, 4 020) code has the lower encoding/decoding complexity compared with the LDPC(32 640, 30 592) code and the irregular QC-LDPC(3 843, 3 603) code. The proposed novel QC-LDPC(4 288, 4 020) code can be more suitable for the increasing development requirements of high-speed optical transmission systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hollister, R
QC sample results (daily background check drum and 100-gram SGS check drum) were within acceptance criteria established by WIPP's Quality Assurance Objectives for TRU Waste Characterization. Replicate runs were performed on drum LL85501243TRU. Replicate measurement results are identical at the 95% confidence level as established by WIPP criteria. HWM NCAR No. 02-1000168 issued on 17-Oct-2002 regarding a partially dislodged Cd sheet filter on the HPGe coaxial detector. This physical geometry occurred on 01-Oct-2002 and was not corrected until 10-Oct-2002, during which period is inclusive of the present batch run of drums. Per discussions among the Independent Technical Reviewer, Expert Reviewermore » and the Technical QA Supervisor, as well as in consultation with John Fleissner, Technical Point of Contact from Canberra, the analytical results are technically reliable. All QC standard runs during this period were in control. Data packet for SGS Batch 2002-13 generated using passive gamma-ray spectroscopy with the Pu Facility SGS unit is technically reasonable. All QC samples are in compliance with establiShed control limits. The batch data packet has been reviewed for correctness, completeness, consistency and compliance with WIPP's Quality Assurance Objectives and determined to be acceptable.« less
Sorich, Michael J; McKinnon, Ross A; Miners, John O; Winkler, David A; Smith, Paul A
2004-10-07
This study aimed to evaluate in silico models based on quantum chemical (QC) descriptors derived using the electronegativity equalization method (EEM) and to assess the use of QC properties to predict chemical metabolism by human UDP-glucuronosyltransferase (UGT) isoforms. Various EEM-derived QC molecular descriptors were calculated for known UGT substrates and nonsubstrates. Classification models were developed using support vector machine and partial least squares discriminant analysis. In general, the most predictive models were generated with the support vector machine. Combining QC and 2D descriptors (from previous work) using a consensus approach resulted in a statistically significant improvement in predictivity (to 84%) over both the QC and 2D models and the other methods of combining the descriptors. EEM-derived QC descriptors were shown to be both highly predictive and computationally efficient. It is likely that EEM-derived QC properties will be generally useful for predicting ADMET and physicochemical properties during drug discovery.
Jiang, Jian; James, Christopher A; Wong, Philip
2016-09-05
A LC-MS/MS method has been developed and validated for the determination of glycine in human cerebrospinal fluid (CSF). The validated method used artificial cerebrospinal fluid as a surrogate matrix for calibration standards. The calibration curve range for the assay was 100-10,000ng/mL and (13)C2, (15)N-glycine was used as an internal standard (IS). Pre-validation experiments were performed to demonstrate parallelism with surrogate matrix and standard addition methods. The mean endogenous glycine concentration in a pooled human CSF determined on three days by using artificial CSF as a surrogate matrix and the method of standard addition was found to be 748±30.6 and 768±18.1ng/mL, respectively. A percentage difference of -2.6% indicated that artificial CSF could be used as a surrogate calibration matrix for the determination of glycine in human CSF. Quality control (QC) samples, except the lower limit of quantitation (LLOQ) QC and low QC samples, were prepared by spiking glycine into aliquots of pooled human CSF sample. The low QC sample was prepared from a separate pooled human CSF sample containing low endogenous glycine concentrations, while the LLOQ QC sample was prepared in artificial CSF. Standard addition was used extensively to evaluate matrix effects during validation. The validated method was used to determine the endogenous glycine concentrations in human CSF samples. Incurred sample reanalysis demonstrated reproducibility of the method. Copyright © 2016 Elsevier B.V. All rights reserved.
Park, Sang Hyuk; Park, Chan-Jeoung; Kim, Mi-Jeong; Choi, Mi-Ok; Han, Min-Young; Cho, Young-Uk; Jang, Seongsoo
2014-12-01
We developed and validated an interinstrument comparison method for automatic hematology analyzers based on the 99th percentile coefficient of variation (CV) cutoff of daily means and validated in both patient samples and quality control (QC) materials. A total of 120 patient samples were obtained over 6 months. Data from the first 3 months were used to determine 99th percentile CV cutoff values, and data obtained in the last 3 months were used to calculate acceptable ranges and rejection rates. Identical analyses were also performed using QC materials. Two instrument comparisons were also performed, and the most appropriate allowable total error (ATE) values were determined. The rejection rates based on the 99th percentile cutoff values were within 10.00% and 9.30% for the patient samples and QC materials, respectively. The acceptable ranges of QC materials based on the currently used method were wider than those calculated from the 99th percentile CV cutoff values in most items. In two-instrument comparisons, 34.8% of all comparisons failed, and 87.0% of failed comparisons were successful when 4 SD was applied as an ATE value instead of 3 SD. The 99th percentile CV cutoff value-derived daily acceptable ranges can be used as a real-time interinstrument comparison method in both patient samples and QC materials. Applying 4 SD as an ATE value can significantly reduce unnecessarily followed recalibration in the leukocyte differential counts, reticulocytes, and mean corpuscular volume. Copyright© by the American Society for Clinical Pathology.
Aris-Brosou, Stephane; Kim, James; Li, Li; Liu, Hui
2018-05-15
Vendors in the health care industry produce diagnostic systems that, through a secured connection, allow them to monitor performance almost in real time. However, challenges exist in analyzing and interpreting large volumes of noisy quality control (QC) data. As a result, some QC shifts may not be detected early enough by the vendor, but lead a customer to complain. The aim of this study was to hypothesize that a more proactive response could be designed by utilizing the collected QC data more efficiently. Our aim is therefore to help prevent customer complaints by predicting them based on the QC data collected by in vitro diagnostic systems. QC data from five select in vitro diagnostic assays were combined with the corresponding database of customer complaints over a period of 90 days. A subset of these data over the last 45 days was also analyzed to assess how the length of the training period affects predictions. We defined a set of features used to train two classifiers, one based on decision trees and the other based on adaptive boosting, and assessed model performance by cross-validation. The cross-validations showed classification error rates close to zero for some assays with adaptive boosting when predicting the potential cause of customer complaints. Performance was improved by shortening the training period when the volume of complaints increased. Denoising filters that reduced the number of categories to predict further improved performance, as their application simplified the prediction problem. This novel approach to predicting customer complaints based on QC data may allow the diagnostic industry, the expected end user of our approach, to proactively identify potential product quality issues and fix these before receiving customer complaints. This represents a new step in the direction of using big data toward product quality improvement. ©Stephane Aris-Brosou, James Kim, Li Li, Hui Liu. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 15.05.2018.
Kim, James; Li, Li; Liu, Hui
2018-01-01
Background Vendors in the health care industry produce diagnostic systems that, through a secured connection, allow them to monitor performance almost in real time. However, challenges exist in analyzing and interpreting large volumes of noisy quality control (QC) data. As a result, some QC shifts may not be detected early enough by the vendor, but lead a customer to complain. Objective The aim of this study was to hypothesize that a more proactive response could be designed by utilizing the collected QC data more efficiently. Our aim is therefore to help prevent customer complaints by predicting them based on the QC data collected by in vitro diagnostic systems. Methods QC data from five select in vitro diagnostic assays were combined with the corresponding database of customer complaints over a period of 90 days. A subset of these data over the last 45 days was also analyzed to assess how the length of the training period affects predictions. We defined a set of features used to train two classifiers, one based on decision trees and the other based on adaptive boosting, and assessed model performance by cross-validation. Results The cross-validations showed classification error rates close to zero for some assays with adaptive boosting when predicting the potential cause of customer complaints. Performance was improved by shortening the training period when the volume of complaints increased. Denoising filters that reduced the number of categories to predict further improved performance, as their application simplified the prediction problem. Conclusions This novel approach to predicting customer complaints based on QC data may allow the diagnostic industry, the expected end user of our approach, to proactively identify potential product quality issues and fix these before receiving customer complaints. This represents a new step in the direction of using big data toward product quality improvement. PMID:29764796
NASA Astrophysics Data System (ADS)
Bushnell, M.; Waldmann, C.; Hermes, J.; Tamburri, M.
2017-12-01
Many oceanographic observation groups create and maintain QA, QC, and best practices (BP) to ensure efficient and accurate data collection and quantify quality. Several entities - IOOS® QARTOD, AtlantOS, ACT, WMO/IOC JCOMM OCG - have joined forces to document existing practices, identify gaps, and support development of emerging techniques. While each group has a slightly different focus, many underlying QA/QC/BP needs can be quite common. QARTOD focuses upon real-time data QC, and has produced manuals that address QC tests for eleven ocean variables. AtlantOS is a research and innovation project working towards the integration of ocean-observing activities across all disciplines in the Atlantic Basin. ACT brings together research institutions, resource managers, and private companies to foster the development and adoption of effective and reliable sensors for coastal, freshwater, and ocean environments. JCOMM promotes broad international coordination of oceanographic and marine meteorological observations and data management and services. Leveraging existing efforts of these organizations is an efficient way to consolidate available information, develop new practices, and evaluate the use of ISO standards to judge the quality of measurements. ISO standards may offer accepted support for a framework for an ocean data quality management system, similar to the meteorological standards defined by WMO (https://www.wmo.int/pages/prog/arep/gaw/qassurance.html). We will first cooperatively develop a plan to create a QA/QC/BP manual. The resulting plan will describe the need for such a manual, the extent of the manual, the process used to engage the community in creating it, the maintenance of the resultant document, and how these things will be done. It will also investigate standards for metadata. The plan will subsequently be used to develop the QA/QC/BP manual, providing guidance which advances the standards adopted by IOOS, AtlantOS, JCOMM, and others.
77 FR 73611 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-11
...: Negative Quality Control Review Schedule. OMB Control Number: 0584-0034. Summary of Collection: The legislative basis for the operation of the quality control system is provided by section 16 of the Food and Nutrition Act of 2008. State agencies are required to perform Quality Control (QC) reviews for the...
NASA Astrophysics Data System (ADS)
Acaba, K. J. C.; Cinco, L. D.; Melchor, J. N.
2016-03-01
Daily QC tests performed on screen film mammography (SFM) equipment are essential to ensure that both SFM unit and film processor are working in a consistent manner. The Breast Imaging Unit of USTH-Benavides Cancer Institute has been conducting QC following the test protocols in the IAEA Human Health Series No.2 manual. However, the availability of Leeds breast phantom (CRP E13039) in the facility made the task easier. Instead of carrying out separate tests on AEC constancy and light sensitometry, only one exposure of the phantom is done to accomplish the two tests. It was observed that measurements made on mAs output and optical densities (ODs) using the Leeds TOR (MAX) phantom are comparable with that obtained from the usual conduct of tests, taking into account the attenuation characteristic of the phantom. Image quality parameters such as low contrast and high contrast details were also evaluated from the phantom image. The authors recognize the usefulness of the phantom in determining technical factors that will help improve detection of smallest pathological details on breast images. The phantom is also convenient for daily QC monitoring and economical since less number of films is expended.
Testing and analysis of LWT and SCB properties of asphalt concrete mixtures.
DOT National Transportation Integrated Search
2016-04-01
Currently, Louisianas Quality Control and Quality Assurance (QC/QA) practice for asphalt mixtures in : pavement construction is mainly based on controlling properties of plant produced mixtures that include : gradation and asphalt content, voids f...
Embankment quality and assessment of moisture control implementation : tech transfer summary.
DOT National Transportation Integrated Search
2016-02-01
The motivation for this project was based on work by : Iowa State University (ISU) researchers at a few recent : grading projects that demonstrated embankments were : being constructed outside moisture control limits, even : though the contractor QC ...
Quality control and quality assurance of hot mix asphalt construction in Delaware.
DOT National Transportation Integrated Search
2006-07-01
Since the mid 60s the Federal Highway Administration began to encourage : Departments of Transportation and Contractors toward the use of quality control and : quality assurance (QA/QC) specifications, which are statistically based. : For example,...
Variation of coda wave attenuation in the Alborz region and central Iran
NASA Astrophysics Data System (ADS)
Rahimi, H.; Motaghi, K.; Mukhopadhyay, S.; Hamzehloo, H.
2010-06-01
More than 340 earthquakes recorded by the Institute of Geophysics, University of Tehran (IGUT) short period stations from 1996 to 2004 were analysed to estimate the S-coda attenuation in the Alborz region, the northern part of the Alpine-Himalayan orogen in western Asia, and in central Iran, which is the foreland of this orogen. The coda quality factor, Qc, was estimated using the single backscattering model in frequency bands of 1-25 Hz. In this research, lateral and depth variation of Qc in the Alborz region and central Iran are studied. It is observed that in the Alborz region there is absence of significant lateral variation in Qc. The average frequency relation for this region is Qc = 79 +/- 2f1.07+/-0.08. Two anomalous high-attenuation areas in central Iran are recognized around the stations LAS and RAZ. The average frequency relation for central Iran excluding the values of these two stations is Qc = 94 +/- 2f0.97+/-0.12. To investigate the attenuation variation with depth, Qc value was calculated for 14 lapse times (25, 30, 35,... 90s) for two data sets having epicentral distance range R < 100 km (data set 1) and 100 < R < 200 km (data set 2) in each area. It is observed that Qc increases with depth. However, the rate of increase of Qc with depth is not uniform in our study area. Beneath central Iran the rate of increase of Qc is greater at depths less than 100 km compared to that at larger depths indicating the existence of a high attenuation anomalous structure under the lithosphere of central Iran. In addition, below ~180 km, the Qc value does not vary much with depth under both study areas, indicating the presence of a transparent mantle under them.
Hybrid spin and valley quantum computing with singlet-triplet qubits.
Rohling, Niklas; Russ, Maximilian; Burkard, Guido
2014-10-24
The valley degree of freedom in the electronic band structure of silicon, graphene, and other materials is often considered to be an obstacle for quantum computing (QC) based on electron spins in quantum dots. Here we show that control over the valley state opens new possibilities for quantum information processing. Combining qubits encoded in the singlet-triplet subspace of spin and valley states allows for universal QC using a universal two-qubit gate directly provided by the exchange interaction. We show how spin and valley qubits can be separated in order to allow for single-qubit rotations.
77 FR 3228 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-23
..., Office of Management and Budget (OMB), [email protected] or fax (202) 395-5806 and to... it displays a currently valid OMB control number. Food and Nutrition Service Title: Quality Control... perform Quality Control (QC) review for the Supplemental Nutrition Assistance Program (SNAP). The FNS-380...
A quality control system for digital elevation data
NASA Astrophysics Data System (ADS)
Knudsen, Thomas; Kokkendorf, Simon; Flatman, Andrew; Nielsen, Thorbjørn; Rosenkranz, Brigitte; Keller, Kristian
2015-04-01
In connection with the introduction of a new version of the Danish national coverage Digital Elevation Model (DK-DEM), the Danish Geodata Agency has developed a comprehensive quality control (QC) and metadata production (MP) system for LiDAR point cloud data. The architecture of the system reflects its origin in a national mapping organization where raw data deliveries are typically outsourced to external suppliers. It also reflects a design decision of aiming at, whenever conceivable, doing full spatial coverage tests, rather than scattered sample checks. Hence, the QC procedure is split in two phases: A reception phase and an acceptance phase. The primary aim of the reception phase is to do a quick assessment of things that can typically go wrong, and which are relatively simple to check: Data coverage, data density, strip adjustment. If a data delivery passes the reception phase, the QC continues with the acceptance phase, which checks five different aspects of the point cloud data: Vertical accuracy Vertical precision Horizontal accuracy Horizontal precision Point classification correctness The vertical descriptors are comparatively simple to measure: The vertical accuracy is checked by direct comparison with previously surveyed patches. The vertical precision is derived from the observed variance on well defined flat surface patches. These patches are automatically derived from the road centerlines registered in FOT, the official Danish map data base. The horizontal descriptors are less straightforward to measure, since potential reference material for direct comparison is typically expected to be less accurate than the LiDAR data. The solution selected is to compare photogrammetrically derived roof centerlines from FOT with LiDAR derived roof centerlines. These are constructed by taking the 3D Hough transform of a point cloud patch defined by the photogrammetrical roof polygon. The LiDAR derived roof centerline is then the intersection line of the two primary planes of the transformed data. Since the photogrammetrical and the LiDAR derived roof centerline sets are independently derived, a low RMS difference indicates that both data sets are of very high accuracy. The horizontal precision is derived by doing a similar comparison between LiDAR derived roof centerlines in the overlap zone of neighbouring flight strips. Contrary to the vertical and horizontal descriptors, the point classification correctness is neither geometric, nor well defined. In this case we must resolve by introducing a human in the loop and presenting data in a form that is as useful as possible to this human. Hence, the QC system produces maps of suspicious patterns such as Vegetation below buildings Points classified as buildings where no building is registered in the map data base Building polygons from the map data base without any building points Buildings on roads All elements of the QC process is carried out in smaller tiles (typically 1 km × 1 km) and hence trivially parallelizable. Results from the parallel executing processes are collected in a geospatial data base system (PostGIS) and the progress can be analyzed and visualized in a desktop GIS while the processes run. Implementation wise, the system is based on open source components, primarily from the OSGeo stack (GDAL, PostGIS, QGIS, NumPy, SciPy, etc.). The system specific code is also being open sourced. This open source distribution philosophy supports the parallel execution paradigm, since all available hardware can be utilized without any licensing problems. As yet, the system has only been used for QC of the first part of a new Danish elevation model. The experience has, however, been very positive. Especially notable is the utility of doing full spatial coverage tests (rather than scattered sample checks). This means that error detection and error reports are exactly as spatial as the point cloud data they concern. This makes it very easy for both data receiver and data provider, to discuss and reason about the nature and causes of irregularities.
Szaszkó, Mária; Hajdú, István; Flachner, Beáta; Dobi, Krisztina; Magyar, Csaba; Simon, István; Lőrincz, Zsolt; Kapui, Zoltán; Pázmány, Tamás; Cseh, Sándor; Dormán, György
2017-02-01
A glutaminyl cyclase (QC) fragment library was in silico selected by disconnection of the structure of known QC inhibitors and by lead-like 2D virtual screening of the same set. The resulting fragment library (204 compounds) was acquired from commercial suppliers and pre-screened by differential scanning fluorimetry followed by functional in vitro assays. In this way, 10 fragment hits were identified ([Formula: see text]5 % hit rate, best inhibitory activity: 16 [Formula: see text]). The in vitro hits were then docked to the active site of QC, and the best scoring compounds were analyzed for binding interactions. Two fragments bound to different regions in a complementary manner, and thus, linking those fragments offered a rational strategy to generate novel QC inhibitors. Based on the structure of the virtual linked fragment, a 77-membered QC target focused library was selected from vendor databases and docked to the active site of QC. A PubChem search confirmed that the best scoring analogues are novel, potential QC inhibitors.
SRT Evaluation of AIRS Version-6.02 and Version-6.02 AIRS Only (6.02 AO) Products
NASA Technical Reports Server (NTRS)
Susskind, Joel; Iredell, Lena; Molnar, Gyula; Blaisdell, John
2012-01-01
Version-6 contains a number of significant improvements over Version-5. This report compares Version-6 products resulting from the advances listed below to those from Version-5. 1. Improved methodology to determine skin temperature (T(sub s)) and spectral emissivity (Epsilon(sub v)). 2. Use of Neural-net start-up state. 3. Improvements which decrease the spurious negative Version-5 trend in tropospheric temperatures. 4. Improved QC methodology. Version-6 uses separate QC thresholds optimized for Data Assimilation (QC=0) and Climate applications (QC=0,1) respectively. 5. Channel-by-channel clear-column radiances R-hat(sub tau) QC flags. 6. Improved cloud parameter retrieval algorithm. 7. Improved OLR RTA. Our evaluation compared V6.02 and V6.02 AIRS Only (V6.02 AO) Quality Controlled products with those of Version-5.0. In particular we evaluated surface skin temperature T(sub s), surface spectral emissivity Epsilon(sub v), temperature profile T(p), water vapor profile q(p), OLR, OLR(sub CLR), effective cloud fraction alpha-Epsilon, and cloud cleared radiances R-hat(sub tau) . We conducted two types of evaluations. The first compared results on 7 focus days to collocated ECMWF truth. The seven focus days are: September 6, 2002; January 25, 2003; September 29, 2004; August 5, 2005; February 24, 2007; August 10, 2007; and May 30, 2010. In these evaluations, we show results for T(sub s), Epsilon(sub v), T(p), and q(p) in terms of yields, and RMS differences and biases with regard to ECMWF. We also show yield trends as well as bias trends of these quantities relative to ECMWF truth. We also show yields and accuracy of channel by channel QC d values of R-hat(sub tau) for V6.02 and V6.02 AO. Version-5 did not contain channel by channel QC d values of R-hat(sub tau). In the second type of evaluation, we compared V6.03 monthly mean Level-3 products to those of Version-5.0, for four different months: January, April, July, and October; in 3 different years 2003, 2007, and 2011. In particular, we compared V6.03 and V5.0 trends of T(p), q(p), alpha-Epsilon, OLR, and OLR(sub CLR) computed based on results for these 12 time periods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Luyao; Curwen, Christopher; Chen, Daguan
A longstanding challenge for terahertz quantum-cascade (QC) lasers is achieving both a high power and high-quality beam pattern, this is due in part due to their use of sub-wavelength metallic waveguides. Recently, the vertical-external-cavity surface-emitting laser (VECSEL) concept was demonstrated for the first time in the terahertz range and for a QC-laser. This is enabled by the development of an amplifying metasurface reflector capable of coupling incident free-space THz radiation to the QC-laser material such that it is amplified and re-radiated. The THz metasurface QC-VECSEL initiates a new approach for making QC-lasers with high power and excellent beam pattern. Furthermore,more » the ability to engineer the electromagnetic phase, amplitude, and polarization response of the metasurface enables lasers with new functionality. Our article provides an overview of the fundamental theory, design considerations, and recent results for high-performance THz QC-VECSELs.« less
Terahertz metasurface quantum-cascade VECSELs: theory and performance
Xu, Luyao; Curwen, Christopher; Chen, Daguan; ...
2017-04-12
A longstanding challenge for terahertz quantum-cascade (QC) lasers is achieving both a high power and high-quality beam pattern, this is due in part due to their use of sub-wavelength metallic waveguides. Recently, the vertical-external-cavity surface-emitting laser (VECSEL) concept was demonstrated for the first time in the terahertz range and for a QC-laser. This is enabled by the development of an amplifying metasurface reflector capable of coupling incident free-space THz radiation to the QC-laser material such that it is amplified and re-radiated. The THz metasurface QC-VECSEL initiates a new approach for making QC-lasers with high power and excellent beam pattern. Furthermore,more » the ability to engineer the electromagnetic phase, amplitude, and polarization response of the metasurface enables lasers with new functionality. Our article provides an overview of the fundamental theory, design considerations, and recent results for high-performance THz QC-VECSELs.« less
PHABULOSA Controls the Quiescent Center-Independent Root Meristem Activities in Arabidopsis thaliana
Sebastian, Jose; Ryu, Kook Hui; Zhou, Jing; Tarkowská, Danuše; Tarkowski, Petr; Cho, Young-Hee; Yoo, Sang-Dong; Kim, Eun-Sol; Lee, Ji-Young
2015-01-01
Plant growth depends on stem cell niches in meristems. In the root apical meristem, the quiescent center (QC) cells form a niche together with the surrounding stem cells. Stem cells produce daughter cells that are displaced into a transit-amplifying (TA) domain of the root meristem. TA cells divide several times to provide cells for growth. SHORTROOT (SHR) and SCARECROW (SCR) are key regulators of the stem cell niche. Cytokinin controls TA cell activities in a dose-dependent manner. Although the regulatory programs in each compartment of the root meristem have been identified, it is still unclear how they coordinate one another. Here, we investigate how PHABULOSA (PHB), under the posttranscriptional control of SHR and SCR, regulates TA cell activities. The root meristem and growth defects in shr or scr mutants were significantly recovered in the shr phb or scr phb double mutant, respectively. This rescue in root growth occurs in the absence of a QC. Conversely, when the modified PHB, which is highly resistant to microRNA, was expressed throughout the stele of the wild-type root meristem, root growth became very similar to that observed in the shr; however, the identity of the QC was unaffected. Interestingly, a moderate increase in PHB resulted in a root meristem phenotype similar to that observed following the application of high levels of cytokinin. Our protoplast assay and transgenic approach using ARR10 suggest that the depletion of TA cells by high PHB in the stele occurs via the repression of B-ARR activities. This regulatory mechanism seems to help to maintain the cytokinin homeostasis in the meristem. Taken together, our study suggests that PHB can dynamically regulate TA cell activities in a QC-independent manner, and that the SHR-PHB pathway enables a robust root growth system by coordinating the stem cell niche and TA domain. PMID:25730098
Quevauviller, P; Bennink, D; Bøwadt, S
2001-05-01
It is now well recognised that the quality control (QC) of all types of analyses, including environmental analyses depends on the appropriate use of reference materials. One of the ways to check the accuracy of methods is based on the use of Certified Reference Materials (CRMs), whereas other types of (not certified) Reference Materials (RMs) are used for routine quality control (establishment of control charts) and interlaboratory testing (e.g. proficiency testing). The perception of these materials, in particular with respect to their production and use, differs widely according to various perspectives (e.g. RM producers, routine laboratories, researchers). This review discusses some critical aspects of RM use and production for the QC of environmental analyses and describes the new approach followed by the Measurements & Testing Generic Activity (European Commission) to tackle new research and production needs.
Palmer, Antony L; Pearson, Michael; Whittard, Paul; McHugh, Katie E; Eaton, David J
2016-12-01
To assess the status and practice of kilovoltage (kV) radiotherapy in the UK. 96% of the radiotherapy centres in the UK responded to a comprehensive survey. An analysis of the installed equipment base, patient numbers, clinical treatment sites, quality control (QC) testing and radiation dosimetry processes were undertaken. 73% of UK centres have at least one kV treatment unit, with 58 units installed across the UK. Although 35% of units are over 10 years old, 39% units have been installed in the last 5 years. Approximately 6000 patients are treated with kV units in the UK each year, the most common site (44%) being basal cell carcinoma. A benchmark of QC practice in the UK is presented, against which individual centres can compare their procedures, frequency of testing and acceptable tolerance values. We propose the use of internal "notification" and "suspension" levels for analysis. All surveyed centres were using recommended Codes of Practice for kV dosimetry in the UK; approximately the same number using in-air and in-water methodologies for medium energy, with two-thirds of all centres citing "clinical relevance" as the reason for choice of code. 64% of centres had hosted an external dosimetry audit within the last 3 years, with only one centre never being independently audited. The majority of centres use locally measured applicator factors and published backscatter factors for treatments. Monitor unit calculations are performed using software in only 36% of centres. A comprehensive review of current kV practice in the UK is presented. Advances in knowledge: Data and discussion on contemporary kV radiotherapy in the UK, with a particular focus on physics aspects.
Habchi, Baninia; Alves, Sandra; Jouan-Rimbaud Bouveresse, Delphine; Appenzeller, Brice; Paris, Alain; Rutledge, Douglas N; Rathahao-Paris, Estelle
2018-01-01
Due to the presence of pollutants in the environment and food, the assessment of human exposure is required. This necessitates high-throughput approaches enabling large-scale analysis and, as a consequence, the use of high-performance analytical instruments to obtain highly informative metabolomic profiles. In this study, direct introduction mass spectrometry (DIMS) was performed using a Fourier transform ion cyclotron resonance (FT-ICR) instrument equipped with a dynamically harmonized cell. Data quality was evaluated based on mass resolving power (RP), mass measurement accuracy, and ion intensity drifts from the repeated injections of quality control sample (QC) along the analytical process. The large DIMS data size entails the use of bioinformatic tools for the automatic selection of common ions found in all QC injections and for robustness assessment and correction of eventual technical drifts. RP values greater than 10 6 and mass measurement accuracy of lower than 1 ppm were obtained using broadband mode resulting in the detection of isotopic fine structure. Hence, a very accurate relative isotopic mass defect (RΔm) value was calculated. This reduces significantly the number of elemental composition (EC) candidates and greatly improves compound annotation. A very satisfactory estimate of repeatability of both peak intensity and mass measurement was demonstrated. Although, a non negligible ion intensity drift was observed for negative ion mode data, a normalization procedure was easily applied to correct this phenomenon. This study illustrates the performance and robustness of the dynamically harmonized FT-ICR cell to perform large-scale high-throughput metabolomic analyses in routine conditions. Graphical abstract Analytical performance of FT-ICR instrument equipped with a dynamically harmonized cell.
Pearson, Michael; Whittard, Paul; McHugh, Katie E; Eaton, David J
2016-01-01
Objective: To assess the status and practice of kilovoltage (kV) radiotherapy in the UK. Methods: 96% of the radiotherapy centres in the UK responded to a comprehensive survey. An analysis of the installed equipment base, patient numbers, clinical treatment sites, quality control (QC) testing and radiation dosimetry processes were undertaken. Results: 73% of UK centres have at least one kV treatment unit, with 58 units installed across the UK. Although 35% of units are over 10 years old, 39% units have been installed in the last 5 years. Approximately 6000 patients are treated with kV units in the UK each year, the most common site (44%) being basal cell carcinoma. A benchmark of QC practice in the UK is presented, against which individual centres can compare their procedures, frequency of testing and acceptable tolerance values. We propose the use of internal “notification” and “suspension” levels for analysis. All surveyed centres were using recommended Codes of Practice for kV dosimetry in the UK; approximately the same number using in-air and in-water methodologies for medium energy, with two-thirds of all centres citing “clinical relevance” as the reason for choice of code. 64% of centres had hosted an external dosimetry audit within the last 3 years, with only one centre never being independently audited. The majority of centres use locally measured applicator factors and published backscatter factors for treatments. Monitor unit calculations are performed using software in only 36% of centres. Conclusion: A comprehensive review of current kV practice in the UK is presented. Advances in knowledge: Data and discussion on contemporary kV radiotherapy in the UK, with a particular focus on physics aspects. PMID:27730839
Jaccoulet, E; Schweitzer-Chaput, A; Toussaint, B; Prognon, P; Caudron, E
2018-09-01
Compounding of monoclonal antibody (mAbs) constantly increases in hospital. Quality control (QC) of the compounded mAbs based on quantification and identification is required to prevent potential errors and fast method is needed to manage outpatient chemotherapy administration. A simple and ultra-fast (less than 30 s) method using flow injection analysis associated to least square matching method issued from the analyzer software was performed and evaluated for the routine hospital QC of three compounded mAbs: bevacizumab, infliximab and rituximab. The method was evaluated through qualitative and quantitative parameters. Preliminary analysis of the UV absorption and second derivative spectra of the mAbs allowed us to adapt analytical conditions according to the therapeutic range of the mAbs. In terms of quantitative QC, linearity, accuracy and precision were assessed as specified in ICH guidelines. Very satisfactory recovery was achieved and the RSD (%) of the intermediate precision were less than 1.1%. Qualitative analytical parameters were also evaluated in terms of specificity, sensitivity and global precision through a matrix of confusion. Results showed to be concentration and mAbs dependant and excellent (100%) specificity and sensitivity were reached within specific concentration range. Finally, routine application on "real life" samples (n = 209) from different batch of the three mAbs complied with the specifications of the quality control i.e. excellent identification (100%) and ± 15% of targeting concentration belonging to the calibration range. The successful use of the combination of second derivative spectroscopy and partial least square matching method demonstrated the interest of FIA for the ultra-fast QC of mAbs after compounding using matching method. Copyright © 2018 Elsevier B.V. All rights reserved.
Lin, Sabrina C.; Bays, Brett C.; Omaiye, Esther; Bhanu, Bir; Talbot, Prue
2016-01-01
There is a foundational need for quality control tools in stem cell laboratories engaged in basic research, regenerative therapies, and toxicological studies. These tools require automated methods for evaluating cell processes and quality during in vitro passaging, expansion, maintenance, and differentiation. In this paper, an unbiased, automated high-content profiling toolkit, StemCellQC, is presented that non-invasively extracts information on cell quality and cellular processes from time-lapse phase-contrast videos. Twenty four (24) morphological and dynamic features were analyzed in healthy, unhealthy, and dying human embryonic stem cell (hESC) colonies to identify those features that were affected in each group. Multiple features differed in the healthy versus unhealthy/dying groups, and these features were linked to growth, motility, and death. Biomarkers were discovered that predicted cell processes before they were detectable by manual observation. StemCellQC distinguished healthy and unhealthy/dying hESC colonies with 96% accuracy by non-invasively measuring and tracking dynamic and morphological features over 48 hours. Changes in cellular processes can be monitored by StemCellQC and predictions can be made about the quality of pluripotent stem cell colonies. This toolkit reduced the time and resources required to track multiple pluripotent stem cell colonies and eliminated handling errors and false classifications due to human bias. StemCellQC provided both user-specified and classifier-determined analysis in cases where the affected features are not intuitive or anticipated. Video analysis algorithms allowed assessment of biological phenomena using automatic detection analysis, which can aid facilities where maintaining stem cell quality and/or monitoring changes in cellular processes are essential. In the future StemCellQC can be expanded to include other features, cell types, treatments, and differentiating cells. PMID:26848582
Zahedi, Atena; On, Vincent; Lin, Sabrina C; Bays, Brett C; Omaiye, Esther; Bhanu, Bir; Talbot, Prue
2016-01-01
There is a foundational need for quality control tools in stem cell laboratories engaged in basic research, regenerative therapies, and toxicological studies. These tools require automated methods for evaluating cell processes and quality during in vitro passaging, expansion, maintenance, and differentiation. In this paper, an unbiased, automated high-content profiling toolkit, StemCellQC, is presented that non-invasively extracts information on cell quality and cellular processes from time-lapse phase-contrast videos. Twenty four (24) morphological and dynamic features were analyzed in healthy, unhealthy, and dying human embryonic stem cell (hESC) colonies to identify those features that were affected in each group. Multiple features differed in the healthy versus unhealthy/dying groups, and these features were linked to growth, motility, and death. Biomarkers were discovered that predicted cell processes before they were detectable by manual observation. StemCellQC distinguished healthy and unhealthy/dying hESC colonies with 96% accuracy by non-invasively measuring and tracking dynamic and morphological features over 48 hours. Changes in cellular processes can be monitored by StemCellQC and predictions can be made about the quality of pluripotent stem cell colonies. This toolkit reduced the time and resources required to track multiple pluripotent stem cell colonies and eliminated handling errors and false classifications due to human bias. StemCellQC provided both user-specified and classifier-determined analysis in cases where the affected features are not intuitive or anticipated. Video analysis algorithms allowed assessment of biological phenomena using automatic detection analysis, which can aid facilities where maintaining stem cell quality and/or monitoring changes in cellular processes are essential. In the future StemCellQC can be expanded to include other features, cell types, treatments, and differentiating cells.
QUALITY ASSURANCE AND QUALITY CONTROL FOR WASTE CONTAINMENT FACILITIES. Project Summary
It is generally agreed that both quality assurance (QA) and quality control (QC) are essential to the proper installation and eventual performance of environmentally safe and secure waste containment systems. Even further, there are both manufacturing and construction aspects to...
"Gap hunting" to characterize clustered probe signals in Illumina methylation array data.
Andrews, Shan V; Ladd-Acosta, Christine; Feinberg, Andrew P; Hansen, Kasper D; Fallin, M Daniele
2016-01-01
The Illumina 450k array has been widely used in epigenetic association studies. Current quality-control (QC) pipelines typically remove certain sets of probes, such as those containing a SNP or with multiple mapping locations. An additional set of potentially problematic probes are those with DNA methylation distributions characterized by two or more distinct clusters separated by gaps. Data-driven identification of such probes may offer additional insights for downstream analyses. We developed a procedure, termed "gap hunting," to identify probes showing clustered distributions. Among 590 peripheral blood samples from the Study to Explore Early Development, we identified 11,007 "gap probes." The vast majority (9199) are likely attributed to an underlying SNP(s) or other variant in the probe, although SNP-affected probes exist that do not produce a gap signals. Specific factors predict which SNPs lead to gap signals, including type of nucleotide change, probe type, DNA strand, and overall methylation state. These expected effects are demonstrated in paired genotype and 450k data on the same samples. Gap probes can also serve as a surrogate for the local genetic sequence on a haplotype scale and can be used to adjust for population stratification. The characteristics of gap probes reflect potentially informative biology. QC pipelines may benefit from an efficient data-driven approach that "flags" gap probes, rather than filtering such probes, followed by careful interpretation of downstream association analyses. Our results should translate directly to the recently released Illumina EPIC array given the similar chemistry and content design.
Ananthula, Suryatheja; Janagam, Dileep R; Jamalapuram, Seshulatha; Johnson, James R; Mandrell, Timothy D; Lowe, Tao L
2015-10-15
Rapid, sensitive, selective and accurate LC/MS/MS method was developed for quantitative determination of levonorgestrel (LNG) in rat plasma and further validated for specificity, linearity, accuracy, precision, sensitivity, matrix effect, recovery efficiency and stability. Liquid-liquid extraction procedure using hexane:ethyl acetate mixture at 80:20 v:v ratio was employed to efficiently extract LNG from rat plasma. Reversed phase Luna column C18(2) (50×2.0mm i.d., 3μM) installed on a AB SCIEX Triple Quad™ 4500 LC/MS/MS system was used to perform chromatographic separation. LNG was identified within 2min with high specificity. Linear calibration curve was drawn within 0.5-50ng·mL(-1) concentration range. The developed method was validated for intra-day and inter-day accuracy and precision whose values fell in the acceptable limits. Matrix effect was found to be minimal. Recovery efficiency at three quality control (QC) concentrations 0.5 (low), 5 (medium) and 50 (high) ng·mL(-1) was found to be >90%. Stability of LNG at various stages of experiment including storage, extraction and analysis was evaluated using QC samples, and the results showed that LNG was stable at all the conditions. This validated method was successfully used to study the pharmacokinetics of LNG in rats after SubQ injection, providing its applicability in relevant preclinical studies. Copyright © 2015 Elsevier B.V. All rights reserved.
Visualization and Quality Control Web Tools for CERES Products
NASA Astrophysics Data System (ADS)
Mitrescu, C.; Doelling, D. R.
2017-12-01
The NASA CERES project continues to provide the scientific communities a wide variety of satellite-derived data products such as observed TOA broadband shortwave and longwave observed fluxes, computed TOA and Surface fluxes, as well as cloud, aerosol, and other atmospheric parameters. They encompass a wide range of temporal and spatial resolutions, suited to specific applications. CERES data is used mostly by climate modeling communities but also by a wide variety of educational institutions. To better serve our users, a web-based Ordering and Visualization Tool (OVT) was developed by using Opens Source Software such as Eclipse, java, javascript, OpenLayer, Flot, Google Maps, python, and others. Due to increased demand by our own scientists, we also implemented a series of specialized functions to be used in the process of CERES Data Quality Control (QC) such as 1- and 2-D histograms, anomalies and differences, temporal and spatial averaging, side-by-side parameter comparison, and others that made the process of QC far easier and faster, but more importantly far more portable. With the integration of ground site observed surface fluxes we further facilitate the CERES project to QC the CERES computed surface fluxes. An overview of the CERES OVT basic functions using Open Source Software, as well as future steps in expanding its capabilities will be presented at the meeting.
PACS 2000: quality control using the task allocation chart
NASA Astrophysics Data System (ADS)
Norton, Gary S.; Romlein, John R.; Lyche, David K.; Richardson, Ronald R., Jr.
2000-05-01
Medical imaging's technological evolution in the next century will continue to include Picture Archive and Communication Systems (PACS) and teleradiology. It is difficult to predict radiology's future in the new millennium with both computed radiography and direct digital capture competing as the primary image acquisition methods for routine radiography. Changes in Computed Axial Tomography (CT) and Magnetic Resonance Imaging (MRI) continue to amaze the healthcare community. No matter how the acquisition, display, and archive functions change, Quality Control (QC) of the radiographic imaging chain will remain an important step in the imaging process. The Task Allocation Chart (TAC) is a tool that can be used in a medical facility's QC process to indicate the testing responsibilities of the image stakeholders and the medical informatics department. The TAC shows a grid of equipment to be serviced, tasks to be performed, and the organization assigned to perform each task. Additionally, skills, tasks, time, and references for each task can be provided. QC of the PACS must be stressed as a primary element of a PACS' implementation. The TAC can be used to clarify responsibilities during warranty and paid maintenance periods. Establishing a TAC a part of a PACS implementation has a positive affect on patient care and clinical acceptance.
Autonomous Quality Control of Joint Orientation Measured with Inertial Sensors.
Lebel, Karina; Boissy, Patrick; Nguyen, Hung; Duval, Christian
2016-07-05
Clinical mobility assessment is traditionally performed in laboratories using complex and expensive equipment. The low accessibility to such equipment, combined with the emerging trend to assess mobility in a free-living environment, creates a need for body-worn sensors (e.g., inertial measurement units-IMUs) that are capable of measuring the complexity in motor performance using meaningful measurements, such as joint orientation. However, accuracy of joint orientation estimates using IMUs may be affected by environment, the joint tracked, type of motion performed and velocity. This study investigates a quality control (QC) process to assess the quality of orientation data based on features extracted from the raw inertial sensors' signals. Joint orientation (trunk, hip, knee, ankle) of twenty participants was acquired by an optical motion capture system and IMUs during a variety of tasks (sit, sit-to-stand transition, walking, turning) performed under varying conditions (speed, environment). An artificial neural network was used to classify good and bad sequences of joint orientation with a sensitivity and a specificity above 83%. This study confirms the possibility to perform QC on IMU joint orientation data based on raw signal features. This innovative QC approach may be of particular interest in a big data context, such as for remote-monitoring of patients' mobility.
Kpaibe, André P S; Ben-Ameur, Randa; Coussot, Gaëlle; Ladner, Yoann; Montels, Jérôme; Ake, Michèle; Perrin, Catherine
2017-08-01
Snake venoms constitute a very promising resource for the development of new medicines. They are mainly composed of very complex peptide and protein mixtures, which composition may vary significantly from batch to batch. This latter consideration is a challenge for routine quality control (QC) in the pharmaceutical industry. In this paper, we report the use of capillary zone electrophoresis for the development of an analytical fingerprint methodology to assess the quality of snake venoms. The analytical fingerprint concept is being widely used for the QC of herbal drugs but rarely for venoms QC so far. CZE was chosen for its intrinsic efficiency in the separation of protein and peptide mixtures. The analytical fingerprint methodology was first developed and evaluated for a particular snake venom, Lachesis muta. Optimal analysis conditions required the use of PDADMAC capillary coating to avoid protein and peptide adsorption. Same analytical conditions were then applied to other snake venom species. Different electrophoretic profiles were obtained for each venom. Excellent repeatability and intermediate precision was observed for each batch. Analysis of different batches of the same species revealed inherent qualitative and quantitative composition variations of the venoms between individuals. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Origin of the concept of the quiescent centre of plant roots.
Barlow, Peter W
2016-09-01
Concepts in biology feed into general theories of growth, development and evolution of organisms and how they interact with the living and non-living components of their environment. A well-founded concept clarifies unsolved problems and serves as a focus for further research. One such example of a constructive concept in the plant sciences is that of the quiescent centre (QC). In anatomical terms, the QC is an inert group of cells maintained within the apex of plant roots. However, the evidence that established the presence of a QC accumulated only gradually, making use of strands of different types of observations, notably from geometrical-analytical anatomy, radioisotope labelling and autoradiography. In their turn, these strands contributed to other concepts: those of the mitotic cell cycle and of tissue-related cell kinetics. Another important concept to which the QC contributed was that of tissue homeostasis. The general principle of this last-mentioned concept is expressed by the QC in relation to the recovery of root growth following a disturbance to cell proliferation; the resulting activation of the QC provides new cells which not only repair the root meristem but also re-establish a new QC.
Scheltema, Richard A; Mann, Matthias
2012-06-01
With the advent of high-throughput mass spectrometry (MS)-based proteomics, the magnitude and complexity of the performed experiments has increased dramatically. Likewise, investments in chromatographic and MS instrumentation are a large proportion of the budget of proteomics laboratories. Guarding measurement quality and maximizing uptime of the LC-MS/MS systems therefore requires constant care despite automated workflows. We describe a real-time surveillance system, called SprayQc, that continuously monitors the status of the peripheral equipment to ensure that operational parameters are within an acceptable range. SprayQc is composed of multiple plug-in software components that use computer vision to analyze electrospray conditions, monitor the chromatographic device for stable backpressure, interact with a column oven to control pressure by temperature, and ensure that the mass spectrometer is still acquiring data. Action is taken when a failure condition has been detected, such as stopping the column oven and the LC flow, as well as automatically notifying the appropriate operator. Additionally, all defined metrics can be recorded synchronized on retention time with the MS acquisition file, allowing for later inspection and providing valuable information for optimization. SprayQc has been extensively tested in our laboratory, supports third-party plug-in development, and is freely available for download from http://sourceforge.org/projects/sprayqc .
High-throughput high-volume nuclear imaging for preclinical in vivo compound screening§.
Macholl, Sven; Finucane, Ciara M; Hesterman, Jacob; Mather, Stephen J; Pauplis, Rachel; Scully, Deirdre; Sosabowski, Jane K; Jouannot, Erwan
2017-12-01
Preclinical single-photon emission computed tomography (SPECT)/CT imaging studies are hampered by low throughput, hence are found typically within small volume feasibility studies. Here, imaging and image analysis procedures are presented that allow profiling of a large volume of radiolabelled compounds within a reasonably short total study time. Particular emphasis was put on quality control (QC) and on fast and unbiased image analysis. 2-3 His-tagged proteins were simultaneously radiolabelled by 99m Tc-tricarbonyl methodology and injected intravenously (20 nmol/kg; 100 MBq; n = 3) into patient-derived xenograft (PDX) mouse models. Whole-body SPECT/CT images of 3 mice simultaneously were acquired 1, 4, and 24 h post-injection, extended to 48 h and/or by 0-2 h dynamic SPECT for pre-selected compounds. Organ uptake was quantified by automated multi-atlas and manual segmentations. Data were plotted automatically, quality controlled and stored on a collaborative image management platform. Ex vivo uptake data were collected semi-automatically and analysis performed as for imaging data. >500 single animal SPECT images were acquired for 25 proteins over 5 weeks, eventually generating >3500 ROI and >1000 items of tissue data. SPECT/CT images clearly visualized uptake in tumour and other tissues even at 48 h post-injection. Intersubject uptake variability was typically 13% (coefficient of variation, COV). Imaging results correlated well with ex vivo data. The large data set of tumour, background and systemic uptake/clearance data from 75 mice for 25 compounds allows identification of compounds of interest. The number of animals required was reduced considerably by longitudinal imaging compared to dissection experiments. All experimental work and analyses were accomplished within 3 months expected to be compatible with drug development programmes. QC along all workflow steps, blinding of the imaging contract research organization to compound properties and automation provide confidence in the data set. Additional ex vivo data were useful as a control but could be omitted from future studies in the same centre. For even larger compound libraries, radiolabelling could be expedited and the number of imaging time points adapted to increase weekly throughput. Multi-atlas segmentation could be expanded via SPECT/MRI; however, this would require an MRI-compatible mouse hotel. Finally, analysis of nuclear images of radiopharmaceuticals in clinical trials may benefit from the automated analysis procedures developed.
Energy transfer networks: Quasicontinuum photoluminescence linked to high densities of defects
Laurence, Ted A.; Ly, Sonny; Bude, Jeff D.; ...
2017-11-06
In a series of studies related to laser-induced damage of optical materials and deposition of plastics, we discovered a broadly emitting photoluminescence with fast lifetimes that we termed quasicontinuum photoluminescence (QC-PL). Here in this paper, we suggest that a high density of optically active defects leads to QC-PL, where interactions between defects affect the temporal and spectral characteristics of both excitation and emission. We develop a model that predicts the temporal characteristics of QC-PL, based on energy transfer interactions between high densities of defects. Our model does not explain all spectral broadening and redshifts found in QC-PL, since we domore » not model spectral changes in defects due to proximity to other defects. However, we do provide an example of a well-defined system that exhibits the QC-PL characteristics of a distribution in shortened lifetimes and broadened, redshifted energy levels: an organic chromophore (fluorescein) that has been dried rapidly on a fused silica surface. Recently, we showed that regions of fused silica exposed to up to 1 billion high-fluence laser shots at 351 rm nm at subdamage fluences exhibit significant transmission losses at the surface. Here, we find that these laser-exposed regions also exhibit QC-PL. Increases in the density of induced defects on these laser-exposed surfaces, as measured by the local transmission loss, lead to decreases in the observed lifetime and redshifts in the spectrum of the QC-PL, consistent with our explanation for QC-PL. In conclusion, we have found QC-PL in an increasing variety of situations and materials, and we believe it is a phenomenon commonly found on surfaces and nanostructured materials.« less
Energy transfer networks: Quasicontinuum photoluminescence linked to high densities of defects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Laurence, Ted A.; Ly, Sonny; Bude, Jeff D.
In a series of studies related to laser-induced damage of optical materials and deposition of plastics, we discovered a broadly emitting photoluminescence with fast lifetimes that we termed quasicontinuum photoluminescence (QC-PL). Here in this paper, we suggest that a high density of optically active defects leads to QC-PL, where interactions between defects affect the temporal and spectral characteristics of both excitation and emission. We develop a model that predicts the temporal characteristics of QC-PL, based on energy transfer interactions between high densities of defects. Our model does not explain all spectral broadening and redshifts found in QC-PL, since we domore » not model spectral changes in defects due to proximity to other defects. However, we do provide an example of a well-defined system that exhibits the QC-PL characteristics of a distribution in shortened lifetimes and broadened, redshifted energy levels: an organic chromophore (fluorescein) that has been dried rapidly on a fused silica surface. Recently, we showed that regions of fused silica exposed to up to 1 billion high-fluence laser shots at 351 rm nm at subdamage fluences exhibit significant transmission losses at the surface. Here, we find that these laser-exposed regions also exhibit QC-PL. Increases in the density of induced defects on these laser-exposed surfaces, as measured by the local transmission loss, lead to decreases in the observed lifetime and redshifts in the spectrum of the QC-PL, consistent with our explanation for QC-PL. In conclusion, we have found QC-PL in an increasing variety of situations and materials, and we believe it is a phenomenon commonly found on surfaces and nanostructured materials.« less
Energy transfer networks: Quasicontinuum photoluminescence linked to high densities of defects
NASA Astrophysics Data System (ADS)
Laurence, Ted A.; Ly, Sonny; Bude, Jeff D.; Baxamusa, Salmaan H.; Lepró, Xavier; Ehrmann, Paul
2017-11-01
In a series of studies related to laser-induced damage of optical materials and deposition of plastics, we discovered a broadly emitting photoluminescence with fast lifetimes that we termed quasicontinuum photoluminescence (QC-PL). Here, we suggest that a high density of optically active defects leads to QC-PL, where interactions between defects affect the temporal and spectral characteristics of both excitation and emission. We develop a model that predicts the temporal characteristics of QC-PL, based on energy transfer interactions between high densities of defects. Our model does not explain all spectral broadening and redshifts found in QC-PL, since we do not model spectral changes in defects due to proximity to other defects. However, we do provide an example of a well-defined system that exhibits the QC-PL characteristics of a distribution in shortened lifetimes and broadened, redshifted energy levels: an organic chromophore (fluorescein) that has been dried rapidly on a fused silica surface. Recently, we showed that regions of fused silica exposed to up to 1 billion high-fluence laser shots at 351 rm nm at subdamage fluences exhibit significant transmission losses at the surface. Here, we find that these laser-exposed regions also exhibit QC-PL. Increases in the density of induced defects on these laser-exposed surfaces, as measured by the local transmission loss, lead to decreases in the observed lifetime and redshifts in the spectrum of the QC-PL, consistent with our explanation for QC-PL. We have found QC-PL in an increasing variety of situations and materials, and we believe it is a phenomenon commonly found on surfaces and nanostructured materials.
Greene, Karen E.
1997-01-01
A study of the ambient ground-water quality in the vicinity of Naval Submarine Base (SUBASE) Bangor was conducted to provide the U.S. Navywith background levels of selected constituents.The Navy needs this information to plan and manage cleanup activities on the base. DuringMarch and April 1995, 136 water-supply wells were sampled for common ions, trace elements, and organic compounds; not all wells were sampled for all constituents. Man-made organic compounds were detected in only two of fifty wells, and the sources of these organic compounds were attributed to activities in the immediate vicinities of these off- base wells. Drinking water standards for trichloroethylene, iron, and manganese were exceeded in one of these wells, which was probablycontaminated by an old local (off-base) dump. Ground water from wells open to the following hydrogeologic units (in order from shallow to deep) was investigated: the Vashon till confining unit (Qvt, three wells); the Vashon aquifer (Qva, 54 wells); the Upper confining unit (QC1, 16 wells); the Permeable interbeds within QC1 (QC1pi, 34 wells); and the Sea-level aquifer (QA1, 29 wells).The 50th and 90th percentile ambient background levels of 35 inorganic constituents were determined for each hydrogeologic unit. At least tenmeasurements were required for a constituent in each hydro- geologic unit for determination of ambient background levels, and data for three wellsdetermined to be affected by localized activities were excluded from these analyses. The only drinking water standards exceeded by ambient background levels were secondary maximum contaminant levels for iron (300 micrograms per liter), in QC1 and QC1pi, and manganese (50 micrograms per liter), in all of the units. The 90th percentile values for arsenic in QC1pi, QA1, and for the entire study area are above 5 micrograms per liter, the Model Toxics Control Act Method A value for protecting drinking water, but well below the maximum contaminant level of 50 micrograms per liter for arsenic. The manganese standard was exceeded in 38 wells and the standard for iron was exceeded in 12 wells.Most of these wells were in QC1 or QC1pi and had dissolved oxygen concentrations of less than 1 milligram per liter and dissolved organic carbon concentrations greater than 1\\x11milligram per liter.The dissolved oxygen concentration is generally lower in the deeper units, while pH increases; the recommended pH range of 6.5-8.5 standard units was exceeded in 9 wells. The common-ion chemistry was similar for all of the units.
Novelo-Casanova, D. A.; Lee, W.H.K.
1991-01-01
Using simulated coda waves, the resolution of the single-scattering model to extract coda Q (Qc) and its power law frequency dependence was tested. The back-scattering model of Aki and Chouet (1975) and the single isotropic-scattering model of Sato (1977) were examined. The results indicate that: (1) The input Qc models are reasonably well approximated by the two methods; (2) almost equal Qc values are recovered when the techniques sample the same coda windows; (3) low Qc models are well estimated in the frequency domain from the early and late part of the coda; and (4) models with high Qc values are more accurately extracted from late code measurements. ?? 1991 Birkha??user Verlag.
Duo, Jia; Dong, Huijin; DeSilva, Binodh; Zhang, Yan J
2013-07-01
Sample dilution and reagent pipetting are time-consuming steps in ligand-binding assays (LBAs). Traditional automation-assisted LBAs use assay-specific scripts that require labor-intensive script writing and user training. Five major script modules were developed on Tecan Freedom EVO liquid handling software to facilitate the automated sample preparation and LBA procedure: sample dilution, sample minimum required dilution, standard/QC minimum required dilution, standard/QC/sample addition, and reagent addition. The modular design of automation scripts allowed the users to assemble an automated assay with minimal script modification. The application of the template was demonstrated in three LBAs to support discovery biotherapeutic programs. The results demonstrated that the modular scripts provided the flexibility in adapting to various LBA formats and the significant time saving in script writing and scientist training. Data generated by the automated process were comparable to those by manual process while the bioanalytical productivity was significantly improved using the modular robotic scripts.
A unifying strain criterion for fracture of fibrous composite laminates
NASA Technical Reports Server (NTRS)
Poe, C. C., Jr.
1983-01-01
Fibrous composite materials, such as graphite/epoxy, are light, stiff, and strong. They have great potential for reducing weight in aircraft structures. However, for a realization of this potential, designers will have to know the fracture toughness of composite laminates in order to design damage tolerant structures. In connection with the development of an economical testing procedure, there is a great need for a single fracture toughness parameter which can be used to predict the stress-intensity factor (K(Q)) for all laminates of interest to the designer. Poe and Sova (1980) have derived a general fracture toughness parameter (Qc), which is a material constant. It defines the critical level of strains in the principal load-carryng plies. The present investigation is concerned with the calculation of values for the ratio of Qc and the ultimate tensile strain of the fibers. The obtained data indicate that this ratio is reasonably constant for layups which fail largely by self-similar crack extension.
Code of Federal Regulations, 2010 CFR
2010-04-01
...' Benefits EMPLOYMENT AND TRAINING ADMINISTRATION, DEPARTMENT OF LABOR QUALITY CONTROL IN THE FEDERAL-STATE... QC unit. The organizational location of this unit shall be positioned to maximize its objectivity, to... organizational conflict of interest. ...
78 FR 48766 - Petition for Waiver of Compliance
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-09
...'s Network Management Center in Montreal, QC, Canada. CP operates approximately six to eight trains a day over this segment. The trackage is operated under a Centralized Traffic Control system and...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-27
... Quality Control process for the Supplemental Nutrition Assistance Program and the FNS-248 will be removed... other forms of information technology. Comments may be sent to: Francis B. Heil, Chief, Quality Control... directed to Francis B. Heil, (703) 305-2442. SUPPLEMENTARY INFORMATION: Title: Negative Quality Control...
Hoang, Van-Hai; Tran, Phuong-Thao; Cui, Minghua; Ngo, Van T H; Ann, Jihyae; Park, Jongmi; Lee, Jiyoun; Choi, Kwanghyun; Cho, Hanyang; Kim, Hee; Ha, Hee-Jin; Hong, Hyun-Seok; Choi, Sun; Kim, Young-Ho; Lee, Jeewoo
2017-03-23
Glutaminyl cyclase (QC) has been implicated in the formation of toxic amyloid plaques by generating the N-terminal pyroglutamate of β-amyloid peptides (pGlu-Aβ) and thus may participate in the pathogenesis of Alzheimer's disease (AD). We designed a library of glutamyl cyclase (QC) inhibitors based on the proposed binding mode of the preferred substrate, Aβ 3E-42 . An in vitro structure-activity relationship study identified several excellent QC inhibitors demonstrating 5- to 40-fold increases in potency compared to a known QC inhibitor. When tested in mouse models of AD, compound 212 significantly reduced the brain concentrations of pyroform Aβ and total Aβ and restored cognitive functions. This potent Aβ-lowering effect was achieved by incorporating an additional binding region into our previously established pharmacophoric model, resulting in strong interactions with the carboxylate group of Glu327 in the QC binding site. Our study offers useful insights in designing novel QC inhibitors as a potential treatment option for AD.
Lin, Jou-Wei; Yang, Chen-Wei
2010-01-01
The objective of this study was to develop and validate an automated acquisition system to assess quality of care (QC) measures for cardiovascular diseases. This system combining searching and retrieval algorithms was designed to extract QC measures from electronic discharge notes and to estimate the attainment rates to the current standards of care. It was developed on the patients with ST-segment elevation myocardial infarction and tested on the patients with unstable angina/non-ST-segment elevation myocardial infarction, both diseases sharing almost the same QC measures. The system was able to reach a reasonable agreement (κ value) with medical experts from 0.65 (early reperfusion rate) to 0.97 (β-blockers and lipid-lowering agents before discharge) for different QC measures in the test set, and then applied to evaluate QC in the patients who underwent coronary artery bypass grafting surgery. The result has validated a new tool to reliably extract QC measures for cardiovascular diseases. PMID:20442141
Wu, Vincent W.; Dana, Craig M.; Iavarone, Anthony T.; ...
2017-01-17
The breakdown of plant biomass to simple sugars is essential for the production of second-generation biofuels and high-value bioproducts. Currently, enzymes produced from filamentous fungi are used for deconstructing plant cell wall polysaccharides into fermentable sugars for biorefinery applications. A post-translational N-terminal pyroglutamate modification observed in some of these enzymes occurs when N-terminal glutamine or glutamate is cyclized to form a five-membered ring. This modification has been shown to confer resistance to thermal denaturation for CBH-1 and EG-1 cellulases. In mammalian cells, the formation of pyroglutamate is catalyzed by glutaminyl cyclases. Using the model filamentous fungus Neurospora crassa, we identifiedmore » two genes ( qc-1 and qc-2) that encode proteins homologous to mammalian glutaminyl cyclases. We show that qc-1 and qc-2 are essential for catalyzing the formation of an N-terminal pyroglutamate on CBH-1 and GH5-1. CBH-1 and GH5-1 produced in a Δqc-1 Δqc-2 mutant, and thus lacking the N-terminal pyroglutamate modification, showed greater sensitivity to thermal denaturation, and for GH5-1, susceptibility to proteolytic cleavage. QC-1 and QC-2 are endoplasmic reticulum (ER)-localized proteins. The pyroglutamate modification is predicted to occur in a number of additional fungal proteins that have diverse functions. The identification of glutaminyl cyclases in fungi may have implications for production of lignocellulolytic enzymes, heterologous expression, and biotechnological applications revolving around protein stability.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Vincent W.; Dana, Craig M.; Iavarone, Anthony T.
The breakdown of plant biomass to simple sugars is essential for the production of second-generation biofuels and high-value bioproducts. Currently, enzymes produced from filamentous fungi are used for deconstructing plant cell wall polysaccharides into fermentable sugars for biorefinery applications. A post-translational N-terminal pyroglutamate modification observed in some of these enzymes occurs when N-terminal glutamine or glutamate is cyclized to form a five-membered ring. This modification has been shown to confer resistance to thermal denaturation for CBH-1 and EG-1 cellulases. In mammalian cells, the formation of pyroglutamate is catalyzed by glutaminyl cyclases. Using the model filamentous fungus Neurospora crassa, we identifiedmore » two genes ( qc-1 and qc-2) that encode proteins homologous to mammalian glutaminyl cyclases. We show that qc-1 and qc-2 are essential for catalyzing the formation of an N-terminal pyroglutamate on CBH-1 and GH5-1. CBH-1 and GH5-1 produced in a Δqc-1 Δqc-2 mutant, and thus lacking the N-terminal pyroglutamate modification, showed greater sensitivity to thermal denaturation, and for GH5-1, susceptibility to proteolytic cleavage. QC-1 and QC-2 are endoplasmic reticulum (ER)-localized proteins. The pyroglutamate modification is predicted to occur in a number of additional fungal proteins that have diverse functions. The identification of glutaminyl cyclases in fungi may have implications for production of lignocellulolytic enzymes, heterologous expression, and biotechnological applications revolving around protein stability.« less
MaNIDA: an operational infrastructure for shipborne data
NASA Astrophysics Data System (ADS)
Macario, Ana; Scientific MaNIDA Team
2013-04-01
The Marine Network for Integrated Data Access (MaNIDA) aims to build a sustainable e-Infrastruture to support discovery and re-use of data archived in a distributed network of data providers in Germany (see related abstracts in session ESSI1.2 and session ESSI2.2). Because one of the primary focus of MaNIDA is the underway data acquired on board of German academic research vessels, we will be addressing various issues related to cruise-level metadata, shiptrack navigation, sampling events conducted during the cruise (event logs), standardization of device-related (type, name, parameters) and place-related (gazetteer) vocabularies, QA/QC procedures (near real time and post-cruise validation, corrections, quality flags) as well as ingestion and management of contextual information (e.g. various types of cruise-related reports and project-related information). One of MaNIDA's long-term goal is to be able to offer an integrative "one-stop-shop" framework for management and access of ship-related information based on international standards and interoperability. This access framework will be freely available and is intended for scientists, funding agencies and the public. The master "catalog" we are building currently contains information from 13 German academic research vessels and respective cruises (to date ~1900 cruises with expected growing rate of ~150 cruises annually). Moreover, MaNIDA's operational infrastructure will additionally provide a direct pipeline to SeaDataNet Cruise Summary Report Inventory, among others. In this presentation, we will focus on the extensions we are currently implementing to support automated acquisition and standardized transfer of various types of data from German research vessels to hosts on land. Our concept towards nationwide common QA/QC procedures for various types of underway data (including versioning concept) and common workflows will also be presented. The "linking" of cruise-related information with quality-controlled data and data products (e.g., digital terrain models), publications, cruise-related reports, people and other contextual information will be additionally shown in the framework of a prototype for R.V. Polarstern.
40 CFR 98.44 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Monitoring and QA/QC requirements. 98.44 Section 98.44 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electricity Generation § 98.44 Monitoring and QA/QC...
40 CFR 98.44 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 21 2011-07-01 2011-07-01 false Monitoring and QA/QC requirements. 98.44 Section 98.44 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electricity Generation § 98.44 Monitoring and QA/QC...
40 CFR 98.44 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 22 2013-07-01 2013-07-01 false Monitoring and QA/QC requirements. 98.44 Section 98.44 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electricity Generation § 98.44 Monitoring and QA/QC...
40 CFR 98.44 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 21 2014-07-01 2014-07-01 false Monitoring and QA/QC requirements. 98.44 Section 98.44 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electricity Generation § 98.44 Monitoring and QA/QC...
40 CFR 98.44 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 22 2012-07-01 2012-07-01 false Monitoring and QA/QC requirements. 98.44 Section 98.44 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electricity Generation § 98.44 Monitoring and QA/QC...
40 CFR 98.64 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Monitoring and QA/QC requirements. 98.64 Section 98.64 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Aluminum Production § 98.64 Monitoring and QA/QC requirements...
40 CFR 98.64 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 21 2011-07-01 2011-07-01 false Monitoring and QA/QC requirements. 98.64 Section 98.64 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Aluminum Production § 98.64 Monitoring and QA/QC requirements...
40 CFR 98.84 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 21 2011-07-01 2011-07-01 false Monitoring and QA/QC requirements. 98.84 Section 98.84 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Cement Production § 98.84 Monitoring and QA/QC requirements...
References on EPA Quality Assurance Project Plans
Provides requirements for the conduct of quality management practices, including quality assurance (QA) and quality control (QC) activities, for all environmental data collection and environmental technology programs performed by or for this Agency.
Xu, Weichen; Jimenez, Rod Brian; Mowery, Rachel; Luo, Haibin; Cao, Mingyan; Agarwal, Nitin; Ramos, Irina; Wang, Xiangyang; Wang, Jihong
2017-10-01
During manufacturing and storage process, therapeutic proteins are subject to various post-translational modifications (PTMs), such as isomerization, deamidation, oxidation, disulfide bond modifications and glycosylation. Certain PTMs may affect bioactivity, stability or pharmacokinetics and pharmacodynamics profile and are therefore classified as potential critical quality attributes (pCQAs). Identifying, monitoring and controlling these PTMs are usually key elements of the Quality by Design (QbD) approach. Traditionally, multiple analytical methods are utilized for these purposes, which is time consuming and costly. In recent years, multi-attribute monitoring methods have been developed in the biopharmaceutical industry. However, these methods combine high-end mass spectrometry with complicated data analysis software, which could pose difficulty when implementing in a quality control (QC) environment. Here we report a multi-attribute method (MAM) using a Quadrupole Dalton (QDa) mass detector to selectively monitor and quantitate PTMs in a therapeutic monoclonal antibody. The result output from the QDa-based MAM is straightforward and automatic. Evaluation results indicate this method provides comparable results to the traditional assays. To ensure future application in the QC environment, this method was qualified according to the International Conference on Harmonization (ICH) guideline and applied in the characterization of drug substance and stability samples. The QDa-based MAM is shown to be an extremely useful tool for product and process characterization studies that facilitates facile understanding of process impact on multiple quality attributes, while being QC friendly and cost-effective.
SU-E-T-103: Development and Implementation of Web Based Quality Control Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Studinski, R; Taylor, R; Angers, C
Purpose: Historically many radiation medicine programs have maintained their Quality Control (QC) test results in paper records or Microsoft Excel worksheets. Both these approaches represent significant logistical challenges, and are not predisposed to data review and approval. It has been our group's aim to develop and implement web based software designed not just to record and store QC data in a centralized database, but to provide scheduling and data review tools to help manage a radiation therapy clinics Equipment Quality control program. Methods: The software was written in the Python programming language using the Django web framework. In order tomore » promote collaboration and validation from other centres the code was made open source and is freely available to the public via an online source code repository. The code was written to provide a common user interface for data entry, formalize the review and approval process, and offer automated data trending and process control analysis of test results. Results: As of February 2014, our installation of QAtrack+ has 180 tests defined in its database and has collected ∼22 000 test results, all of which have been reviewed and approved by a physicist via QATrack+'s review tools. These results include records for quality control of Elekta accelerators, CT simulators, our brachytherapy programme, TomoTherapy and Cyberknife units. Currently at least 5 other centres are known to be running QAtrack+ clinically, forming the start of an international user community. Conclusion: QAtrack+ has proven to be an effective tool for collecting radiation therapy QC data, allowing for rapid review and trending of data for a wide variety of treatment units. As free and open source software, all source code, documentation and a bug tracker are available to the public at https://bitbucket.org/tohccmedphys/qatrackplus/.« less
Maurer, Matthew J.; Spear, Eric D.; Yu, Allen T.; Lee, Evan J.; Shahzad, Saba; Michaelis, Susan
2016-01-01
Cellular protein quality control (PQC) systems selectively target misfolded or otherwise aberrant proteins for degradation by the ubiquitin-proteasome system (UPS). How cells discern abnormal from normal proteins remains incompletely understood, but involves in part the recognition between ubiquitin E3 ligases and degradation signals (degrons) that are exposed in misfolded proteins. PQC is compartmentalized in the cell, and a great deal has been learned in recent years about ER-associated degradation (ERAD) and nuclear quality control. In contrast, a comprehensive view of cytosolic quality control (CytoQC) has yet to emerge, and will benefit from the development of a well-defined set of model substrates. In this study, we generated an isogenic “degron library” in Saccharomyces cerevisiae consisting of short sequences appended to the C-terminus of a reporter protein, Ura3. About half of these degron-containing proteins are substrates of the integral membrane E3 ligase Doa10, which also plays a pivotal role in ERAD and some nuclear protein degradation. Notably, some of our degron fusion proteins exhibit dependence on the E3 ligase Ltn1/Rkr1 for degradation, apparently by a mechanism distinct from its known role in ribosomal quality control of translationally paused proteins. Ubr1 and San1, E3 ligases involved in the recognition of some misfolded CytoQC substrates, are largely dispensable for the degradation of our degron-containing proteins. Interestingly, the Hsp70/Hsp40 chaperone/cochaperones Ssa1,2 and Ydj1, are required for the degradation of all constructs tested. Taken together, the comprehensive degron library presented here provides an important resource of isogenic substrates for testing candidate PQC components and identifying new ones. PMID:27172186
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
Method 1664 was developed by the United States Environmental Protection Agency Office of Science and Technology to replace previously used gravimetric procedures that employed Freon-113, a Class I CFC, as the extraction solvent for the determination of oil and grease and petroleum hydrocarbons. Method 1664 is a performance-based method applicable to aqueous matrices that requires the use of n-hexane as the extraction solvent and gravimetry as the determinative technique. In addition, QC procedures designed to monitor precision and accuracy have been incorporated into Method 1664.
40 CFR 98.424 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 22 2012-07-01 2012-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Carbon Dioxide § 98.424 Monitoring and QA/QC... determine quantity in accordance with this paragraph. (i) Reporters that supply CO2 in containers using...
40 CFR 98.424 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 22 2013-07-01 2013-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Carbon Dioxide § 98.424 Monitoring and QA/QC... determine quantity in accordance with this paragraph. (i) Reporters that supply CO2 in containers using...
40 CFR 98.334 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 22 2012-07-01 2012-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Zinc Production § 98.334 Monitoring and QA/QC requirements. If..., belt weigh feeders, weighed purchased quantities in shipments or containers, combination of bulk...
40 CFR 98.94 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 21 2011-07-01 2011-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electronics Manufacturing § 98.94 Monitoring and QA/QC requirements. (a) For calendar year 2011 monitoring, you may follow the provisions in paragraphs (a)(1) through...
40 CFR 98.424 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 21 2011-07-01 2011-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Carbon Dioxide § 98.424 Monitoring and QA/QC... determine quantity in accordance with this paragraph. (i) Reporters that supply CO2 in containers using...
40 CFR 98.334 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 21 2011-07-01 2011-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Zinc Production § 98.334 Monitoring and QA/QC requirements. If..., belt weigh feeders, weighed purchased quantities in shipments or containers, combination of bulk...
40 CFR 98.334 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 21 2014-07-01 2014-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Zinc Production § 98.334 Monitoring and QA/QC requirements. If..., belt weigh feeders, weighed purchased quantities in shipments or containers, combination of bulk...
40 CFR 98.94 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 22 2013-07-01 2013-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electronics Manufacturing § 98.94 Monitoring and QA/QC requirements. (a) For calendar year 2011 monitoring, you may follow the provisions in paragraphs (a)(1) through...
40 CFR 98.424 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 21 2014-07-01 2014-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Carbon Dioxide § 98.424 Monitoring and QA/QC... determine quantity in accordance with this paragraph. (i) Reporters that supply CO2 in containers using...
40 CFR 98.94 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 22 2012-07-01 2012-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electronics Manufacturing § 98.94 Monitoring and QA/QC requirements. (a) For calendar year 2011 monitoring, you may follow the provisions in paragraphs (a)(1) through...
40 CFR 98.334 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 22 2013-07-01 2013-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Zinc Production § 98.334 Monitoring and QA/QC requirements. If..., belt weigh feeders, weighed purchased quantities in shipments or containers, combination of bulk...
Measurement of pulmonary capillary blood flow in infants by plethysmography.
Stocks, J; Costeloe, K; Winlove, C P; Godfrey, S
1977-01-01
An accurate method for measuring effective pulmonary capillary blood flow (Qc eff) in infants has been developed with an adaptation of the plethysmographic technique. Measurements were made on 19 preterm. 14 small-for-dates, and 7 fullterm normal infants with a constant volume whole body plethysmograph in which the infant rebreathed nitrous oxide. There was a highly significant correlation between Qc eff and body weight, and this relationship was unaffected by premature delivery or intrauterine growth retardation. Mean Qc eff in preterm, small-for dates, and fullterm infants was 203, 208 and 197 ml min-1 kg-1, respectively, with no significant differences between the groups. A significant negative correlation existed between Qc eff and haematocrit in the preterm infants. There was no relationship between weight standardized Qc eff and postnatal age in any of the groups. With this technique, it was possible to readily recognise the presence of rapid recirculation (indicative of shunting) in several of the infants, suggesting that rebreathing methods for the assessment of Qc eff should not be applied indiscriminately during the neonatal period. By taking care to overcome the potential sources of technical error, it was possible to obtain highly reproducible results of Qc eff in infants over a wider age range than has been previously reported. PMID:838861
Quantum cascade transmitters for ultrasensitive chemical agent and explosives detection
NASA Astrophysics Data System (ADS)
Schultz, John F.; Taubman, Matthew S.; Harper, Warren W.; Williams, Richard M.; Myers, Tanya L.; Cannon, Bret D.; Sheen, David M.; Anheier, Norman C., Jr.; Allen, Paul J.; Sundaram, S. K.; Johnson, Bradley R.; Aker, Pamela M.; Wu, Ming C.; Lau, Erwin K.
2003-07-01
The small size, high power, promise of access to any wavelength between 3.5 and 16 microns, substantial tuning range about a chosen center wavelength, and general robustness of quantum cascade (QC) lasers provide opportunities for new approaches to ultra-sensitive chemical detection and other applications in the mid-wave infrared. PNNL is developing novel remote and sampling chemical sensing systems based on QC lasers, using QC lasers loaned by Lucent Technologies. In recent months laboratory cavity-enhanced sensing experiments have achieved absorption sensitivities of 8.5 x 10-11 cm-1 Hz-1/2, and the PNNL team has begun monostatic and bi-static frequency modulated, differential absorption lidar (FM DIAL) experiments at ranges of up to 2.5 kilometers. In related work, PNNL and UCLA are developing miniature QC laser transmitters with the multiplexed tunable wavelengths, frequency and amplitude stability, modulation characteristics, and power levels needed for chemical sensing and other applications. Current miniaturization concepts envision coupling QC oscillators, QC amplifiers, frequency references, and detectors with miniature waveguides and waveguide-based modulators, isolators, and other devices formed from chalcogenide or other types of glass. Significant progress has been made on QC laser stabilization and amplification, and on development and characterization of high-purity chalcogenide glasses, waveguide writing techniques, and waveguide metrology.
Goldwaser, Elodie; de Courcy, Benoit; Demange, Luc; Garbay, Christiane; Raynaud, Françoise; Hadj-Slimane, Reda; Piquemal, Jean-Philip; Gresh, Nohad
2014-11-01
We investigate the conformational properties of a potent inhibitor of neuropilin-1, a protein involved in cancer processes and macular degeneration. This inhibitor consists of four aromatic/conjugated fragments: a benzimidazole, a methylbenzene, a carboxythiourea, and a benzene-linker dioxane, and these fragments are all linked together by conjugated bonds. The calculations use the SIBFA polarizable molecular mechanics procedure. Prior to docking simulations, it is essential to ensure that variations in the ligand conformational energy upon rotations around its six main-chain torsional bonds are correctly represented (as compared to high-level ab initio quantum chemistry, QC). This is done in two successive calibration stages and one validation stage. In the latter, the minima identified following independent stepwise variations of each of the six main-chain torsion angles are used as starting points for energy minimization of all the torsion angles simultaneously. Single-point QC calculations of the minimized structures are then done to compare their relative energies ΔE conf to the SIBFA ones. We compare three different methods of deriving the multipoles and polarizabilities of the central, most critical moiety of the inhibitor: carboxythiourea (CTU). The representation that gives the best agreement with QC is the one that includes the effects of the mutual polarization energy E pol between the amide and thioamide moieties. This again highlights the critical role of this contribution. The implications and perspectives of these findings are discussed.
Quality Assurance and Quality Control Practices for Rehabilitation of Sewer and Water Mains
As part of the US Environmental Protection Agency (EPA)’s Aging Water Infrastructure Research Program, several areas of research are being pursued, including a review of quality assurance and quality control (QA/QC) practices and acceptance testing during the installation of reha...
Quality Assurance and Quality Control Practices For Rehabilitation of Sewer and Water Mains
As part of the US Environmental Protection Agency (EPA)’s Aging Water Infrastructure Research Program, several areas of research are being pursued including a review of quality assurance and quality control (QA/QC) practices and acceptance testing during the installation of rehab...
USDA-ARS?s Scientific Manuscript database
A multi-laboratory broth microdilution method trial was performed to standardize the specialized test conditions required for fish pathogens Flavobacterium columnare and F. pyschrophilum. Nine laboratories tested the quality control (QC) strains Escherichia coli ATCC 25922 and Aeromonas salmonicid...
7 CFR 283.2 - Scope and applicability.
Code of Federal Regulations, 2010 CFR
2010-01-01
... agencies of Food and Nutrition Service quality control (QC) claims for Fiscal Year (“FY”) 1986 and... Agriculture Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF AGRICULTURE FOOD STAMP AND FOOD DISTRIBUTION PROGRAM APPEALS OF QUALITY CONTROL (âQCâ) CLAIMS General § 283.2...
NASA Astrophysics Data System (ADS)
Le, Loan T.
Over the span of more than 20 years of development, the Quantum Cascade (QC) laser has positioned itself as the most viable mid-infrared (mid-IR) light source. Today's QC lasers emit watts of continuous wave power at room temperature. Despite significant progress, the mid-IR region remains vastly under-utilized. State-of-the-art QC lasers are found in high power defense applications and detection of trace gases with narrow absorption lines. A large number of applications, however, do not require so much power, but rather, a broadly tunable laser source to detect molecules with broad absorption features. As such, a QC laser that is broadly tunable over the entire biochemical fingerprinting region remains the missing link to markets such as non- invasive biomedical diagnostics, food safety, and stand-off detection in turbid media. In this thesis, we detail how we utilized the inherent flexibility of the QC design space to conceive a new type of laser with the potential to bridge that missing link of the QC laser to large commercial markets. Our design concept, the Super Cascade (SC) laser, works contrary to conventional laser design principle by supporting multiple independent optical transitions, each contributing to broadening the gain spectrum. We have demonstrated a room temperature laser gain medium with electroluminescence spanning 3.3-12.5 ?m and laser emission from 6.2-12.5 ?m, the record spectral width for any solid state laser gain medium. This gain bandwidth covers the entire biochemical fingerprinting region. The achievement of such a spectrally broad gain medium presents engineering challenges of how to optimally utilize the bandwidth. As of this work, a monolithi- cally integrated array of Distributed Feedback QC (DFB-QC) lasers is one of the most promising ways to fully utilize the SC gain bandwidth. Therefore, in this thesis, we explore ways of improving the yield and ease of fabrication of DFB-QC lasers, including a re-examination of the role of current spreading in QC geometry.
The Surface Ocean CO2 Atlas: Stewarding Underway Carbon Data from Collection to Archival
NASA Astrophysics Data System (ADS)
O'Brien, K.; Smith, K. M.; Pfeil, B.; Landa, C.; Bakker, D. C. E.; Olsen, A.; Jones, S.; Shrestha, B.; Kozyr, A.; Manke, A. B.; Schweitzer, R.; Burger, E. F.
2016-02-01
The Surface Ocean CO2 Atlas (SOCAT, www.socat.info) is a quality controlled, global surface ocean carbon dioxide (CO2) data set gathered on research vessels, SOOP and buoys. To the degree feasible SOCAT is comprehensive; it draws together and applies uniform QC procedures to all such observations made across the international community. The first version of SOCAT (version 1.5) was publicly released September 2011(Bakker et al., 2011) with 6.3 million observations. This was followed by the release of SOCAT version 2, expanded to over 10 million observations, in June 2013 (Bakker et al., 2013). Most recently, in September 2015 SOCAT version 3 was released containing over 14 millions observations spanning almost 60 years! The process of assembling, QC'ing and publishing V1.5 and V2 of SOCAT required an unsustainable level of manual effort. To ease the burden on data managers and data providers, the SOCAT community agreed to embark an automated data ingestion process which would create a streamlined workflow to improve data stewardship from ingestion to quality control and from publishing to archival. To that end, for version 3 and beyond, the SOCAT automation team created a framework which was based upon standards and conventions, yet at the same time allows scientists to work in the data formats they felt most comfortable with (ie, csv files). This automated workflow provides several advantages: 1) data ingestion into uniform and standards-based file formats; 2) ease of data integration into standard quality control system; 3) data ingestion and quality control can be performed in parallel; 4) provides uniform method of archiving carbon data and generation of digital object identifiers (DOI).In this presentation, we will discuss and demonstrate the SOCAT data ingestion dashboard and the quality control system. We will also discuss the standards, conventions, and tools that were leveraged to create a workflow that allows scientists to work in their own formats, yet provides a framework for creating high quality data products on an annual basis, while meeting or exceeding data requirements for access, documentation and archival.
40 CFR 98.474 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 22 2013-07-01 2013-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Injection of Carbon Dioxide § 98.474 Monitoring and QA/QC.... (2) You must determine the quarterly mass or volume of contents in all containers if you receive CO2...
40 CFR 98.474 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 21 2014-07-01 2014-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Injection of Carbon Dioxide § 98.474 Monitoring and QA/QC.... (2) You must determine the quarterly mass or volume of contents in all containers if you receive CO2...
40 CFR 98.474 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 22 2012-07-01 2012-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Injection of Carbon Dioxide § 98.474 Monitoring and QA/QC.... (2) You must determine the quarterly mass or volume of contents in all containers if you receive CO2...
40 CFR 98.424 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Carbon Dioxide § 98.424 Monitoring and QA/QC... containers shall measure the mass in each CO2 container using weigh bills, scales, or load cells and sum the...
Study of quantum correlation swapping with relative entropy methods
NASA Astrophysics Data System (ADS)
Xie, Chuanmei; Liu, Yimin; Chen, Jianlan; Zhang, Zhanjun
2016-02-01
To generate long-distance shared quantum correlations (QCs) for information processing in future quantum networks, recently we proposed the concept of QC repeater and its kernel technique named QC swapping. Besides, we extensively studied the QC swapping between two simple QC resources (i.e., a pair of Werner states) with four different methods to quantify QCs (Xie et al. in Quantum Inf Process 14:653-679, 2015). In this paper, we continue to treat the same issue by employing other three different methods associated with relative entropies, i.e., the MPSVW method (Modi et al. in Phys Rev Lett 104:080501, 2010), the Zhang method (arXiv:1011.4333 [quant-ph]) and the RS method (Rulli and Sarandy in Phys Rev A 84:042109, 2011). We first derive analytic expressions of all QCs which occur during the swapping process and then reveal their properties about monotonicity and threshold. Importantly, we find that a long-distance shared QC can be generated from two short-distance ones via QC swapping indeed. In addition, we simply compare our present results with our previous ones.
Quantum Testbeds Stakeholder Workshop (QTSW) Report meeting purpose and agenda.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hebner, Gregory A.
Quantum computing (QC) is a promising early-stage technology with the potential to provide scientific computing capabilities far beyond what is possible with even an Exascale computer in specific problems of relevance to the Office of Science. These include (but are not limited to) materials modeling, molecular dynamics, and quantum chromodynamics. However, commercial QC systems are not yet available and the technical maturity of current QC hardware, software, algorithms, and systems integration is woefully incomplete. Thus, there is a significant opportunity for DOE to define the technology building blocks, and solve the system integration issues to enable a revolutionary tool. Oncemore » realized, QC will have world changing impact on economic competitiveness, the scientific enterprise, and citizen well-being. Prior to this workshop, DOE / Office of Advanced Scientific Computing Research (ASCR) hosted a workshop in 2015 to explore QC scientific applications. The goal of that workshop was to assess the viability of QC technologies to meet the computational requirements in support of DOE’s science and energy mission and to identify the potential impact of these technologies.« less
Lapse time and frequency-dependent coda wave attenuation for Delhi and its surrounding regions
NASA Astrophysics Data System (ADS)
Das, Rabin; Mukhopadhyay, Sagarika; Singh, Ravi Kant; Baidya, Pushap R.
2018-07-01
Attenuation of seismic wave energy of Delhi and its surrounding regions has been estimated using coda of local earthquakes. Estimated quality factor (Qc) values are strongly dependent on frequency and lapse time. Frequency dependence of Qc has been estimated from the relationship Qc(f) = Q0fn for different lapse time window lengths. Q0 and n values vary from 73 to 453 and 0.97 to 0.63 for lapse time window lengths of 15 s to 90 s respectively. Average estimated frequency dependent relation is, Qc(f) = 135 ± 8f0.96±0.02 for the entire region for a window length of 30 s, where the average Qc value varies from 200 at 1.5 Hz to 1962 at 16 Hz. These values show that the region is seismically active and highly heterogeneous. The entire study region is divided into two sub-regions according to the geology of the area to investigate if there is a spatial variation in attenuation characteristics in this region. It is observed that at smaller lapse time both regions have similar Qc values. However, at larger lapse times the rate of increase of Qc with frequency is larger for Region 2 compared to Region 1. This is understandable, as it is closer to the tectonically more active Himalayan ranges and seismically more active compared to Region 1. The difference in variation of Qc with frequencies for the two regions is such that at larger lapse time and higher frequencies Region 2 shows higher Qc compared to Region 1. For lower frequencies the opposite situation is true. This indicates that there is a systematic variation in attenuation characteristics from the south (Region 1) to the north (Region 2) in the deeper part of the study area. This variation can be explained in terms of an increase in heat flow and a decrease in the age of the rocks from south to north.
Summation rules for a fully nonlocal energy-based quasicontinuum method
NASA Astrophysics Data System (ADS)
Amelang, J. S.; Venturini, G. N.; Kochmann, D. M.
2015-09-01
The quasicontinuum (QC) method coarse-grains crystalline atomic ensembles in order to bridge the scales from individual atoms to the micro- and mesoscales. A crucial cornerstone of all QC techniques, summation or quadrature rules efficiently approximate the thermodynamic quantities of interest. Here, we investigate summation rules for a fully nonlocal, energy-based QC method to approximate the total Hamiltonian of a crystalline atomic ensemble by a weighted sum over a small subset of all atoms in the crystal lattice. Our formulation does not conceptually differentiate between atomistic and coarse-grained regions and thus allows for seamless bridging without domain-coupling interfaces. We review traditional summation rules and discuss their strengths and weaknesses with a focus on energy approximation errors and spurious force artifacts. Moreover, we introduce summation rules which produce no residual or spurious force artifacts in centrosymmetric crystals in the large-element limit under arbitrary affine deformations in two dimensions (and marginal force artifacts in three dimensions), while allowing us to seamlessly bridge to full atomistics. Through a comprehensive suite of examples with spatially non-uniform QC discretizations in two and three dimensions, we compare the accuracy of the new scheme to various previous ones. Our results confirm that the new summation rules exhibit significantly smaller force artifacts and energy approximation errors. Our numerical benchmark examples include the calculation of elastic constants from completely random QC meshes and the inhomogeneous deformation of aggressively coarse-grained crystals containing nano-voids. In the elastic regime, we directly compare QC results to those of full atomistics to assess global and local errors in complex QC simulations. Going beyond elasticity, we illustrate the performance of the energy-based QC method with the new second-order summation rule by the help of nanoindentation examples with automatic mesh adaptation. Overall, our findings provide guidelines for the selection of summation rules for the fully nonlocal energy-based QC method.
2018-01-01
This study performed two phases of analysis to shed light on the performance and thematic evolution of China’s quantum cryptography (QC) research. First, large-scale research publication metadata derived from QC research published from 2001–2017 was used to examine the research performance of China relative to that of global peers using established quantitative and qualitative measures. Second, this study identified the thematic evolution of China’s QC research using co-word cluster network analysis, a computational science mapping technique. The results from the first phase indicate that over the past 17 years, China’s performance has evolved dramatically, placing it in a leading position. Among the most significant findings is the exponential rate at which all of China’s performance indicators (i.e., Publication Frequency, citation score, H-index) are growing. China’s H-index (a normalized indicator) has surpassed all other countries’ over the last several years. The second phase of analysis shows how China’s main research focus has shifted among several QC themes, including quantum-key-distribution, photon-optical communication, network protocols, and quantum entanglement with an emphasis on applied research. Several themes were observed across time periods (e.g., photons, quantum-key-distribution, secret-messages, quantum-optics, quantum-signatures); some themes disappeared over time (e.g., computer-networks, attack-strategies, bell-state, polarization-state), while others emerged more recently (e.g., quantum-entanglement, decoy-state, unitary-operation). Findings from the first phase of analysis provide empirical evidence that China has emerged as the global driving force in QC. Considering China is the premier driving force in global QC research, findings from the second phase of analysis provide an understanding of China’s QC research themes, which can provide clarity into how QC technologies might take shape. QC and science and technology policy researchers can also use these findings to trace previous research directions and plan future lines of research. PMID:29385151
Olijnyk, Nicholas V
2018-01-01
This study performed two phases of analysis to shed light on the performance and thematic evolution of China's quantum cryptography (QC) research. First, large-scale research publication metadata derived from QC research published from 2001-2017 was used to examine the research performance of China relative to that of global peers using established quantitative and qualitative measures. Second, this study identified the thematic evolution of China's QC research using co-word cluster network analysis, a computational science mapping technique. The results from the first phase indicate that over the past 17 years, China's performance has evolved dramatically, placing it in a leading position. Among the most significant findings is the exponential rate at which all of China's performance indicators (i.e., Publication Frequency, citation score, H-index) are growing. China's H-index (a normalized indicator) has surpassed all other countries' over the last several years. The second phase of analysis shows how China's main research focus has shifted among several QC themes, including quantum-key-distribution, photon-optical communication, network protocols, and quantum entanglement with an emphasis on applied research. Several themes were observed across time periods (e.g., photons, quantum-key-distribution, secret-messages, quantum-optics, quantum-signatures); some themes disappeared over time (e.g., computer-networks, attack-strategies, bell-state, polarization-state), while others emerged more recently (e.g., quantum-entanglement, decoy-state, unitary-operation). Findings from the first phase of analysis provide empirical evidence that China has emerged as the global driving force in QC. Considering China is the premier driving force in global QC research, findings from the second phase of analysis provide an understanding of China's QC research themes, which can provide clarity into how QC technologies might take shape. QC and science and technology policy researchers can also use these findings to trace previous research directions and plan future lines of research.
Pavlov, Sergey S; Dmitriev, Andrey Yu; Frontasyeva, Marina V
The present status of development of software packages and equipment designed for automation of NAA at the reactor IBR-2 of FLNP, JINR, Dubna, RF, is described. The NAA database, construction of sample changers and software for automation of spectra measurement and calculation of concentrations are presented. Automation of QC procedures is integrated in the software developed. Details of the design are shown.
Adjustment of pesticide concentrations for temporal changes in analytical recovery, 1992–2010
Martin, Jeffrey D.; Eberle, Michael
2011-01-01
Recovery is the proportion of a target analyte that is quantified by an analytical method and is a primary indicator of the analytical bias of a measurement. Recovery is measured by analysis of quality-control (QC) water samples that have known amounts of target analytes added ("spiked" QC samples). For pesticides, recovery is the measured amount of pesticide in the spiked QC sample expressed as a percentage of the amount spiked, ideally 100 percent. Temporal changes in recovery have the potential to adversely affect time-trend analysis of pesticide concentrations by introducing trends in apparent environmental concentrations that are caused by trends in performance of the analytical method rather than by trends in pesticide use or other environmental conditions. This report presents data and models related to the recovery of 44 pesticides and 8 pesticide degradates (hereafter referred to as "pesticides") that were selected for a national analysis of time trends in pesticide concentrations in streams. Water samples were analyzed for these pesticides from 1992 through 2010 by gas chromatography/mass spectrometry. Recovery was measured by analysis of pesticide-spiked QC water samples. Models of recovery, based on robust, locally weighted scatterplot smooths (lowess smooths) of matrix spikes, were developed separately for groundwater and stream-water samples. The models of recovery can be used to adjust concentrations of pesticides measured in groundwater or stream-water samples to 100 percent recovery to compensate for temporal changes in the performance (bias) of the analytical method.
Countably QC-Approximating Posets
Mao, Xuxin; Xu, Luoshan
2014-01-01
As a generalization of countably C-approximating posets, the concept of countably QC-approximating posets is introduced. With the countably QC-approximating property, some characterizations of generalized completely distributive lattices and generalized countably approximating posets are given. The main results are as follows: (1) a complete lattice is generalized completely distributive if and only if it is countably QC-approximating and weakly generalized countably approximating; (2) a poset L having countably directed joins is generalized countably approximating if and only if the lattice σ c(L)op of all σ-Scott-closed subsets of L is weakly generalized countably approximating. PMID:25165730
Desaules, André
2012-11-01
It is crucial for environmental monitoring to fully control temporal bias, which is the distortion of real data evolution by varying bias through time. Temporal bias cannot be fully controlled by statistics alone but requires appropriate and sufficient metadata, which should be under rigorous and continuous quality assurance and control (QA/QC) to reliably document the degree of consistency of the monitoring system. All presented strategies to detect and control temporal data bias (QA/QC, harmonisation/homogenisation/standardisation, mass balance approach, use of tracers and analogues and control of changing boundary conditions) rely on metadata. The Will Rogers phenomenon, due to subsequent reclassification, is a particular source of temporal data bias introduced to environmental monitoring here. Sources and effects of temporal data bias are illustrated by examples from the Swiss soil monitoring network. The attempt to make a comprehensive compilation and assessment of required metadata for soil contamination monitoring reveals that most metadata are still far from being reliable. This leads to the conclusion that progress in environmental monitoring means further development of the concept of environmental metadata for the sake of temporal data bias control as a prerequisite for reliable interpretations and decisions.
NASA Astrophysics Data System (ADS)
Choi, Hyunwoo; Kim, Tae Geun; Shin, Changhwan
2017-06-01
A topological insulator (TI) is a new kind of material that exhibits unique electronic properties owing to its topological surface state (TSS). Previous studies focused on the transport properties of the TSS, since it can be used as the active channel layer in metal-oxide-semiconductor field-effect transistors (MOSFETs). However, a TI with a negative quantum capacitance (QC) effect can be used in the gate stack of MOSFETs, thereby facilitating the creation of ultra-low power electronics. Therefore, it is important to study the physics behind the QC in TIs in the absence of any external magnetic field, at room temperature. We fabricated a simple capacitor structure using a TI (TI-capacitor: Au-TI-SiO2-Si), which shows clear evidence of QC at room temperature. In the capacitance-voltage (C-V) measurement, the total capacitance of the TI-capacitor increases in the accumulation regime, since QC is the dominant capacitive component in the series capacitor model (i.e., CT-1 = CQ-1 + CSiO2-1). Based on the QC model of the two-dimensional electron systems, we quantitatively calculated the QC, and observed that the simulated C-V curve theoretically supports the conclusion that the QC of the TI-capacitor is originated from electron-electron interaction in the two-dimensional surface state of the TI.
NASA Astrophysics Data System (ADS)
Cha, Min Kyoung; Ko, Hyun Soo; Jung, Woo Young; Ryu, Jae Kwang; Choe, Bo-Young
2015-08-01
The Accuracy of registration between positron emission tomography (PET) and computed tomography (CT) images is one of the important factors for reliable diagnosis in PET/CT examinations. Although quality control (QC) for checking alignment of PET and CT images should be performed periodically, the procedures have not been fully established. The aim of this study is to determine optimal quality control (QC) procedures that can be performed at the user level to ensure the accuracy of PET/CT registration. Two phantoms were used to carry out this study: the American college of Radiology (ACR)-approved PET phantom and National Electrical Manufacturers Association (NEMA) International Electrotechnical Commission (IEC) body phantom, containing fillable spheres. All PET/CT images were acquired on a Biograph TruePoint 40 PET/CT scanner using routine protocols. To measure registration error, the spatial coordinates of the estimated centers of the target slice (spheres) was calculated independently for the PET and the CT images in two ways. We compared the images from the ACR-approved PET phantom to that from the NEMA IEC body phantom. Also, we measured the total time required from phantom preparation to image analysis. The first analysis method showed a total difference of 0.636 ± 0.11 mm for the largest hot sphere and 0.198 ± 0.09 mm for the largest cold sphere in the case of the ACR-approved PET phantom. In the NEMA IEC body phantom, the total difference was 3.720 ± 0.97 mm for the largest hot sphere and 4.800 ± 0.85 mm for the largest cold sphere. The second analysis method showed that the differences in the x location at the line profile of the lesion on PET and CT were (1.33, 1.33) mm for a bone lesion, (-1.26, -1.33) mm for an air lesion and (-1.67, -1.60) mm for a hot sphere lesion for the ACR-approved PET phantom. For the NEMA IEC body phantom, the differences in the x location at the line profile of the lesion on PET and CT were (-1.33, 4.00) mm for the air lesion and (1.33, -1.29) mm for a hot sphere lesion. These registration errors from this study were reasonable compared to the errors reported in previous studies. Meanwhile, the total time required from phantom preparation was 67.72 ± 4.50 min for the ACR-approved PET phantom and 96.78 ± 8.50 min for the NEMA IEC body phantom. When the registration errors and the lead times are considered, the method using the ACR-approved PET phantom was more practical and useful than the method using the NEMA IEC body phantom.
Material quality assurance risk assessment.
DOT National Transportation Integrated Search
2013-01-01
Over the past two decades the role of SHA has shifted from quality control (QC) of materials and : placement techniques to quality assurance (QA) and acceptance. The role of the Office of Materials : Technology (OMT) has been shifting towards assuran...
Long-term pavement performance indicators for failed materials.
DOT National Transportation Integrated Search
2016-04-01
State Transportation Agencies (STAs) use quality control/quality assurance (QC/QA) specifications to guide the testing and inspection of : road pavement construction. Although failed materials of pavement rarely occur in practice, it is critical to h...
Material quality assurance risk assessment : [summary].
DOT National Transportation Integrated Search
2013-01-01
With the shift from quality control (QC) of materials and placement techniques : to quality assurance (QA) and acceptance over the years, the role of the Office : of Materials Technology (OMT) has been shifting towards assurance of : material quality...
The April 1994 and October 1994 radon intercomparisons at EML
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fisenne, I.M.; George, A.C.; Perry, P.M.
1995-10-01
Quality assurance/quality control (QA/QC) are the backbone of many commercial and research processes and programs. QA/QC research tests the state of a functioning system, be it the production of manufactured goods or the ability to make accurate and precise measurements. The quality of the radon measurements in the US have been tested under controlled conditions in semi-annual radon gas intercomparison exercises sponsored by the Environmental Measurements Laboratory (EML) since 1981. The two Calendar Year 1994 radon gas intercomparison exercises were conducted in the EML exposure chamber. Thirty-two groups including US Federal facilities, USDOE contractors, national and state laboratories, universities andmore » foreign institutions participated in these exercises. The majority of the participant`s results were within {+-}10% of the EML value at radon concentrations of 570 and 945 Bq m{sup {minus}3}.« less
Quality control in the year 2000.
Schade, B
1992-01-01
'Just-in-time' production is a prerequisite for a company to meet the challenges of competition. Manufacturing cycles have been so successfully optimized that release time now has become a significant factor. A vision for a major quality-control (QC) contribution to profitability in this decade seems to be the just-in-time release. Benefits will go beyond cost savings for lower inventory. The earlier detection of problems will reduce rejections and scrap. In addition, problem analysis and problem-solving will be easier. To achieve just-in-time release, advanced automated systems like robots will become the workhorses in QC for high volume pharmaceutical production. The requirements for these systems are extremely high in terms of quality, reliability and ruggedness. Crucial for the success might be advances in use of microelectronics for error checks, system recording, trouble shooting, etc. as well as creative new approaches (for example the use of redundant assay systems).
Quality control in the year 2000
Schade, Bernd
1992-01-01
‘Just-in-time’ production is a prerequisite for a company to meet the challenges of competition. Manufacturing cycles have been so successfully optimized that release time now has become a significant factor. A vision for a major quality-control (QC) contribution to profitability in this decade seems to be the just-in-time release. Benefits will go beyond cost savings for lower inventory. The earlier detection of problems will reduce rejections and scrap. In addition, problem analysis and problem-solving will be easier. To achieve just-in-time release, advanced automated systems like robots will become the workhorses in QC for high volume pharmaceutical production. The requirements for these systems are extremely high in terms of quality, reliability and ruggedness. Crucial for the success might be advances in use of microelectronics for error checks, system recording, trouble shooting, etc. as well as creative new approaches (for example the use of redundant assay systems). PMID:18924930
Large-Scale Topographic Features on Venus: A Comparison by Geological Mapping in Four Quadrangles
NASA Astrophysics Data System (ADS)
Ivanov, M. A.; Head, J. W.
2002-05-01
We have conducted geological mapping in four quadrangles under the NASA program of geological mapping of Venus. Two quadrangles portray large equidimensional lowlands (Lavinia, V55, and Atalanta, V4, Planitiae) and two more areas are characterized by a large corona (Quetzalpetlatl corona, QC, V66), and Lakshmi Planum (LP, V7). Geological mapping of these large-scale features allows for their broad comparisons by both sets of typical structures and sequences of events. The Planitiae share a number of similar characteristics. (1) Lavinia and Atalanta are broad quasi-circular lowlands 1-2 km deep. (2) The central portions of the basins lack both coronae and large volcanoes. (3) The belts of tectonic deformation characterize the central portions of the basins. (4) There is evidence in both lowlands that they subsided predominantly before the emplacement of regional plains. (5) Recent volcanism is shifted toward the periphery of the basins and occurred after or at the late stages the formation of the lowlands. The above characteristics of the lowlands are better reconciled with the scenario in which their formation is due to a broad-scale mantle downwelling that started relatively early in the visible geologic history of Venus. The QC and LP are elevated structures roughly comparable in size. The formation of QC is commonly attributed to large-scale mantle positive diapirism while the formation of LP remains controversial and both mantle upwelling and downwelling models exist. QC and LP have similar characteristics such as broadly circular shape in plan-view, association with regional highlands, associated relatively young volcanism, and a topographic moat bordering both QC and LP from the North. Despite the above similarities, the striking differences between QC and LP are obvious too. LP is crowned by the highest mountain ranges on Venus and QC is bordered from the North by a common belt of ridges. LP itself makes up a regional highland within the upland of Ishtar Terra while QC produces a much less significant topographic anomaly on the background of the highland of Lada Terra. Highly deformed, tessera-like, terrain apparently makes up the basement of LP, and QC formed in the tessera-free area. Volcanic activity is concentrated in the central portion of LP while QC is a regionally important center of young volcanism. These differences, which probably can not be accounted for by simple difference in the size of LP and QC, suggest non-similar modes of the formation of both regional structures and do not favor the upwelling models of the formation of LP.
Chen, Haiming; Lu, Chuanjian; Liu, Huazhen; Wang, Maojie; Zhao, Hui; Yan, Yuhong; Han, Ling
2017-07-01
Quercetin (QC) is a dietary flavonoid abundant in many natural plants. A series of studies have shown that it has been shown to exhibit several biological properties, including anti-inflammatory, anti-oxidant, cardio-protective, vasodilatory, liver-protective and anti-cancer activities. However, so far the possible therapeutic effect of QC on psoriasis has not been reported. The present study was undertaken to evaluate the potential beneficial effect of QC in psoriasis using a generated imiquimod (IMQ)-induced psoriasis-like mouse model, and to further elucidate its underlying mechanisms of action. Effects of QC on PASI scores, back temperature, histopathological changes, oxidative/anti-oxidative indexes, pro-inflammatory cytokines and NF-κB pathway in IMQ-induced mice were investigated. Our results showed that QC could significantly reduce the PASI scores, decrease the temperature of the psoriasis-like lesions, and ameliorate the deteriorating histopathology in IMQ-induced mice. Moreover, QC effectively attenuated levels of TNF-α, IL-6 and IL-17 in serum, increased activities of GSH, CAT and SOD, and decreased the accumulation of MDA in skin tissue induced by IMQ in mice. The mechanism may be associated with the down-regulation of NF-κB, IKKα, NIK and RelB expression and up-regulation of TRAF3, which were critically involved in the non-canonical NF-κB pathway. In conclusion, our present study demonstrated that QC had appreciable anti-psoriasis effects in IMQ-induced mice, and the underlying mechanism may involve the improvement of antioxidant and anti-inflammatory status and inhibition on the activation of the NF-κB signaling. Hence, QC, a naturally occurring flavone with potent anti-psoriatic effects, has the potential for further development as a candidate for psoriasis treatment. Copyright © 2017 Elsevier B.V. All rights reserved.
Improvement of the quality of work in a biochemistry laboratory via measurement system analysis.
Chen, Ming-Shu; Liao, Chen-Mao; Wu, Ming-Hsun; Lin, Chih-Ming
2016-10-31
An adequate and continuous monitoring of operational variations can effectively reduce the uncertainty and enhance the quality of laboratory reports. This study applied the evaluation rule of the measurement system analysis (MSA) method to estimate the quality of work conducted in a biochemistry laboratory. Using the gauge repeatability & reproducibility (GR&R) approach, variations in quality control (QC) data among medical technicians in conducting measurements of five biochemical items, namely, serum glucose (GLU), aspartate aminotransferase (AST), uric acid (UA), sodium (Na) and chloride (Cl), were evaluated. The measurements of the five biochemical items showed different levels of variance among the different technicians, with the variances in GLU measurements being higher than those for the other four items. The ratios of precision-to-tolerance (P/T) for Na, Cl and GLU were all above 0.5, implying inadequate gauge capability. The product variation contribution of Na was large (75.45% and 31.24% in normal and abnormal QC levels, respectively), which showed that the impact of insufficient usage of reagents could not be excluded. With regard to reproducibility, high contributions (of more than 30%) of variation for the selected items were found. These high operator variation levels implied that the possibility of inadequate gauge capacity could not be excluded. The analysis of variance (ANOVA) of GR&R showed that the operator variations in GLU measurements were significant (F=5.296, P=0.001 in the normal level and F=3.399, P=0.015 in the abnormal level, respectively). In addition to operator variations, product variations of Na were also significant for both QC levels. The heterogeneity of variance for the five technicians showed significant differences for the Na and Cl measurements in the normal QC level. The accuracy of QC for five technicians was identified for further operational improvement. This study revealed that MSA can be used to evaluate product and personnel errors and to improve the quality of work in a biochemical laboratory through proper corrective actions.
Statistical analysis of QC data and estimation of fuel rod behaviour
NASA Astrophysics Data System (ADS)
Heins, L.; Groβ, H.; Nissen, K.; Wunderlich, F.
1991-02-01
The behaviour of fuel rods while in reactor is influenced by many parameters. As far as fabrication is concerned, fuel pellet diameter and density, and inner cladding diameter are important examples. Statistical analyses of quality control data show a scatter of these parameters within the specified tolerances. At present it is common practice to use a combination of superimposed unfavorable tolerance limits (worst case dataset) in fuel rod design calculations. Distributions are not considered. The results obtained in this way are very conservative but the degree of conservatism is difficult to quantify. Probabilistic calculations based on distributions allow the replacement of the worst case dataset by a dataset leading to results with known, defined conservatism. This is achieved by response surface methods and Monte Carlo calculations on the basis of statistical distributions of the important input parameters. The procedure is illustrated by means of two examples.
The Navy’s Quality Journey: Operational Implementation of TQL
1993-04-01
training. Dr. Kaoru Ishikawa "Guide to Ouality Control" "QC begins with education and ends with education. To implement TQC, we need to carry out...York: McGraw-Hill, 1986. 20. Ishikawa , Kaoru . What is Total Qualit Control? Englewood Cliffs, NJ: Prentice-Hall, Inc., 1985. 21. Ishikawa , Kaoru
Stability of Tetrahydrocannabinol and Cannabidiol in Prepared Quality Control Medible Brownies.
Wolf, Carl E; Poklis, Justin L; Poklis, Alphonse
2017-03-01
The legalization of marijuana in the USA for both medicinal and recreational use has increased in the past few years. Currently, 24 states have legalized marijuana for medicinal use. The US Drug Enforcement Administration has classified marijuana as a Schedule I substance. The US Food and Drug Administration does not regulate formulations or packages of marijuana that are currently marketed in states that have legalized marijuana. Marijuana edibles or "medibles" are typically packages of candies and baked goods consumed for medicinal as well as recreational marijuana use. They contain major psychoactive drug in marijuana, delta-9-tetrahydrocannabinol (THC) and/or cannabidiol (CBD), which has reputed medical properties. Presented is a method for the preparation and application of THC and CBD containing brownies used as quality control (QC) material for the analysis of marijuana or cannabinoid baked medibles. The performance parameters of the assay including possible matrix effects and cannabinoid stability in the brownie QC over time are presented. It was determined that the process used to prepare and bake the brownie control material did not degrade the THC or CBD. The brownie matrix was found not to interfere with the analysis of a THC or a CBD. Ten commercially available brownie matrixes were evaluated for potential interferences; none of them were found to interfere with the analysis of THC or CBD. The laboratory baked medible QC material was found to be stable at room temperature for at least 3 months. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Samuelson, John; Robbins, Phillips W.
2014-01-01
Asparagine-linked glycans (N-glycans) of medically important protists have much to tell us about the evolution of N-glycosylation and of N-glycan-dependent quality control (N-glycan QC) of protein folding in the endoplasmic reticulum. While host N-glycans are built upon a dolichol-pyrophosphate-linked precursor with 14 sugars (Glc3Man9GlcNAc2), protist N-glycan precursors vary from Glc3Man9GlcNAc2 (Acanthamoeba) to Man9GlcNAc2 (Trypanosoma) to Glc3Man5GlcNAc2 (Toxoplasma) to Man5GlcNAc2 (Entamoeba, Trichomonas, and Eimeria) to GlcNAc2 (Plasmodium and Giardia) to zero (Theileria). As related organisms have differing N-glycan lengths (e.g. Toxoplasma, Eimeria, Plasmodium, and Theileria), the present N-glycan variation is based upon secondary loss of Alg genes, which encode enzymes that add sugars to the N-glycan precursor. An N-glycan precursor with Man5GlcNAc2 is necessary but not sufficient for N-glycan QC, which is predicted by the presence of the UDP-glucose:glucosyltransferase (UGGT) plus calreticulin and/or calnexin. As many parasites lack glucose in their N-glycan precursor, UGGT product may be identified by inhibition of glucosidase II. The presence of an armless calnexin in Toxoplasma suggests secondary loss of N-glycan QC from coccidia. Positive selection for N-glycan sites occurs in secreted proteins of organisms with NG-QC and is based upon an increased likelihood of threonine but not serine in the second position versus asparagine. In contrast, there appears to be selection against N-glycan length in Plasmodium and N-glycan site density in Toxoplasma. Finally, there is suggestive evidence for N-glycan-dependent ERAD in Trichomonas, which glycosylates and degrades the exogenous reporter mutant carboxypeptidase Y (CPY*). PMID:25475176
NASA Astrophysics Data System (ADS)
Maity, H.; Biswas, A.; Bhattacharjee, A. K.; Pal, A.
In this paper, we have proposed the design of quantum cost (QC) optimized 4-bit reversible universal shift register (RUSR) using reduced number of reversible logic gates. The proposed design is very useful in quantum computing due to its low QC, less no. of reversible logic gate and less delay. The QC, no. of gates, garbage outputs (GOs) are respectively 64, 8 and 16 for proposed work. The improvement of proposed work is also presented. The QC is 5.88% to 70.9% improved, no. of gate is 60% to 83.33% improved with compared to latest reported result.
The Quasicontinuum Method: Overview, applications and current directions
NASA Astrophysics Data System (ADS)
Miller, Ronald E.; Tadmor, E. B.
2002-10-01
The Quasicontinuum (QC) Method, originally conceived and developed by Tadmor, Ortiz and Phillips [1] in 1996, has since seen a great deal of development and application by a number of researchers. The idea of the method is a relatively simple one. With the goal of modeling an atomistic system without explicitly treating every atom in the problem, the QC provides a framework whereby degrees of freedom are judiciously eliminated and force/energy calculations are expedited. This is combined with adaptive model refinement to ensure that full atomistic detail is retained in regions of the problem where it is required while continuum assumptions reduce the computational demand elsewhere. This article provides a review of the method, from its original motivations and formulation to recent improvements and developments. A summary of the important mechanics of materials results that have been obtained using the QC approach is presented. Finally, several related modeling techniques from the literature are briefly discussed. As an accompaniment to this paper, a website designed to serve as a clearinghouse for information on the QC method has been established at www.qcmethod.com. The site includes information on QC research, links to researchers, downloadable QC code and documentation.
NASA Astrophysics Data System (ADS)
Dirisu, Afusat Olayinka
Quantum Cascade (QC) lasers are intersubband light sources operating in the wavelength range of ˜ 3 to 300 mum and are used in applications such as sensing (environmental, biological, and hazardous chemical), infrared countermeasures, and free-space infrared communications. The mid-infrared range (i.e. lambda ˜ 3-30 mum) is of particular importance in sensing because of the strong interaction of laser radiation with various chemical species, while in free space communications the atmospheric windows of 3-5 mum and 8-12 mum are highly desirable for low loss transmission. Some of the requirements of these applications include, (1) high output power for improved sensitivity; (2) high operating temperatures for compact and cost-effective systems; (3) wide tunability; (4) single mode operation for high selectivity. In the past, available mid-infrared sources, such as the lead-salt and solid-state lasers, were bulky, expensive, or emit low output power. In recent years, QC lasers have been explored as cost-effective and compact sources because of their potential to satisfy and exceed all the above requirements. Also, the ultrafast carrier lifetimes of intersubband transitions in QC lasers are promising for high bandwidth free-space infrared communication. This thesis was focused on the improvement of QC lasers through the design and optimization of the laser cavity and characterization of the laser gain medium. The optimization of the laser cavity included, (1) the design and fabrication of high reflection Bragg gratings and subwavelength antireflection gratings, by focused ion beam milling, to achieve tunable, single mode and high power QC lasers, and (2) modeling of slab-coupled optical waveguide QC lasers for high brightness output beams. The characterization of the QC laser gain medium was carried out using the single-pass transmission experiment, a sensitive measurement technique, for probing the intersubband transitions and the electron distribution of QC lasers under different temperatures and applied bias conditions, unlike typical infrared measurement techniques that are restricted to non-functional devices. With the single-pass technique, basic understanding of the physics behind the workings of the QC laser gain can be achieved, which is invaluable in the design of QC lasers with high output power and high operating temperatures.
Managing the Quality of Environmental Data in EPA Region 9
EPA Pacific Southwest, Region 9's Quality Assurance (QA) section's primary mission is to effectively oversee and carry out the Quality System and Quality Management Plan, and project-level quality assurance and quality control (QA/QC) activities.
QC/QA : evaluation of effectiveness in Kentucky.
DOT National Transportation Integrated Search
2008-06-30
Quality control and quality assurance in the highway industry is going through a cultural shift. There is a growing trend toward using the contractor data for acceptance and payment purpose. This has led to serious concerns about conflicts of interes...
Ensuring the reliability of stable isotope ratio data--beyond the principle of identical treatment.
Carter, J F; Fry, B
2013-03-01
The need for inter-laboratory comparability is crucial to facilitate the globalisation of scientific networks and the development of international databases to support scientific and criminal investigations. This article considers what lessons can be learned from a series of inter-laboratory comparison exercises organised by the Forensic Isotope Ratio Mass Spectrometry (FIRMS) network in terms of reference materials (RMs), the management of data quality, and technical limitations. The results showed that within-laboratory precision (repeatability) was generally good but between-laboratory accuracy (reproducibility) called for improvements. This review considers how stable isotope laboratories can establish a system of quality control (QC) and quality assurance (QA), emphasising issues of repeatability and reproducibility. For results to be comparable between laboratories, measurements must be traceable to the international δ-scales and, because isotope ratio measurements are reported relative to standards, a key aspect is the correct selection, calibration, and use of international and in-house RMs. The authors identify four principles which promote good laboratory practice. The principle of identical treatment by which samples and RMs are processed in an identical manner and which incorporates three further principles; the principle of identical correction (by which necessary corrections are identified and evenly applied), the principle of identical scaling (by which data are shifted and stretched to the international δ-scales), and the principle of error detection by which QC and QA results are monitored and acted upon. To achieve both good repeatability and good reproducibility it is essential to obtain RMs with internationally agreed δ-values. These RMs will act as the basis for QC and can be used to calibrate further in-house QC RMs tailored to the activities of specific laboratories. In-house QA standards must also be developed to ensure that QC-based calibrations and corrections lead to accurate results for samples. The δ-values assigned to RMs must be recorded and reported with all data. Reference materials must be used to determine what corrections are necessary for measured data. Each analytical sequence of samples must include both QC and QA materials which are subject to identical treatment during measurement and data processing. Results for these materials must be plotted, monitored, and acted upon. Periodically international RMs should be analysed as an in-house proficiency test to demonstrate results are accurate.
NASA Astrophysics Data System (ADS)
Huang, Sheng; Ao, Xiang; Li, Yuan-yuan; Zhang, Rui
2016-09-01
In order to meet the needs of high-speed development of optical communication system, a construction method of quasi-cyclic low-density parity-check (QC-LDPC) codes based on multiplicative group of finite field is proposed. The Tanner graph of parity check matrix of the code constructed by this method has no cycle of length 4, and it can make sure that the obtained code can get a good distance property. Simulation results show that when the bit error rate ( BER) is 10-6, in the same simulation environment, the net coding gain ( NCG) of the proposed QC-LDPC(3 780, 3 540) code with the code rate of 93.7% in this paper is improved by 2.18 dB and 1.6 dB respectively compared with those of the RS(255, 239) code in ITU-T G.975 and the LDPC(3 2640, 3 0592) code in ITU-T G.975.1. In addition, the NCG of the proposed QC-LDPC(3 780, 3 540) code is respectively 0.2 dB and 0.4 dB higher compared with those of the SG-QC-LDPC(3 780, 3 540) code based on the two different subgroups in finite field and the AS-QC-LDPC(3 780, 3 540) code based on the two arbitrary sets of a finite field. Thus, the proposed QC-LDPC(3 780, 3 540) code in this paper can be well applied in optical communication systems.