Quality Control (QC) System Development for the Pell Grant Program: A Conceptual Framework.
ERIC Educational Resources Information Center
Advanced Technology, Inc., Reston, VA.
The objectives of the Pell Grant quality control (QC) system and the general definition of QC are considered. Attention is also directed to: the objectives of the Stage II Pell Grant QC system design and testing project, the approach used to develop the QC system, and the interface of the QC system and the Pell Grant delivery system. The…
A Total Quality-Control Plan with Right-Sized Statistical Quality-Control.
Westgard, James O
2017-03-01
A new Clinical Laboratory Improvement Amendments option for risk-based quality-control (QC) plans became effective in January, 2016. Called an Individualized QC Plan, this option requires the laboratory to perform a risk assessment, develop a QC plan, and implement a QC program to monitor ongoing performance of the QC plan. Difficulties in performing a risk assessment may limit validity of an Individualized QC Plan. A better alternative is to develop a Total QC Plan including a right-sized statistical QC procedure to detect medically important errors. Westgard Sigma Rules provides a simple way to select the right control rules and the right number of control measurements. Copyright © 2016 Elsevier Inc. All rights reserved.
Jones, A Kyle; Heintz, Philip; Geiser, William; Goldman, Lee; Jerjian, Khachig; Martin, Melissa; Peck, Donald; Pfeiffer, Douglas; Ranger, Nicole; Yorkston, John
2015-11-01
Quality control (QC) in medical imaging is an ongoing process and not just a series of infrequent evaluations of medical imaging equipment. The QC process involves designing and implementing a QC program, collecting and analyzing data, investigating results that are outside the acceptance levels for the QC program, and taking corrective action to bring these results back to an acceptable level. The QC process involves key personnel in the imaging department, including the radiologist, radiologic technologist, and the qualified medical physicist (QMP). The QMP performs detailed equipment evaluations and helps with oversight of the QC program, the radiologic technologist is responsible for the day-to-day operation of the QC program. The continued need for ongoing QC in digital radiography has been highlighted in the scientific literature. The charge of this task group was to recommend consistency tests designed to be performed by a medical physicist or a radiologic technologist under the direction of a medical physicist to identify problems with an imaging system that need further evaluation by a medical physicist, including a fault tree to define actions that need to be taken when certain fault conditions are identified. The focus of this final report is the ongoing QC process, including rejected image analysis, exposure analysis, and artifact identification. These QC tasks are vital for the optimal operation of a department performing digital radiography.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, A. Kyle, E-mail: kyle.jones@mdanderson.org; Geiser, William; Heintz, Philip
Quality control (QC) in medical imaging is an ongoing process and not just a series of infrequent evaluations of medical imaging equipment. The QC process involves designing and implementing a QC program, collecting and analyzing data, investigating results that are outside the acceptance levels for the QC program, and taking corrective action to bring these results back to an acceptable level. The QC process involves key personnel in the imaging department, including the radiologist, radiologic technologist, and the qualified medical physicist (QMP). The QMP performs detailed equipment evaluations and helps with oversight of the QC program, the radiologic technologist ismore » responsible for the day-to-day operation of the QC program. The continued need for ongoing QC in digital radiography has been highlighted in the scientific literature. The charge of this task group was to recommend consistency tests designed to be performed by a medical physicist or a radiologic technologist under the direction of a medical physicist to identify problems with an imaging system that need further evaluation by a medical physicist, including a fault tree to define actions that need to be taken when certain fault conditions are identified. The focus of this final report is the ongoing QC process, including rejected image analysis, exposure analysis, and artifact identification. These QC tasks are vital for the optimal operation of a department performing digital radiography.« less
Jayakody, Chatura; Hull-Ryde, Emily A
2016-01-01
Well-defined quality control (QC) processes are used to determine whether a certain procedure or action conforms to a widely accepted standard and/or set of guidelines, and are important components of any laboratory quality assurance program (Popa-Burke et al., J Biomol Screen 14: 1017-1030, 2009). In this chapter, we describe QC procedures useful for monitoring the accuracy and precision of laboratory instrumentation, most notably automated liquid dispensers. Two techniques, gravimetric QC and photometric QC, are highlighted in this chapter. When used together, these simple techniques provide a robust process for evaluating liquid handler accuracy and precision, and critically underpin high-quality research programs.
Results-driven approach to improving quality and productivity
John Dramm
2000-01-01
Quality control (QC) programs do not often realize their full potential. Elaborate and expensive QC programs can easily get side tracked by the process of building a program with promises of âSomeday, this will all pay off.â Training employees in QC methods is no guarantee that quality will improve. Several documented cases show that such activity-centered efforts...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Angers, Crystal Plume; Bottema, Ryan; Buckley, Les
Purpose: Treatment unit uptime statistics are typically used to monitor radiation equipment performance. The Ottawa Hospital Cancer Centre has introduced the use of Quality Control (QC) test success as a quality indicator for equipment performance and overall health of the equipment QC program. Methods: Implemented in 2012, QATrack+ is used to record and monitor over 1100 routine machine QC tests each month for 20 treatment and imaging units ( http://qatrackplus.com/ ). Using an SQL (structured query language) script, automated queries of the QATrack+ database are used to generate program metrics such as the number of QC tests executed and themore » percentage of tests passing, at tolerance or at action. These metrics are compared against machine uptime statistics already reported within the program. Results: Program metrics for 2015 show good correlation between pass rate of QC tests and uptime for a given machine. For the nine conventional linacs, the QC test success rate was consistently greater than 97%. The corresponding uptimes for these units are better than 98%. Machines that consistently show higher failure or tolerance rates in the QC tests have lower uptimes. This points to either poor machine performance requiring corrective action or to problems with the QC program. Conclusions: QATrack+ significantly improves the organization of QC data but can also aid in overall equipment management. Complimenting machine uptime statistics with QC test metrics provides a more complete picture of overall machine performance and can be used to identify areas of improvement in the machine service and QC programs.« less
A Framework for a Quality Control System for Vendor/Processor Contracts.
ERIC Educational Resources Information Center
Advanced Technology, Inc., Reston, VA.
A framework for monitoring quality control (QC) of processor contracts administered by the Department of Education's Office of Student Financial Assistance (OSFA) is presented and applied to the Pell Grant program. Guidelines for establishing QC measures and standards are included, and the uses of a sampling procedure in the QC system are…
Preliminary Quality Control System Design for the Pell Grant Program.
ERIC Educational Resources Information Center
Advanced Technology, Inc., Reston, VA.
A preliminary design for a quality control (QC) system for the Pell Grant Program is proposed, based on the needs of the Office of Student Financial Assistance (OSFA). The applicability of the general design for other student aid programs administered by OSFA is also considered. The following steps included in a strategic approach to QC system…
76 FR 67315 - Supplemental Nutrition Assistance Program: Quality Control Error Tolerance Threshold
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-01
...This direct final rule is amending the Quality Control (QC) review error threshold in our regulations from $25.00 to $50.00. The purpose for raising the QC error threshold is to make permanent the temporary threshold change that was required by the American Recovery and Reinvestment Act of 2008. This change does not have an impact on the public. The QC system measures the accuracy of the eligibility system for the Supplemental Nutrition Assistance Program (SNAP).
Data-quality measures for stakeholder-implemented watershed-monitoring programs
Greve, Adrienne I.
2002-01-01
Community-based watershed groups, many of which collect environmental data, have steadily increased in number over the last decade. The data generated by these programs are often underutilized due to uncertainty in the quality of data produced. The incorporation of data-quality measures into stakeholder monitoring programs lends statistical validity to data. Data-quality measures are divided into three steps: quality assurance, quality control, and quality assessment. The quality-assurance step attempts to control sources of error that cannot be directly quantified. This step is part of the design phase of a monitoring program and includes clearly defined, quantifiable objectives, sampling sites that meet the objectives, standardized protocols for sample collection, and standardized laboratory methods. Quality control (QC) is the collection of samples to assess the magnitude of error in a data set due to sampling, processing, transport, and analysis. In order to design a QC sampling program, a series of issues needs to be considered: (1) potential sources of error, (2) the type of QC samples, (3) inference space, (4) the number of QC samples, and (5) the distribution of the QC samples. Quality assessment is the process of evaluating quality-assurance measures and analyzing the QC data in order to interpret the environmental data. Quality assessment has two parts: one that is conducted on an ongoing basis as the monitoring program is running, and one that is conducted during the analysis of environmental data. The discussion of the data-quality measures is followed by an example of their application to a monitoring program in the Big Thompson River watershed of northern Colorado.
Quality Assurance and Control Considerations in Environmental Measurements and Monitoring
NASA Astrophysics Data System (ADS)
Sedlet, Jacob
1982-06-01
Quality assurance and quality control have become accepted as essential parts of all environmental surveillance, measurements, and monitoring programs, both nuclear and non-nuclear. The same principles and details apply to each. It is primarily the final measurement technique that differs. As the desire and need to measure smaller amounts of pollutants with greater accuracy has increased, it has been recognized that quality assurance and control programs are cost-effective in achieving the expected results. Quality assurance (QA) consists of all the actions necessary to provide confidence in the results. Quality control (QC) is a part of QA, and consists of those actions and activities that permit the control of the individual steps in the environmental program. The distinction between the two terms is not always clearly defined, but a sharp division is not necessary. The essential principle of QA and QC is a commitment to high quality results. The essential components of a QA and QC program are a complete, written procedures manual for all parts of the environmental program, the use of standard or validated procedures, participation in applicable interlaboratory comparison or QA programs, replicate analysis and measurement, training of personnel, and a means of auditing or checking that the QA and QC programs are properly conducted. These components are discussed below in some detail.
Rosenbaum, Matthew W; Flood, James G; Melanson, Stacy E F; Baumann, Nikola A; Marzinke, Mark A; Rai, Alex J; Hayden, Joshua; Wu, Alan H B; Ladror, Megan; Lifshitz, Mark S; Scott, Mitchell G; Peck-Palmer, Octavia M; Bowen, Raffick; Babic, Nikolina; Sobhani, Kimia; Giacherio, Donald; Bocsi, Gregary T; Herman, Daniel S; Wang, Ping; Toffaletti, John; Handel, Elizabeth; Kelly, Kathleen A; Albeiroti, Sami; Wang, Sihe; Zimmer, Melissa; Driver, Brandon; Yi, Xin; Wilburn, Clayton; Lewandrowski, Kent B
2018-05-29
In the United States, minimum standards for quality control (QC) are specified in federal law under the Clinical Laboratory Improvement Amendment and its revisions. Beyond meeting this required standard, laboratories have flexibility to determine their overall QC program. We surveyed chemistry and immunochemistry QC procedures at 21 clinical laboratories within leading academic medical centers to assess if standardized QC practices exist for chemistry and immunochemistry testing. We observed significant variation and unexpected similarities in practice across laboratories, including QC frequency, cutoffs, number of levels analyzed, and other features. This variation in practice indicates an opportunity exists to establish an evidence-based approach to QC that can be generalized across institutions.
77 FR 75968 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-26
... information unless it displays a currently valid OMB control number. Food and Nutrition Service Title: Quality... required to perform Quality Control (QC) review for the Supplemental Nutrition Assistance Program (SNAP). The FNS-380-1, Quality Control Review Schedule is for State use to collect both QC data and case...
jqcML: an open-source java API for mass spectrometry quality control data in the qcML format.
Bittremieux, Wout; Kelchtermans, Pieter; Valkenborg, Dirk; Martens, Lennart; Laukens, Kris
2014-07-03
The awareness that systematic quality control is an essential factor to enable the growth of proteomics into a mature analytical discipline has increased over the past few years. To this aim, a controlled vocabulary and document structure have recently been proposed by Walzer et al. to store and disseminate quality-control metrics for mass-spectrometry-based proteomics experiments, called qcML. To facilitate the adoption of this standardized quality control routine, we introduce jqcML, a Java application programming interface (API) for the qcML data format. First, jqcML provides a complete object model to represent qcML data. Second, jqcML provides the ability to read, write, and work in a uniform manner with qcML data from different sources, including the XML-based qcML file format and the relational database qcDB. Interaction with the XML-based file format is obtained through the Java Architecture for XML Binding (JAXB), while generic database functionality is obtained by the Java Persistence API (JPA). jqcML is released as open-source software under the permissive Apache 2.0 license and can be downloaded from https://bitbucket.org/proteinspector/jqcml .
Applying Sigma Metrics to Reduce Outliers.
Litten, Joseph
2017-03-01
Sigma metrics can be used to predict assay quality, allowing easy comparison of instrument quality and predicting which tests will require minimal quality control (QC) rules to monitor the performance of the method. A Six Sigma QC program can result in fewer controls and fewer QC failures for methods with a sigma metric of 5 or better. The higher the number of methods with a sigma metric of 5 or better, the lower the costs for reagents, supplies, and control material required to monitor the performance of the methods. Copyright © 2016 Elsevier Inc. All rights reserved.
Office of Student Financial Aid Quality Improvement Program: Design and Implementation Plan.
ERIC Educational Resources Information Center
Advanced Technology, Inc., Reston, VA.
The purpose and direction of the Office of Student Financial Aid (OSFA) quality improvement program are described. The background and context for the Pell Grant quality control (QC) design study and the meaning of QC are reviewed. The general approach to quality improvement consists of the following elements: a strategic approach that enables OSFA…
Analytical approaches to quality assurance and quality control in rangeland monitoring data
USDA-ARS?s Scientific Manuscript database
Producing quality data to support land management decisions is the goal of every rangeland monitoring program. However, the results of quality assurance (QA) and quality control (QC) efforts to improve data quality are rarely reported. The purpose of QA and QC is to prevent and describe non-sampling...
NASA Technical Reports Server (NTRS)
Robinson, Michael; Steiner, Matthias; Wolff, David B.; Ferrier, Brad S.; Kessinger, Cathy; Einaudi, Franco (Technical Monitor)
2000-01-01
The primary function of the TRMM Ground Validation (GV) Program is to create GV rainfall products that provide basic validation of satellite-derived precipitation measurements for select primary sites. A fundamental and extremely important step in creating high-quality GV products is radar data quality control. Quality control (QC) processing of TRMM GV radar data is based on some automated procedures, but the current QC algorithm is not fully operational and requires significant human interaction to assure satisfactory results. Moreover, the TRMM GV QC algorithm, even with continuous manual tuning, still can not completely remove all types of spurious echoes. In an attempt to improve the current operational radar data QC procedures of the TRMM GV effort, an intercomparison of several QC algorithms has been conducted. This presentation will demonstrate how various radar data QC algorithms affect accumulated radar rainfall products. In all, six different QC algorithms will be applied to two months of WSR-88D radar data from Melbourne, Florida. Daily, five-day, and monthly accumulated radar rainfall maps will be produced for each quality-controlled data set. The QC algorithms will be evaluated and compared based on their ability to remove spurious echoes without removing significant precipitation. Strengths and weaknesses of each algorithm will be assessed based on, their abilit to mitigate both erroneous additions and reductions in rainfall accumulation from spurious echo contamination and true precipitation removal, respectively. Contamination from individual spurious echo categories will be quantified to further diagnose the abilities of each radar QC algorithm. Finally, a cost-benefit analysis will be conducted to determine if a more automated QC algorithm is a viable alternative to the current, labor-intensive QC algorithm employed by TRMM GV.
Krishnan, S; Webb, S; Henderson, A R; Cheung, C M; Nazir, D J; Richardson, H
1999-03-01
The Laboratory Proficiency Testing Program (LPTP) assesses the analytical performance of all licensed laboratories in Ontario. The LPTP Enzymes, Cardiac Markers, and Lipids Committee conducted a "Patterns of Practice" survey to assess the in-house quality control (QC) practices of laboratories in Ontario using cholesterol as the QC paradigm. The survey was questionnaire-based seeking information on statistical calculations, software rules, review process and data retention, and so on. Copies of the in-house cholesterol QC graphs were requested. A total of 120 of 210 laboratories were randomly chosen to receive the questionnaires during 1995 and 1996; 115 laboratories responded, although some did not answer all questions. The majority calculate means and standard deviations (SD) every month, using anywhere from 4 to >100 data points. 65% use a fixed mean and SD, while 17% use means calculated from the previous month. A few use a floating or cumulative mean. Some laboratories that do not use fixed means use a fixed SD. About 90% use some form of statistical quality control rules. The most common rules used to detect random error are 1(3s)/R4s while 2(2s)/4(1s)/10x are used for systematic errors. About 20% did not assay any QC at levels >5.5 mmol/L. Quality control data are reviewed daily (technologists), weekly and monthly (supervisors/directors). Most laboratories retain their QC records for up to 3 years on paper and magnetic media. On some QC graphs the mean and SD, QC product lot number, or reference to action logs are not apparent. Quality control practices in Ontario are, therefore, disappointing. Improvement is required in the use of clinically appropriate concentrations of QC material and documentation on QC graphs.
Folks, Russell D; Garcia, Ernest V; Taylor, Andrew T
2007-03-01
Quantitative nuclear renography has numerous potential sources of error. We previously reported the initial development of a computer software module for comprehensively addressing the issue of quality control (QC) in the analysis of radionuclide renal images. The objective of this study was to prospectively test the QC software. The QC software works in conjunction with standard quantitative renal image analysis using a renal quantification program. The software saves a text file that summarizes QC findings as possible errors in user-entered values, calculated values that may be unreliable because of the patient's clinical condition, and problems relating to acquisition or processing. To test the QC software, a technologist not involved in software development processed 83 consecutive nontransplant clinical studies. The QC findings of the software were then tabulated. QC events were defined as technical (study descriptors that were out of range or were entered and then changed, unusually sized or positioned regions of interest, or missing frames in the dynamic image set) or clinical (calculated functional values judged to be erroneous or unreliable). Technical QC events were identified in 36 (43%) of 83 studies. Clinical QC events were identified in 37 (45%) of 83 studies. Specific QC events included starting the camera after the bolus had reached the kidney, dose infiltration, oversubtraction of background activity, and missing frames in the dynamic image set. QC software has been developed to automatically verify user input, monitor calculation of renal functional parameters, summarize QC findings, and flag potentially unreliable values for the nuclear medicine physician. Incorporation of automated QC features into commercial or local renal software can reduce errors and improve technologist performance and should improve the efficiency and accuracy of image interpretation.
Khanna, Niharika; Shaya, Fadia T; Chirikov, Viktor V; Sharp, David; Steffen, Ben
2016-01-01
We present data on quality of care (QC) improvement in 35 of 45 National Quality Forum metrics reported annually by 52 primary care practices recognized as patient-centered medical homes (PCMHs) that participated in the Maryland Multi-Payor Program from 2011 to 2013. We assigned QC metrics to (1) chronic, (2) preventive, and (3) mental health care domains. The study used a panel data design with no control group. Using longitudinal fixed-effects regressions, we modeled QC and case mix severity in a PCMH. Overall, 35 of 45 quality metrics reported by 52 PCMHs demonstrated improvement over 3 years, and case mix severity did not affect the achievement of quality improvement. From 2011 to 2012, QC increased by 0.14 (P < .01) for chronic, 0.15 (P < .01) for preventive, and 0.34 (P < .01) for mental health care domains; from 2012 to 2013 these domains increased by 0.03 (P = .06), 0.04 (P = .05), and 0.07 (P = .12), respectively. In univariate analyses, lower National Commission on Quality Assurance PCMH level was associated with higher QC for the mental health care domain, whereas case mix severity did not correlate with QC. In multivariate analyses, higher QC correlated with larger practices, greater proportion of older patients, and readmission visits. Rural practices had higher proportions of Medicaid patients, lower QC, and higher QC improvement in interaction analyses with time. The gains in QC in the chronic disease domain, the preventive care domain, and, most significantly, the mental health care domain were observed over time regardless of patient case mix severity. QC improvement was generally not modified by practice characteristics, except for rurality. © Copyright 2016 by the American Board of Family Medicine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu, Z.
The involvement of medical physicists in diagnostic ultrasound imaging service is increasing due to QC and accreditation requirements. The goal of this ultrasound hands-on workshop is to demonstrate quality control (QC) testing in diagnostic ultrasound and to provide updates in ACR ultrasound accreditation requirements. The first half of this workshop will include two presentations reviewing diagnostic ultrasound QA/QC and ACR ultrasound accreditation requirements. The second half of the workshop will include live demonstrations of basic QC tests. An array of ultrasound testing phantoms and ultrasound scanners will be available for attendees to learn diagnostic ultrasound QC in a hands-on environmentmore » with live demonstrations and on-site instructors. The targeted attendees are medical physicists in diagnostic imaging. Learning Objectives: Gain familiarity with common elements of a QA/QC program for diagnostic ultrasound imaging dentify QC tools available for testing diagnostic ultrasound systems and learn how to use these tools Learn ACR ultrasound accreditation requirements Jennifer Walter is an employee of American College of Radiology on Ultrasound Accreditation.« less
WE-AB-206-01: Diagnostic Ultrasound Imaging Quality Assurance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zagzebski, J.
The involvement of medical physicists in diagnostic ultrasound imaging service is increasing due to QC and accreditation requirements. The goal of this ultrasound hands-on workshop is to demonstrate quality control (QC) testing in diagnostic ultrasound and to provide updates in ACR ultrasound accreditation requirements. The first half of this workshop will include two presentations reviewing diagnostic ultrasound QA/QC and ACR ultrasound accreditation requirements. The second half of the workshop will include live demonstrations of basic QC tests. An array of ultrasound testing phantoms and ultrasound scanners will be available for attendees to learn diagnostic ultrasound QC in a hands-on environmentmore » with live demonstrations and on-site instructors. The targeted attendees are medical physicists in diagnostic imaging. Learning Objectives: Gain familiarity with common elements of a QA/QC program for diagnostic ultrasound imaging dentify QC tools available for testing diagnostic ultrasound systems and learn how to use these tools Learn ACR ultrasound accreditation requirements Jennifer Walter is an employee of American College of Radiology on Ultrasound Accreditation.« less
WE-AB-206-00: Diagnostic QA/QC Hands-On Workshop
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
The involvement of medical physicists in diagnostic ultrasound imaging service is increasing due to QC and accreditation requirements. The goal of this ultrasound hands-on workshop is to demonstrate quality control (QC) testing in diagnostic ultrasound and to provide updates in ACR ultrasound accreditation requirements. The first half of this workshop will include two presentations reviewing diagnostic ultrasound QA/QC and ACR ultrasound accreditation requirements. The second half of the workshop will include live demonstrations of basic QC tests. An array of ultrasound testing phantoms and ultrasound scanners will be available for attendees to learn diagnostic ultrasound QC in a hands-on environmentmore » with live demonstrations and on-site instructors. The targeted attendees are medical physicists in diagnostic imaging. Learning Objectives: Gain familiarity with common elements of a QA/QC program for diagnostic ultrasound imaging dentify QC tools available for testing diagnostic ultrasound systems and learn how to use these tools Learn ACR ultrasound accreditation requirements Jennifer Walter is an employee of American College of Radiology on Ultrasound Accreditation.« less
WE-AB-206-02: ACR Ultrasound Accreditation: Requirements and Pitfalls
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walter, J.
The involvement of medical physicists in diagnostic ultrasound imaging service is increasing due to QC and accreditation requirements. The goal of this ultrasound hands-on workshop is to demonstrate quality control (QC) testing in diagnostic ultrasound and to provide updates in ACR ultrasound accreditation requirements. The first half of this workshop will include two presentations reviewing diagnostic ultrasound QA/QC and ACR ultrasound accreditation requirements. The second half of the workshop will include live demonstrations of basic QC tests. An array of ultrasound testing phantoms and ultrasound scanners will be available for attendees to learn diagnostic ultrasound QC in a hands-on environmentmore » with live demonstrations and on-site instructors. The targeted attendees are medical physicists in diagnostic imaging. Learning Objectives: Gain familiarity with common elements of a QA/QC program for diagnostic ultrasound imaging dentify QC tools available for testing diagnostic ultrasound systems and learn how to use these tools Learn ACR ultrasound accreditation requirements Jennifer Walter is an employee of American College of Radiology on Ultrasound Accreditation.« less
The purpose of this SOP is to outline the process of field quality assurance and quality control checks. This procedure was followed to ensure consistent data retrieval during the Arizona NHEXAS project and the Border study. Keywords: custody; QA/QC; field checks.
The U.S.-Mex...
Quality Circles: An Innovative Program to Improve Military Hospitals
1982-08-01
quality control. However, Dr. Kaoru Ishikawa is credited with starting the first "Quality Control Circles" and registering them with the Japanese Union of...McGregor and Abraham Maslow into a unique style of management. In 1962 Dr. Ishikawa , a professor at Tokyo University, developed the QC concept based on...RECOMMENDATIONS Conclusions The QC concept has come a long way since Dr. Ishikawa gave it birth in 1962. It has left an enviable record of success along its
NASA Technical Reports Server (NTRS)
Brenton, James C.; Barbre. Robert E., Jr.; Decker, Ryan K.; Orcutt, John M.
2018-01-01
The National Aeronautics and Space Administration's (NASA) Marshall Space Flight Center (MSFC) Natural Environments Branch (EV44) has provided atmospheric databases and analysis in support of space vehicle design and day-of-launch operations for NASA and commercial launch vehicle programs launching from the NASA Kennedy Space Center (KSC), co-located on the United States Air Force's Eastern Range (ER) at the Cape Canaveral Air Force Station. The ER complex is one of the most heavily instrumented sites in the United States with over 31 towers measuring various atmospheric parameters on a continuous basis. An inherent challenge with large sets of data consists of ensuring erroneous data is removed from databases, and thus excluded from launch vehicle design analyses. EV44 has put forth great effort in developing quality control (QC) procedures for individual meteorological instruments, however no standard QC procedures for all databases currently exists resulting in QC databases that have inconsistencies in variables, methodologies, and periods of record. The goal of this activity is to use the previous efforts by EV44 to develop a standardized set of QC procedures from which to build meteorological databases from KSC and the ER, while maintaining open communication with end users from the launch community to develop ways to improve, adapt and grow the QC database. Details of the QC procedures will be described. As the rate of launches increases with additional launch vehicle programs, it is becoming more important that weather databases are continually updated and checked for data quality before use in launch vehicle design and certification analyses.
Quality control and assurance for validation of DOS/I measurements
NASA Astrophysics Data System (ADS)
Cerussi, Albert; Durkin, Amanda; Kwong, Richard; Quang, Timothy; Hill, Brian; Tromberg, Bruce J.; MacKinnon, Nick; Mantulin, William W.
2010-02-01
Ongoing multi-center clinical trials are crucial for Biophotonics to gain acceptance in medical imaging. In these trials, quality control (QC) and assurance (QA) are key to success and provide "data insurance". Quality control and assurance deal with standardization, validation, and compliance of procedures, materials and instrumentation. Specifically, QC/QA involves systematic assessment of testing materials, instrumentation performance, standard operating procedures, data logging, analysis, and reporting. QC and QA are important for FDA accreditation and acceptance by the clinical community. Our Biophotonics research in the Network for Translational Research in Optical Imaging (NTROI) program for breast cancer characterization focuses on QA/QC issues primarily related to the broadband Diffuse Optical Spectroscopy and Imaging (DOS/I) instrumentation, because this is an emerging technology with limited standardized QC/QA in place. In the multi-center trial environment, we implement QA/QC procedures: 1. Standardize and validate calibration standards and procedures. (DOS/I technology requires both frequency domain and spectral calibration procedures using tissue simulating phantoms and reflectance standards, respectively.) 2. Standardize and validate data acquisition, processing and visualization (optimize instrument software-EZDOS; centralize data processing) 3. Monitor, catalog and maintain instrument performance (document performance; modularize maintenance; integrate new technology) 4. Standardize and coordinate trial data entry (from individual sites) into centralized database 5. Monitor, audit and communicate all research procedures (database, teleconferences, training sessions) between participants ensuring "calibration". This manuscript describes our ongoing efforts, successes and challenges implementing these strategies.
NASA Technical Reports Server (NTRS)
Brenton, J. C.; Barbre, R. E.; Decker, R. K.; Orcutt, J. M.
2018-01-01
The National Aeronautics and Space Administration's (NASA) Marshall Space Flight Center (MSFC) Natural Environments Branch (EV44) provides atmospheric databases and analysis in support of space vehicle design and day-of-launch operations for NASA and commercial launch vehicle programs launching from the NASA Kennedy Space Center (KSC), co-located on the United States Air Force's Eastern Range (ER) at the Cape Canaveral Air Force Station. The ER complex is one of the most heavily instrumented sites in the United States with over 31 towers measuring various atmospheric parameters on a continuous basis. An inherent challenge with large datasets consists of ensuring erroneous data are removed from databases, and thus excluded from launch vehicle design analyses. EV44 has put forth great effort in developing quality control (QC) procedures for individual meteorological instruments, however no standard QC procedures for all databases currently exists resulting in QC databases that have inconsistencies in variables, development methodologies, and periods of record. The goal of this activity is to use the previous efforts to develop a standardized set of QC procedures from which to build meteorological databases from KSC and the ER, while maintaining open communication with end users from the launch community to develop ways to improve, adapt and grow the QC database. Details of the QC procedures will be described. As the rate of launches increases with additional launch vehicle programs, It is becoming more important that weather databases are continually updated and checked for data quality before use in launch vehicle design and certification analyses.
7 CFR 275.10 - Scope and purpose.
Code of Federal Regulations, 2010 CFR
2010-01-01
... to enhanced funding. (b) The objectives of quality control reviews are to provide: (1) A systematic... FOOD STAMP AND FOOD DISTRIBUTION PROGRAM PERFORMANCE REPORTING SYSTEM Quality Control (QC) Reviews... responsible for conducting quality control reviews. For food stamp quality control reviews, a sample of...
CHALLENGES IN SETTING UP QUALITY CONTROL IN DIAGNOSTIC RADIOLOGY FACILITIES IN NIGERIA.
Inyang, S O; Egbe, N O; Ekpo, E
2015-01-01
The Nigerian Nuclear Regulatory Authority (NNRA) was established to regulate and control the use of radioactive and radiation emitting sources in Nigeria. Quality control (QC) on diagnostic radiology equipment form part of the fundamental requirements for the authorization of diagnostic radiology facilities in the Country. Some quality control tests (output, exposure linearity and reproducibility) were measured on the x-ray machines in the facilities that took part in the study. Questionnaire was developed to evaluate the frequencies at which QC tests were conducted in the facilities and the challenges in setting up QC. Results show great variation in the values of the QC parameters measured. Inadequate cooperation by facilities management, lack of QC equipment and insufficient staff form the major challenges in setting up QC in the facilities under study. The responses on the frequencies at which QC tests should be conducted did not correspond to the recommended standards; indicating that personnel were not familiar with QC implementation and may require further training on QC.
Development of QC Procedures for Ocean Data Obtained by National Research Projects of Korea
NASA Astrophysics Data System (ADS)
Kim, S. D.; Park, H. M.
2017-12-01
To establish data management system for ocean data obtained by national research projects of Ministry of Oceans and Fisheries of Korea, KIOST conducted standardization and development of QC procedures. After reviewing and analyzing the existing international and domestic ocean-data standards and QC procedures, the draft version of standards and QC procedures were prepared. The proposed standards and QC procedures were reviewed and revised by experts in the field of oceanography and academic societies several times. A technical report on the standards of 25 data items and 12 QC procedures for physical, chemical, biological and geological data items. The QC procedure for temperature and salinity data was set up by referring the manuals published by GTSPP, ARGO and IOOS QARTOD. It consists of 16 QC tests applicable for vertical profile data and time series data obtained in real-time mode and delay mode. Three regional range tests to inspect annual, seasonal and monthly variations were included in the procedure. Three programs were developed to calculate and provide upper limit and lower limit of temperature and salinity at depth from 0 to 1550m. TS data of World Ocean Database, ARGO, GTSPP and in-house data of KIOST were analysed statistically to calculate regional limit of Northwest Pacific area. Based on statistical analysis, the programs calculate regional ranges using mean and standard deviation at 3 kind of grid systems (3° grid, 1° grid and 0.5° grid) and provide recommendation. The QC procedures for 12 data items were set up during 1st phase of national program for data management (2012-2015) and are being applied to national research projects practically at 2nd phase (2016-2019). The QC procedures will be revised by reviewing the result of QC application when the 2nd phase of data management programs is completed.
RNA-SeQC: RNA-seq metrics for quality control and process optimization.
DeLuca, David S; Levin, Joshua Z; Sivachenko, Andrey; Fennell, Timothy; Nazaire, Marc-Danie; Williams, Chris; Reich, Michael; Winckler, Wendy; Getz, Gad
2012-06-01
RNA-seq, the application of next-generation sequencing to RNA, provides transcriptome-wide characterization of cellular activity. Assessment of sequencing performance and library quality is critical to the interpretation of RNA-seq data, yet few tools exist to address this issue. We introduce RNA-SeQC, a program which provides key measures of data quality. These metrics include yield, alignment and duplication rates; GC bias, rRNA content, regions of alignment (exon, intron and intragenic), continuity of coverage, 3'/5' bias and count of detectable transcripts, among others. The software provides multi-sample evaluation of library construction protocols, input materials and other experimental parameters. The modularity of the software enables pipeline integration and the routine monitoring of key measures of data quality such as the number of alignable reads, duplication rates and rRNA contamination. RNA-SeQC allows investigators to make informed decisions about sample inclusion in downstream analysis. In summary, RNA-SeQC provides quality control measures critical to experiment design, process optimization and downstream computational analysis. See www.genepattern.org to run online, or www.broadinstitute.org/rna-seqc/ for a command line tool.
Quality control for federal clean water act and safe drinking water act regulatory compliance.
Askew, Ed
2013-01-01
QC sample results are required in order to have confidence in the results from analytical tests. Some of the AOAC water methods include specific QC procedures, frequencies, and acceptance criteria. These are considered to be the minimum controls needed to perform the method successfully. Some regulatory programs, such as those in 40 CFR Part 136.7, require additional QC or have alternative acceptance limits. Essential QC measures include method calibration, reagent standardization, assessment of each analyst's capabilities, analysis of blind check samples, determination of the method's sensitivity (method detection level or quantification limit), and daily evaluation of bias, precision, and the presence of laboratory contamination or other analytical interference. The details of these procedures, their performance frequency, and expected ranges of results are set out in this manuscript. The specific regulatory requirements of 40 CFR Part 136.7 for the Clean Water Act, the laboratory certification requirements of 40 CFR Part 141 for the Safe Drinking Water Act, and the ISO 17025 accreditation requirements under The NELAC Institute are listed.
7 CFR 275.10 - Scope and purpose.
Code of Federal Regulations, 2011 CFR
2011-01-01
... FOOD STAMP AND FOOD DISTRIBUTION PROGRAM PERFORMANCE REPORTING SYSTEM Quality Control (QC) Reviews... responsible for conducting quality control reviews. For food stamp quality control reviews, a sample of... terminated (called negative cases). Reviews shall be conducted on active cases to determine if households are...
LOVE CANAL MONITORING PROGRAM. GCA QA/QC (QUALITY ASSURANCE/QUALITY CONTROL) SUMMARY REPORT
One of the most important responsibilities of the Love Canal prime contractor was the institution and maintenance of a quality assurance program. An important objective of the quality assurance program was to alert the subcontractors to the importance of high quality work on thei...
Lean Six Sigma in Health Care: Improving Utilization and Reducing Waste.
Almorsy, Lamia; Khalifa, Mohamed
2016-01-01
Healthcare costs have been increasing worldwide mainly due to over utilization of resources. The savings potentially achievable from systematic, comprehensive, and cooperative reduction in waste are far higher than from more direct and blunter cuts in care and coverage. At King Faisal Specialist Hospital and Research Center inappropriate and over utilization of the glucose test strips used for whole blood glucose determination using glucometers was observed. The hospital implemented a project to improve its utilization. Using the Six Sigma DMAIC approach (Define, Measure, Analyze, Improve and Control), an efficient practice was put in place including updating the related internal policies and procedures and the proper implementation of an effective users' training and competency check off program. That resulted in decreasing the unnecessary Quality Control (QC) runs from 13% to 4%, decreasing the failed QC runs from 14% to 7%, lowering the QC to patient testing ratio from 24/76 to 19/81.
CUSTOMER/SUPPLIER ACCOUNTABILITY AND PROGRAM IMPLEMENTATION
Quality assurance (QA) and quality control (QC) are the basic components of a QA program, which is a fundamental quality management tool. he quality of outputs and services strongly depends on the caliber of the communications between the "customer" and the "supplier." lear under...
23 CFR 650.313 - Inspection procedures.
Code of Federal Regulations, 2010 CFR
2010-04-01
...) Quality control and quality assurance. Assure systematic quality control (QC) and quality assurance (QA... periodic field review of inspection teams, periodic bridge inspection refresher training for program managers and team leaders, and independent review of inspection reports and computations. (h) Follow-up on...
ChronQC: a quality control monitoring system for clinical next generation sequencing.
Tawari, Nilesh R; Seow, Justine Jia Wen; Perumal, Dharuman; Ow, Jack L; Ang, Shimin; Devasia, Arun George; Ng, Pauline C
2018-05-15
ChronQC is a quality control (QC) tracking system for clinical implementation of next-generation sequencing (NGS). ChronQC generates time series plots for various QC metrics to allow comparison of current runs to historical runs. ChronQC has multiple features for tracking QC data including Westgard rules for clinical validity, laboratory-defined thresholds and historical observations within a specified time period. Users can record their notes and corrective actions directly onto the plots for long-term recordkeeping. ChronQC facilitates regular monitoring of clinical NGS to enable adherence to high quality clinical standards. ChronQC is freely available on GitHub (https://github.com/nilesh-tawari/ChronQC), Docker (https://hub.docker.com/r/nileshtawari/chronqc/) and the Python Package Index. ChronQC is implemented in Python and runs on all common operating systems (Windows, Linux and Mac OS X). tawari.nilesh@gmail.com or pauline.c.ng@gmail.com. Supplementary data are available at Bioinformatics online.
Root, Patsy; Hunt, Margo; Fjeld, Karla; Kundrat, Laurie
2014-01-01
Quality assurance (QA) and quality control (QC) data are required in order to have confidence in the results from analytical tests and the equipment used to produce those results. Some AOAC water methods include specific QA/QC procedures, frequencies, and acceptance criteria, but these are considered to be the minimum controls needed to perform a microbiological method successfully. Some regulatory programs, such as those at Code of Federal Regulations (CFR), Title 40, Part 136.7 for chemistry methods, require additional QA/QC measures beyond those listed in the method, which can also apply to microbiological methods. Essential QA/QC measures include sterility checks, reagent specificity and sensitivity checks, assessment of each analyst's capabilities, analysis of blind check samples, and evaluation of the presence of laboratory contamination and instrument calibration and checks. The details of these procedures, their performance frequency, and expected results are set out in this report as they apply to microbiological methods. The specific regulatory requirements of CFR Title 40 Part 136.7 for the Clean Water Act, the laboratory certification requirements of CFR Title 40 Part 141 for the Safe Drinking Water Act, and the International Organization for Standardization 17025 accreditation requirements under The NELAC Institute are also discussed.
77 FR 3228 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-23
..., Office of Management and Budget (OMB), [email protected] or fax (202) 395-5806 and to... it displays a currently valid OMB control number. Food and Nutrition Service Title: Quality Control... perform Quality Control (QC) review for the Supplemental Nutrition Assistance Program (SNAP). The FNS-380...
NASA Technical Reports Server (NTRS)
Brenton, James C.; Barbre, Robert E.; Orcutt, John M.; Decker, Ryan K.
2018-01-01
The National Aeronautics and Space Administration's (NASA) Marshall Space Flight Center (MSFC) Natural Environments Branch (EV44) has provided atmospheric databases and analysis in support of space vehicle design and day-of-launch operations for NASA and commercial launch vehicle programs launching from the NASA Kennedy Space Center (KSC), co-located on the United States Air Force's Eastern Range (ER) at the Cape Canaveral Air Force Station. The ER is one of the most heavily instrumented sites in the United States measuring various atmospheric parameters on a continuous basis. An inherent challenge with the large databases that EV44 receives from the ER consists of ensuring erroneous data are removed from the databases, and thus excluded from launch vehicle design analyses. EV44 has put forth great effort in developing quality control (QC) procedures for individual meteorological instruments; however, no standard QC procedures for all databases currently exist resulting in QC databases that have inconsistencies in variables, methodologies, and periods of record. The goal of this activity is to use the previous efforts by EV44 to develop a standardized set of QC procedures from which to build flags within the meteorological databases from KSC and the ER, while maintaining open communication with end users from the launch community to develop ways to improve, adapt and grow the QC database. Details of the QC checks are described. The flagged data points will be plotted in a graphical user interface (GUI) as part of a manual confirmation that the flagged data do indeed need to be removed from the archive. As the rate of launches increases with additional launch vehicle programs, more emphasis is being placed to continually update and check weather databases for data quality before use in launch vehicle design and certification analyses.
References on EPA Quality Assurance Project Plans
Provides requirements for the conduct of quality management practices, including quality assurance (QA) and quality control (QC) activities, for all environmental data collection and environmental technology programs performed by or for this Agency.
Levey-Jennings Analysis Uncovers Unsuspected Causes of Immunohistochemistry Stain Variability.
Vani, Kodela; Sompuram, Seshi R; Naber, Stephen P; Goldsmith, Jeffrey D; Fulton, Regan; Bogen, Steven A
Almost all clinical laboratory tests use objective, quantitative measures of quality control (QC), incorporating Levey-Jennings analysis and Westgard rules. Clinical immunohistochemistry (IHC) testing, in contrast, relies on subjective, qualitative QC review. The consequences of using Levey-Jennings analysis for QC assessment in clinical IHC testing are not known. To investigate this question, we conducted a 1- to 2-month pilot test wherein the QC for either human epidermal growth factor receptor 2 (HER-2) or progesterone receptor (PR) in 3 clinical IHC laboratories was quantified and analyzed with Levey-Jennings graphs. Moreover, conventional tissue controls were supplemented with a new QC comprised of HER-2 or PR peptide antigens coupled onto 8 μm glass beads. At institution 1, this more stringent analysis identified a decrease in the HER-2 tissue control that had escaped notice by subjective evaluation. The decrement was due to heterogeneity in the tissue control itself. At institution 2, we identified a 1-day sudden drop in the PR tissue control, also undetected by subjective evaluation, due to counterstain variability. At institution 3, a QC shift was identified, but only with 1 of 2 controls mounted on each slide. The QC shift was due to use of the instrument's selective reagent drop zones dispense feature. None of these events affected patient diagnoses. These case examples illustrate that subjective QC evaluation of tissue controls can detect gross assay failure but not subtle changes. The fact that QC issues arose from each site, and in only a pilot study, suggests that immunohistochemical stain variability may be an underappreciated problem.
SU-E-T-103: Development and Implementation of Web Based Quality Control Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Studinski, R; Taylor, R; Angers, C
Purpose: Historically many radiation medicine programs have maintained their Quality Control (QC) test results in paper records or Microsoft Excel worksheets. Both these approaches represent significant logistical challenges, and are not predisposed to data review and approval. It has been our group's aim to develop and implement web based software designed not just to record and store QC data in a centralized database, but to provide scheduling and data review tools to help manage a radiation therapy clinics Equipment Quality control program. Methods: The software was written in the Python programming language using the Django web framework. In order tomore » promote collaboration and validation from other centres the code was made open source and is freely available to the public via an online source code repository. The code was written to provide a common user interface for data entry, formalize the review and approval process, and offer automated data trending and process control analysis of test results. Results: As of February 2014, our installation of QAtrack+ has 180 tests defined in its database and has collected ∼22 000 test results, all of which have been reviewed and approved by a physicist via QATrack+'s review tools. These results include records for quality control of Elekta accelerators, CT simulators, our brachytherapy programme, TomoTherapy and Cyberknife units. Currently at least 5 other centres are known to be running QAtrack+ clinically, forming the start of an international user community. Conclusion: QAtrack+ has proven to be an effective tool for collecting radiation therapy QC data, allowing for rapid review and trending of data for a wide variety of treatment units. As free and open source software, all source code, documentation and a bug tracker are available to the public at https://bitbucket.org/tohccmedphys/qatrackplus/.« less
40 CFR 98.44 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Monitoring and QA/QC requirements. 98.44 Section 98.44 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electricity Generation § 98.44 Monitoring and QA/QC...
40 CFR 98.44 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 21 2011-07-01 2011-07-01 false Monitoring and QA/QC requirements. 98.44 Section 98.44 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electricity Generation § 98.44 Monitoring and QA/QC...
40 CFR 98.44 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 22 2013-07-01 2013-07-01 false Monitoring and QA/QC requirements. 98.44 Section 98.44 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electricity Generation § 98.44 Monitoring and QA/QC...
40 CFR 98.44 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 21 2014-07-01 2014-07-01 false Monitoring and QA/QC requirements. 98.44 Section 98.44 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electricity Generation § 98.44 Monitoring and QA/QC...
40 CFR 98.44 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 22 2012-07-01 2012-07-01 false Monitoring and QA/QC requirements. 98.44 Section 98.44 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electricity Generation § 98.44 Monitoring and QA/QC...
40 CFR 98.64 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Monitoring and QA/QC requirements. 98.64 Section 98.64 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Aluminum Production § 98.64 Monitoring and QA/QC requirements...
40 CFR 98.64 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 21 2011-07-01 2011-07-01 false Monitoring and QA/QC requirements. 98.64 Section 98.64 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Aluminum Production § 98.64 Monitoring and QA/QC requirements...
40 CFR 98.84 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 21 2011-07-01 2011-07-01 false Monitoring and QA/QC requirements. 98.84 Section 98.84 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Cement Production § 98.84 Monitoring and QA/QC requirements...
Quality Control for Scoring Tests Administered in Continuous Mode: An NCME Instructional Module
ERIC Educational Resources Information Center
Allalouf, Avi; Gutentag, Tony; Baumer, Michal
2017-01-01
Quality control (QC) in testing is paramount. QC procedures for tests can be divided into two types. The first type, one that has been well researched, is QC for tests administered to large population groups on few administration dates using a small set of test forms (e.g., large-scale assessment). The second type is QC for tests, usually…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-27
... Quality Control process for the Supplemental Nutrition Assistance Program and the FNS-248 will be removed... other forms of information technology. Comments may be sent to: Francis B. Heil, Chief, Quality Control... directed to Francis B. Heil, (703) 305-2442. SUPPLEMENTARY INFORMATION: Title: Negative Quality Control...
Cian, Francesco; Villiers, Elisabeth; Archer, Joy; Pitorri, Francesca; Freeman, Kathleen
2014-06-01
Quality control (QC) validation is an essential tool in total quality management of a veterinary clinical pathology laboratory. Cost-analysis can be a valuable technique to help identify an appropriate QC procedure for the laboratory, although this has never been reported in veterinary medicine. The aim of this study was to determine the applicability of the Six Sigma Quality Cost Worksheets in the evaluation of possible candidate QC rules identified by QC validation. Three months of internal QC records were analyzed. EZ Rules 3 software was used to evaluate candidate QC procedures, and the costs associated with the application of different QC rules were calculated using the Six Sigma Quality Cost Worksheets. The costs associated with the current and the candidate QC rules were compared, and the amount of cost savings was calculated. There was a significant saving when the candidate 1-2.5s, n = 3 rule was applied instead of the currently utilized 1-2s, n = 3 rule. The savings were 75% per year (£ 8232.5) based on re-evaluating all of the patient samples in addition to the controls, and 72% per year (£ 822.4) based on re-analyzing only the control materials. The savings were also shown to change accordingly with the number of samples analyzed and with the number of daily QC procedures performed. These calculations demonstrated the importance of the selection of an appropriate QC procedure, and the usefulness of the Six Sigma Costs Worksheet in determining the most cost-effective rule(s) when several candidate rules are identified by QC validation. © 2014 American Society for Veterinary Clinical Pathology and European Society for Veterinary Clinical Pathology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
The goal of this ultrasound hands-on workshop is to demonstrate advancements in high intensity focused ultrasound (HIFU) and to demonstrate quality control (QC) testing in diagnostic ultrasound. HIFU is a therapeutic modality that uses ultrasound waves as carriers of energy. HIFU is used to focus a beam of ultrasound energy into a small volume at specific target locations within the body. The focused beam causes localized high temperatures and produces a well-defined regions of necrosis. This completely non-invasive technology has great potential for tumor ablation and targeted drug delivery. At the workshop, attendees will see configurations, applications, and hands-on demonstrationsmore » with on-site instructors at separate stations. The involvement of medical physicists in diagnostic ultrasound imaging service is increasing due to QC and accreditation requirements. At the workshop, an array of ultrasound testing phantoms and ultrasound scanners will be provided for attendees to learn diagnostic ultrasound QC in a hands-on environment with live demonstrations of the techniques. Target audience: Medical physicists and other medical professionals in diagnostic imaging and radiation oncology with interest in high-intensity focused ultrasound and in diagnostic ultrasound QC. Learning Objectives: Learn ultrasound physics and safety for HIFU applications through live demonstrations Get an overview of the state-of-the art in HIFU technologies and equipment Gain familiarity with common elements of a quality control program for diagnostic ultrasound imaging Identify QC tools available for testing diagnostic ultrasound systems and learn how to use these tools List of supporting vendors for HIFU and diagnostic ultrasound QC hands-on workshop: Philips Healthcare Alpinion Medical Systems Verasonics, Inc Zonare Medical Systems, Inc Computerized Imaging Reference Systems (CIRS), Inc. GAMMEX, Inc., Cablon Medical BV Steffen Sammet: NIH/NCI grant 5R25CA132822, NIH/NINDS grant 5R25NS080949 and Philips Healthcare research grant C32.« less
Quality Assurance and Quality Control Practices for Rehabilitation of Sewer and Water Mains
As part of the US Environmental Protection Agency (EPA)’s Aging Water Infrastructure Research Program, several areas of research are being pursued, including a review of quality assurance and quality control (QA/QC) practices and acceptance testing during the installation of reha...
Quality Assurance and Quality Control Practices For Rehabilitation of Sewer and Water Mains
As part of the US Environmental Protection Agency (EPA)’s Aging Water Infrastructure Research Program, several areas of research are being pursued including a review of quality assurance and quality control (QA/QC) practices and acceptance testing during the installation of rehab...
7 CFR 283.2 - Scope and applicability.
Code of Federal Regulations, 2010 CFR
2010-01-01
... agencies of Food and Nutrition Service quality control (QC) claims for Fiscal Year (“FY”) 1986 and... Agriculture Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF AGRICULTURE FOOD STAMP AND FOOD DISTRIBUTION PROGRAM APPEALS OF QUALITY CONTROL (âQCâ) CLAIMS General § 283.2...
The importance of quality control in validating concentrations ...
A national-scale survey of 247 contaminants of emerging concern (CECs), including organic and inorganic chemical compounds, and microbial contaminants, was conducted in source and treated drinking water samples from 25 treatment plants across the United States. Multiple methods were used to determine these CECs, including six analytical methods to measure 174 pharmaceuticals, personal care products, and pesticides. A three-component quality assurance/quality control (QA/QC) program was designed for the subset of 174 CECs which allowed us to assess and compare performances of the methods used. The three components included: 1) a common field QA/QC protocol and sample design, 2) individual investigator-developed method-specific QA/QC protocols, and 3) a suite of 46 method comparison analytes that were determined in two or more analytical methods. Overall method performance for the 174 organic chemical CECs was assessed by comparing spiked recoveries in reagent, source, and treated water over a two-year period. In addition to the 247 CECs reported in the larger drinking water study, another 48 pharmaceutical compounds measured did not consistently meet predetermined quality standards. Methodologies that did not seem suitable for these analytes are overviewed. The need to exclude analytes based on method performance demonstrates the importance of additional QA/QC protocols. This paper compares the method performance of six analytical methods used to measure 174 emer
Parvin, C A
1993-03-01
The error detection characteristics of quality-control (QC) rules that use control observations within a single analytical run are investigated. Unlike the evaluation of QC rules that span multiple analytical runs, most of the fundamental results regarding the performance of QC rules applied within a single analytical run can be obtained from statistical theory, without the need for simulation studies. The case of two control observations per run is investigated for ease of graphical display, but the conclusions can be extended to more than two control observations per run. Results are summarized in a graphical format that offers many interesting insights into the relations among the various QC rules. The graphs provide heuristic support to the theoretical conclusions that no QC rule is best under all error conditions, but the multirule that combines the mean rule and a within-run standard deviation rule offers an attractive compromise.
Purba, Fredrick Dermawan; Hunfeld, Joke A M; Iskandarsyah, Aulia; Fitriana, Titi Sahidah; Sadarjoen, Sawitri S; Passchier, Jan; Busschbach, Jan J V
2017-05-01
In valuing health states using generic questionnaires such as EQ-5D, there are unrevealed issues with the quality of the data collection. The aims were to describe the problems encountered during valuation and to evaluate a quality control report and subsequent retraining of interviewers in improving this valuation. Data from the first 266 respondents in an EQ-5D-5L valuation study were used. Interviewers were trained and answered questions regarding problems during these initial interviews. Thematic analysis was used, and individual feedback was provided. After completion of 98 interviews, a first quantitative quality control (QC) report was generated, followed by a 1-day retraining program. Subsequently individual feedback was also given on the basis of follow-up QCs. The Wilcoxon signed-rank test was used to assess improvements based on 7 indicators of quality as identified in the first QC and the QC conducted after a further 168 interviews. Interviewers encountered problems in recruiting respondents. Solutions provided were: optimization of the time of interview, the use of broader networks and the use of different scripts to explain the project's goals to respondents. For problems in interviewing process, solutions applied were: developing the technical and personal skills of the interviewers and stimulating the respondents' thought processes. There were also technical problems related to hardware, software and internet connections. There was an improvement in all 7 indicators of quality after the second QC. Training before and during a study, and individual feedback on the basis of a quantitative QC, can increase the validity of values obtained from generic questionnaires.
Selecting Statistical Procedures for Quality Control Planning Based on Risk Management.
Yago, Martín; Alcover, Silvia
2016-07-01
According to the traditional approach to statistical QC planning, the performance of QC procedures is assessed in terms of its probability of rejecting an analytical run that contains critical size errors (PEDC). Recently, the maximum expected increase in the number of unacceptable patient results reported during the presence of an undetected out-of-control error condition [Max E(NUF)], has been proposed as an alternative QC performance measure because it is more related to the current introduction of risk management concepts for QC planning in the clinical laboratory. We used a statistical model to investigate the relationship between PEDC and Max E(NUF) for simple QC procedures widely used in clinical laboratories and to construct charts relating Max E(NUF) with the capability of the analytical process that allow for QC planning based on the risk of harm to a patient due to the report of erroneous results. A QC procedure shows nearly the same Max E(NUF) value when used for controlling analytical processes with the same capability, and there is a close relationship between PEDC and Max E(NUF) for simple QC procedures; therefore, the value of PEDC can be estimated from the value of Max E(NUF) and vice versa. QC procedures selected by their high PEDC value are also characterized by a low value for Max E(NUF). The PEDC value can be used for estimating the probability of patient harm, allowing for the selection of appropriate QC procedures in QC planning based on risk management. © 2016 American Association for Clinical Chemistry.
Announcement—guidance document for acquiring reliable data in ecological restoration projects
Stapanian, Martin A.; Rodriguez, Karen; Lewis, Timothy E.; Blume, Louis; Palmer, Craig J.; Walters, Lynn; Schofield, Judith; Amos, Molly M.; Bucher, Adam
2016-01-01
The Laurentian Great Lakes are undergoing intensive ecological restoration in Canada and the United States. In the United States, an interagency committee was formed to facilitate implementation of quality practices for federally funded restoration projects in the Great Lakes basin. The Committee's responsibilities include developing a guidance document that will provide a common approach to the application of quality assurance and quality control (QA/QC) practices for restoration projects. The document will serve as a “how-to” guide for ensuring data quality during each aspect of ecological restoration projects. In addition, the document will provide suggestions on linking QA/QC data with the routine project data and hints on creating detailed supporting documentation. Finally, the document will advocate integrating all components of the project, including QA/QC applications, into an overarching decision-support framework. The guidance document is expected to be released by the U.S. EPA Great Lakes National Program Office in 2017.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hood, F.C.
1992-05-01
Quality assurance and quality control (QA/QC) of analytical chemistry laboratory activities are essential to the validity and usefulness of resultant data. However, in themselves, conventional QA/QC measures will not always ensure that fraudulent data are not generated. Conventional QA/QC measures are based on the assumption that work will be done in good faith; to assure against fraudulent practices, QA/QC measures must be tailored to specific analyses protocols in anticipation of intentional misapplication of those protocols. Application of specific QA/QC measures to ensure against fraudulent practices result in an increased administrative burden being placed on the analytical process; accordingly, in keepingmore » with graded QA philosophy, data quality objectives must be used to identify specific points of concern for special control to minimize the administrative impact.« less
QA/QC in the laboratory. Session F
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hood, F.C.
1992-05-01
Quality assurance and quality control (QA/QC) of analytical chemistry laboratory activities are essential to the validity and usefulness of resultant data. However, in themselves, conventional QA/QC measures will not always ensure that fraudulent data are not generated. Conventional QA/QC measures are based on the assumption that work will be done in good faith; to assure against fraudulent practices, QA/QC measures must be tailored to specific analyses protocols in anticipation of intentional misapplication of those protocols. Application of specific QA/QC measures to ensure against fraudulent practices result in an increased administrative burden being placed on the analytical process; accordingly, in keepingmore » with graded QA philosophy, data quality objectives must be used to identify specific points of concern for special control to minimize the administrative impact.« less
Gantner, Pierre; Mélard, Adeline; Damond, Florence; Delaugerre, Constance; Dina, Julia; Gueudin, Marie; Maillard, Anne; Sauné, Karine; Rodallec, Audrey; Tuaillon, Edouard; Plantier, Jean-Christophe; Rouzioux, Christine; Avettand-Fenoel, Véronique
2017-11-01
Viral reservoirs represent an important barrier to HIV cure. Accurate markers of HIV reservoirs are needed to develop multicenter studies. The aim of this multicenter quality control (QC) was to evaluate the inter-laboratory reproducibility of total HIV-1-DNA quantification. Ten laboratories of the ANRS-AC11 working group participated by quantifying HIV-DNA with a real-time qPCR assay (Biocentric) in four samples (QCMD). Good reproducibility was found between laboratories (standard deviation ≤ 0.2 log 10 copies/10 6 PBMC) for the three positive QC that were correctly classified by each laboratory (QC1
Li, Junming; He, Zhiyao; Yu, Shui; Li, Shuangzhi; Ma, Qing; Yu, Yiyi; Zhang, Jialin; Li, Rui; Zheng, Yu; He, Gu; Song, Xiangrong
2012-10-01
In this study, quercetin (QC) with cancer chemoprevention effect and anticancer potential was loaded into polymeric micelles of methoxy poly(ethylene glycol)-cholesterol conjugate (mPEG-Chol) in order to increase its water solubility. MPEG-Chol with lower critical micelle concentration (CMC) value (4.0 x 10(-7) M - 13 x 10(-7) M) was firstly synthesized involving two steps of chemical modification on cholesterol by esterification, and then QC was incorporated into mPEG-Chol micelles by self-assembly method. After the process parameters were optimized, QC-loaded micelles had higher drug loading (3.66%) and entrapment efficiency (93.51%) and nano-sized diameter (116 nm). DSC analysis demonstrated that QC had been incorporated non-covalently into the micelles and existed as an amorphous state or a solid solution in the polymeric matrix. The freeze-dried formulation with addition of 1% (w/v) mannitol as cryoprotectant was successfully developed for the long-term storage of QC-loaded micelles. Compared to free QC, QC-loaded micelles could release QC more slowly. Moreover, the release of QC from micelles was slightly faster in PBS at pH 5 than that in PBS at pH 7.4, which implied that QC-loaded micelles might be pH-sensitive and thereby selectively deliver QC to tumor tissue with unwanted side effects. Therefore, mPEG-Chol was a promising micellar vector for the controlled and targeted drug delivery of QC to tumor and QC-loaded micelles were also worth being further investigated as a potential formulation for cancer chemoprevention and treatment.
Quality assurance and quality control of geochemical data—A primer for the research scientist
Geboy, Nicholas J.; Engle, Mark A.
2011-01-01
Geochemistry is a constantly expanding science. More and more, scientists are employing geochemical tools to help answer questions about the Earth and earth system processes. Scientists may assume that the responsibility of examining and assessing the quality of the geochemical data they generate is not theirs but rather that of the analytical laboratories to which their samples have been submitted. This assumption may be partially based on knowledge about internal and external quality assurance and quality control (QA/QC) programs in which analytical laboratories typically participate. Or there may be a perceived lack of time or resources to adequately examine data quality. Regardless of the reason, the lack of QA/QC protocols can lead to the generation and publication of erroneous data. Because the interpretations drawn from the data are primary products to U.S. Geological Survey (USGS) stakeholders, the consequences of publishing erroneous results can be significant. The principal investigator of a scientific study ultimately is responsible for the quality and interpretation of the project's findings, and thus must also play a role in the understanding, implementation, and presentation of QA/QC information about the data. Although occasionally ignored, QA/QC protocols apply not only to procedures in the laboratory but also in the initial planning of a research study and throughout the life of the project. Many of the tenets of developing a sound QA/QC program or protocols also parallel the core concepts of developing a good study: What is the main objective of the study? Will the methods selected provide data of enough resolution to answer the hypothesis? How should samples be collected? Are there known or unknown artifacts or contamination sources in the sampling and analysis methods? Assessing data quality requires communication between the scientists responsible for designing the study and those collecting samples, analyzing samples, treating data, and interpreting results. This primer has been developed to provide basic information and guidance about developing QA/QC protocols for geochemical studies. It is not intended to be a comprehensive guide but rather an introduction to key concepts tied to a list of relevant references for further reading. The guidelines are presented in stepwise order beginning with presampling considerations and continuing through final data interpretation. The goal of this primer is to outline basic QA/QC practices that scientists can use before, during, and after chemical analysis to ensure the validity of the data they collect with the goal of providing defendable results and conclusions.
A comprehensive quality control workflow for paired tumor-normal NGS experiments.
Schroeder, Christopher M; Hilke, Franz J; Löffler, Markus W; Bitzer, Michael; Lenz, Florian; Sturm, Marc
2017-06-01
Quality control (QC) is an important part of all NGS data analysis stages. Many available tools calculate QC metrics from different analysis steps of single sample experiments (raw reads, mapped reads and variant lists). Multi-sample experiments, as sequencing of tumor-normal pairs, require additional QC metrics to ensure validity of results. These multi-sample QC metrics still lack standardization. We therefore suggest a new workflow for QC of DNA sequencing of tumor-normal pairs. With this workflow well-known single-sample QC metrics and additional metrics specific for tumor-normal pairs can be calculated. The segmentation into different tools offers a high flexibility and allows reuse for other purposes. All tools produce qcML, a generic XML format for QC of -omics experiments. qcML uses quality metrics defined in an ontology, which was adapted for NGS. All QC tools are implemented in C ++ and run both under Linux and Windows. Plotting requires python 2.7 and matplotlib. The software is available under the 'GNU General Public License version 2' as part of the ngs-bits project: https://github.com/imgag/ngs-bits. christopher.schroeder@med.uni-tuebingen.de. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yung, J; Stefan, W; Reeve, D
2015-06-15
Purpose: Phantom measurements allow for the performance of magnetic resonance (MR) systems to be evaluated. Association of Physicists in Medicine (AAPM) Report No. 100 Acceptance Testing and Quality Assurance Procedures for MR Imaging Facilities, American College of Radiology (ACR) MR Accreditation Program MR phantom testing, and ACR MRI quality control (QC) program documents help to outline specific tests for establishing system performance baselines as well as system stability over time. Analyzing and processing tests from multiple systems can be time-consuming for medical physicists. Besides determining whether tests are within predetermined limits or criteria, monitoring longitudinal trends can also help preventmore » costly downtime of systems during clinical operation. In this work, a semi-automated QC program was developed to analyze and record measurements in a database that allowed for easy access to historical data. Methods: Image analysis was performed on 27 different MR systems of 1.5T and 3.0T field strengths from GE and Siemens manufacturers. Recommended measurements involved the ACR MRI Accreditation Phantom, spherical homogenous phantoms, and a phantom with an uniform hole pattern. Measurements assessed geometric accuracy and linearity, position accuracy, image uniformity, signal, noise, ghosting, transmit gain, center frequency, and magnetic field drift. The program was designed with open source tools, employing Linux, Apache, MySQL database and Python programming language for the front and backend. Results: Processing time for each image is <2 seconds. Figures are produced to show regions of interests (ROIs) for analysis. Historical data can be reviewed to compare previous year data and to inspect for trends. Conclusion: A MRI quality assurance and QC program is necessary for maintaining high quality, ACR MRI Accredited MR programs. A reviewable database of phantom measurements assists medical physicists with processing and monitoring of large datasets. Longitudinal data can reveal trends that although are within passing criteria indicate underlying system issues.« less
TU-A-18C-01: ACR Accreditation Updates in CT, Ultrasound, Mammography and MRI
DOE Office of Scientific and Technical Information (OSTI.GOV)
Price, R; Berns, E; Hangiandreou, N
2014-06-15
A goal of an imaging accreditation program is to ensure adequate image quality, verify appropriate staff qualifications, and to assure patient and personnel safety. Currently, more than 35,000 facilities in 10 modalities have been accredited by the American College of Radiology (ACR), making the ACR program one of the most prolific accreditation options in the U.S. In addition, the ACR is one of the accepted accreditations required by some state laws, CMS/MIPPA insurance and others. Familiarity with the ACR accreditation process is therefore essential to clinical diagnostic medical physicists. Maintaining sufficient knowledge of the ACR program must include keeping up-to-datemore » as the various modality requirements are refined to better serve the goals of the program and to accommodate newer technologies and practices. This session consists of presentations from authorities in four ACR accreditation modality programs, including magnetic resonance imaging, mammography, ultrasound, and computed tomography. Each speaker will discuss the general components of the modality program and address any recent changes to the requirements. Learning Objectives: To understand the requirements of the ACR MR accreditation program. The discussion will include accreditation of whole-body general purpose magnets, dedicated extremity systems well as breast MRI accreditation. Anticipated updates to the ACR MRI Quality Control Manual will also be reviewed. To understand the current ACR MAP Accreditation requirement and present the concepts and structure of the forthcoming ACR Digital Mammography QC Manual and Program. To understand the new requirements of the ACR ultrasound accreditation program, and roles the physicist can play in annual equipment surveys and setting up and supervising the routine QC program. To understand the requirements of the ACR CT accreditation program, including updates to the QC manual as well as updates through the FAQ process.« less
MO-AB-207-02: ACR Update in MR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Price, R.
2015-06-15
A goal of an imaging accreditation program is to ensure adequate image quality, verify appropriate staff qualifications, and to assure patient and personnel safety. Currently, more than 35,000 facilities in 10 modalities have been accredited by the American College of Radiology (ACR), making the ACR program one of the most prolific accreditation options in the U.S. In addition, ACR is one of the accepted accreditations required by some state laws, CMS/MIPPA insurance and others. Familiarity with the ACR accreditation process is therefore essential to clinical diagnostic medical physicists. Maintaining sufficient knowledge of the ACR program must include keeping up-to-date asmore » the various modality requirements are refined to better serve the goals of the program and to accommodate newer technologies and practices. This session consists of presentations from authorities in four ACR accreditation modality programs, including magnetic resonance imaging, computed tomography, nuclear medicine, and mammography. Each speaker will discuss the general components of the modality program and address any recent changes to the requirements. Learning Objectives: To understand the requirements of the ACR MR Accreditation program. The discussion will include accreditation of whole-body general purpose magnets, dedicated extremity systems well as breast MRI accreditation. Anticipated updates to the ACR MRI Quality Control Manual will also be reviewed. To understand the requirements of the ACR CT accreditation program, including updates to the QC manual as well as updates through the FAQ process. To understand the requirements of the ACR nuclear medicine accreditation program, and the role of the physicist in annual equipment surveys and the set up and supervision of the routine QC program. To understand the current ACR MAP Accreditation requirement and present the concepts and structure of the forthcoming ACR Digital Mammography QC Manual and Program.« less
MO-AB-207-04: ACR Update in Mammography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berns, E.
2015-06-15
A goal of an imaging accreditation program is to ensure adequate image quality, verify appropriate staff qualifications, and to assure patient and personnel safety. Currently, more than 35,000 facilities in 10 modalities have been accredited by the American College of Radiology (ACR), making the ACR program one of the most prolific accreditation options in the U.S. In addition, ACR is one of the accepted accreditations required by some state laws, CMS/MIPPA insurance and others. Familiarity with the ACR accreditation process is therefore essential to clinical diagnostic medical physicists. Maintaining sufficient knowledge of the ACR program must include keeping up-to-date asmore » the various modality requirements are refined to better serve the goals of the program and to accommodate newer technologies and practices. This session consists of presentations from authorities in four ACR accreditation modality programs, including magnetic resonance imaging, computed tomography, nuclear medicine, and mammography. Each speaker will discuss the general components of the modality program and address any recent changes to the requirements. Learning Objectives: To understand the requirements of the ACR MR Accreditation program. The discussion will include accreditation of whole-body general purpose magnets, dedicated extremity systems well as breast MRI accreditation. Anticipated updates to the ACR MRI Quality Control Manual will also be reviewed. To understand the requirements of the ACR CT accreditation program, including updates to the QC manual as well as updates through the FAQ process. To understand the requirements of the ACR nuclear medicine accreditation program, and the role of the physicist in annual equipment surveys and the set up and supervision of the routine QC program. To understand the current ACR MAP Accreditation requirement and present the concepts and structure of the forthcoming ACR Digital Mammography QC Manual and Program.« less
MO-AB-207-01: ACR Update in CT
DOE Office of Scientific and Technical Information (OSTI.GOV)
McNitt-Gray, M.
2015-06-15
A goal of an imaging accreditation program is to ensure adequate image quality, verify appropriate staff qualifications, and to assure patient and personnel safety. Currently, more than 35,000 facilities in 10 modalities have been accredited by the American College of Radiology (ACR), making the ACR program one of the most prolific accreditation options in the U.S. In addition, ACR is one of the accepted accreditations required by some state laws, CMS/MIPPA insurance and others. Familiarity with the ACR accreditation process is therefore essential to clinical diagnostic medical physicists. Maintaining sufficient knowledge of the ACR program must include keeping up-to-date asmore » the various modality requirements are refined to better serve the goals of the program and to accommodate newer technologies and practices. This session consists of presentations from authorities in four ACR accreditation modality programs, including magnetic resonance imaging, computed tomography, nuclear medicine, and mammography. Each speaker will discuss the general components of the modality program and address any recent changes to the requirements. Learning Objectives: To understand the requirements of the ACR MR Accreditation program. The discussion will include accreditation of whole-body general purpose magnets, dedicated extremity systems well as breast MRI accreditation. Anticipated updates to the ACR MRI Quality Control Manual will also be reviewed. To understand the requirements of the ACR CT accreditation program, including updates to the QC manual as well as updates through the FAQ process. To understand the requirements of the ACR nuclear medicine accreditation program, and the role of the physicist in annual equipment surveys and the set up and supervision of the routine QC program. To understand the current ACR MAP Accreditation requirement and present the concepts and structure of the forthcoming ACR Digital Mammography QC Manual and Program.« less
MO-AB-207-00: ACR Update in MR, CT, Nuclear Medicine, and Mammography
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
A goal of an imaging accreditation program is to ensure adequate image quality, verify appropriate staff qualifications, and to assure patient and personnel safety. Currently, more than 35,000 facilities in 10 modalities have been accredited by the American College of Radiology (ACR), making the ACR program one of the most prolific accreditation options in the U.S. In addition, ACR is one of the accepted accreditations required by some state laws, CMS/MIPPA insurance and others. Familiarity with the ACR accreditation process is therefore essential to clinical diagnostic medical physicists. Maintaining sufficient knowledge of the ACR program must include keeping up-to-date asmore » the various modality requirements are refined to better serve the goals of the program and to accommodate newer technologies and practices. This session consists of presentations from authorities in four ACR accreditation modality programs, including magnetic resonance imaging, computed tomography, nuclear medicine, and mammography. Each speaker will discuss the general components of the modality program and address any recent changes to the requirements. Learning Objectives: To understand the requirements of the ACR MR Accreditation program. The discussion will include accreditation of whole-body general purpose magnets, dedicated extremity systems well as breast MRI accreditation. Anticipated updates to the ACR MRI Quality Control Manual will also be reviewed. To understand the requirements of the ACR CT accreditation program, including updates to the QC manual as well as updates through the FAQ process. To understand the requirements of the ACR nuclear medicine accreditation program, and the role of the physicist in annual equipment surveys and the set up and supervision of the routine QC program. To understand the current ACR MAP Accreditation requirement and present the concepts and structure of the forthcoming ACR Digital Mammography QC Manual and Program.« less
MO-AB-207-03: ACR Update in Nuclear Medicine
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harkness, B.
A goal of an imaging accreditation program is to ensure adequate image quality, verify appropriate staff qualifications, and to assure patient and personnel safety. Currently, more than 35,000 facilities in 10 modalities have been accredited by the American College of Radiology (ACR), making the ACR program one of the most prolific accreditation options in the U.S. In addition, ACR is one of the accepted accreditations required by some state laws, CMS/MIPPA insurance and others. Familiarity with the ACR accreditation process is therefore essential to clinical diagnostic medical physicists. Maintaining sufficient knowledge of the ACR program must include keeping up-to-date asmore » the various modality requirements are refined to better serve the goals of the program and to accommodate newer technologies and practices. This session consists of presentations from authorities in four ACR accreditation modality programs, including magnetic resonance imaging, computed tomography, nuclear medicine, and mammography. Each speaker will discuss the general components of the modality program and address any recent changes to the requirements. Learning Objectives: To understand the requirements of the ACR MR Accreditation program. The discussion will include accreditation of whole-body general purpose magnets, dedicated extremity systems well as breast MRI accreditation. Anticipated updates to the ACR MRI Quality Control Manual will also be reviewed. To understand the requirements of the ACR CT accreditation program, including updates to the QC manual as well as updates through the FAQ process. To understand the requirements of the ACR nuclear medicine accreditation program, and the role of the physicist in annual equipment surveys and the set up and supervision of the routine QC program. To understand the current ACR MAP Accreditation requirement and present the concepts and structure of the forthcoming ACR Digital Mammography QC Manual and Program.« less
Automated quality control in a file-based broadcasting workflow
NASA Astrophysics Data System (ADS)
Zhang, Lina
2014-04-01
Benefit from the development of information and internet technologies, television broadcasting is transforming from inefficient tape-based production and distribution to integrated file-based workflows. However, no matter how many changes have took place, successful broadcasting still depends on the ability to deliver a consistent high quality signal to the audiences. After the transition from tape to file, traditional methods of manual quality control (QC) become inadequate, subjective, and inefficient. Based on China Central Television's full file-based workflow in the new site, this paper introduces an automated quality control test system for accurate detection of hidden troubles in media contents. It discusses the system framework and workflow control when the automated QC is added. It puts forward a QC criterion and brings forth a QC software followed this criterion. It also does some experiments on QC speed by adopting parallel processing and distributed computing. The performance of the test system shows that the adoption of automated QC can make the production effective and efficient, and help the station to achieve a competitive advantage in the media market.
Chow, Judith C; Watson, John G; Robles, Jerome; Wang, Xiaoliang; Chen, L-W Antony; Trimble, Dana L; Kohl, Steven D; Tropp, Richard J; Fung, Kochy K
2011-12-01
Accurate, precise, and valid organic and elemental carbon (OC and EC, respectively) measurements require more effort than the routine analysis of ambient aerosol and source samples. This paper documents the quality assurance (QA) and quality control (QC) procedures that should be implemented to ensure consistency of OC and EC measurements. Prior to field sampling, the appropriate filter substrate must be selected and tested for sampling effectiveness. Unexposed filters are pre-fired to remove contaminants and acceptance tested. After sampling, filters must be stored in the laboratory in clean, labeled containers under refrigeration (<4 °C) to minimize loss of semi-volatile OC. QA activities include participation in laboratory accreditation programs, external system audits, and interlaboratory comparisons. For thermal/optical carbon analyses, periodic QC tests include calibration of the flame ionization detector with different types of carbon standards, thermogram inspection, replicate analyses, quantification of trace oxygen concentrations (<100 ppmv) in the helium atmosphere, and calibration of the sample temperature sensor. These established QA/QC procedures are applicable to aerosol sampling and analysis for carbon and other chemical components.
Investigation of the Asphalt Pavement Analyzer (APA) testing program in Nebraska.
DOT National Transportation Integrated Search
2008-03-01
The asphalt pavement analyzer (APA) has been widely used to evaluate hot-mix asphalt (HMA) rutting potential in mix : design and quality control-quality assurance (QC-QA) applications, because the APA testing and its data analyses are : relatively si...
Batt, Angela L; Furlong, Edward T; Mash, Heath E; Glassmeyer, Susan T; Kolpin, Dana W
2017-02-01
A national-scale survey of 247 contaminants of emerging concern (CECs), including organic and inorganic chemical compounds, and microbial contaminants, was conducted in source and treated drinking water samples from 25 treatment plants across the United States. Multiple methods were used to determine these CECs, including six analytical methods to measure 174 pharmaceuticals, personal care products, and pesticides. A three-component quality assurance/quality control (QA/QC) program was designed for the subset of 174 CECs which allowed us to assess and compare performances of the methods used. The three components included: 1) a common field QA/QC protocol and sample design, 2) individual investigator-developed method-specific QA/QC protocols, and 3) a suite of 46 method comparison analytes that were determined in two or more analytical methods. Overall method performance for the 174 organic chemical CECs was assessed by comparing spiked recoveries in reagent, source, and treated water over a two-year period. In addition to the 247 CECs reported in the larger drinking water study, another 48 pharmaceutical compounds measured did not consistently meet predetermined quality standards. Methodologies that did not seem suitable for these analytes are overviewed. The need to exclude analytes based on method performance demonstrates the importance of additional QA/QC protocols. Published by Elsevier B.V.
Eight years of quality control in Bulgaria: impact on mammography practice.
Avramova-Cholakova, S; Lilkov, G; Kaneva, M; Terziev, K; Nakov, I; Mutkurov, N; Kovacheva, D; Ivanova, M; Vasilev, D
2015-07-01
The requirements for quality control (QC) in diagnostic radiology were introduced in Bulgarian legislation in 2005. Hospital medical physicists and several private medical physics groups provide QC services to radiology departments. The aim of this study was to analyse data from QC tests in mammography and to investigate the impact of QC introduction on mammography practice in the country. The study was coordinated by the National Centre of Radiobiology and Radiation Protection. All medical physics services were requested to fill in standardised forms with information about most important parameters routinely measured during QC. All QC service providers responded. Results demonstrated significant improvement of practice since the introduction of QC, with reduction of established deviations from 65 % during the first year to 7 % in the last year. The systems that do not meet the acceptability criteria were suspended from use. Performance of automatic exposure control and digital detectors are not regularly tested because of the absence of requirements in the legislation. The need of updated guidance and training of medical physicists to reflect the change in technology was demonstrated. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Quality Control in Primary Schools: Progress from 2001-2006
ERIC Educational Resources Information Center
Hofman, Roelande H.; de Boom, Jan; Hofman, W. H. Adriaan
2010-01-01
This article presents findings of research into the quality control (QC) of schools from 2001-2006. In 2001 several targets for QC were set and the progress of 939 primary schools is presented. Furthermore, using cluster analysis, schools are classified into four QC-types that differ in their focus on school (self) evaluation and school…
WE-A-210-00: Educational: Diagnostic Ultrasound QA
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
This presentation will focus on the present role of ultrasound medical physics in clinical practices. The first part of the presentation will provide an overview of ultrasound QC methodologies and testing procedures. A brief review of ultrasound phantoms utilized in these testing procedures will be presented. The second part of the presentation will summarize ultrasound imaging technical standards and professional guidelines by American College of Radiology (ACR), American Institute of Ultrasound in Medicine (AIUM), American Association of Physicists in Medicine (AAPM) and International Electrotechnical Commission (IEC). The current accreditation requirements by ACR and AIUM for ultrasound practices will be describedmore » and the practical aspects of implementing QC programs to be compliant with these requirements will be discussed. Learning Objectives: Achieve familiarity with common ultrasound QC test methods and ultrasound phantoms. Understand the coverage of the existing testing standards and professional guidelines on diagnostic ultrasound imaging. Learn what a medical physicist needs to know about ultrasound program accreditation and be able to implement ultrasound QC programs accordingly.« less
MO-AB-210-03: Workshop [Advancements in high intensity focused ultrasound
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu, Z.
The goal of this ultrasound hands-on workshop is to demonstrate advancements in high intensity focused ultrasound (HIFU) and to demonstrate quality control (QC) testing in diagnostic ultrasound. HIFU is a therapeutic modality that uses ultrasound waves as carriers of energy. HIFU is used to focus a beam of ultrasound energy into a small volume at specific target locations within the body. The focused beam causes localized high temperatures and produces a well-defined regions of necrosis. This completely non-invasive technology has great potential for tumor ablation and targeted drug delivery. At the workshop, attendees will see configurations, applications, and hands-on demonstrationsmore » with on-site instructors at separate stations. The involvement of medical physicists in diagnostic ultrasound imaging service is increasing due to QC and accreditation requirements. At the workshop, an array of ultrasound testing phantoms and ultrasound scanners will be provided for attendees to learn diagnostic ultrasound QC in a hands-on environment with live demonstrations of the techniques. Target audience: Medical physicists and other medical professionals in diagnostic imaging and radiation oncology with interest in high-intensity focused ultrasound and in diagnostic ultrasound QC. Learning Objectives: Learn ultrasound physics and safety for HIFU applications through live demonstrations Get an overview of the state-of-the art in HIFU technologies and equipment Gain familiarity with common elements of a quality control program for diagnostic ultrasound imaging Identify QC tools available for testing diagnostic ultrasound systems and learn how to use these tools List of supporting vendors for HIFU and diagnostic ultrasound QC hands-on workshop: Philips Healthcare Alpinion Medical Systems Verasonics, Inc Zonare Medical Systems, Inc Computerized Imaging Reference Systems (CIRS), Inc. GAMMEX, Inc., Cablon Medical BV Steffen Sammet: NIH/NCI grant 5R25CA132822, NIH/NINDS grant 5R25NS080949 and Philips Healthcare research grant C32.« less
MO-AB-210-02: Ultrasound Imaging and Therapy-Hands On Workshop
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sammet, S.
The goal of this ultrasound hands-on workshop is to demonstrate advancements in high intensity focused ultrasound (HIFU) and to demonstrate quality control (QC) testing in diagnostic ultrasound. HIFU is a therapeutic modality that uses ultrasound waves as carriers of energy. HIFU is used to focus a beam of ultrasound energy into a small volume at specific target locations within the body. The focused beam causes localized high temperatures and produces a well-defined regions of necrosis. This completely non-invasive technology has great potential for tumor ablation and targeted drug delivery. At the workshop, attendees will see configurations, applications, and hands-on demonstrationsmore » with on-site instructors at separate stations. The involvement of medical physicists in diagnostic ultrasound imaging service is increasing due to QC and accreditation requirements. At the workshop, an array of ultrasound testing phantoms and ultrasound scanners will be provided for attendees to learn diagnostic ultrasound QC in a hands-on environment with live demonstrations of the techniques. Target audience: Medical physicists and other medical professionals in diagnostic imaging and radiation oncology with interest in high-intensity focused ultrasound and in diagnostic ultrasound QC. Learning Objectives: Learn ultrasound physics and safety for HIFU applications through live demonstrations Get an overview of the state-of-the art in HIFU technologies and equipment Gain familiarity with common elements of a quality control program for diagnostic ultrasound imaging Identify QC tools available for testing diagnostic ultrasound systems and learn how to use these tools List of supporting vendors for HIFU and diagnostic ultrasound QC hands-on workshop: Philips Healthcare Alpinion Medical Systems Verasonics, Inc Zonare Medical Systems, Inc Computerized Imaging Reference Systems (CIRS), Inc. GAMMEX, Inc., Cablon Medical BV Steffen Sammet: NIH/NCI grant 5R25CA132822, NIH/NINDS grant 5R25NS080949 and Philips Healthcare research grant C32.« less
MO-AB-210-01: Ultrasound Imaging and Therapy-Hands On Workshop
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu, Z.
The goal of this ultrasound hands-on workshop is to demonstrate advancements in high intensity focused ultrasound (HIFU) and to demonstrate quality control (QC) testing in diagnostic ultrasound. HIFU is a therapeutic modality that uses ultrasound waves as carriers of energy. HIFU is used to focus a beam of ultrasound energy into a small volume at specific target locations within the body. The focused beam causes localized high temperatures and produces a well-defined regions of necrosis. This completely non-invasive technology has great potential for tumor ablation and targeted drug delivery. At the workshop, attendees will see configurations, applications, and hands-on demonstrationsmore » with on-site instructors at separate stations. The involvement of medical physicists in diagnostic ultrasound imaging service is increasing due to QC and accreditation requirements. At the workshop, an array of ultrasound testing phantoms and ultrasound scanners will be provided for attendees to learn diagnostic ultrasound QC in a hands-on environment with live demonstrations of the techniques. Target audience: Medical physicists and other medical professionals in diagnostic imaging and radiation oncology with interest in high-intensity focused ultrasound and in diagnostic ultrasound QC. Learning Objectives: Learn ultrasound physics and safety for HIFU applications through live demonstrations Get an overview of the state-of-the art in HIFU technologies and equipment Gain familiarity with common elements of a quality control program for diagnostic ultrasound imaging Identify QC tools available for testing diagnostic ultrasound systems and learn how to use these tools List of supporting vendors for HIFU and diagnostic ultrasound QC hands-on workshop: Philips Healthcare Alpinion Medical Systems Verasonics, Inc Zonare Medical Systems, Inc Computerized Imaging Reference Systems (CIRS), Inc. GAMMEX, Inc., Cablon Medical BV Steffen Sammet: NIH/NCI grant 5R25CA132822, NIH/NINDS grant 5R25NS080949 and Philips Healthcare research grant C32.« less
Nowik, Patrik; Bujila, Robert; Poludniowski, Gavin; Fransson, Annette
2015-07-08
The purpose of this study was to develop a method of performing routine periodical quality controls (QC) of CT systems by automatically analyzing key performance indicators (KPIs), obtainable from images of manufacturers' quality assurance (QA) phantoms. A KPI pertains to a measurable or determinable QC parameter that is influenced by other underlying fundamental QC parameters. The established KPIs are based on relationships between existing QC parameters used in the annual testing program of CT scanners at the Karolinska University Hospital in Stockholm, Sweden. The KPIs include positioning, image noise, uniformity, homogeneity, the CT number of water, and the CT number of air. An application (MonitorCT) was developed to automatically evaluate phantom images in terms of the established KPIs. The developed methodology has been used for two years in clinical routine, where CT technologists perform daily scans of the manufacturer's QA phantom and automatically send the images to MonitorCT for KPI evaluation. In the cases where results were out of tolerance, actions could be initiated in less than 10 min. 900 QC scans from two CT scanners have been collected and analyzed over the two-year period that MonitorCT has been active. Two types of errors have been registered in this period: a ring artifact was discovered with the image noise test, and a calibration error was detected multiple times with the CT number test. In both cases, results were outside the tolerances defined for MonitorCT, as well as by the vendor. Automated monitoring of KPIs is a powerful tool that can be used to supplement established QC methodologies. Medical physicists and other professionals concerned with the performance of a CT system will, using such methods, have access to comprehensive data on the current and historical (trend) status of the system such that swift actions can be taken in order to ensure the quality of the CT examinations, patient safety, and minimal disruption of service.
Song, Ting; Li, Nan; Zarepisheh, Masoud; Li, Yongbao; Gautier, Quentin; Zhou, Linghong; Mell, Loren; Jiang, Steve; Cerviño, Laura
2016-01-01
Intensity-modulated radiation therapy (IMRT) currently plays an important role in radiotherapy, but its treatment plan quality can vary significantly among institutions and planners. Treatment plan quality control (QC) is a necessary component for individual clinics to ensure that patients receive treatments with high therapeutic gain ratios. The voxel-weighting factor-based plan re-optimization mechanism has been proved able to explore a larger Pareto surface (solution domain) and therefore increase the possibility of finding an optimal treatment plan. In this study, we incorporated additional modules into an in-house developed voxel weighting factor-based re-optimization algorithm, which was enhanced as a highly automated and accurate IMRT plan QC tool (TPS-QC tool). After importing an under-assessment plan, the TPS-QC tool was able to generate a QC report within 2 minutes. This QC report contains the plan quality determination as well as information supporting the determination. Finally, the IMRT plan quality can be controlled by approving quality-passed plans and replacing quality-failed plans using the TPS-QC tool. The feasibility and accuracy of the proposed TPS-QC tool were evaluated using 25 clinically approved cervical cancer patient IMRT plans and 5 manually created poor-quality IMRT plans. The results showed high consistency between the QC report quality determinations and the actual plan quality. In the 25 clinically approved cases that the TPS-QC tool identified as passed, a greater difference could be observed for dosimetric endpoints for organs at risk (OAR) than for planning target volume (PTV), implying that better dose sparing could be achieved in OAR than in PTV. In addition, the dose-volume histogram (DVH) curves of the TPS-QC tool re-optimized plans satisfied the dosimetric criteria more frequently than did the under-assessment plans. In addition, the criteria for unsatisfied dosimetric endpoints in the 5 poor-quality plans could typically be satisfied when the TPS-QC tool generated re-optimized plans without sacrificing other dosimetric endpoints. In addition to its feasibility and accuracy, the proposed TPS-QC tool is also user-friendly and easy to operate, both of which are necessary characteristics for clinical use.
222-S Laboratory Quality Assurance Plan. Revision 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meznarich, H.K.
1995-07-31
This Quality Assurance Plan provides,quality assurance (QA) guidance, regulatory QA requirements (e.g., 10 CFR 830.120), and quality control (QC) specifications for analytical service. This document follows the U.S Department of Energy (DOE) issued Hanford Analytical Services Quality Assurance Plan (HASQAP). In addition, this document meets the objectives of the Quality Assurance Program provided in the WHC-CM-4-2, Section 2.1. Quality assurance elements required in the Guidelines and Specifications for Preparing Quality Assurance Program Plans (QAMS-004) and Interim Guidelines and Specifications for Preparing Quality Assurance Project Plans (QAMS-005) from the US Environmental Protection Agency (EPA) are covered throughout this document. A qualitymore » assurance index is provided in the Appendix A. This document also provides and/or identifies the procedural information that governs laboratory operations. The personnel of the 222-S Laboratory and the Standards Laboratory including managers, analysts, QA/QC staff, auditors, and support staff shall use this document as guidance and instructions for their operational and quality assurance activities. Other organizations that conduct activities described in this document for the 222-S Laboratory shall follow this QA/QC document.« less
Evaluation of digital radiography practice using exposure index tracking
Zhou, Yifang; Allahverdian, Janet; Nute, Jessica L.; Lee, Christina
2016-01-01
Some digital radiography (DR) detectors and software allow for remote download of exam statistics, including image reject status, body part, projection, and exposure index (EI). The ability to have automated data collection from multiple DR units is conducive to a quality control (QC) program monitoring institutional radiographic exposures. We have implemented such a QC program with the goal to identify outliers in machine radiation output and opportunities for improvement in radiation dose levels. We studied the QC records of four digital detectors in greater detail on a monthly basis for one year. Although individual patient entrance skin exposure varied, the radiation dose levels to the detectors were made to be consistent via phototimer recalibration. The exposure data stored on each digital detector were periodically downloaded in a spreadsheet format for analysis. EI median and standard deviation were calculated for each protocol (by body part) and EI histograms were created for torso protocols. When histograms of EI values for different units were compared, we observed differences up to 400 in average EI (representing 60% difference in radiation levels to the detector) between units nominally calibrated to the same EI. We identified distinct components of the EI distributions, which in some cases, had mean EI values 300 apart. Peaks were observed at the current calibrated EI, a previously calibrated EI, and an EI representing computed radiography (CR) techniques. Our findings in this ongoing project have allowed us to make useful interventions, from emphasizing the use of phototimers instead of institutional memory of manual techniques to improvements in our phototimer calibration. We believe that this QC program can be implemented at other sites and can reveal problems with radiation levels in the aggregate that are difficult to identify on a case‐by‐case basis. PACS number(s): 87.59.bf PMID:27929507
Lourens, Chris; Lindegardh, Niklas; Barnes, Karen I.; Guerin, Philippe J.; Sibley, Carol H.; White, Nicholas J.
2014-01-01
Comprehensive assessment of antimalarial drug resistance should include measurements of antimalarial blood or plasma concentrations in clinical trials and in individual assessments of treatment failure so that true resistance can be differentiated from inadequate drug exposure. Pharmacometric modeling is necessary to assess pharmacokinetic-pharmacodynamic relationships in different populations to optimize dosing. To accomplish both effectively and to allow comparison of data from different laboratories, it is essential that drug concentration measurement is accurate. Proficiency testing (PT) of laboratory procedures is necessary for verification of assay results. Within the Worldwide Antimalarial Resistance Network (WWARN), the goal of the quality assurance/quality control (QA/QC) program is to facilitate and sustain high-quality antimalarial assays. The QA/QC program consists of an international PT program for pharmacology laboratories and a reference material (RM) program for the provision of antimalarial drug standards, metabolites, and internal standards for laboratory use. The RM program currently distributes accurately weighed quantities of antimalarial drug standards, metabolites, and internal standards to 44 pharmacology, in vitro, and drug quality testing laboratories. The pharmacology PT program has sent samples to eight laboratories in four rounds of testing. WWARN technical experts have provided advice for correcting identified problems to improve performance of subsequent analysis and ultimately improved the quality of data. Many participants have demonstrated substantial improvements over subsequent rounds of PT. The WWARN QA/QC program has improved the quality and value of antimalarial drug measurement in laboratories globally. It is a model that has potential to be applied to strengthening laboratories more widely and improving the therapeutics of other infectious diseases. PMID:24777099
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-12
... perfluorocarbon QA/QC quality assurance/quality control R&D research and development RFA Regulatory Flexibility... Climate Change.'' Joint Global Change Research Institute, Battelle Pacific Northwest Division. PNWD-3602... research, demonstration, and deployment programs throughout the world, are building confidence that...
7 CFR 275.12 - Review of active cases.
Code of Federal Regulations, 2014 CFR
2014-01-01
... Agriculture Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF AGRICULTURE FOOD STAMP AND FOOD DISTRIBUTION PROGRAM PERFORMANCE REPORTING SYSTEM Quality Control (QC) Reviews... letter from the Food and Nutrition Service to a State agency which contains comments on the State agency...
7 CFR 275.12 - Review of active cases.
Code of Federal Regulations, 2013 CFR
2013-01-01
... Agriculture Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF AGRICULTURE FOOD STAMP AND FOOD DISTRIBUTION PROGRAM PERFORMANCE REPORTING SYSTEM Quality Control (QC) Reviews... letter from the Food and Nutrition Service to a State agency which contains comments on the State agency...
7 CFR 275.12 - Review of active cases.
Code of Federal Regulations, 2012 CFR
2012-01-01
... Agriculture Regulations of the Department of Agriculture (Continued) FOOD AND NUTRITION SERVICE, DEPARTMENT OF AGRICULTURE FOOD STAMP AND FOOD DISTRIBUTION PROGRAM PERFORMANCE REPORTING SYSTEM Quality Control (QC) Reviews... letter from the Food and Nutrition Service to a State agency which contains comments on the State agency...
AfterQC: automatic filtering, trimming, error removing and quality control for fastq data.
Chen, Shifu; Huang, Tanxiao; Zhou, Yanqing; Han, Yue; Xu, Mingyan; Gu, Jia
2017-03-14
Some applications, especially those clinical applications requiring high accuracy of sequencing data, usually have to face the troubles caused by unavoidable sequencing errors. Several tools have been proposed to profile the sequencing quality, but few of them can quantify or correct the sequencing errors. This unmet requirement motivated us to develop AfterQC, a tool with functions to profile sequencing errors and correct most of them, plus highly automated quality control and data filtering features. Different from most tools, AfterQC analyses the overlapping of paired sequences for pair-end sequencing data. Based on overlapping analysis, AfterQC can detect and cut adapters, and furthermore it gives a novel function to correct wrong bases in the overlapping regions. Another new feature is to detect and visualise sequencing bubbles, which can be commonly found on the flowcell lanes and may raise sequencing errors. Besides normal per cycle quality and base content plotting, AfterQC also provides features like polyX (a long sub-sequence of a same base X) filtering, automatic trimming and K-MER based strand bias profiling. For each single or pair of FastQ files, AfterQC filters out bad reads, detects and eliminates sequencer's bubble effects, trims reads at front and tail, detects the sequencing errors and corrects part of them, and finally outputs clean data and generates HTML reports with interactive figures. AfterQC can run in batch mode with multiprocess support, it can run with a single FastQ file, a single pair of FastQ files (for pair-end sequencing), or a folder for all included FastQ files to be processed automatically. Based on overlapping analysis, AfterQC can estimate the sequencing error rate and profile the error transform distribution. The results of our error profiling tests show that the error distribution is highly platform dependent. Much more than just another new quality control (QC) tool, AfterQC is able to perform quality control, data filtering, error profiling and base correction automatically. Experimental results show that AfterQC can help to eliminate the sequencing errors for pair-end sequencing data to provide much cleaner outputs, and consequently help to reduce the false-positive variants, especially for the low-frequency somatic mutations. While providing rich configurable options, AfterQC can detect and set all the options automatically and require no argument in most cases.
[Development of quality assurance/quality control web system in radiotherapy].
Okamoto, Hiroyuki; Mochizuki, Toshihiko; Yokoyama, Kazutoshi; Wakita, Akihisa; Nakamura, Satoshi; Ueki, Heihachi; Shiozawa, Keiko; Sasaki, Koji; Fuse, Masashi; Abe, Yoshihisa; Itami, Jun
2013-12-01
Our purpose is to develop a QA/QC (quality assurance/quality control) web system using a server-side script language such as HTML (HyperText Markup Language) and PHP (Hypertext Preprocessor), which can be useful as a tool to share information about QA/QC in radiotherapy. The system proposed in this study can be easily built in one's own institute, because HTML can be easily handled. There are two desired functions in a QA/QC web system: (i) To review the results of QA/QC for a radiotherapy machine, manuals, and reports necessary for routinely performing radiotherapy through this system. By disclosing the results, transparency can be maintained, (ii) To reveal a protocol for QA/QC in one's own institute using pictures and movies relating to QA/QC for simplicity's sake, which can also be used as an educational tool for junior radiation technologists and medical physicists. By using this system, not only administrators, but also all staff involved in radiotherapy, can obtain information about the conditions and accuracy of treatment machines through the QA/QC web system.
Real Time Quality Control Methods for Cued EMI Data Collection
2016-03-14
contents be construed as reflecting the official policy or position of the Department of Defense. Reference herein to any specific commercial product...This project evaluated the effectiveness of in-field quality control (QC) procedures during cued electromagnetic induction (EMI) data collection. The...electromagnetic induction ESTCP Environmental Security Technology Certification Program hr hour ISO Industry Standard Object IVS Instrument
Planning Risk-Based SQC Schedules for Bracketed Operation of Continuous Production Analyzers.
Westgard, James O; Bayat, Hassan; Westgard, Sten A
2018-02-01
To minimize patient risk, "bracketed" statistical quality control (SQC) is recommended in the new CLSI guidelines for SQC (C24-Ed4). Bracketed SQC requires that a QC event both precedes and follows (brackets) a group of patient samples. In optimizing a QC schedule, the frequency of QC or run size becomes an important planning consideration to maintain quality and also facilitate responsive reporting of results from continuous operation of high production analytic systems. Different plans for optimizing a bracketed SQC schedule were investigated on the basis of Parvin's model for patient risk and CLSI C24-Ed4's recommendations for establishing QC schedules. A Sigma-metric run size nomogram was used to evaluate different QC schedules for processes of different sigma performance. For high Sigma performance, an effective SQC approach is to employ a multistage QC procedure utilizing a "startup" design at the beginning of production and a "monitor" design periodically throughout production. Example QC schedules are illustrated for applications with measurement procedures having 6-σ, 5-σ, and 4-σ performance. Continuous production analyzers that demonstrate high σ performance can be effectively controlled with multistage SQC designs that employ a startup QC event followed by periodic monitoring or bracketing QC events. Such designs can be optimized to minimize the risk of harm to patients. © 2017 American Association for Clinical Chemistry.
Song, Ting; Li, Nan; Zarepisheh, Masoud; Li, Yongbao; Gautier, Quentin; Zhou, Linghong; Mell, Loren; Jiang, Steve; Cerviño, Laura
2016-01-01
Intensity-modulated radiation therapy (IMRT) currently plays an important role in radiotherapy, but its treatment plan quality can vary significantly among institutions and planners. Treatment plan quality control (QC) is a necessary component for individual clinics to ensure that patients receive treatments with high therapeutic gain ratios. The voxel-weighting factor-based plan re-optimization mechanism has been proved able to explore a larger Pareto surface (solution domain) and therefore increase the possibility of finding an optimal treatment plan. In this study, we incorporated additional modules into an in-house developed voxel weighting factor-based re-optimization algorithm, which was enhanced as a highly automated and accurate IMRT plan QC tool (TPS-QC tool). After importing an under-assessment plan, the TPS-QC tool was able to generate a QC report within 2 minutes. This QC report contains the plan quality determination as well as information supporting the determination. Finally, the IMRT plan quality can be controlled by approving quality-passed plans and replacing quality-failed plans using the TPS-QC tool. The feasibility and accuracy of the proposed TPS-QC tool were evaluated using 25 clinically approved cervical cancer patient IMRT plans and 5 manually created poor-quality IMRT plans. The results showed high consistency between the QC report quality determinations and the actual plan quality. In the 25 clinically approved cases that the TPS-QC tool identified as passed, a greater difference could be observed for dosimetric endpoints for organs at risk (OAR) than for planning target volume (PTV), implying that better dose sparing could be achieved in OAR than in PTV. In addition, the dose-volume histogram (DVH) curves of the TPS-QC tool re-optimized plans satisfied the dosimetric criteria more frequently than did the under-assessment plans. In addition, the criteria for unsatisfied dosimetric endpoints in the 5 poor-quality plans could typically be satisfied when the TPS-QC tool generated re-optimized plans without sacrificing other dosimetric endpoints. In addition to its feasibility and accuracy, the proposed TPS-QC tool is also user-friendly and easy to operate, both of which are necessary characteristics for clinical use. PMID:26930204
NASA Astrophysics Data System (ADS)
Kapanen, Mika; Tenhunen, Mikko; Hämäläinen, Tuomo; Sipilä, Petri; Parkkinen, Ritva; Järvinen, Hannu
2006-07-01
Quality control (QC) data of radiotherapy linear accelerators, collected by Helsinki University Central Hospital between the years 2000 and 2004, were analysed. The goal was to provide information for the evaluation and elaboration of QC of accelerator outputs and to propose a method for QC data analysis. Short- and long-term drifts in outputs were quantified by fitting empirical mathematical models to the QC measurements. Normally, long-term drifts were well (<=1%) modelled by either a straight line or a single-exponential function. A drift of 2% occurred in 18 ± 12 months. The shortest drift times of only 2-3 months were observed for some new accelerators just after the commissioning but they stabilized during the first 2-3 years. The short-term reproducibility and the long-term stability of local constancy checks, carried out with a sealed plane parallel ion chamber, were also estimated by fitting empirical models to the QC measurements. The reproducibility was 0.2-0.5% depending on the positioning practice of a device. Long-term instabilities of about 0.3%/month were observed for some checking devices. The reproducibility of local absorbed dose measurements was estimated to be about 0.5%. The proposed empirical model fitting of QC data facilitates the recognition of erroneous QC measurements and abnormal output behaviour, caused by malfunctions, offering a tool to improve dose control.
Chaturvedi, Arvind K; Craft, Kristi J; Cardona, Patrick S; Rogers, Paul B; Canfield, Dennis V
2009-05-01
During toxicological evaluations of samples from fatally injured pilots involved in civil aviation accidents, a high degree of quality control/quality assurance (QC/QA) is maintained. Under this philosophy, the Federal Aviation Administration (FAA) started a forensic toxicology proficiency-testing (PT) program in July 1991. In continuation of the first seven years of the PT findings reported earlier, PT findings of the next seven years are summarized herein. Twenty-eight survey samples (12 urine, 9 blood, and 7 tissue homogenate) with/without alcohols/volatiles, drugs, and/or putrefactive amine(s) were submitted to an average of 31 laboratories, of which an average of 25 participants returned their results. Analytes in survey samples were correctly identified and quantitated by a large number of participants, but some false positives of concern were reported. It is anticipated that the FAA's PT program will continue to serve the forensic toxicology community through this important part of the QC/QA for laboratory accreditations.
You, Jun; Zhou, Jinping; Li, Qian; Zhang, Lina
2012-03-20
As a weak base, β-glycerophosphate (β-GP) was used to spontaneously initiate gelation of quaternized cellulose (QC) solutions at body temperature. The QC/β-GP solutions are flowable below or at room temperature but gel rapidly under physiological conditions. In order to clarify the sol-gel transition process of the QC/β-GP systems, the complex was investigated by dynamic viscoelastic measurements. The shear storage modulus (G') and loss modulus (G″) as a function of (1) concentration of β-GP (c(β-GP)), (2) concentration of QC (c(QC)), (3) degree of substitution (DS; i.e., the average number of substituted hydroxyl groups in the anhydroglucose unit) of QC, (4) viscosity-average molecular weight (M(η)) of QC, and (5) solvent medium were studied by the oscillatory rheology. The sol-gel transition temperature of QC/β-GP solutions decreased with an increase of c(QC) and c(β-GP), the M(η) of QC, and a decrease of the DS of QC and pH of the solvent. The sol-gel transition temperature and time could be easily controlled by adjusting the concentrations of QC and β-GP, M(η) and DS of QC, and the solvent medium. Gels formed after heating were irreversible; i.e., after cooling to lower temperature they could not be dissolved to become liquid again. The aggregation and entanglement of QC chains, electrostatic interaction, and hydrogen bonding between QC and β-GP were the main factors responsible for the irreversible sol-gel transition behavior of QC/β-GP systems.
Vassileva, J; Dimov, A; Slavchev, A; Karadjov, A
2005-01-01
Results from a Bulgarian patient dose survey in diagnostic radiology are presented. Reference levels for entrance surface dose (ESD) were 0.9 mGy for chest radiography (PA), 30 mGy for lumbar spine (Lat), 10 mGy for pelvis, 5 mGy for skull (AP), 3 mGy for skull (Lat) and 13 mGy for mammography. Quality control (QC) programmes were proposed for various areas of diagnostic radiology. Film processing QC warranted special attention. Proposed QC programmes included parameters to be tested, level of expertise needed and two action levels: remedial and suspension. Programmes were tested under clinical conditions to assess initial results and draw conclusions for further QC system development. On the basis of international experience, measurement protocols were developed for all parameters tested. QC equipment was provided as part of the PHARE project. A future problem for QC programme implementation may be the small number of medical physics experts in diagnostic radiology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, J; Christianson, O; Samei, E
Purpose: Flood-field uniformity evaluation is an essential element in the assessment of nuclear medicine (NM) gamma cameras. It serves as the central element of the quality control (QC) program, acquired and analyzed on a daily basis prior to clinical imaging. Uniformity images are traditionally analyzed using pixel value-based metrics which often fail to capture subtle structure and patterns caused by changes in gamma camera performance requiring additional visual inspection which is subjective and time demanding. The goal of this project was to develop and implement a robust QC metrology for NM that is effective in identifying non-uniformity issues, reporting issuesmore » in a timely manner for efficient correction prior to clinical involvement, all incorporated into an automated effortless workflow, and to characterize the program over a two year period. Methods: A new quantitative uniformity analysis metric was developed based on 2D noise power spectrum metrology and confirmed based on expert observer visual analysis. The metric, termed Structured Noise Index (SNI) was then integrated into an automated program to analyze, archive, and report on daily NM QC uniformity images. The effectiveness of the program was evaluated over a period of 2 years. Results: The SNI metric successfully identified visually apparent non-uniformities overlooked by the pixel valuebased analysis methods. Implementation of the program has resulted in nonuniformity identification in about 12% of daily flood images. In addition, due to the vigilance of staff response, the percentage of days exceeding trigger value shows a decline over time. Conclusion: The SNI provides a robust quantification of the NM performance of gamma camera uniformity. It operates seamlessly across a fleet of multiple camera models. The automated process provides effective workflow within the NM spectra between physicist, technologist, and clinical engineer. The reliability of this process has made it the preferred platform for NM uniformity analysis.« less
Results of the Excreta Bioassay Quality Control Program for April 1, 2009 through March 31, 2010
DOE Office of Scientific and Technical Information (OSTI.GOV)
Antonio, Cheryl L.
2012-07-19
A total of 58 urine samples and 10 fecal samples were submitted during the report period (April 1, 2009 through March 31, 2010) to General Engineering Laboratories, South Carolina by the Hanford Internal Dosimetry Program (IDP) to check the accuracy, precision, and detection levels of their analyses. Urine analyses for Sr, 238Pu, 239Pu, 241Am, 243Am 235U, 238U, elemental uranium and fecal analyses for 241Am, 238Pu and 239Pu were tested this year as well as four tissue samples for 238Pu, 239Pu, 241Am and 241Pu. The number of QC urine samples submitted during the report period represented 1.3% of the total samplesmore » submitted. In addition to the samples provided by IDP, GEL was also required to conduct their own QC program, and submit the results of analyses to IDP. About 33% of the analyses processed by GEL during the third year of this contract were quality control samples. GEL tested the performance of 21 radioisotopes, all of which met or exceeded the specifications in the Statement of Work within statistical uncertainty (Table 4).« less
Results of The Excreta Bioassay Quality Control Program For April 1, 2010 Through March 31, 2011
DOE Office of Scientific and Technical Information (OSTI.GOV)
Antonio, Cheryl L.
2012-07-19
A total of 76 urine samples and 10 spiked fecal samples were submitted during the report period (April 1, 2010 through March 31, 2011) to GEL Laboratories, LLC in South Carolina by the Hanford Internal Dosimetry Program (IDP) to check the accuracy, precision, and detection levels of their analyses. Urine analyses for 14C, Sr, for 238Pu, 239Pu, 241Am, 243Am, 235U, 238U, 238U-mass and fecal analyses for 241Am, 238Pu and 239Pu were tested this year. The number of QC urine samples submitted during the report period represented 1.1% of the total samples submitted. In addition to the samples provided by IDP,more » GEL was also required to conduct their own QC program, and submit the results of analyses to IDP. About 31% of the analyses processed by GEL during the first year of contract 112512 were quality control samples. GEL tested the performance of 23 radioisotopes, all of which met or exceeded the specifications in the Statement of Work within statistical uncertainty except the slightly elevated relative bias for 243,244Cm (Table 4).« less
E-Quality in the Workplace: Quality Circles or Quality of Working Life Programs in the US.
ERIC Educational Resources Information Center
Savage, Grant T.; Romano, Richard
Quality Circle (QC) and Quality of Working Life (QWL) in the United States are similar in that both stress participative decision making, preserve management's prerogative to have the final say, and are voluntary. QC and QWL programs differ, however, in that labor unions are more involved in QWLs; QCs deal only with technical problems related to…
Quality control in urinalysis.
Takubo, T; Tatsumi, N
1999-01-01
Quality control (QC) has been introduced in laboratories, and QC surveys in urinalysis have been performed by College of American Pathologist, by Japanese Association of Medical Technologists, by Osaka Medical Association and by manufacturers. QC survey in urinalysis for synthetic urine by the reagent strip and instrument made in same manufacturer, and by an automated urine cell analyser provided satisfactory results among laboratories. QC survey in urinalysis for synthetic urine by the reagent strips and instruments made by various manufacturers indicated differences in the determination values among manufacturers, and between manual and automated methods because the reagent strips and instruments have different characteristics, respectively. QC photo survey in urinalysis on the microscopic photos of urine sediment constituents indicated differences in the identification of cells among laboratories. From the results, it is necessary to standardize a reagent strip method, manual and automated methods, and synthetic urine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Joseph; Pirrung, Meg; McCue, Lee Ann
FQC is software that facilitates quality control of FASTQ files by carrying out a QC protocol using FastQC, parsing results, and aggregating quality metrics into an interactive dashboard designed to richly summarize individual sequencing runs. The dashboard groups samples in dropdowns for navigation among the data sets, utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data.
Brown, Joseph; Pirrung, Meg; McCue, Lee Ann
2017-06-09
FQC is software that facilitates quality control of FASTQ files by carrying out a QC protocol using FastQC, parsing results, and aggregating quality metrics into an interactive dashboard designed to richly summarize individual sequencing runs. The dashboard groups samples in dropdowns for navigation among the data sets, utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data.
Quantum key distribution using card, base station and trusted authority
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nordholt, Jane E.; Hughes, Richard John; Newell, Raymond Thorson
Techniques and tools for quantum key distribution ("QKD") between a quantum communication ("QC") card, base station and trusted authority are described herein. In example implementations, a QC card contains a miniaturized QC transmitter and couples with a base station. The base station provides a network connection with the trusted authority and can also provide electric power to the QC card. When coupled to the base station, after authentication by the trusted authority, the QC card acquires keys through QKD with a trust authority. The keys can be used to set up secure communication, for authentication, for access control, or formore » other purposes. The QC card can be implemented as part of a smart phone or other mobile computing device, or the QC card can be used as a fillgun for distribution of the keys.« less
Quantum key distribution using card, base station and trusted authority
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nordholt, Jane Elizabeth; Hughes, Richard John; Newell, Raymond Thorson
Techniques and tools for quantum key distribution ("QKD") between a quantum communication ("QC") card, base station and trusted authority are described herein. In example implementations, a QC card contains a miniaturized QC transmitter and couples with a base station. The base station provides a network connection with the trusted authority and can also provide electric power to the QC card. When coupled to the base station, after authentication by the trusted authority, the QC card acquires keys through QKD with a trusted authority. The keys can be used to set up secure communication, for authentication, for access control, or formore » other purposes. The QC card can be implemented as part of a smart phone or other mobile computing device, or the QC card can be used as a fillgun for distribution of the keys.« less
NASA Astrophysics Data System (ADS)
Kawka, O. E.; Nelson, J. S.; Manalang, D.; Kelley, D. S.
2016-02-01
The Cabled Array component of the NSF-funded Ocean Observatories Initiative (OOI) provides access to real-time physical, chemical, geological, and biological data from water column and seafloor platforms/instruments at sites spanning the southern half of the Juan de Fuca Plate. The Quality Assurance (QA) program for OOI data is designed to ensure that data products meet OOI science requirements. This overall data QA plan establishes the guidelines for assuring OOI data quality and summarizes Quality Control (QC) protocols and procedures, based on best practices, which can be utilized to ensure the highest quality data across the OOI program. This presentation will highlight, specifically, the QA/QC approach being utilized for the OOI Cabled Array infrastructure and data and will include a summary of both shipboard and shore-based protocols currently in use. Aspects addressed will be pre-deployment instrument testing and calibration checks, post-deployment and pre-recovery field verification of data, and post-recovery "as-found" testing of instruments. Examples of QA/QC data will be presented and specific cases of cabled data will be discussed in the context of quality assessments and adjustment/correction of OOI datasets overall for inherent sensor drift and/or instrument fouling.
Yago, Martín
2017-05-01
QC planning based on risk management concepts can reduce the probability of harming patients due to an undetected out-of-control error condition. It does this by selecting appropriate QC procedures to decrease the number of erroneous results reported. The selection can be easily made by using published nomograms for simple QC rules when the out-of-control condition results in increased systematic error. However, increases in random error also occur frequently and are difficult to detect, which can result in erroneously reported patient results. A statistical model was used to construct charts for the 1 ks and X /χ 2 rules. The charts relate the increase in the number of unacceptable patient results reported due to an increase in random error with the capability of the measurement procedure. They thus allow for QC planning based on the risk of patient harm due to the reporting of erroneous results. 1 ks Rules are simple, all-around rules. Their ability to deal with increases in within-run imprecision is minimally affected by the possible presence of significant, stable, between-run imprecision. X /χ 2 rules perform better when the number of controls analyzed during each QC event is increased to improve QC performance. Using nomograms simplifies the selection of statistical QC procedures to limit the number of erroneous patient results reported due to an increase in analytical random error. The selection largely depends on the presence or absence of stable between-run imprecision. © 2017 American Association for Clinical Chemistry.
7 CFR 275.13 - Review of negative cases.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 4 2011-01-01 2011-01-01 false Review of negative cases. 275.13 Section 275.13... AGRICULTURE FOOD STAMP AND FOOD DISTRIBUTION PROGRAM PERFORMANCE REPORTING SYSTEM Quality Control (QC) Reviews § 275.13 Review of negative cases. (a) General. A sample of actions to deny applications, or suspend or...
Bujila, Robert; Poludniowski, Gavin; Fransson, Annette
2015-01-01
The purpose of this study was to develop a method of performing routine periodical quality controls (QC) of CT systems by automatically analyzing key performance indicators (KPIs), obtainable from images of manufacturers' quality assurance (QA) phantoms. A KPI pertains to a measurable or determinable QC parameter that is influenced by other underlying fundamental QC parameters. The established KPIs are based on relationships between existing QC parameters used in the annual testing program of CT scanners at the Karolinska University Hospital in Stockholm, Sweden. The KPIs include positioning, image noise, uniformity, homogeneity, the CT number of water, and the CT number of air. An application (MonitorCT) was developed to automatically evaluate phantom images in terms of the established KPIs. The developed methodology has been used for two years in clinical routine, where CT technologists perform daily scans of the manufacturer's QA phantom and automatically send the images to MonitorCT for KPI evaluation. In the cases where results were out of tolerance, actions could be initiated in less than 10 min. 900 QC scans from two CT scanners have been collected and analyzed over the two‐year period that MonitorCT has been active. Two types of errors have been registered in this period: a ring artifact was discovered with the image noise test, and a calibration error was detected multiple times with the CT number test. In both cases, results were outside the tolerances defined for MonitorCT, as well as by the vendor. Automated monitoring of KPIs is a powerful tool that can be used to supplement established QC methodologies. Medical physicists and other professionals concerned with the performance of a CT system will, using such methods, have access to comprehensive data on the current and historical (trend) status of the system such that swift actions can be taken in order to ensure the quality of the CT examinations, patient safety, and minimal disruption of service PACS numbers: 87.57.C‐, 87.57.N‐, 87.57.Q‐ PMID:26219012
2004-07-01
sampler, project manager, data reviewer, statistician , risk assessor, assessment personnel, and laboratory QC manager. In addition, a complete copy of...sample • Corrective actions to be taken if the QC sample fails these criteria • A description of how the QC data and results are to be documented and...Intergovernmental Data Quality Task Force Uniform Federal Policy for Quality Assurance Project Plans Evaluating, Assessing, and Documenting
40 CFR 98.434 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 22 2012-07-01 2012-07-01 false Monitoring and QA/QC requirements. 98.434 Section 98.434 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Importers and Exporters of Fluorinated Greenhouse Gases...
Embankment quality and assessment of moisture control implementation.
DOT National Transportation Integrated Search
2016-02-01
A specification for contractor moisture quality control (QC) in roadway embankment construction has been in use for approximately 10 : years in Iowa on about 190 projects. The use of this QC specification and the development of the soils certificatio...
Southam, Lorraine; Panoutsopoulou, Kalliope; Rayner, N William; Chapman, Kay; Durrant, Caroline; Ferreira, Teresa; Arden, Nigel; Carr, Andrew; Deloukas, Panos; Doherty, Michael; Loughlin, John; McCaskie, Andrew; Ollier, William E R; Ralston, Stuart; Spector, Timothy D; Valdes, Ana M; Wallis, Gillian A; Wilkinson, J Mark; Marchini, Jonathan; Zeggini, Eleftheria
2011-05-01
Imputation is an extremely valuable tool in conducting and synthesising genome-wide association studies (GWASs). Directly typed SNP quality control (QC) is thought to affect imputation quality. It is, therefore, common practise to use quality-controlled (QCed) data as an input for imputing genotypes. This study aims to determine the effect of commonly applied QC steps on imputation outcomes. We performed several iterations of imputing SNPs across chromosome 22 in a dataset consisting of 3177 samples with Illumina 610 k (Illumina, San Diego, CA, USA) GWAS data, applying different QC steps each time. The imputed genotypes were compared with the directly typed genotypes. In addition, we investigated the correlation between alternatively QCed data. We also applied a series of post-imputation QC steps balancing elimination of poorly imputed SNPs and information loss. We found that the difference between the unQCed data and the fully QCed data on imputation outcome was minimal. Our study shows that imputation of common variants is generally very accurate and robust to GWAS QC, which is not a major factor affecting imputation outcome. A minority of common-frequency SNPs with particular properties cannot be accurately imputed regardless of QC stringency. These findings may not generalise to the imputation of low frequency and rare variants.
The April 1994 and October 1994 radon intercomparisons at EML
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fisenne, I.M.; George, A.C.; Perry, P.M.
1995-10-01
Quality assurance/quality control (QA/QC) are the backbone of many commercial and research processes and programs. QA/QC research tests the state of a functioning system, be it the production of manufactured goods or the ability to make accurate and precise measurements. The quality of the radon measurements in the US have been tested under controlled conditions in semi-annual radon gas intercomparison exercises sponsored by the Environmental Measurements Laboratory (EML) since 1981. The two Calendar Year 1994 radon gas intercomparison exercises were conducted in the EML exposure chamber. Thirty-two groups including US Federal facilities, USDOE contractors, national and state laboratories, universities andmore » foreign institutions participated in these exercises. The majority of the participant`s results were within {+-}10% of the EML value at radon concentrations of 570 and 945 Bq m{sup {minus}3}.« less
Betsou, Fay; Bulla, Alexandre; Cho, Sang Yun; Clements, Judith; Chuaqui, Rodrigo; Coppola, Domenico; De Souza, Yvonne; De Wilde, Annemieke; Grizzle, William; Guadagni, Fiorella; Gunter, Elaine; Heil, Stacey; Hodgkinson, Verity; Kessler, Joseph; Kiehntopf, Michael; Kim, Hee Sung; Koppandi, Iren; Shea, Katheryn; Singh, Rajeev; Sobel, Marc; Somiari, Stella; Spyropoulos, Demetri; Stone, Mars; Tybring, Gunnel; Valyi-Nagy, Klara; Van den Eynden, Gert; Wadhwa, Lalita
2016-10-01
This technical report presents quality control (QC) assays that can be performed in order to qualify clinical biospecimens that have been biobanked for use in research. Some QC assays are specific to a disease area. Some QC assays are specific to a particular downstream analytical platform. When such a qualification is not possible, QC assays are presented that can be performed to stratify clinical biospecimens according to their biomolecular quality.
Bulla, Alexandre; Cho, Sang Yun; Clements, Judith; Chuaqui, Rodrigo; Coppola, Domenico; De Souza, Yvonne; De Wilde, Annemieke; Grizzle, William; Guadagni, Fiorella; Gunter, Elaine; Heil, Stacey; Hodgkinson, Verity; Kessler, Joseph; Kiehntopf, Michael; Kim, Hee Sung; Koppandi, Iren; Shea, Katheryn; Singh, Rajeev; Sobel, Marc; Somiari, Stella; Spyropoulos, Demetri; Stone, Mars; Tybring, Gunnel; Valyi-Nagy, Klara; Van den Eynden, Gert; Wadhwa, Lalita
2016-01-01
This technical report presents quality control (QC) assays that can be performed in order to qualify clinical biospecimens that have been biobanked for use in research. Some QC assays are specific to a disease area. Some QC assays are specific to a particular downstream analytical platform. When such a qualification is not possible, QC assays are presented that can be performed to stratify clinical biospecimens according to their biomolecular quality. PMID:27046294
Jackson, David; Bramwell, David
2013-12-16
Proteomics technologies can be effective for the discovery and assay of protein forms altered with disease. However, few examples of successful biomarker discovery yet exist. Critical to addressing this is the widespread implementation of appropriate QC (quality control) methodology. Such QC should combine the rigour of clinical laboratory assays with a suitable treatment of the complexity of the proteome by targeting separate assignable causes of variation. We demonstrate an approach, metric and example workflow for users to develop such targeted QC rules systematically and objectively, using a publicly available plasma DIGE data set. Hierarchical clustering analysis of standard channels is first used to discover correlated groups of features corresponding to specific assignable sources of technical variation. These effects are then quantified using a statistical distance metric, and followed on control charts. This allows measurement of process drift and the detection of runs that outlie for any given effect. A known technical issue on originally rejected gels was detected validating this approach, and relevant novel effects were also detected and classified effectively. Our approach was effective for 2-DE QC. Whilst we demonstrated this in a retrospective DIGE experiment, the principles would apply to ongoing QC and other proteomic technologies. This work asserts that properly carried out QC is essential to proteomics discovery experiments. Its significance is that it provides one possible novel framework for applying such methods, with a particular consideration of how to handle the complexity of the proteome. It not only focusses on 2DE-based methodology but also demonstrates general principles. A combination of results and discussion based upon a publicly available data set is used to illustrate the approach and allows a structured discussion of factors that experimenters may wish to bear in mind in other situations. The demonstration is on retrospective data only for reasons of scope, but the principles applied are also important for ongoing QC, and this work serves as a step towards a later demonstration of that application. This article is part of a Special Issue entitled: Standardization and Quality Control in Proteomics. © 2013.
Mora, Patricia; Faulkner, Keith; Mahmoud, Ahmed M; Gershan, Vesna; Kausik, Aruna; Zdesar, Urban; Brandan, María-Ester; Kurt, Serap; Davidović, Jasna; Salama, Dina H; Aribal, Erkin; Odio, Clara; Chaturvedi, Arvind K; Sabih, Zahida; Vujnović, Saša; Paez, Diana; Delis, Harry
2018-04-01
The International Atomic Energy Agency (IAEA) through a Coordinated Research Project on "Enhancing Capacity for Early Detection and Diagnosis of Breast Cancer through Imaging", brought together a group of mammography radiologists, medical physicists and radiographers; to investigate current practices and improve procedures for the early detection of breast cancer by strengthening both the clinical and medical physics components. This paper addresses the medical physics component. The countries that participated in the CRP were Bosnia and Herzegovina, Costa Rica, Egypt, India, Kenya, the Frmr. Yug. Rep. of Macedonia, Mexico, Nigeria, Pakistan, Philippines, Slovenia, Turkey, Uganda, United Kingdom and Zambia. Ten institutions participated using IAEA quality control protocols in 9 digital and 3 analogue mammography equipment. A spreadsheet for data collection was generated and distributed. Evaluation of image quality was done using TOR MAX and DMAM2 Gold phantoms. QC results for analogue equipment showed satisfactory results. QC tests performed on digital systems showed that improvements needed to be implemented, especially in thickness accuracy, signal difference to noise ratio (SDNR) values for achievable levels, uniformity and modulation transfer function (MTF). Mean glandular dose (MGD) was below international recommended levels for patient radiation protection. Evaluation of image quality by phantoms also indicated the need for improvement. Common activities facilitated improvement in mammography practice, including training of medical physicists in QC programs and infrastructure was improved and strengthened; networking among medical physicists and radiologists took place and was maintained over time. IAEA QC protocols provided a uniformed approach to QC measurements. Copyright © 2018 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
SU-E-T-216: TPS QC Supporting Program by a Third-Party Evaluation Agency in Japan.
Fukata, K; Minemura, T; Kurokawa, C; Miyagishi, T; Itami, J
2012-06-01
To equalize the quality of radiation therapy in Japan by supporting quality control of radiation treatment planning system. Center for Cancer Control and Information Service in National Cancer Center supports the QA-QC of the cancer core hospitals in Japan as a third-party evaluation agency. Recently, a program for assessing the quality of treatment planning system (TPS) began as a part of our QA-QC supporting activities. In this program, a questionnaire about TPS was sent to 45 prefectural cancer core hospitals in Japan. The object of this questionnaire is to assess the proper commissioning, implement and applications of TPSs. The contents of the questionnaire are as follows; 1) calculate MUs which deliver 1000 cGy to the point of SSD = 100 cm, 10 cm depth with field sizes ranging from 5×5 to 30 × 30 cm 2 , and obtain doses at several depths for the calculated MUs, 2) calculate MUs which deliver 1000 cGy to the point of SSD = 100 cm, 10 cm depth for wedge fields whose angles are from 15 to 60 degrees, and obtain doses at several depths with the MUs, 3) calculate MU which deliver 1000 cGy to the point of STD = 100 cm, 10 cm depth with 10×10 cm 2 field size and obtain doses at several depths with the MU. In this program, 179 beam data from 44 facilities were collected. Data were compared in terms of dose per MU, output factor, wedge factor and TMR. It was found that 90% of the data agreed within 2%. The quality of the treatment planning system was investigated through the questionnaire including the information of essential beam data. We compared 179 beam data in TPSs sent from 44 facilities and 90% of the data showed good agreement. © 2012 American Association of Physicists in Medicine.
Robust modular product family design
NASA Astrophysics Data System (ADS)
Jiang, Lan; Allada, Venkat
2001-10-01
This paper presents a modified Taguchi methodology to improve the robustness of modular product families against changes in customer requirements. The general research questions posed in this paper are: (1) How to effectively design a product family (PF) that is robust enough to accommodate future customer requirements. (2) How far into the future should designers look to design a robust product family? An example of a simplified vacuum product family is used to illustrate our methodology. In the example, customer requirements are selected as signal factors; future changes of customer requirements are selected as noise factors; an index called quality characteristic (QC) is set to evaluate the product vacuum family; and the module instance matrix (M) is selected as control factor. Initially a relation between the objective function (QC) and the control factor (M) is established, and then the feasible M space is systemically explored using a simplex method to determine the optimum M and the corresponding QC values. Next, various noise levels at different time points are introduced into the system. For each noise level, the optimal values of M and QC are computed and plotted on a QC-chart. The tunable time period of the control factor (the module matrix, M) is computed using the QC-chart. The tunable time period represents the maximum time for which a given control factor can be used to satisfy current and future customer needs. Finally, a robustness index is used to break up the tunable time period into suitable time periods that designers should consider while designing product families.
Introducing Quality Control in the Chemistry Teaching Laboratory Using Control Charts
ERIC Educational Resources Information Center
Schazmann, Benjamin; Regan, Fiona; Ross, Mary; Diamond, Dermot; Paull, Brett
2009-01-01
Quality control (QC) measures are less prevalent in teaching laboratories than commercial settings possibly owing to a lack of commercial incentives or teaching resources. This article focuses on the use of QC assessment in the analytical techniques of high performance liquid chromatography (HPLC) and ultraviolet-visible spectroscopy (UV-vis) at…
The Development of Quality Control Genotyping Approaches: A Case Study Using Elite Maize Lines.
Chen, Jiafa; Zavala, Cristian; Ortega, Noemi; Petroli, Cesar; Franco, Jorge; Burgueño, Juan; Costich, Denise E; Hearne, Sarah J
2016-01-01
Quality control (QC) of germplasm identity and purity is a critical component of breeding and conservation activities. SNP genotyping technologies and increased availability of markers provide the opportunity to employ genotyping as a low-cost and robust component of this QC. In the public sector available low-cost SNP QC genotyping methods have been developed from a very limited panel of markers of 1,000 to 1,500 markers without broad selection of the most informative SNPs. Selection of optimal SNPs and definition of appropriate germplasm sampling in addition to platform section impact on logistical and resource-use considerations for breeding and conservation applications when mainstreaming QC. In order to address these issues, we evaluated the selection and use of SNPs for QC applications from large DArTSeq data sets generated from CIMMYT maize inbred lines (CMLs). Two QC genotyping strategies were developed, the first is a "rapid QC", employing a small number of SNPs to identify potential mislabeling of seed packages or plots, the second is a "broad QC", employing a larger number of SNP, used to identify each germplasm entry and to measure heterogeneity. The optimal marker selection strategies combined the selection of markers with high minor allele frequency, sampling of clustered SNP in proportion to marker cluster distance and selecting markers that maintain a uniform genomic distribution. The rapid and broad QC SNP panels selected using this approach were further validated using blind test assessments of related re-generation samples. The influence of sampling within each line was evaluated. Sampling 192 individuals would result in close to 100% possibility of detecting a 5% contamination in the entry, and approximately a 98% probability to detect a 2% contamination of the line. These results provide a framework for the establishment of QC genotyping. A comparison of financial and time costs for use of these approaches across different platforms is discussed providing a framework for institutions involved in maize conservation and breeding to assess the resource use effectiveness of QC genotyping. Application of these research findings, in combination with existing QC approaches, will ensure the regeneration, distribution and use in breeding of true to type inbred germplasm. These findings also provide an effective approach to optimize SNP selection for QC genotyping in other species.
Cho, Min-Chul; Kim, So Young; Jeong, Tae-Dong; Lee, Woochang; Chun, Sail; Min, Won-Ki
2014-11-01
Verification of new lot reagent's suitability is necessary to ensure that results for patients' samples are consistent before and after reagent lot changes. A typical procedure is to measure results of some patients' samples along with quality control (QC) materials. In this study, the results of patients' samples and QC materials in reagent lot changes were analysed. In addition, the opinion regarding QC target range adjustment along with reagent lot changes was proposed. Patients' sample and QC material results of 360 reagent lot change events involving 61 analytes and eight instrument platforms were analysed. The between-lot differences for the patients' samples (ΔP) and the QC materials (ΔQC) were tested by Mann-Whitney U tests. The size of the between-lot differences in the QC data was calculated as multiples of standard deviation (SD). The ΔP and ΔQC values only differed significantly in 7.8% of the reagent lot change events. This frequency was not affected by the assay principle or the QC material source. One SD was proposed for the cutoff for maintaining pre-existing target range after reagent lot change. While non-commutable QC material results were infrequent in the present study, our data confirmed that QC materials have limited usefulness when assessing new reagent lots. Also a 1 SD standard for establishing a new QC target range after reagent lot change event was proposed. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Quality control in urodynamics and the role of software support in the QC procedure.
Hogan, S; Jarvis, P; Gammie, A; Abrams, P
2011-11-01
This article aims to identify quality control (QC) best practice, to review published QC audits in order to identify how closely good practice is followed, and to carry out a market survey of the software features that support QC offered by urodynamics machines available in the UK. All UK distributors of urodynamic systems were contacted and asked to provide information on the software features relating to data quality of the products they supply. The results of the market survey show that the features offered by manufacturers differ greatly. Automated features, which can be turned off in most cases, include: cough recognition, detrusor contraction detection, and high pressure alerts. There are currently no systems that assess data quality based on published guidelines. A literature review of current QC guidelines for urodynamics was carried out; QC audits were included in the literature review to see how closely guidelines were being followed. This review highlights the fact that basic QC is not being carried out effectively by urodynamicists. Based on the software features currently available and the results of the literature review there is both the need and capacity for a greater degree of automation in relation to urodynamic data quality and accuracy assessment. Some progress has been made in this area and certain manufacturers have already developed automated cough detection. Copyright © 2011 Wiley Periodicals, Inc.
76 FR 51274 - Supplemental Nutrition Assistance Program: Major System Failures
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-18
... data mining as necessary to determine if losses are occurring in the process of issuing benefits. It is... further by using data mining techniques on States' data or analyzing QC data for error patterns that may... conjunction with an additional sample of cases. Data mining techniques may be employed when QC data cannot...
Quality control management and communication between radiologists and technologists.
Nagy, Paul G; Pierce, Benjamin; Otto, Misty; Safdar, Nabile M
2008-06-01
The greatest barrier to quality control (QC) in the digital imaging environment is the lack of communication and documentation between those who interpret images and those who acquire them. Paper-based QC methods are insufficient in a digital image management system. Problem work flow must be incorporated into reengineering efforts when migrating to a digital practice. The authors implemented a Web-based QC feedback tool to document and facilitate the communication of issues identified by radiologists. The goal was to promote a responsive and constructive tool that contributes to a culture of quality. The hypothesis was that by making it easier for radiologists to submit quality issues, the number of QC issues submitted would increase. The authors integrated their Web-based quality tracking system with a clinical picture archiving and communication system so that radiologists could report quality issues without disrupting clinical work flow. Graphical dashboarding techniques aid supervisors in using this database to identify the root causes of different types of issues. Over the initial 12-month rollout period, starting in the general section, the authors recorded 20 times more QC issues submitted by radiologists, accompanied by a rise in technologists' responsiveness to QC issues. For technologists with high numbers of QC issues, the incorporation of data from this tracking system proved useful in performance appraisals and in driving individual improvement. This tool is an example of the types of information technology innovations that can be leveraged to support QC in the digital imaging environment. Initial data suggest that the result is not only an improvement in quality but higher levels of satisfaction for both radiologists and technologists.
Pichler, Peter; Mazanek, Michael; Dusberger, Frederico; Weilnböck, Lisa; Huber, Christian G; Stingl, Christoph; Luider, Theo M; Straube, Werner L; Köcher, Thomas; Mechtler, Karl
2012-11-02
While the performance of liquid chromatography (LC) and mass spectrometry (MS) instrumentation continues to increase, applications such as analyses of complete or near-complete proteomes and quantitative studies require constant and optimal system performance. For this reason, research laboratories and core facilities alike are recommended to implement quality control (QC) measures as part of their routine workflows. Many laboratories perform sporadic quality control checks. However, successive and systematic longitudinal monitoring of system performance would be facilitated by dedicated automatic or semiautomatic software solutions that aid an effortless analysis and display of QC metrics over time. We present the software package SIMPATIQCO (SIMPle AuTomatIc Quality COntrol) designed for evaluation of data from LTQ Orbitrap, Q-Exactive, LTQ FT, and LTQ instruments. A centralized SIMPATIQCO server can process QC data from multiple instruments. The software calculates QC metrics supervising every step of data acquisition from LC and electrospray to MS. For each QC metric the software learns the range indicating adequate system performance from the uploaded data using robust statistics. Results are stored in a database and can be displayed in a comfortable manner from any computer in the laboratory via a web browser. QC data can be monitored for individual LC runs as well as plotted over time. SIMPATIQCO thus assists the longitudinal monitoring of important QC metrics such as peptide elution times, peak widths, intensities, total ion current (TIC) as well as sensitivity, and overall LC-MS system performance; in this way the software also helps identify potential problems. The SIMPATIQCO software package is available free of charge.
2012-01-01
While the performance of liquid chromatography (LC) and mass spectrometry (MS) instrumentation continues to increase, applications such as analyses of complete or near-complete proteomes and quantitative studies require constant and optimal system performance. For this reason, research laboratories and core facilities alike are recommended to implement quality control (QC) measures as part of their routine workflows. Many laboratories perform sporadic quality control checks. However, successive and systematic longitudinal monitoring of system performance would be facilitated by dedicated automatic or semiautomatic software solutions that aid an effortless analysis and display of QC metrics over time. We present the software package SIMPATIQCO (SIMPle AuTomatIc Quality COntrol) designed for evaluation of data from LTQ Orbitrap, Q-Exactive, LTQ FT, and LTQ instruments. A centralized SIMPATIQCO server can process QC data from multiple instruments. The software calculates QC metrics supervising every step of data acquisition from LC and electrospray to MS. For each QC metric the software learns the range indicating adequate system performance from the uploaded data using robust statistics. Results are stored in a database and can be displayed in a comfortable manner from any computer in the laboratory via a web browser. QC data can be monitored for individual LC runs as well as plotted over time. SIMPATIQCO thus assists the longitudinal monitoring of important QC metrics such as peptide elution times, peak widths, intensities, total ion current (TIC) as well as sensitivity, and overall LC–MS system performance; in this way the software also helps identify potential problems. The SIMPATIQCO software package is available free of charge. PMID:23088386
Individualized Quality Control Plan (IQCP): Is It Value-Added for Clinical Microbiology?
Sharp, Susan E; Miller, Melissa B; Hindler, Janet
2015-12-01
The Center for Medicaid and Medicare Services (CMS) recently published their Individualized Quality Control Plan (IQCP [https://www.cms.gov/regulations-and-guidance/legislation/CLIA/Individualized_Quality_Control_Plan_IQCP.html]), which will be the only option for quality control (QC) starting in January 2016 if laboratories choose not to perform Clinical Laboratory Improvement Act (CLIA) [U.S. Statutes at Large 81(1967):533] default QC. Laboratories will no longer be able to use "equivalent QC" (EQC) or the Clinical and Laboratory Standards Institute (CLSI) standards alone for quality control of their microbiology systems. The implementation of IQCP in clinical microbiology laboratories will most certainly be an added burden, the benefits of which are currently unknown. Copyright © 2015, American Society for Microbiology. All Rights Reserved.
Shephard, Mark; Shephard, Anne; McAteer, Bridgit; Regnier, Tamika; Barancek, Kristina
2017-12-01
Diabetes is a major health problem for Australia's Aboriginal and Torres Strait Islander peoples. Point-of-care testing for haemoglobin A1c (HbA1c) has been the cornerstone of a long-standing program (QAAMS) to manage glycaemic control in Indigenous people with diabetes and recently, to diagnose diabetes. The QAAMS quality management framework includes monthly testing of quality control (QC) and external quality assurance (EQA) samples. Key performance indicators of quality include imprecision (coefficient of variation [CV%]) and percentage acceptable results. This paper reports on the past 15years of quality testing in QAAMS and examines the performance of HbA1c POC testing at the 6.5% cut-off recommended for diagnosis. The total number of HbA1c EQA results submitted from 2002 to 2016 was 29,093. The median imprecision for EQA testing by QAAMS device operators averaged 2.81% (SD 0.50; range 2.2 to 3.9%) from 2002 to 2016 and 2.44% (SD 0.22; range 2.2 to 2.9%) from 2009 to 2016. No significant difference was observed between the median imprecision achieved in QAAMS and by Australasian laboratories from 2002 to 2016 (p=0.05; two-tailed paired t-test) and from 2009 to 2016 (p=0.17; two-tailed paired t-test). For QC testing from 2009 to 2016, imprecision averaged 2.5% and 3.0% for the two levels of QC tested. Percentage acceptable results averaged 90% for QA testing from 2002 to 2016 and 96% for QC testing from 2009 to 2016. The DCA Vantage was able to measure a patient and an EQA sample with an HbA1c value close to 6.5% both accurately and precisely. HbA1c POC testing in QAAMS has remained analytically sound, matched the quality achieved by Australasian laboratories and met profession-derived analytical goals for 15years. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oshiro, T; Donaghy, M; Slechta, A
Purpose: To determine if the flipped class format has an effect on examination results for a radiologic technologist (RT) program and discuss benefits from creating video resources. Methods: From 2001–2015, students had taken both a radiological physics and quality control (QC) class as a part of their didactic training. In 2005/2006, the creation of videos of didactic lectures and QC test demonstrations allowed for a flip where content was studied at home while exercises and reviews were done in-class. Final examinations were retrospectively reviewed from this timeframe. 12 multiple choice physics questions (MCP) and 5 short answer QC questions (SAQC)more » were common to pre and post flip exams. The RT program’s ARRT exam scores were also obtained and compared to national averages. Results: In total, 36 lecture videos and 65 quality control videos were created for the flipped content. Data was ∼2.4GB and distributed to students via USB or CD media. For MCP questions, scores improved by 7.9% with the flipped format and significance (Student’s t-test, p<0.05) was found for 3 of the 12 questions. SAQC questions showed improvement by 14.6% and significance was found for 2 of the 5 questions. Student enrollment increased from ∼14 (2001–2004) to ∼23 students (2005–15). Content was continuously added post-flip due to the efficiency of delivery. The QC class in 2003 covered 45 test setups in-class while 65 were covered with video segments in 2014. Flipped materials are currently being repurposed. In 2015, this video content was restructured into an ARRT exam review guide and in 2016, the content was reorganized for fluoroscopy training for physicians. Conclusion: We believe that flipped classes can improve efficiency of content delivery and improve student performance even with an increase in class size. This format allows for flexibility in learning as well as re-use in multiple applications.« less
Design, implementation, and quality control in the Pathways American-Indian multicenter trial
Stone, Elaine J.; Norman, James E.; Davis, Sally M.; Stewart, Dawn; Clay, Theresa E.; Caballero, Ben; Lohman, Timothy G.; Murray, David M.
2016-01-01
Background Pathways was the first multicenter American-Indian school-based study to test the effectiveness of an obesity prevention program promoting healthy eating and physical activity. Methods Pathways employed a nested cohort design in which 41 schools were randomized to intervention or control conditions and students within these schools were followed as a cohort (1,704 third graders at baseline). The study’s primary endpoint was percent body fat. Secondary endpoints were levels of fat in school lunches; time spent in physical activity; and knowledge, attitudes, and behaviors regarding diet and exercise. Quality control (QC) included design of data management systems which provided standardization and quality assurance of data collection and processing. Data QC procedures at study centers included manuals of operation, training and certification, and monitoring of performance. Process evaluation was conducted to monitor dose and fidelity of the interventions. Registration and tracking systems were used for students and schools. Results No difference in mean percent body fat at fifth grade was found between the intervention and control schools. Percent of calories from fat and saturated fat in school lunches was significantly reduced in the intervention schools as was total energy intake from 24-hour recalls. Significant increases in self-reported physical activity levels and knowledge of healthy behaviors were found for the intervention school students. Conclusions The Pathways study results provide evidence demonstrating the role schools can play in public health promotion. Its study design and QC systems and procedures provide useful models for other similar school based multi- or single-site studies. PMID:14636805
PHABULOSA Controls the Quiescent Center-Independent Root Meristem Activities in Arabidopsis thaliana
Sebastian, Jose; Ryu, Kook Hui; Zhou, Jing; Tarkowská, Danuše; Tarkowski, Petr; Cho, Young-Hee; Yoo, Sang-Dong; Kim, Eun-Sol; Lee, Ji-Young
2015-01-01
Plant growth depends on stem cell niches in meristems. In the root apical meristem, the quiescent center (QC) cells form a niche together with the surrounding stem cells. Stem cells produce daughter cells that are displaced into a transit-amplifying (TA) domain of the root meristem. TA cells divide several times to provide cells for growth. SHORTROOT (SHR) and SCARECROW (SCR) are key regulators of the stem cell niche. Cytokinin controls TA cell activities in a dose-dependent manner. Although the regulatory programs in each compartment of the root meristem have been identified, it is still unclear how they coordinate one another. Here, we investigate how PHABULOSA (PHB), under the posttranscriptional control of SHR and SCR, regulates TA cell activities. The root meristem and growth defects in shr or scr mutants were significantly recovered in the shr phb or scr phb double mutant, respectively. This rescue in root growth occurs in the absence of a QC. Conversely, when the modified PHB, which is highly resistant to microRNA, was expressed throughout the stele of the wild-type root meristem, root growth became very similar to that observed in the shr; however, the identity of the QC was unaffected. Interestingly, a moderate increase in PHB resulted in a root meristem phenotype similar to that observed following the application of high levels of cytokinin. Our protoplast assay and transgenic approach using ARR10 suggest that the depletion of TA cells by high PHB in the stele occurs via the repression of B-ARR activities. This regulatory mechanism seems to help to maintain the cytokinin homeostasis in the meristem. Taken together, our study suggests that PHB can dynamically regulate TA cell activities in a QC-independent manner, and that the SHR-PHB pathway enables a robust root growth system by coordinating the stem cell niche and TA domain. PMID:25730098
Operational quality control of daily precipitation using spatio-climatological consistency testing
NASA Astrophysics Data System (ADS)
Scherrer, S. C.; Croci-Maspoli, M.; van Geijtenbeek, D.; Naguel, C.; Appenzeller, C.
2010-09-01
Quality control (QC) of meteorological data is of utmost importance for climate related decisions. The search for an effective automated QC of precipitation data has proven difficult and many weather services still use mainly manual inspection of daily precipitation including MeteoSwiss. However, man power limitations force many weather services to move towards less labour intensive and more automated QC with the challenge to keeping data quality high. In the last decade, several approaches have been presented to objectify daily precipitation QC. Here we present a spatio-climatological approach that will be implemented operationally at MeteoSwiss. It combines the information from the event based spatial distribution of everyday's precipitation field and the historical information of the interpolation error using different precipitation intensity intervals. Expert judgement shows that the system is able to detect potential outliers very well (hardly any missed errors) without creating too many false alarms that need human inspection. 50-80% of all flagged values have been classified as real errors by the data editor. This is much better than the roughly 15-20% using standard spatial regression tests. Very helpful in the QC process is the automatic redistribution of accumulated several day sums. Manual inspection in operations can be reduced and the QC of precipitation objectified substantially.
The Quality Control Circle: Is It for Education?
ERIC Educational Resources Information Center
Land, Arthur J.
From its start in Japan after World War II, the Quality Control Circle (Q.C.) approach to management and organizational operation evolved into what it is today: people doing similar work meeting regularly to identify, objectively analyze, and develop solutions to problems. The Q.C. approach meets Maslow's theory of motivation by inviting…
qcML: An Exchange Format for Quality Control Metrics from Mass Spectrometry Experiments*
Walzer, Mathias; Pernas, Lucia Espona; Nasso, Sara; Bittremieux, Wout; Nahnsen, Sven; Kelchtermans, Pieter; Pichler, Peter; van den Toorn, Henk W. P.; Staes, An; Vandenbussche, Jonathan; Mazanek, Michael; Taus, Thomas; Scheltema, Richard A.; Kelstrup, Christian D.; Gatto, Laurent; van Breukelen, Bas; Aiche, Stephan; Valkenborg, Dirk; Laukens, Kris; Lilley, Kathryn S.; Olsen, Jesper V.; Heck, Albert J. R.; Mechtler, Karl; Aebersold, Ruedi; Gevaert, Kris; Vizcaíno, Juan Antonio; Hermjakob, Henning; Kohlbacher, Oliver; Martens, Lennart
2014-01-01
Quality control is increasingly recognized as a crucial aspect of mass spectrometry based proteomics. Several recent papers discuss relevant parameters for quality control and present applications to extract these from the instrumental raw data. What has been missing, however, is a standard data exchange format for reporting these performance metrics. We therefore developed the qcML format, an XML-based standard that follows the design principles of the related mzML, mzIdentML, mzQuantML, and TraML standards from the HUPO-PSI (Proteomics Standards Initiative). In addition to the XML format, we also provide tools for the calculation of a wide range of quality metrics as well as a database format and interconversion tools, so that existing LIMS systems can easily add relational storage of the quality control data to their existing schema. We here describe the qcML specification, along with possible use cases and an illustrative example of the subsequent analysis possibilities. All information about qcML is available at http://code.google.com/p/qcml. PMID:24760958
qcML: an exchange format for quality control metrics from mass spectrometry experiments.
Walzer, Mathias; Pernas, Lucia Espona; Nasso, Sara; Bittremieux, Wout; Nahnsen, Sven; Kelchtermans, Pieter; Pichler, Peter; van den Toorn, Henk W P; Staes, An; Vandenbussche, Jonathan; Mazanek, Michael; Taus, Thomas; Scheltema, Richard A; Kelstrup, Christian D; Gatto, Laurent; van Breukelen, Bas; Aiche, Stephan; Valkenborg, Dirk; Laukens, Kris; Lilley, Kathryn S; Olsen, Jesper V; Heck, Albert J R; Mechtler, Karl; Aebersold, Ruedi; Gevaert, Kris; Vizcaíno, Juan Antonio; Hermjakob, Henning; Kohlbacher, Oliver; Martens, Lennart
2014-08-01
Quality control is increasingly recognized as a crucial aspect of mass spectrometry based proteomics. Several recent papers discuss relevant parameters for quality control and present applications to extract these from the instrumental raw data. What has been missing, however, is a standard data exchange format for reporting these performance metrics. We therefore developed the qcML format, an XML-based standard that follows the design principles of the related mzML, mzIdentML, mzQuantML, and TraML standards from the HUPO-PSI (Proteomics Standards Initiative). In addition to the XML format, we also provide tools for the calculation of a wide range of quality metrics as well as a database format and interconversion tools, so that existing LIMS systems can easily add relational storage of the quality control data to their existing schema. We here describe the qcML specification, along with possible use cases and an illustrative example of the subsequent analysis possibilities. All information about qcML is available at http://code.google.com/p/qcml. © 2014 by The American Society for Biochemistry and Molecular Biology, Inc.
Quality Control of Meteorological Observations
NASA Technical Reports Server (NTRS)
Collins, William; Dee, Dick; Rukhovets, Leonid
1999-01-01
For the first time, a problem of the meteorological observation quality control (QC) was formulated by L.S. Gandin at the Main Geophysical Observatory in the 70's. Later in 1988 L.S. Gandin began adapting his ideas in complex quality control (CQC) to the operational environment at the National Centers for Environmental Prediction. The CQC was first applied by L.S.Gandin and his colleagues to detection and correction of errors in rawinsonde heights and temperatures using a complex of hydrostatic residuals.Later, a full complex of residuals, vertical and horizontal optimal interpolations and baseline checks were added for the checking and correction of a wide range of meteorological variables. some other of Gandin's ideas were applied and substantially developed at other meteorological centers. A new statistical QC was recently implemented in the Goddard Data Assimilation System. The central component of any quality control is a buddy check which is a test of individual suspect observations against available nearby non-suspect observations. A novel feature of this test is that the error variances which are used for QC decision are re-estimated on-line. As a result, the allowed tolerances for suspect observations can depend on local atmospheric conditions. The system is then better able to accept extreme values observed in deep cyclones, jet streams and so on. The basic statements of this adaptive buddy check are described. Some results of the on-line QC including moisture QC are presented.
Proteomics Quality Control: Quality Control Software for MaxQuant Results.
Bielow, Chris; Mastrobuoni, Guido; Kempa, Stefan
2016-03-04
Mass spectrometry-based proteomics coupled to liquid chromatography has matured into an automatized, high-throughput technology, producing data on the scale of multiple gigabytes per instrument per day. Consequently, an automated quality control (QC) and quality analysis (QA) capable of detecting measurement bias, verifying consistency, and avoiding propagation of error is paramount for instrument operators and scientists in charge of downstream analysis. We have developed an R-based QC pipeline called Proteomics Quality Control (PTXQC) for bottom-up LC-MS data generated by the MaxQuant software pipeline. PTXQC creates a QC report containing a comprehensive and powerful set of QC metrics, augmented with automated scoring functions. The automated scores are collated to create an overview heatmap at the beginning of the report, giving valuable guidance also to nonspecialists. Our software supports a wide range of experimental designs, including stable isotope labeling by amino acids in cell culture (SILAC), tandem mass tags (TMT), and label-free data. Furthermore, we introduce new metrics to score MaxQuant's Match-between-runs (MBR) functionality by which peptide identifications can be transferred across Raw files based on accurate retention time and m/z. Last but not least, PTXQC is easy to install and use and represents the first QC software capable of processing MaxQuant result tables. PTXQC is freely available at https://github.com/cbielow/PTXQC .
GTKDynamo: a PyMOL plug-in for QC/MM hybrid potential simulations
Bachega, José Fernando R.; Timmers, Luís Fernando S.M.; Assirati, Lucas; Bachega, Leonardo R.; Field, Martin J.; Wymore, Troy
2014-01-01
Hybrid quantum chemical (QC)/molecular mechanical (MM) potentials are very powerful tools for molecular simulation. They are especially useful for studying processes in condensed phase systems, such as chemical reactions, that involve a relatively localized change in electronic structure and where the surrounding environment contributes to these changes but can be represented with more computationally efficient functional forms. Despite their utility, however, these potentials are not always straightforward to apply since the extent of significant electronic structure changes occurring in the condensed phase process may not be intuitively obvious. To facilitate their use we have developed an open-source graphical plug-in, GTKDynamo, that links the PyMOL visualization program and the pDynamo QC/MM simulation library. This article describes the implementation of GTKDynamo and its capabilities and illustrates its application to QC/MM simulations. PMID:24137667
Analysis of CrIs/ATMS Using AIRS Version-7 Retrieval and QC Methodology
NASA Technical Reports Server (NTRS)
Susskind, Joel; Kouvaris, Louis; Blaisdell, John M.; Iredell, Lena
2017-01-01
The objective of this research is to develop and implement an algorithm to analyze a long term data record of CrIS/ATMS observations so as to produce monthly mean gridded Level-3 products which are consistent with, and will serve as a seamless follow on to, those of AIRS Version-7. We feel the best way to achieve this result is to analyze CrIS/ATMS data using retrieval and Quality Control (QC) methodologies which are scientifically equivalent to those used in AIRS Version-7. We developed and implemented a single retrieval program that uses as input either AIRS/AMSU or CrIS/ATMS radiance observations, and has appropriate switches that take into account the spectral and radiometric differences between CrIS and AIRS. Our methodology is call CHART (Climate Heritage AIRS Retrieval Technique).
2008-11-17
Accepted for oral presentation. o Bojesen SE, Malkki M, Gooley T, Zhao LP, Selvakumar A, Spellman S, Petersdorf E, Hansen JA and Dupont B. Genetic allelic...Oudshoorn M, Petersdorf E, Setterholm M, Champlin R and de Lima M. The clinical significance of matching for alleles at the low expression HLA loci...Reaction DMSO Dimethylsulphoxide PSA Public Service Announcement DNA Deoxyribonucleic Acid QC Quality control D/ R Donor/Recipient RCC Renal Cell
Haga, Yoshihiro; Chida, Koichi; Inaba, Yohei; Kaga, Yuji; Meguro, Taiichiro; Zuguchi, Masayuki
2016-02-01
As the use of diagnostic X-ray equipment with flat panel detectors (FPDs) has increased, so has the importance of proper management of FPD systems. To ensure quality control (QC) of FPD system, an easy method for evaluating FPD imaging performance for both stationary and moving objects is required. Until now, simple rotatable QC phantoms have not been available for the easy evaluation of the performance (spatial resolution and dynamic range) of FPD in imaging moving objects. We developed a QC phantom for this purpose. It consists of three thicknesses of copper and a rotatable test pattern of piano wires of various diameters. Initial tests confirmed its stable performance. Our moving phantom is very useful for QC of FPD images of moving objects because it enables visual evaluation of image performance (spatial resolution and dynamic range) easily.
AutoLock: a semiautomated system for radiotherapy treatment plan quality control
Lowe, Matthew; Hardy, Mark J.; Boylan, Christopher J.; Whitehurst, Philip; Rowbottom, Carl G.
2015-01-01
A semiautomated system for radiotherapy treatment plan quality control (QC), named AutoLock, is presented. AutoLock is designed to augment treatment plan QC by automatically checking aspects of treatment plans that are well suited to computational evaluation, whilst summarizing more subjective aspects in the form of a checklist. The treatment plan must pass all automated checks and all checklist items must be acknowledged by the planner as correct before the plan is finalized. Thus AutoLock uniquely integrates automated treatment plan QC, an electronic checklist, and plan finalization. In addition to reducing the potential for the propagation of errors, the integration of AutoLock into the plan finalization workflow has improved efficiency at our center. Detailed audit data are presented, demonstrating that the treatment plan QC rejection rate fell by around a third following the clinical introduction of AutoLock. PACS number: 87.55.Qr PMID:26103498
Quality control and conduct of genome-wide association meta-analyses.
Winkler, Thomas W; Day, Felix R; Croteau-Chonka, Damien C; Wood, Andrew R; Locke, Adam E; Mägi, Reedik; Ferreira, Teresa; Fall, Tove; Graff, Mariaelisa; Justice, Anne E; Luan, Jian'an; Gustafsson, Stefan; Randall, Joshua C; Vedantam, Sailaja; Workalemahu, Tsegaselassie; Kilpeläinen, Tuomas O; Scherag, André; Esko, Tonu; Kutalik, Zoltán; Heid, Iris M; Loos, Ruth J F
2014-05-01
Rigorous organization and quality control (QC) are necessary to facilitate successful genome-wide association meta-analyses (GWAMAs) of statistics aggregated across multiple genome-wide association studies. This protocol provides guidelines for (i) organizational aspects of GWAMAs, and for (ii) QC at the study file level, the meta-level across studies and the meta-analysis output level. Real-world examples highlight issues experienced and solutions developed by the GIANT Consortium that has conducted meta-analyses including data from 125 studies comprising more than 330,000 individuals. We provide a general protocol for conducting GWAMAs and carrying out QC to minimize errors and to guarantee maximum use of the data. We also include details for the use of a powerful and flexible software package called EasyQC. Precise timings will be greatly influenced by consortium size. For consortia of comparable size to the GIANT Consortium, this protocol takes a minimum of about 10 months to complete.
AutoLock: a semiautomated system for radiotherapy treatment plan quality control.
Dewhurst, Joseph M; Lowe, Matthew; Hardy, Mark J; Boylan, Christopher J; Whitehurst, Philip; Rowbottom, Carl G
2015-05-08
A semiautomated system for radiotherapy treatment plan quality control (QC), named AutoLock, is presented. AutoLock is designed to augment treatment plan QC by automatically checking aspects of treatment plans that are well suited to computational evaluation, whilst summarizing more subjective aspects in the form of a checklist. The treatment plan must pass all automated checks and all checklist items must be acknowledged by the planner as correct before the plan is finalized. Thus AutoLock uniquely integrates automated treatment plan QC, an electronic checklist, and plan finalization. In addition to reducing the potential for the propagation of errors, the integration of AutoLock into the plan finalization workflow has improved efficiency at our center. Detailed audit data are presented, demonstrating that the treatment plan QC rejection rate fell by around a third following the clinical introduction of AutoLock.
Quality control and conduct of genome-wide association meta-analyses
Winkler, Thomas W; Day, Felix R; Croteau-Chonka, Damien C; Wood, Andrew R; Locke, Adam E; Mägi, Reedik; Ferreira, Teresa; Fall, Tove; Graff, Mariaelisa; Justice, Anne E; Luan, Jian'an; Gustafsson, Stefan; Randall, Joshua C; Vedantam, Sailaja; Workalemahu, Tsegaselassie; Kilpeläinen, Tuomas O; Scherag, André; Esko, Tonu; Kutalik, Zoltán; Heid, Iris M; Loos, Ruth JF
2014-01-01
Rigorous organization and quality control (QC) are necessary to facilitate successful genome-wide association meta-analyses (GWAMAs) of statistics aggregated across multiple genome-wide association studies. This protocol provides guidelines for [1] organizational aspects of GWAMAs, and for [2] QC at the study file level, the meta-level across studies, and the meta-analysis output level. Real–world examples highlight issues experienced and solutions developed by the GIANT Consortium that has conducted meta-analyses including data from 125 studies comprising more than 330,000 individuals. We provide a general protocol for conducting GWAMAs and carrying out QC to minimize errors and to guarantee maximum use of the data. We also include details for use of a powerful and flexible software package called EasyQC. For consortia of comparable size to the GIANT consortium, the present protocol takes a minimum of about 10 months to complete. PMID:24762786
ERIC Educational Resources Information Center
Espy, John; And Others
A project was conducted to field test selected first- and second-year courses in a postsecondary nuclear quality assurance/quality control (QA/QC) technician curriculum and to develop the teaching/learning modules for seven technical specialty courses remaining in the QA/QC technician curriculum. The field testing phase of the project involved the…
Lelental, Natalia; Brandner, Sebastian; Kofanova, Olga; Blennow, Kaj; Zetterberg, Henrik; Andreasson, Ulf; Engelborghs, Sebastiaan; Mroczko, Barbara; Gabryelewicz, Tomasz; Teunissen, Charlotte; Mollenhauer, Brit; Parnetti, Lucilla; Chiasserini, Davide; Molinuevo, Jose Luis; Perret-Liaudet, Armand; Verbeek, Marcel M; Andreasen, Niels; Brosseron, Frederic; Bahl, Justyna M C; Herukka, Sanna-Kaisa; Hausner, Lucrezia; Frölich, Lutz; Labonte, Anne; Poirier, Judes; Miller, Anne-Marie; Zilka, Norbert; Kovacech, Branislav; Urbani, Andrea; Suardi, Silvia; Oliveira, Catarina; Baldeiras, Ines; Dubois, Bruno; Rot, Uros; Lehmann, Sylvain; Skinningsrud, Anders; Betsou, Fay; Wiltfang, Jens; Gkatzima, Olymbia; Winblad, Bengt; Buchfelder, Michael; Kornhuber, Johannes; Lewczuk, Piotr
2016-03-01
Assay-vendor independent quality control (QC) samples for neurochemical dementia diagnostics (NDD) biomarkers are so far commercially unavailable. This requires that NDD laboratories prepare their own QC samples, for example by pooling leftover cerebrospinal fluid (CSF) samples. To prepare and test alternative matrices for QC samples that could facilitate intra- and inter-laboratory QC of the NDD biomarkers. Three matrices were validated in this study: (A) human pooled CSF, (B) Aβ peptides spiked into human prediluted plasma, and (C) Aβ peptides spiked into solution of bovine serum albumin in phosphate-buffered saline. All matrices were tested also after supplementation with an antibacterial agent (sodium azide). We analyzed short- and long-term stability of the biomarkers with ELISA and chemiluminescence (Fujirebio Europe, MSD, IBL International), and performed an inter-laboratory variability study. NDD biomarkers turned out to be stable in almost all samples stored at the tested conditions for up to 14 days as well as in samples stored deep-frozen (at - 80°C) for up to one year. Sodium azide did not influence biomarker stability. Inter-center variability of the samples sent at room temperature (pooled CSF, freeze-dried CSF, and four artificial matrices) was comparable to the results obtained on deep-frozen samples in other large-scale projects. Our results suggest that it is possible to replace self-made, CSF-based QC samples with large-scale volumes of QC materials prepared with artificial peptides and matrices. This would greatly facilitate intra- and inter-laboratory QC schedules for NDD measurements.
Ford, Eric C; Terezakis, Stephanie; Souranis, Annette; Harris, Kendra; Gay, Hiram; Mutic, Sasa
2012-11-01
To quantify the error-detection effectiveness of commonly used quality control (QC) measures. We analyzed incidents from 2007-2010 logged into a voluntary in-house, electronic incident learning systems at 2 academic radiation oncology clinics. None of the incidents resulted in patient harm. Each incident was graded for potential severity using the French Nuclear Safety Authority scoring scale; high potential severity incidents (score >3) were considered, along with a subset of 30 randomly chosen low severity incidents. Each report was evaluated to identify which of 15 common QC checks could have detected it. The effectiveness was calculated, defined as the percentage of incidents that each QC measure could detect, both for individual QC checks and for combinations of checks. In total, 4407 incidents were reported, 292 of which had high-potential severity. High- and low-severity incidents were detectable by 4.0 ± 2.3 (mean ± SD) and 2.6 ± 1.4 QC checks, respectively (P<.001). All individual checks were less than 50% sensitive with the exception of pretreatment plan review by a physicist (63%). An effectiveness of 97% was achieved with 7 checks used in combination and was not further improved with more checks. The combination of checks with the highest effectiveness includes physics plan review, physician plan review, Electronic Portal Imaging Device-based in vivo portal dosimetry, radiation therapist timeout, weekly physics chart check, the use of checklists, port films, and source-to-skin distance checks. Some commonly used QC checks such as pretreatment intensity modulated radiation therapy QA do not substantially add to the ability to detect errors in these data. The effectiveness of QC measures in radiation oncology depends sensitively on which checks are used and in which combinations. A small percentage of errors cannot be detected by any of the standard formal QC checks currently in broad use, suggesting that further improvements are needed. These data require confirmation with a broader incident-reporting database. Copyright © 2012 Elsevier Inc. All rights reserved.
Kadowaki, Hisae; Satrimafitrah, Pasjan; Takami, Yasunari; Nishitoh, Hideki
2018-05-09
The maintenance of endoplasmic reticulum (ER) homeostasis is essential for cell function. ER stress-induced pre-emptive quality control (ERpQC) helps alleviate the burden to a stressed ER by limiting further protein loading. We have previously reported the mechanisms of ERpQC, which includes a rerouting step and a degradation step. Under ER stress conditions, Derlin family proteins (Derlins), which are components of ER-associated degradation, reroute specific ER-targeting proteins to the cytosol. Newly synthesized rerouted polypeptides are degraded via the cytosolic chaperone Bag6 and the AAA-ATPase p97 in the ubiquitin-proteasome system. However, the mechanisms by which ER-targeting proteins are rerouted from the ER translocation pathway to the cytosolic degradation pathway and how the E3 ligase ubiquitinates ERpQC substrates remain unclear. Here, we show that ERpQC substrates are captured by the carboxyl-terminus region of Derlin-1 and ubiquitinated by the HRD1 E3 ubiquitin ligase prior to degradation. Moreover, HRD1 forms a large ERpQC-related complex composed of Sec61α and Derlin-1 during ER stress. These findings indicate that the association of the degradation factor HRD1 with the translocon and the rerouting factor Derlin-1 may be necessary for the smooth and effective clearance of ERpQC substrates.
Develop a Methodology to Evaluate the Effectiveness of QC/QA Specifications (Phase II)
DOT National Transportation Integrated Search
1998-08-01
The Texas Department of Transportation (TxDOT) has been implementing statistically based quality control/quality assurance (QC/QA) specifications for hot mix asphalt concrete pavements since the early 1990s. These specifications have been continuousl...
A Benchmark Study on Error Assessment and Quality Control of CCS Reads Derived from the PacBio RS
Jiao, Xiaoli; Zheng, Xin; Ma, Liang; Kutty, Geetha; Gogineni, Emile; Sun, Qiang; Sherman, Brad T.; Hu, Xiaojun; Jones, Kristine; Raley, Castle; Tran, Bao; Munroe, David J.; Stephens, Robert; Liang, Dun; Imamichi, Tomozumi; Kovacs, Joseph A.; Lempicki, Richard A.; Huang, Da Wei
2013-01-01
PacBio RS, a newly emerging third-generation DNA sequencing platform, is based on a real-time, single-molecule, nano-nitch sequencing technology that can generate very long reads (up to 20-kb) in contrast to the shorter reads produced by the first and second generation sequencing technologies. As a new platform, it is important to assess the sequencing error rate, as well as the quality control (QC) parameters associated with the PacBio sequence data. In this study, a mixture of 10 prior known, closely related DNA amplicons were sequenced using the PacBio RS sequencing platform. After aligning Circular Consensus Sequence (CCS) reads derived from the above sequencing experiment to the known reference sequences, we found that the median error rate was 2.5% without read QC, and improved to 1.3% with an SVM based multi-parameter QC method. In addition, a De Novo assembly was used as a downstream application to evaluate the effects of different QC approaches. This benchmark study indicates that even though CCS reads are post error-corrected it is still necessary to perform appropriate QC on CCS reads in order to produce successful downstream bioinformatics analytical results. PMID:24179701
A Benchmark Study on Error Assessment and Quality Control of CCS Reads Derived from the PacBio RS.
Jiao, Xiaoli; Zheng, Xin; Ma, Liang; Kutty, Geetha; Gogineni, Emile; Sun, Qiang; Sherman, Brad T; Hu, Xiaojun; Jones, Kristine; Raley, Castle; Tran, Bao; Munroe, David J; Stephens, Robert; Liang, Dun; Imamichi, Tomozumi; Kovacs, Joseph A; Lempicki, Richard A; Huang, Da Wei
2013-07-31
PacBio RS, a newly emerging third-generation DNA sequencing platform, is based on a real-time, single-molecule, nano-nitch sequencing technology that can generate very long reads (up to 20-kb) in contrast to the shorter reads produced by the first and second generation sequencing technologies. As a new platform, it is important to assess the sequencing error rate, as well as the quality control (QC) parameters associated with the PacBio sequence data. In this study, a mixture of 10 prior known, closely related DNA amplicons were sequenced using the PacBio RS sequencing platform. After aligning Circular Consensus Sequence (CCS) reads derived from the above sequencing experiment to the known reference sequences, we found that the median error rate was 2.5% without read QC, and improved to 1.3% with an SVM based multi-parameter QC method. In addition, a De Novo assembly was used as a downstream application to evaluate the effects of different QC approaches. This benchmark study indicates that even though CCS reads are post error-corrected it is still necessary to perform appropriate QC on CCS reads in order to produce successful downstream bioinformatics analytical results.
Hong, Sung Kuk; Choi, Seung Jun; Shin, Saeam; Lee, Wonmok; Pinto, Naina; Shin, Nari; Lee, Kwangjun; Hong, Seong Geun; Kim, Young Ah; Lee, Hyukmin; Kim, Heejung; Song, Wonkeun; Lee, Sun Hwa; Yong, Dongeun; Lee, Kyungwon; Chong, Yunsop
2015-11-01
Quality control (QC) processes are being performed in the majority of clinical microbiology laboratories to ensure the performance of microbial identification and antimicrobial susceptibility testing by using ATCC strains. To obtain these ATCC strains, some inconveniences are encountered concerning the purchase cost of the strains and the shipping time required. This study was focused on constructing a database of reference strains for QC processes using domestic bacterial strains, concentrating primarily on antimicrobial susceptibility testing. Three strains (Escherichia coli, Pseudomonas aeruginosa, and Staphylococcus aureus) that showed legible results in preliminary testing were selected. The minimal inhibitory concentrations (MICs) and zone diameters (ZDs) of eight antimicrobials for each strain were determined according to the CLSI M23. All resulting MIC and ZD ranges included at least 95% of the data. The ZD QC ranges obtained by using the CLSI method were less than 12 mm, and the MIC QC ranges extended no more than five dilutions. This study is a preliminary attempt to construct a bank of Korean QC strains. With further studies, a positive outcome toward cost and time reduction can be anticipated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Joseph; Pirrung, Meg; McCue, Lee Ann
FQC is software that facilitates large-scale quality control of FASTQ files by carrying out a QC protocol, parsing results, and aggregating quality metrics within and across experiments into an interactive dashboard. The dashboard utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data.
ITC Guidelines on Quality Control in Scoring, Test Analysis, and Reporting of Test Scores
ERIC Educational Resources Information Center
Allalouf, Avi
2014-01-01
The Quality Control (QC) Guidelines are intended to increase the efficiency, precision, and accuracy of the scoring, analysis, and reporting process of testing. The QC Guidelines focus on large-scale testing operations where multiple forms of tests are created for use on set dates. However, they may also be used for a wide variety of other testing…
Létourneau, Daniel; Wang, An; Amin, Md Nurul; Pearce, Jim; McNiven, Andrea; Keller, Harald; Norrlinger, Bernhard; Jaffray, David A
2014-12-01
High-quality radiation therapy using highly conformal dose distributions and image-guided techniques requires optimum machine delivery performance. In this work, a monitoring system for multileaf collimator (MLC) performance, integrating semiautomated MLC quality control (QC) tests and statistical process control tools, was developed. The MLC performance monitoring system was used for almost a year on two commercially available MLC models. Control charts were used to establish MLC performance and assess test frequency required to achieve a given level of performance. MLC-related interlocks and servicing events were recorded during the monitoring period and were investigated as indicators of MLC performance variations. The QC test developed as part of the MLC performance monitoring system uses 2D megavoltage images (acquired using an electronic portal imaging device) of 23 fields to determine the location of the leaves with respect to the radiation isocenter. The precision of the MLC performance monitoring QC test and the MLC itself was assessed by detecting the MLC leaf positions on 127 megavoltage images of a static field. After initial calibration, the MLC performance monitoring QC test was performed 3-4 times/week over a period of 10-11 months to monitor positional accuracy of individual leaves for two different MLC models. Analysis of test results was performed using individuals control charts per leaf with control limits computed based on the measurements as well as two sets of specifications of ± 0.5 and ± 1 mm. Out-of-specification and out-of-control leaves were automatically flagged by the monitoring system and reviewed monthly by physicists. MLC-related interlocks reported by the linear accelerator and servicing events were recorded to help identify potential causes of nonrandom MLC leaf positioning variations. The precision of the MLC performance monitoring QC test and the MLC itself was within ± 0.22 mm for most MLC leaves and the majority of the apparent leaf motion was attributed to beam spot displacements between irradiations. The MLC QC test was performed 193 and 162 times over the monitoring period for the studied units and recalibration had to be repeated up to three times on one of these units. For both units, rate of MLC interlocks was moderately associated with MLC servicing events. The strongest association with the MLC performance was observed between the MLC servicing events and the total number of out-of-control leaves. The average elapsed time for which the number of out-of-specification or out-of-control leaves was within a given performance threshold was computed and used to assess adequacy of MLC test frequency. A MLC performance monitoring system has been developed and implemented to acquire high-quality QC data at high frequency. This is enabled by the relatively short acquisition time for the images and automatic image analysis. The monitoring system was also used to record and track the rate of MLC-related interlocks and servicing events. MLC performances for two commercially available MLC models have been assessed and the results support monthly test frequency for widely accepted ± 1 mm specifications. Higher QC test frequency is however required to maintain tighter specification and in-control behavior.
Development of concrete QC/QA specifications for highway construction in Kentucky.
DOT National Transportation Integrated Search
2001-08-01
There is a growing trend toward quality-based specifications in highway construction. A large number of quality control/quality assurance (QC/QA) specifications shift the responsibility of day-to-day testing from the state DOH to the contractor. This...
Portland cement concrete pavement review of QC/QA data 2000 through 2009.
DOT National Transportation Integrated Search
2011-04-01
This report analyzes the Quality Control/Quality Assurance (QC/QA) data for Portland cement concrete pavement : (PCCP) awarded in the years 2000 through 2009. Analysis of the overall performance of the projects is accomplished by : reviewing the Calc...
Brown, Joseph; Pirrung, Meg; McCue, Lee Ann
2017-06-09
FQC is software that facilitates quality control of FASTQ files by carrying out a QC protocol using FastQC, parsing results, and aggregating quality metrics into an interactive dashboard designed to richly summarize individual sequencing runs. The dashboard groups samples in dropdowns for navigation among the data sets, utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data. FQC is implemented in Python 3 and Javascript, and is maintained under an MIT license. Documentation and source code is available at: https://github.com/pnnl/fqc . joseph.brown@pnnl.gov. © The Author(s) 2017. Published by Oxford University Press.
McClure, Matthew C.; McCarthy, John; Flynn, Paul; McClure, Jennifer C.; Dair, Emma; O'Connell, D. K.; Kearney, John F.
2018-01-01
A major use of genetic data is parentage verification and identification as inaccurate pedigrees negatively affect genetic gain. Since 2012 the international standard for single nucleotide polymorphism (SNP) verification in Bos taurus cattle has been the ISAG SNP panels. While these ISAG panels provide an increased level of parentage accuracy over microsatellite markers (MS), they can validate the wrong parent at ≤1% misconcordance rate levels, indicating that more SNP are needed if a more accurate pedigree is required. With rapidly increasing numbers of cattle being genotyped in Ireland that represent 61 B. taurus breeds from a wide range of farm types: beef/dairy, AI/pedigree/commercial, purebred/crossbred, and large to small herd size the Irish Cattle Breeding Federation (ICBF) analyzed different SNP densities to determine that at a minimum ≥500 SNP are needed to consistently predict only one set of parents at a ≤1% misconcordance rate. For parentage validation and prediction ICBF uses 800 SNP (ICBF800) selected based on SNP clustering quality, ISAG200 inclusion, call rate (CR), and minor allele frequency (MAF) in the Irish cattle population. Large datasets require sample and SNP quality control (QC). Most publications only deal with SNP QC via CR, MAF, parent-progeny conflicts, and Hardy-Weinberg deviation, but not sample QC. We report here parentage, SNP QC, and a genomic sample QC pipelines to deal with the unique challenges of >1 million genotypes from a national herd such as SNP genotype errors from mis-tagging of animals, lab errors, farm errors, and multiple other issues that can arise. We divide the pipeline into two parts: a Genotype QC and an Animal QC pipeline. The Genotype QC identifies samples with low call rate, missing or mixed genotype classes (no BB genotype or ABTG alleles present), and low genotype frequencies. The Animal QC handles situations where the genotype might not belong to the listed individual by identifying: >1 non-matching genotypes per animal, SNP duplicates, sex and breed prediction mismatches, parentage and progeny validation results, and other situations. The Animal QC pipeline make use of ICBF800 SNP set where appropriate to identify errors in a computationally efficient yet still highly accurate method. PMID:29599798
Quality control and quality assurance in genotypic data for genome-wide association studies
Laurie, Cathy C.; Doheny, Kimberly F.; Mirel, Daniel B.; Pugh, Elizabeth W.; Bierut, Laura J.; Bhangale, Tushar; Boehm, Frederick; Caporaso, Neil E.; Cornelis, Marilyn C.; Edenberg, Howard J.; Gabriel, Stacy B.; Harris, Emily L.; Hu, Frank B.; Jacobs, Kevin; Kraft, Peter; Landi, Maria Teresa; Lumley, Thomas; Manolio, Teri A.; McHugh, Caitlin; Painter, Ian; Paschall, Justin; Rice, John P.; Rice, Kenneth M.; Zheng, Xiuwen; Weir, Bruce S.
2011-01-01
Genome-wide scans of nucleotide variation in human subjects are providing an increasing number of replicated associations with complex disease traits. Most of the variants detected have small effects and, collectively, they account for a small fraction of the total genetic variance. Very large sample sizes are required to identify and validate findings. In this situation, even small sources of systematic or random error can cause spurious results or obscure real effects. The need for careful attention to data quality has been appreciated for some time in this field, and a number of strategies for quality control and quality assurance (QC/QA) have been developed. Here we extend these methods and describe a system of QC/QA for genotypic data in genome-wide association studies. This system includes some new approaches that (1) combine analysis of allelic probe intensities and called genotypes to distinguish gender misidentification from sex chromosome aberrations, (2) detect autosomal chromosome aberrations that may affect genotype calling accuracy, (3) infer DNA sample quality from relatedness and allelic intensities, (4) use duplicate concordance to infer SNP quality, (5) detect genotyping artifacts from dependence of Hardy-Weinberg equilibrium (HWE) test p-values on allelic frequency, and (6) demonstrate sensitivity of principal components analysis (PCA) to SNP selection. The methods are illustrated with examples from the ‘Gene Environment Association Studies’ (GENEVA) program. The results suggest several recommendations for QC/QA in the design and execution of genome-wide association studies. PMID:20718045
Comprehensive Testing Guidelines to Increase Efficiency in INDOT Operations : [Technical Summary
DOT National Transportation Integrated Search
2012-01-01
When the Indiana Department of Transportation designs a pavement project, a decision for QC/QA (Quality Control/ Quality Assurance) or nonQC/QA is made solely based on the quantity of pavement materials to be used in the project. Once the pavement...
Comprehensive Testing Guidelines to Increase Efficiency in INDOT Operations : [Technical Summary
DOT National Transportation Integrated Search
2012-01-01
When the Indiana Department of Transportation designs : a pavement project, a decision for QC/QA (Quality Control/ : Quality Assurance) or nonQC/QA is made solely : based on the quantity of pavement materials to be used : in the project. Once the ...
Hot mix asphalt voids acceptance review of QC/QA data 2000 through 2010.
DOT National Transportation Integrated Search
2011-10-01
This report analyzes the quality control/quality assurance (QC/QA) data for hot mix asphalt (HMA) using : voids acceptance as the testing criteria awarded in the years 2000 through 2010. Analysis of the overall : performance of the projects is accomp...
An Automatic Quality Control Pipeline for High-Throughput Screening Hit Identification.
Zhai, Yufeng; Chen, Kaisheng; Zhong, Yang; Zhou, Bin; Ainscow, Edward; Wu, Ying-Ta; Zhou, Yingyao
2016-09-01
The correction or removal of signal errors in high-throughput screening (HTS) data is critical to the identification of high-quality lead candidates. Although a number of strategies have been previously developed to correct systematic errors and to remove screening artifacts, they are not universally effective and still require fair amount of human intervention. We introduce a fully automated quality control (QC) pipeline that can correct generic interplate systematic errors and remove intraplate random artifacts. The new pipeline was first applied to ~100 large-scale historical HTS assays; in silico analysis showed auto-QC led to a noticeably stronger structure-activity relationship. The method was further tested in several independent HTS runs, where QC results were sampled for experimental validation. Significantly increased hit confirmation rates were obtained after the QC steps, confirming that the proposed method was effective in enriching true-positive hits. An implementation of the algorithm is available to the screening community. © 2016 Society for Laboratory Automation and Screening.
Non-monotonicity and divergent time scale in Axelrod model dynamics
NASA Astrophysics Data System (ADS)
Vazquez, F.; Redner, S.
2007-04-01
We study the evolution of the Axelrod model for cultural diversity, a prototypical non-equilibrium process that exhibits rich dynamics and a dynamic phase transition between diversity and an inactive state. We consider a simple version of the model in which each individual possesses two features that can assume q possibilities. Within a mean-field description in which each individual has just a few interaction partners, we find a phase transition at a critical value qc between an active, diverse state for q < qc and a frozen state. For q lesssim qc, the density of active links is non-monotonic in time and the asymptotic approach to the steady state is controlled by a time scale that diverges as (q-qc)-1/2.
Dimech, Wayne; Karakaltsas, Marina; Vincini, Giuseppe A
2018-05-25
A general trend towards conducting infectious disease serology testing in centralized laboratories means that quality control (QC) principles used for clinical chemistry testing are applied to infectious disease testing. However, no systematic assessment of methods used to establish QC limits has been applied to infectious disease serology testing. A total of 103 QC data sets, obtained from six different infectious disease serology analytes, were parsed through standard methods for establishing statistical control limits, including guidelines from Public Health England, USA Clinical and Laboratory Standards Institute (CLSI), German Richtlinien der Bundesärztekammer (RiliBÄK) and Australian QConnect. The percentage of QC results failing each method was compared. The percentage of data sets having more than 20% of QC results failing Westgard rules when the first 20 results were used to calculate the mean±2 standard deviation (SD) ranged from 3 (2.9%) for R4S to 66 (64.1%) for 10X rule, whereas the percentage ranged from 0 (0%) for R4S to 32 (40.5%) for 10X when the first 100 results were used to calculate the mean±2 SD. By contrast, the percentage of data sets with >20% failing the RiliBÄK control limits was 25 (24.3%). Only two data sets (1.9%) had more than 20% of results outside the QConnect Limits. The rate of failure of QCs using QConnect Limits was more applicable for monitoring infectious disease serology testing compared with UK Public Health, CLSI and RiliBÄK, as the alternatives to QConnect Limits reported an unacceptably high percentage of failures across the 103 data sets.
A situational analysis of breast cancer early detection services in Trinidad and Tobago.
Badal, Kimberly; Rampersad, Fidel; Warner, Wayne A; Toriola, Adetunji T; Mohammed, Hamish; Scheffel, Harold-Alexis; Ali, Rehanna; Moosoodeen, Murrie; Konduru, Siva; Russel, Adaila; Haraksingh, Rajini
2018-01-01
A situational analysis of breast cancer (BC) early detection services was carried out to investigate whether Trinidad and Tobago (T&T) has the framework for successful organized national screening. An online survey was designed to assess the availability, accessibility, quality control and assurance (QC&A), and monitoring and evaluation (M&E) mechanisms for public and private BC early detection. A focus group with local radiologists (n = 3) was held to identify unaddressed challenges and make recommendations for improvement. Major public hospitals offer free detection services with wait times of 1-6 months for an appointment. Private institutions offer mammograms for TTD$240 (USD$37) at minimum with same day service. Both sectors report a lack of trained staff. Using 1.2 mammograms per 10,000 women ≥40 years as sufficient, the public sector's rate of 0.19 mammograms per 10,000 women ≥40 years for screening and diagnosis is inadequate. Program M&E mechanisms, QC&A guidelines for machinery use, delays in receipt of pathology reports, and unreliable drug access are further unaddressed challenges. T&T must first strengthen its human and physical resources, implement M&E and QC&A measures, strengthen cancer care, and address other impediments to BC early detection before investing in nationally organized BC screening.
An introduction to statistical process control in research proteomics.
Bramwell, David
2013-12-16
Statistical process control is a well-established and respected method which provides a general purpose, and consistent framework for monitoring and improving the quality of a process. It is routinely used in many industries where the quality of final products is critical and is often required in clinical diagnostic laboratories [1,2]. To date, the methodology has been little utilised in research proteomics. It has been shown to be capable of delivering quantitative QC procedures for qualitative clinical assays [3] making it an ideal methodology to apply to this area of biological research. To introduce statistical process control as an objective strategy for quality control and show how it could be used to benefit proteomics researchers and enhance the quality of the results they generate. We demonstrate that rules which provide basic quality control are easy to derive and implement and could have a major impact on data quality for many studies. Statistical process control is a powerful tool for investigating and improving proteomics research work-flows. The process of characterising measurement systems and defining control rules forces the exploration of key questions that can lead to significant improvements in performance. This work asserts that QC is essential to proteomics discovery experiments. Every experimenter must know the current capabilities of their measurement system and have an objective means for tracking and ensuring that performance. Proteomic analysis work-flows are complicated and multi-variate. QC is critical for clinical chemistry measurements and huge strides have been made in ensuring the quality and validity of results in clinical biochemistry labs. This work introduces some of these QC concepts and works to bridge their use from single analyte QC to applications in multi-analyte systems. This article is part of a Special Issue entitled: Standardization and Quality Control in Proteomics. Copyright © 2013 The Author. Published by Elsevier B.V. All rights reserved.
Hot mix asphalt voids acceptance review of QC/QA data 2000 through 2004.
DOT National Transportation Integrated Search
2006-07-01
This report analyzes the Quality Control/Quality Assurance (QC/QA) data for hot mix asphalt using voids acceptance as : the testing criteria for the years 2000 through 2004. Analysis of the overall quality of the HMA is accomplished by : reviewing th...
The Electric Vehicle Alternative.
1981-06-01
7 qc, LIST OF TABLES Table Page 2-1 AIR TRAINING COMMAND EV DEMONSTRATION PROGRAM 30 3-1 COMPUTATION FOR DERIVATION OF THE COMBINED RELIABILITY...batteries wear out quickly be- cause the zinc they use gets dissapated in their charging/discharging cycle. GM plans to have such problems solved by 1985...with how the G & W battery controls the release of poi- sonous chlorine gas in the case of an accident.. Unlike the lead-acid battery, the zinc
2015-05-01
in consultation with the site management . 4.0 DATA TYPES AND QUALITY CONTROL A sampling plan must account for the collection, handling, and...GUIDANCE DOCUMENT Cost-Effective, Ultra-Sensitive Groundwater Monitoring for Site Remediation and Management : Standard Operating Procedures...Groundwater Monitoring for Site Remediation and Management 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Halden, R.U., Roll, I.B. 5d
Operational Processing of Ground Validation Data for the Tropical Rainfall Measuring Mission
NASA Technical Reports Server (NTRS)
Kulie, Mark S.; Robinson, Mike; Marks, David A.; Ferrier, Brad S.; Rosenfeld, Danny; Wolff, David B.
1999-01-01
The Tropical Rainfall Measuring Mission (TRMM) satellite was successfully launched in November 1997. A primary goal of TRMM is to sample tropical rainfall using the first active spaceborne precipitation radar. To validate TRMM satellite observations, a comprehensive Ground Validation (GV) Program has been implemented for this mission. A key component of GV is the analysis and quality control of meteorological ground-based radar data from four primary sites: Melbourne, FL; Houston, TX; Darwin, Australia; and Kwajalein Atoll, RMI. As part of the TRMM GV effort, the Joint Center for Earth Systems Technology (JCET) at the University of Maryland, Baltimore County, has been tasked with developing and implementing an operational system to quality control (QC), archive, and provide data for subsequent rainfall product generation from the four primary GV sites. This paper provides an overview of the JCET operational environment. A description of the QC algorithm and performance, in addition to the data flow procedure between JCET and the TRNM science and Data Information System (TSDIS), are presented. The impact of quality-controlled data on higher level rainfall and reflectivity products will also be addressed, Finally, a brief description of JCET's expanded role into producing reference rainfall products will be discussed.
Hesari, Nikou; Kıratlı Yılmazçoban, Nursel; Elzein, Mohamad; Alum, Absar; Abbaszadegan, Morteza
2017-01-03
Rapid bacterial detection using biosensors is a novel approach for microbiological testing applications. Validation of such methods is an obstacle in the adoption of new bio-sensing technologies for water testing. Therefore, establishing a quality assurance and quality control (QA/QC) plan is essential to demonstrate accuracy and reliability of the biosensor method for the detection of E. coli in drinking water samples. In this study, different reagents and assay conditions including temperatures, holding time, E. coli strains and concentrations, dissolving agents, salinity and pH effects, quality of substrates of various suppliers of 4-methylumbelliferyl glucuronide (MUG), and environmental water samples were included in the QA/QC plan and used in the assay optimization and documentation. Furthermore, the procedural QA/QC for the monitoring of drinking water samples was established to validate the performance of the biosensor platform for the detection of E. coli using a culture-based standard technique. Implementing the developed QA/QC plan, the same level of precision and accuracy was achieved using both the standard and the biosensor methods. The established procedural QA/QC for the biosensor will provide a reliable tool for a near real-time monitoring of E. coli in drinking water samples to both industry and regulatory authorities.
User-friendly solutions for microarray quality control and pre-processing on ArrayAnalysis.org
Eijssen, Lars M. T.; Jaillard, Magali; Adriaens, Michiel E.; Gaj, Stan; de Groot, Philip J.; Müller, Michael; Evelo, Chris T.
2013-01-01
Quality control (QC) is crucial for any scientific method producing data. Applying adequate QC introduces new challenges in the genomics field where large amounts of data are produced with complex technologies. For DNA microarrays, specific algorithms for QC and pre-processing including normalization have been developed by the scientific community, especially for expression chips of the Affymetrix platform. Many of these have been implemented in the statistical scripting language R and are available from the Bioconductor repository. However, application is hampered by lack of integrative tools that can be used by users of any experience level. To fill this gap, we developed a freely available tool for QC and pre-processing of Affymetrix gene expression results, extending, integrating and harmonizing functionality of Bioconductor packages. The tool can be easily accessed through a wizard-like web portal at http://www.arrayanalysis.org or downloaded for local use in R. The portal provides extensive documentation, including user guides, interpretation help with real output illustrations and detailed technical documentation. It assists newcomers to the field in performing state-of-the-art QC and pre-processing while offering data analysts an integral open-source package. Providing the scientific community with this easily accessible tool will allow improving data quality and reuse and adoption of standards. PMID:23620278
Hesari, Nikou; Kıratlı Yılmazçoban, Nursel; Elzein, Mohamad; Alum, Absar; Abbaszadegan, Morteza
2017-01-01
Rapid bacterial detection using biosensors is a novel approach for microbiological testing applications. Validation of such methods is an obstacle in the adoption of new bio-sensing technologies for water testing. Therefore, establishing a quality assurance and quality control (QA/QC) plan is essential to demonstrate accuracy and reliability of the biosensor method for the detection of E. coli in drinking water samples. In this study, different reagents and assay conditions including temperatures, holding time, E. coli strains and concentrations, dissolving agents, salinity and pH effects, quality of substrates of various suppliers of 4-methylumbelliferyl glucuronide (MUG), and environmental water samples were included in the QA/QC plan and used in the assay optimization and documentation. Furthermore, the procedural QA/QC for the monitoring of drinking water samples was established to validate the performance of the biosensor platform for the detection of E. coli using a culture-based standard technique. Implementing the developed QA/QC plan, the same level of precision and accuracy was achieved using both the standard and the biosensor methods. The established procedural QA/QC for the biosensor will provide a reliable tool for a near real-time monitoring of E. coli in drinking water samples to both industry and regulatory authorities. PMID:28054956
Technical Note: Independent component analysis for quality assurance in functional MRI.
Astrakas, Loukas G; Kallistis, Nikolaos S; Kalef-Ezra, John A
2016-02-01
Independent component analysis (ICA) is an established method of analyzing human functional MRI (fMRI) data. Here, an ICA-based fMRI quality control (QC) tool was developed and used. ICA-based fMRI QC tool to be used with a commercial phantom was developed. In an attempt to assess the performance of the tool relative to preexisting alternative tools, it was used seven weeks before and eight weeks after repair of a faulty gradient amplifier of a non-state-of-the-art MRI unit. More specifically, its performance was compared with the AAPM 100 acceptance testing and quality assurance protocol and two fMRI QC protocols, proposed by Freidman et al. ["Report on a multicenter fMRI quality assurance protocol," J. Magn. Reson. Imaging 23, 827-839 (2006)] and Stocker et al. ["Automated quality assurance routines for fMRI data applied to a multicenter study," Hum. Brain Mapp. 25, 237-246 (2005)], respectively. The easily developed and applied ICA-based QC protocol provided fMRI QC indices and maps equally sensitive to fMRI instabilities with the indices and maps of other established protocols. The ICA fMRI QC indices were highly correlated with indices of other fMRI QC protocols and in some cases theoretically related to them. Three or four independent components with slow varying time series are detected under normal conditions. ICA applied on phantom measurements is an easy and efficient tool for fMRI QC. Additionally, it can protect against misinterpretations of artifact components as human brain activations. Evaluating fMRI QC indices in the central region of a phantom is not always the optimal choice.
Molecular Characterization of Tick Salivary Gland Glutaminyl Cyclase
Adamson, Steven W.; Browning, Rebecca E.; Chao, Chien-Chung; Bateman, Robert C.; Ching, Wei-Mei; Karim, Shahid
2013-01-01
Glutaminyl cyclase (QC) catalyzes the cyclization of N-terminal glutamine residues into pyroglutamate. This post-translational modification extends the half-life of peptides and, in some cases, is essential in binding to their cognate receptor. Due to its potential role in the post-translational modification of tick neuropeptides, we report the molecular, biochemical and physiological characterization of salivary gland QC during the prolonged blood-feeding of the black-legged tick (Ixodes scapularis) and the gulf-coast tick (Amblyomma maculatum). QC sequences from I. scapularis and A. maculatum showed a high degree of amino acid identity to each other and other arthropods and residues critical for zinc-binding/catalysis (D159, E202, and H330) or intermediate stabilization (E201, W207, D248, D305, F325, and W329) are conserved. Analysis of QC transcriptional gene expression kinetics depicts an upregulation during the blood-meal of adult female ticks prior to fast feeding phases in both I. scapularis and A. maculatum suggesting a functional link with blood meal uptake. QC enzymatic activity was detected in saliva and extracts of tick salivary glands and midguts. Recombinant QC was shown to be catalytically active. Furthermore, knockdown of QC-transcript by RNA interference resulted in lower enzymatic activity, and small, unviable egg masses in both studied tick species as well as lower engorged tick weights for I. scapularis. These results suggest that the post-translational modification of neurotransmitters and other bioactive peptides by QC is critical to oviposition and potentially other physiological processes. Moreover, these data suggest that tick-specific QC-modified neurotransmitters/hormones or other relevant parts of this system could potentially be used as novel physiological targets for tick control. PMID:23770496
Comparison of quality control software tools for diffusion tensor imaging.
Liu, Bilan; Zhu, Tong; Zhong, Jianhui
2015-04-01
Image quality of diffusion tensor imaging (DTI) is critical for image interpretation, diagnostic accuracy and efficiency. However, DTI is susceptible to numerous detrimental artifacts that may impair the reliability and validity of the obtained data. Although many quality control (QC) software tools are being developed and are widely used and each has its different tradeoffs, there is still no general agreement on an image quality control routine for DTIs, and the practical impact of these tradeoffs is not well studied. An objective comparison that identifies the pros and cons of each of the QC tools will be helpful for the users to make the best choice among tools for specific DTI applications. This study aims to quantitatively compare the effectiveness of three popular QC tools including DTI studio (Johns Hopkins University), DTIprep (University of North Carolina at Chapel Hill, University of Iowa and University of Utah) and TORTOISE (National Institute of Health). Both synthetic and in vivo human brain data were used to quantify adverse effects of major DTI artifacts to tensor calculation as well as the effectiveness of different QC tools in identifying and correcting these artifacts. The technical basis of each tool was discussed, and the ways in which particular techniques affect the output of each of the tools were analyzed. The different functions and I/O formats that three QC tools provide for building a general DTI processing pipeline and integration with other popular image processing tools were also discussed. Copyright © 2015 Elsevier Inc. All rights reserved.
Practical Shipbuilding Standards for Surface Preparation and Coatings
1979-07-01
strong solvent and apply over last coat of epoxy within 48 hours. *Minimum Dry Film Thickness 12.0 SAFETY AND POLUTION CONTROL 12.5 Safety solvents shall...Owner Inspec ion (3) QA/QC Dept. Inspectors. (4) Craft Inspectors (5) Craft Supervision Inspection Only (6) QA/QC Dept. Audit Only (7) Are
Bosnjak, J; Ciraj-Bjelac, O; Strbac, B
2008-01-01
Application of a quality control (QC) programme is very important when optimisation of image quality and reduction of patient exposure is desired. QC surveys of diagnostics imaging equipment in Republic of Srpska (entity of Bosnia and Herzegovina) has been systematically performed since 2001. The presented results are mostly related to the QC test results of X-ray tubes and generators for diagnostic radiology units in 92 radiology departments. In addition, results include workplace monitoring and usage of personal protective devices for staff and patients. Presented results showed the improvements in the implementation of the QC programme within the period 2001--2005. Also, more attention is given to appropriate maintenance of imaging equipment, which was one of the main problems in the past. Implementation of a QC programme is a continuous and complex process. To achieve good performance of imaging equipment, additional tests are to be introduced, along with image quality assessment and patient dosimetry. Training is very important in order to achieve these goals.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Létourneau, Daniel, E-mail: daniel.letourneau@rmp.uh.on.ca; McNiven, Andrea; Keller, Harald
2014-12-15
Purpose: High-quality radiation therapy using highly conformal dose distributions and image-guided techniques requires optimum machine delivery performance. In this work, a monitoring system for multileaf collimator (MLC) performance, integrating semiautomated MLC quality control (QC) tests and statistical process control tools, was developed. The MLC performance monitoring system was used for almost a year on two commercially available MLC models. Control charts were used to establish MLC performance and assess test frequency required to achieve a given level of performance. MLC-related interlocks and servicing events were recorded during the monitoring period and were investigated as indicators of MLC performance variations. Methods:more » The QC test developed as part of the MLC performance monitoring system uses 2D megavoltage images (acquired using an electronic portal imaging device) of 23 fields to determine the location of the leaves with respect to the radiation isocenter. The precision of the MLC performance monitoring QC test and the MLC itself was assessed by detecting the MLC leaf positions on 127 megavoltage images of a static field. After initial calibration, the MLC performance monitoring QC test was performed 3–4 times/week over a period of 10–11 months to monitor positional accuracy of individual leaves for two different MLC models. Analysis of test results was performed using individuals control charts per leaf with control limits computed based on the measurements as well as two sets of specifications of ±0.5 and ±1 mm. Out-of-specification and out-of-control leaves were automatically flagged by the monitoring system and reviewed monthly by physicists. MLC-related interlocks reported by the linear accelerator and servicing events were recorded to help identify potential causes of nonrandom MLC leaf positioning variations. Results: The precision of the MLC performance monitoring QC test and the MLC itself was within ±0.22 mm for most MLC leaves and the majority of the apparent leaf motion was attributed to beam spot displacements between irradiations. The MLC QC test was performed 193 and 162 times over the monitoring period for the studied units and recalibration had to be repeated up to three times on one of these units. For both units, rate of MLC interlocks was moderately associated with MLC servicing events. The strongest association with the MLC performance was observed between the MLC servicing events and the total number of out-of-control leaves. The average elapsed time for which the number of out-of-specification or out-of-control leaves was within a given performance threshold was computed and used to assess adequacy of MLC test frequency. Conclusions: A MLC performance monitoring system has been developed and implemented to acquire high-quality QC data at high frequency. This is enabled by the relatively short acquisition time for the images and automatic image analysis. The monitoring system was also used to record and track the rate of MLC-related interlocks and servicing events. MLC performances for two commercially available MLC models have been assessed and the results support monthly test frequency for widely accepted ±1 mm specifications. Higher QC test frequency is however required to maintain tighter specification and in-control behavior.« less
NASA Technical Reports Server (NTRS)
Barbre, Robert E., Jr.
2015-01-01
This paper describes in detail the QC and splicing methodology for KSC's 50- and 915-MHz DRWP measurements that generates an extensive archive of vertically complete profiles from 0.20-18.45 km. The concurrent POR from each archive extends from April 2000 to December 2009. MSFC NE applies separate but similar QC processes to each of the 50- and 915-MHz DRWP archives. DRWP literature and data examination provide the basis for developing and applying the automated and manual QC processes on both archives. Depending on the month, the QC'ed 50- and 915-MHz DRWP archives retain 52-65% and 16-30% of the possible data, respectively. The 50- and 915-MHz DRWP QC archives retain 84-91% and 85-95%, respectively, of all the available data provided that data exist in the non- QC'ed archives. Next, MSFC NE applies an algorithm to splice concurrent measurements from both DRWP sources. Last, MSFC NE generates a composite profile from the (up to) five available spliced profiles to effectively characterize boundary layer winds and to utilize all possible 915-MHz DRWP measurements at each timestamp. During a given month, roughly 23,000-32,000 complete profiles exist from 0.25-18.45 km from the composite profiles' archive, and approximately 5,000- 27,000 complete profiles exist from an archive utilizing an individual 915-MHz DRWP. One can extract a variety of profile combinations (pairs, triplets, etc.) from this sample for a given application. The sample of vertically complete DRWP wind measurements not only gives launch vehicle customers greater confidence in loads and trajectory assessments versus using balloon output, but also provides flexibility to simulate different DOL situations across applicable altitudes. In addition to increasing sample size and providing more flexibility for DOL simulations in the vehicle design phase, the spliced DRWP database provides any upcoming launch vehicle program with the capability to utilize DRWP profiles on DOL to compute vehicle steering commands, provided the program applies the procedures that this report describes to new DRWP data on DOL. Decker et al. (2015) details how SLS is proposing to use DRWP data and splicing techniques on DOL. Although automation could enhance the current DOL 50-MHz DRWP QC process and could streamline any future DOL 915-MHz DRWP QC and splicing process, the DOL community would still require manual intervention to ensure that the vehicle only uses valid profiles. If a program desires to use high spatial resolution profiles, then the algorithm could randomly add high-frequency components to the DRWP profiles. The spliced DRWP database provides lots of flexibility in how one performs DOL simulations, and the algorithms that this report provides will assist the aerospace and atmospheric communities that are interested in utilizing the DRWP.
NASA Astrophysics Data System (ADS)
Sturtevant, C.; Hackley, S.; Lee, R.; Holling, G.; Bonarrigo, S.
2017-12-01
Quality assurance and control (QA/QC) is one of the most important yet challenging aspects of producing research-quality data. Data quality issues are multi-faceted, including sensor malfunctions, unmet theoretical assumptions, and measurement interference from humans or the natural environment. Tower networks such as Ameriflux, ICOS, and NEON continue to grow in size and sophistication, yet tools for robust, efficient, scalable QA/QC have lagged. Quality control remains a largely manual process heavily relying on visual inspection of data. In addition, notes of measurement interference are often recorded on paper without an explicit pathway to data flagging. As such, an increase in network size requires a near-proportional increase in personnel devoted to QA/QC, quickly stressing the human resources available. We present a scalable QA/QC framework in development for NEON that combines the efficiency and standardization of automated checks with the power and flexibility of human review. This framework includes fast-response monitoring of sensor health, a mobile application for electronically recording maintenance activities, traditional point-based automated quality flagging, and continuous monitoring of quality outcomes and longer-term holistic evaluations. This framework maintains the traceability of quality information along the entirety of the data generation pipeline, and explicitly links field reports of measurement interference to quality flagging. Preliminary results show that data quality can be effectively monitored and managed for a multitude of sites with a small group of QA/QC staff. Several components of this framework are open-source, including a R-Shiny application for efficiently monitoring, synthesizing, and investigating data quality issues.
Quality Control of Wind Data from 50-MHz Doppler Radar Wind Profiler
NASA Technical Reports Server (NTRS)
Vacek, Austin
2016-01-01
Upper-level wind profiles obtained from a 50-MHz Doppler Radar Wind Profiler (DRWP) instrument at Kennedy Space Center are incorporated in space launch vehicle design and day-of-launch operations to assess wind effects on the vehicle during ascent. Automated and manual quality control (QC) techniques are implemented to remove spurious data in the upper-level wind profiles caused from atmospheric and non-atmospheric artifacts over the 2010-2012 period of record (POR). By adding the new quality controlled profiles with older profiles from 1997-2009, a robust database will be constructed of upper-level wind characteristics. Statistical analysis will determine the maximum, minimum, and 95th percentile of the wind components from the DRWP profiles over recent POR and compare against the older database. Additionally, this study identifies specific QC flags triggered during the QC process to understand how much data is retained and removed from the profiles.
Quality Control of Wind Data from 50-MHz Doppler Radar Wind Profiler
NASA Technical Reports Server (NTRS)
Vacek, Austin
2015-01-01
Upper-level wind profiles obtained from a 50-MHz Doppler Radar Wind Profiler (DRWP) instrument at Kennedy Space Center are incorporated in space launch vehicle design and day-of-launch operations to assess wind effects on the vehicle during ascent. Automated and manual quality control (QC) techniques are implemented to remove spurious data in the upper-level wind profiles caused from atmospheric and non-atmospheric artifacts over the 2010-2012 period of record (POR). By adding the new quality controlled profiles with older profiles from 1997-2009, a robust database will be constructed of upper-level wind characteristics. Statistical analysis will determine the maximum, minimum, and 95th percentile of the wind components from the DRWP profiles over recent POR and compare against the older database. Additionally, this study identifies specific QC flags triggered during the QC process to understand how much data is retained and removed from the profiles.
NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR FORM QA/QC CHECKS (UA-C-2.0)
The purpose of this SOP is to outline the process of Field Quality Assurance and Quality Control checks. This procedure was followed to ensure consistent data retrieval during the Arizona NHEXAS project and the "Border" study. Keywords: custody; QA/QC; field checks.
The Nation...
Delis, H; Christaki, K; Healy, B; Loreti, G; Poli, G L; Toroi, P; Meghzifene, A
2017-09-01
Quality control (QC), according to ISO definitions, represents the most basic level of quality. It is considered to be the snapshot of the performance or the characteristics of a product or service, in order to verify that it complies with the requirements. Although it is usually believed that "the role of medical physicists in Diagnostic Radiology is QC", this, not only limits the contribution of medical physicists, but is also no longer adequate to meet the needs of Diagnostic Radiology in terms of Quality. In order to assure quality practices more organized activities and efforts are required in the modern era of diagnostic radiology. The complete system of QC is just one element of a comprehensive quality assurance (QA) program that aims at ensuring that the requirements of quality of a product or service will consistently be fulfilled. A comprehensive Quality system, starts even before the procurement of any equipment, as the need analysis and the development of specifications are important components under the QA framework. Further expanding this framework of QA, a comprehensive Quality Management System can provide additional benefits to a Diagnostic Radiology service. Harmonized policies and procedures and elements such as mission statement or job descriptions can provide clarity and consistency in the services provided, enhancing the outcome and representing a solid platform for quality improvement. The International Atomic Energy Agency (IAEA) promotes this comprehensive quality approach in diagnostic imaging and especially supports the field of comprehensive clinical audits as a tool for quality improvement. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Large-Scale Topographic Features on Venus: A Comparison by Geological Mapping in Four Quadrangles
NASA Astrophysics Data System (ADS)
Ivanov, M. A.; Head, J. W.
2002-05-01
We have conducted geological mapping in four quadrangles under the NASA program of geological mapping of Venus. Two quadrangles portray large equidimensional lowlands (Lavinia, V55, and Atalanta, V4, Planitiae) and two more areas are characterized by a large corona (Quetzalpetlatl corona, QC, V66), and Lakshmi Planum (LP, V7). Geological mapping of these large-scale features allows for their broad comparisons by both sets of typical structures and sequences of events. The Planitiae share a number of similar characteristics. (1) Lavinia and Atalanta are broad quasi-circular lowlands 1-2 km deep. (2) The central portions of the basins lack both coronae and large volcanoes. (3) The belts of tectonic deformation characterize the central portions of the basins. (4) There is evidence in both lowlands that they subsided predominantly before the emplacement of regional plains. (5) Recent volcanism is shifted toward the periphery of the basins and occurred after or at the late stages the formation of the lowlands. The above characteristics of the lowlands are better reconciled with the scenario in which their formation is due to a broad-scale mantle downwelling that started relatively early in the visible geologic history of Venus. The QC and LP are elevated structures roughly comparable in size. The formation of QC is commonly attributed to large-scale mantle positive diapirism while the formation of LP remains controversial and both mantle upwelling and downwelling models exist. QC and LP have similar characteristics such as broadly circular shape in plan-view, association with regional highlands, associated relatively young volcanism, and a topographic moat bordering both QC and LP from the North. Despite the above similarities, the striking differences between QC and LP are obvious too. LP is crowned by the highest mountain ranges on Venus and QC is bordered from the North by a common belt of ridges. LP itself makes up a regional highland within the upland of Ishtar Terra while QC produces a much less significant topographic anomaly on the background of the highland of Lada Terra. Highly deformed, tessera-like, terrain apparently makes up the basement of LP, and QC formed in the tessera-free area. Volcanic activity is concentrated in the central portion of LP while QC is a regionally important center of young volcanism. These differences, which probably can not be accounted for by simple difference in the size of LP and QC, suggest non-similar modes of the formation of both regional structures and do not favor the upwelling models of the formation of LP.
Cirillo, Daniela M.; Hoffner, Sven; Ismail, Nazir A.; Kaur, Devinder; Lounis, Nacer; Metchock, Beverly; Pfyffer, Gaby E.; Venter, Amour
2016-01-01
The aim of this study was to establish standardized drug susceptibility testing (DST) methodologies and reference MIC quality control (QC) ranges for bedaquiline, a diarylquinoline antimycobacterial, used in the treatment of adults with multidrug-resistant tuberculosis. Two tier-2 QC reproducibility studies of bedaquiline DST were conducted in eight laboratories using Clinical Laboratory and Standards Institute (CLSI) guidelines. Agar dilution and broth microdilution methods were evaluated. Mycobacterium tuberculosis H37Rv was used as the QC reference strain. Bedaquiline MIC frequency, mode, and geometric mean were calculated. When resulting data occurred outside predefined CLSI criteria, the entire laboratory data set was excluded. For the agar dilution MIC, a 4-dilution QC range (0.015 to 0.12 μg/ml) centered around the geometric mean included 95.8% (7H10 agar dilution; 204/213 observations with one data set excluded) or 95.9% (7H11 agar dilution; 232/242) of bedaquiline MICs. For the 7H9 broth microdilution MIC, a 3-dilution QC range (0.015 to 0.06 μg/ml) centered around the mode included 98.1% (207/211, with one data set excluded) of bedaquiline MICs. Microbiological equivalence was demonstrated for bedaquiline MICs determined using 7H10 agar and 7H11 agar but not for bedaquiline MICs determined using 7H9 broth and 7H10 agar or 7H9 broth and 7H11 agar. Bedaquiline DST methodologies and MIC QC ranges against the H37Rv M. tuberculosis reference strain have been established: 0.015 to 0.12 μg/ml for the 7H10 and 7H11 agar dilution MICs and 0.015 to 0.06 μg/ml for the 7H9 broth microdilution MIC. These methodologies and QC ranges will be submitted to CLSI and EUCAST to inform future research and provide guidance for routine clinical bedaquiline DST in laboratories worldwide. PMID:27654337
NASA Technical Reports Server (NTRS)
Orcutt, John M.; Brenton, James C.
2016-01-01
An accurate database of meteorological data is essential for designing any aerospace vehicle and for preparing launch commit criteria. Meteorological instrumentation were recently placed on the three Lightning Protection System (LPS) towers at Kennedy Space Center (KSC) launch complex 39B (LC-39B), which provide a unique meteorological dataset existing at the launch complex over an extensive altitude range. Data records of temperature, dew point, relative humidity, wind speed, and wind direction are produced at 40, 78, 116, and 139 m at each tower. The Marshall Space Flight Center Natural Environments Branch (EV44) received an archive that consists of one-minute averaged measurements for the period of record of January 2011 - April 2015. However, before the received database could be used EV44 needed to remove any erroneous data from within the database through a comprehensive quality control (QC) process. The QC process applied to the LPS towers' meteorological data is similar to other QC processes developed by EV44, which were used in the creation of meteorological databases for other towers at KSC. The QC process utilized in this study has been modified specifically for use with the LPS tower database. The QC process first includes a check of each individual sensor. This check includes removing any unrealistic data and checking the temporal consistency of each variable. Next, data from all three sensors at each height are checked against each other, checked against climatology, and checked for sensors that erroneously report a constant value. Then, a vertical consistency check of each variable at each tower is completed. Last, the upwind sensor at each level is selected to minimize the influence of the towers and other structures at LC-39B on the measurements. The selection process for the upwind sensor implemented a study of tower-induced turbulence. This paper describes in detail the QC process, QC results, and the attributes of the LPS towers meteorological database.
Raef, A.
2009-01-01
The recent proliferation of the 3D reflection seismic method into the near-surface area of geophysical applications, especially in response to the emergence of the need to comprehensively characterize and monitor near-surface carbon dioxide sequestration in shallow saline aquifers around the world, justifies the emphasis on cost-effective and robust quality control and assurance (QC/QA) workflow of 3D seismic data preprocessing that is suitable for near-surface applications. The main purpose of our seismic data preprocessing QC is to enable the use of appropriate header information, data that are free of noise-dominated traces, and/or flawed vertical stacking in subsequent processing steps. In this article, I provide an account of utilizing survey design specifications, noise properties, first breaks, and normal moveout for rapid and thorough graphical QC/QA diagnostics, which are easy to apply and efficient in the diagnosis of inconsistencies. A correlated vibroseis time-lapse 3D-seismic data set from a CO2-flood monitoring survey is used for demonstrating QC diagnostics. An important by-product of the QC workflow is establishing the number of layers for a refraction statics model in a data-driven graphical manner that capitalizes on the spatial coverage of the 3D seismic data. ?? China University of Geosciences (Wuhan) and Springer-Verlag GmbH 2009.
The quality control theory of aging.
Ladiges, Warren
2014-01-01
The quality control (QC) theory of aging is based on the concept that aging is the result of a reduction in QC of cellular systems designed to maintain lifelong homeostasis. Four QC systems associated with aging are 1) inadequate protein processing in a distressed endoplasmic reticulum (ER); 2) histone deacetylase (HDAC) processing of genomic histones and gene silencing; 3) suppressed AMPK nutrient sensing with inefficient energy utilization and excessive fat accumulation; and 4) beta-adrenergic receptor (BAR) signaling and environmental and emotional stress. Reprogramming these systems to maintain efficiency and prevent aging would be a rational strategy for increased lifespan and improved health. The QC theory can be tested with a pharmacological approach using three well-known and safe, FDA-approved drugs: 1) phenyl butyric acid, a chemical chaperone that enhances ER function and is also an HDAC inhibitor, 2) metformin, which activates AMPK and is used to treat type 2 diabetes, and 3) propranolol, a beta blocker which inhibits BAR signaling and is used to treat hypertension and anxiety. A critical aspect of the QC theory, then, is that aging is associated with multiple cellular systems that can be targeted with drug combinations more effectively than with single drugs. But more importantly, these drug combinations will effectively prevent, delay, or reverse chronic diseases of aging that impose such a tremendous health burden on our society.
Mapp, Latisha; Klonicki, Patricia; Takundwa, Prisca; Hill, Vincent R; Schneeberger, Chandra; Knee, Jackie; Raynor, Malik; Hwang, Nina; Chambers, Yildiz; Miller, Kenneth; Pope, Misty
2015-11-01
The U.S. Environmental Protection Agency's (EPA) Water Laboratory Alliance (WLA) currently uses ultrafiltration (UF) for concentration of biosafety level 3 (BSL-3) agents from large volumes (up to 100-L) of drinking water prior to analysis. Most UF procedures require comprehensive training and practice to achieve and maintain proficiency. As a result, there was a critical need to develop quality control (QC) criteria. Because select agents are difficult to work with and pose a significant safety hazard, QC criteria were developed using surrogates, including Enterococcus faecalis and Bacillus atrophaeus. This article presents the results from the QC criteria development study and results from a subsequent demonstration exercise in which E. faecalis was used to evaluate proficiency using UF to concentrate large volume drinking water samples. Based on preliminary testing EPA Method 1600 and Standard Methods 9218, for E. faecalis and B. atrophaeus respectively, were selected for use during the QC criteria development study. The QC criteria established for Method 1600 were used to assess laboratory performance during the demonstration exercise. Based on the results of the QC criteria study E. faecalis and B. atrophaeus can be used effectively to demonstrate and maintain proficiency using ultrafiltration. Published by Elsevier B.V.
Targeted Informatics General Information Software Posters NIH Program Projects and Statistics QC Statistics Completed Projects Publications Contact Information NIH Contacts CIDR Contacts ___________________ -Contact
Impacts of Intelligent Automated Quality Control on a Small Animal APD-Based Digital PET Scanner
NASA Astrophysics Data System (ADS)
Charest, Jonathan; Beaudoin, Jean-François; Bergeron, Mélanie; Cadorette, Jules; Arpin, Louis; Lecomte, Roger; Brunet, Charles-Antoine; Fontaine, Réjean
2016-10-01
Stable system performance is mandatory to warrant the accuracy and reliability of biological results relying on small animal positron emission tomography (PET) imaging studies. This simple requirement sets the ground for imposing routine quality control (QC) procedures to keep PET scanners at a reliable optimal performance level. However, such procedures can become burdensome to implement for scanner operators, especially taking into account the increasing number of data acquisition channels in newer generation PET scanners. In systems using pixel detectors to achieve enhanced spatial resolution and contrast-to-noise ratio (CNR), the QC workload rapidly increases to unmanageable levels due to the number of independent channels involved. An artificial intelligence based QC system, referred to as Scanner Intelligent Diagnosis for Optimal Performance (SIDOP), was proposed to help reducing the QC workload by performing automatic channel fault detection and diagnosis. SIDOP consists of four high-level modules that employ machine learning methods to perform their tasks: Parameter Extraction, Channel Fault Detection, Fault Prioritization, and Fault Diagnosis. Ultimately, SIDOP submits a prioritized faulty channel list to the operator and proposes actions to correct them. To validate that SIDOP can perform QC procedures adequately, it was deployed on a LabPET™ scanner and multiple performance metrics were extracted. After multiple corrections on sub-optimal scanner settings, a 8.5% (with a 95% confidence interval (CI) of [7.6, 9.3]) improvement in the CNR, a 17.0% (CI: [15.3, 18.7]) decrease of the uniformity percentage standard deviation, and a 6.8% gain in global sensitivity were observed. These results confirm that SIDOP can indeed be of assistance in performing QC procedures and restore performance to optimal figures.
Automated radiotherapy treatment plan integrity verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang Deshan; Moore, Kevin L.
2012-03-15
Purpose: In our clinic, physicists spend from 15 to 60 min to verify the physical and dosimetric integrity of radiotherapy plans before presentation to radiation oncology physicians for approval. The purpose of this study was to design and implement a framework to automate as many elements of this quality control (QC) step as possible. Methods: A comprehensive computer application was developed to carry out a majority of these verification tasks in the Philips PINNACLE treatment planning system (TPS). This QC tool functions based on both PINNACLE scripting elements and PERL sub-routines. The core of this technique is the method ofmore » dynamic scripting, which involves a PERL programming module that is flexible and powerful for treatment plan data handling. Run-time plan data are collected, saved into temporary files, and analyzed against standard values and predefined logical rules. The results were summarized in a hypertext markup language (HTML) report that is displayed to the user. Results: This tool has been in clinical use for over a year. The occurrence frequency of technical problems, which would cause delays and suboptimal plans, has been reduced since clinical implementation. Conclusions: In addition to drastically reducing the set of human-driven logical comparisons, this QC tool also accomplished some tasks that are otherwise either quite laborious or impractical for humans to verify, e.g., identifying conflicts amongst IMRT optimization objectives.« less
FASTQ quality control dashboard
DOE Office of Scientific and Technical Information (OSTI.GOV)
2016-07-25
FQCDB builds up existing open source software, FastQC, implementing a modern web interface for across parsed output of FastQC. In addition, FQCDB is extensible as a web service to include additional plots of type line, boxplot, or heatmap, across data formatted according to guidelines. The interface is also configurable via more readable JSON format, enabling customization by non-web programmers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Günther, Markus, E-mail: markus.guenther@tu-berlin.de; Geißler, Gesa; Köppel, Johann
As there is no one-and-only concept on how to precisely define and establish quality control (QC) or quality assurance (QA) in the making of environmental assessments (EA), this paper presents selected features of international approaches that address quality in EA systems in the USA, the Netherlands, Canada, and the United Kingdom. Based on explanative case studies, we highlight the embedding of specific quality control features within the EA systems, the objectives and processes, and relevant transparency challenges. Such features of QC/QA approaches can be considered in cases where substantial quality control and assurance efforts are still missing. Yet further researchmore » needs to be conducted on the efficacy of these approaches, which remains beyond the scope of this study. - Highlights: • We present four tools for quality control and assurance from different EA systems. • Approaches vary in institutional setting, objectives, procedures, and transparency. • Highlighted features might provide guidance in cases where QC/QA is still lacking.« less
Guillot, Sophie; Guiso, Nicole
2016-08-01
The French National Reference Centre (NRC) for Whooping Cough carried out an external quality control (QC) analysis in 2010 for the PCR diagnosis of whooping cough. The main objective of the study was to assess the impact of this QC in the participating laboratories through a repeat analysis in 2012. Copyright © 2016, American Society for Microbiology. All Rights Reserved.
NASA Astrophysics Data System (ADS)
Sicoe, G. M.; Belu, N.; Rachieru, N.; Nicolae, E. V.
2017-10-01
Presently, in the automotive industry, the tendency is to adapt permanently to the changes and introduce the market tendency in the new products that leads of the customer satisfaction. Many quality techniques were adopted in this field to continuous improvement of product and process quality and advantages were also gained. The present paper has focused on possibilities that offers the use of Quality Assurance Matrix (QAM) and Quality Control Story (QC Story) to provide largest protection against nonconformities in the production process, throughout a case study in the automotive industry. There is a direct relationship from the QAM to a QC Story analysis. The failures identified using QAM are treated with QC Story methodology. Using this methods, will help to decrease the PPM values and will increase the quality performance and the customer satisfaction.
Financial Recruitment Incentive Programs for Nursing Personnel in Canada.
Mathews, Maria; Ryan, Dana
2015-03-01
Financial incentives are increasingly offered to recruit nursing personnel to work in underserved communities. The authors describe and compare the characteristics of federal, provincial and territorial financial recruitment incentive programs for registered nurses (RNs), nurse practitioners (NPs), licensed practical nurses (LPNs), registered practical nurses or registered psychiatric nurses. The authors identified incentive programs from government, health ministry and student aid websites and by contacting program officials. Only government-funded recruitment programs providing funding beyond the normal employee wages and benefits and requiring a service commitment were included. The authors excluded programs offered by hospitals, regional or private firms, and programs that rewarded retention. All provinces and territories except QC and NB offer financial recruitment incentive programs for RNs; six provinces (BC, AB, SK, ON, QC and NL) offer programs for NPs, and NL offers a program for LPNs. Programs include student loan forgiveness, tuition forgiveness, education bursaries, signing bonuses and relocation expenses. Programs target trainees, recent graduates and new hires. Funding and service requirements vary by program, and service requirements are not always commensurate with funding levels. This snapshot of government-funded recruitment incentives provides program managers with data to compare and improve nursing workforce recruitment initiatives. Copyright © 2015 Longwoods Publishing.
Impact of dose calibrators quality control programme in Argentina
NASA Astrophysics Data System (ADS)
Furnari, J. C.; de Cabrejas, M. L.; del C. Rotta, M.; Iglicki, F. A.; Milá, M. I.; Magnavacca, C.; Dima, J. C.; Rodríguez Pasqués, R. H.
1992-02-01
The national Quality Control (QC) programme for radionuclide calibrators started 12 years ago. Accuracy and the implementation of a QC programme were evaluated over all these years at 95 nuclear medicine laboratories where dose calibrators were in use. During all that time, the Metrology Group of CNEA has distributed 137Cs sealed sources to check stability and has been performing periodic "checking rounds" and postal surveys using unknown samples (external quality control). An account of the results of both methods is presented. At present, more of 65% of the dose calibrators measure activities with an error less than 10%.
[Highly quality-controlled radiation therapy].
Shirato, Hiroki
2005-04-01
Advanced radiation therapy for intracranial disease has focused on set-up accuracy for the past 15 years. However, quality control in the prescribed dose is actually as important as the tumor set-up in radiation therapy. Because of the complexity of the three-dimensional radiation treatment planning system in recent years, the highly quality-controlled prescription of the dose has now been reappraised as the mainstream to improve the treatment outcome of radiation therapy for intracranial disease. The Japanese Committee for Quality Control of Radiation Therapy has developed fundamental requirements such as a QC committee in each hospital, a medical physicist, dosimetrists (QC members), and an external audit.
Network-Centric Quantum Communications
NASA Astrophysics Data System (ADS)
Hughes, Richard
2014-03-01
Single-photon quantum communications (QC) offers ``future-proof'' cryptographic security rooted in the laws of physics. Today's quantum-secured communications cannot be compromised by unanticipated future technological advances. But to date, QC has only existed in point-to-point instantiations that have limited ability to address the cyber security challenges of our increasingly networked world. In my talk I will describe a fundamentally new paradigm of network-centric quantum communications (NQC) that leverages the network to bring scalable, QC-based security to user groups that may have no direct user-to-user QC connectivity. With QC links only between each of N users and a trusted network node, NQC brings quantum security to N2 user pairs, and to multi-user groups. I will describe a novel integrated photonics quantum smartcard (``QKarD'') and its operation in a multi-node NQC test bed. The QKarDs are used to implement the quantum cryptographic protocols of quantum identification, quantum key distribution and quantum secret splitting. I will explain how these cryptographic primitives are used to provide key management for encryption, authentication, and non-repudiation for user-to-user communications. My talk will conclude with a description of a recent demonstration that QC can meet both the security and quality-of-service (latency) requirements for electric grid control commands and data. These requirements cannot be met simultaneously with present-day cryptography.
IAEA support to medical physics in nuclear medicine.
Meghzifene, Ahmed; Sgouros, George
2013-05-01
Through its programmatic efforts and its publications, the International Atomic Energy Agency (IAEA) has helped define the role and responsibilities of the nuclear medicine physicist in the practice of nuclear medicine. This paper describes the initiatives that the IAEA has undertaken to support medical physics in nuclear medicine. In 1984, the IAEA provided guidance on how to ensure that the equipment used for detecting, imaging, and quantifying radioactivity is functioning properly (Technical Document [TECDOC]-137, "Quality Control of Nuclear Medicine Instruments"). An updated version of IAEA-TECDOC-137 was issued in 1991 as IAEA-TECDOC-602, and this included new chapters on scanner-computer systems and single-photon emission computed tomography systems. Nuclear medicine physics was introduced as a part of a project on radiation imaging and radioactivity measurements in the 2002-2003 IAEA biennium program in Dosimetry and Medical Radiation Physics. Ten years later, IAEA activities in this field have expanded to cover quality assurance (QA) and quality control (QC) of nuclear medicine equipment, education and clinical training, professional recognition of the role of medical physicists in nuclear medicine physics, and finally, the coordination of research and development activities in internal dosimetry. As a result of these activities, the IAEA has received numerous requests to support the development and implementation of QA or QC programs for radioactivity measurements in nuclear medicine in many Member States. During the last 5 years, support was provided to 20 Member States through the IAEA's technical cooperation programme. The IAEA has also supported education and clinical training of medical physicists. This type of support has been essential for the development and expansion of the Medical Physics profession, especially in low- and middle-income countries. The need for basic as well as specialized clinical training in medical physics was identified as a priority for healthcare providers in many countries. The IAEA's response to meet the increasing needs for training has been 2-folds. Through its regular program, a priority is given to the development of standardized syllabi and education and clinical training guides. Through its technical cooperation programme, support is given for setting up national medical physics education and clinical training programs in countries. In addition, fellowships are granted for professionals working in the field for specialized training, and workshops are organized at the national and regional level in specialized topics of nuclear medicine physics. So as to support on-the-job training, the IAEA has also setup a gamma camera laboratory in Seibersdorf, Austria. The laboratory is also equipped with QC tools and equipments, and radioisotopes are procured when training events are held. About 2-3 specialized courses are held every year for medical physicists at the IAEA gamma camera laboratory. In the area of research and development, the IAEA supports, through its coordinated research projects, new initiatives in quantitative nuclear medicine and internal dosimetry. The future of nuclear medicine is driven by advances in instrumentation, by the ever increasing availability of computing power and data storage, and by the development of new radiopharmaceuticals for molecular imaging and therapy. Future developments in nuclear medicine are partially driven by, and will influence, nuclear medicine physics and medical physics. To summarize, the IAEA has established a number of programs to support nuclear medicine physics and will continue to do so through its coordinated research activities, education and training in clinical medical physics, and through programs and meetings to promote standardization and harmonization of QA or QC procedures for imaging and treatment of patients. Copyright © 2013 Elsevier Inc. All rights reserved.
Disk diffusion quality control guidelines for NVP-PDF 713: a novel peptide deformylase inhibitor.
Anderegg, Tamara R; Jones, Ronald N
2004-01-01
NVP-PDF713 is a peptide deformylase inhibitor that has emerged as a candidate for treating Gram-positive infections and selected Gram-negative species that commonly cause community-acquired respiratory tract infections. This report summarizes the results of a multi-center (seven participants) disk diffusion quality control (QC) investigation for NVP PDF-713 using guidelines of the National Committee for Clinical Laboratory Standards and the standardized disk diffusion method. A total of 420 NVP-PDF 713 zone diameter values were generated for each QC organism. The proposed zone diameter ranges contained 97.6-99.8% of the reported participant results and were: Staphylococcus aureus ATCC 25923 (25-35 mm), Streptococcus pneumoniae ATCC 49619 (30-37 mm), and Haemophilus influenzae ATCC 49247 (24-32 mm). These QC criteria for the disk diffusion method should be applied during the NVP-PDF 713 clinical trials to maximize test accuracy.
Francy, D.S.; Jones, A.L.; Myers, Donna N.; Rowe, G.L.; Eberle, Michael; Sarver, K.M.
1998-01-01
The U.S. Geological Survey (USGS), Water Resources Division (WRD), requires that quality-assurance/quality-control (QA/QC) activities be included in any sampling and analysis program. Operational QA/QC procedures address local needs while incorporating national policies. Therefore, specific technical policies were established for all activities associated with water-quality project being done by the Ohio District. The policies described in this report provide Ohio District personnel, cooperating agencies, and others with a reference manual on QA/QC procedures that are followed in collecitng and analyzing water-quality samples and reporting water-quality information in the Ohio District. The project chief, project support staff, District Water-Quality Specialist, and District Laboratory Coordinator are all involved in planning and implementing QA/QC activities at the district level. The District Chief and other district-level managers provide oversight, and the Regional Water-Quality Specialist, Office of Water Quality (USGS headquarters), and the Branch of Quality Systems within the Office of Water Quality create national QA/QC polices and provide assistance to District personnel. In the literature, the quality of all measurement data is expressed in terms of precision, variability, bias, accuracy, completeness, representativeness, and comparability. In the Ohio District, bias and variability will be used to describe quality-control data generated from samples in the field and laboratory. Each project chief must plan for implementation and financing of QA/QC activities necessary to achieve data-quality objectives. At least 15 percent of the total project effort must be directed toward QA/QC activities. Of this total, 5-10 percent will be used for collection and analysis of quality-control samples. This is an absolute minimum, and more may be required based on project objectives. Proper techniques must be followed in the collection and processing of surface-water, ground-water, biological, precipitation, bed-sediment, bedload, suspended-sediment, and solid-phase samples. These techniques are briefly described in this report and are extensively documented. The reference documents listed in this report will be kept by the District librarian and District Water-Quality Specialist and updated regularly so that they are available to all District staff. Proper handling and documentation before, during, and after field activities are essential to ensure the integrity of the sample and to correct erroneous reporting of data results. Field sites are to be properly identified and entered into the data base before field data-collection activities begin. During field activities, field notes are to be completed and sample bottles appropriately labeled a nd stored. After field activities, all paperwork is to be completed promptly and samples transferred to the laboratory within allowable holding times. All equipment used by District personnel for the collection and processing of water-quality samples is to be properly operated, maintained, and calibrated by project personnel. This includes equipment for onsite measurement of water-quality characteristics (temperature, specific conductance, pH, dissolved oxygen, alkalinity, acidity, and turbidity) and equipment and instruments used for biological sampling. The District Water-Quality Specialist and District Laboratory Coordinator are responsible for preventive maintenance and calibration of equipment in the Ohio District laboratory. The USGS National Water Quality Laboratory in Arvada, Colo., is the primary source of analytical services for most project work done by the Ohio District. Analyses done at the Ohio District laboratory are usually those that must be completed within a few hours of sample collection. Contract laboratories or other USGS laboratories are sometimes used instead of the NWQL or the Ohio District laboratory. When a contract laboratory is used, the projec
Taghizadeh, Somayeh; Yang, Claus Chunli; R. Kanakamedala, Madhava; Morris, Bart; Vijayakumar, Srinivasan
2017-01-01
Purpose Magnetic resonance (MR) images are necessary for accurate contouring of intracranial targets, determination of gross target volume and evaluation of organs at risk during stereotactic radiosurgery (SRS) treatment planning procedures. Many centers use magnetic resonance imaging (MRI) simulators or regular diagnostic MRI machines for SRS treatment planning; while both types of machine require two stages of quality control (QC), both machine- and patient-specific, before use for SRS, no accepted guidelines for such QC currently exist. This article describes appropriate machine-specific QC procedures for SRS applications. Methods and materials We describe the adaptation of American College of Radiology (ACR)-recommended QC tests using an ACR MRI phantom for SRS treatment planning. In addition, commercial Quasar MRID3D and Quasar GRID3D phantoms were used to evaluate the effects of static magnetic field (B0) inhomogeneity, gradient nonlinearity, and a Leksell G frame (SRS frame) and its accessories on geometrical distortion in MR images. Results QC procedures found in-plane distortions (Maximum = 3.5 mm, Mean = 0.91 mm, Standard deviation = 0.67 mm, >2.5 mm (%) = 2) in X-direction (Maximum = 2.51 mm, Mean = 0.52 mm, Standard deviation = 0.39 mm, > 2.5 mm (%) = 0) and in Y-direction (Maximum = 13. 1 mm , Mean = 2.38 mm, Standard deviation = 2.45 mm, > 2.5 mm (%) = 34) in Z-direction and < 1 mm distortion at a head-sized region of interest. MR images acquired using a Leksell G frame and localization devices showed a mean absolute deviation of 2.3 mm from isocenter. The results of modified ACR tests were all within recommended limits, and baseline measurements have been defined for regular weekly QC tests. Conclusions With appropriate QC procedures in place, it is possible to routinely obtain clinically useful MR images suitable for SRS treatment planning purposes. MRI examination for SRS planning can benefit from the improved localization and planning possible with the superior image quality and soft tissue contrast achieved under optimal conditions. PMID:29487771
Fatemi, Ali; Taghizadeh, Somayeh; Yang, Claus Chunli; R Kanakamedala, Madhava; Morris, Bart; Vijayakumar, Srinivasan
2017-12-18
Purpose Magnetic resonance (MR) images are necessary for accurate contouring of intracranial targets, determination of gross target volume and evaluation of organs at risk during stereotactic radiosurgery (SRS) treatment planning procedures. Many centers use magnetic resonance imaging (MRI) simulators or regular diagnostic MRI machines for SRS treatment planning; while both types of machine require two stages of quality control (QC), both machine- and patient-specific, before use for SRS, no accepted guidelines for such QC currently exist. This article describes appropriate machine-specific QC procedures for SRS applications. Methods and materials We describe the adaptation of American College of Radiology (ACR)-recommended QC tests using an ACR MRI phantom for SRS treatment planning. In addition, commercial Quasar MRID 3D and Quasar GRID 3D phantoms were used to evaluate the effects of static magnetic field (B 0 ) inhomogeneity, gradient nonlinearity, and a Leksell G frame (SRS frame) and its accessories on geometrical distortion in MR images. Results QC procedures found in-plane distortions (Maximum = 3.5 mm, Mean = 0.91 mm, Standard deviation = 0.67 mm, >2.5 mm (%) = 2) in X-direction (Maximum = 2.51 mm, Mean = 0.52 mm, Standard deviation = 0.39 mm, > 2.5 mm (%) = 0) and in Y-direction (Maximum = 13. 1 mm , Mean = 2.38 mm, Standard deviation = 2.45 mm, > 2.5 mm (%) = 34) in Z-direction and < 1 mm distortion at a head-sized region of interest. MR images acquired using a Leksell G frame and localization devices showed a mean absolute deviation of 2.3 mm from isocenter. The results of modified ACR tests were all within recommended limits, and baseline measurements have been defined for regular weekly QC tests. Conclusions With appropriate QC procedures in place, it is possible to routinely obtain clinically useful MR images suitable for SRS treatment planning purposes. MRI examination for SRS planning can benefit from the improved localization and planning possible with the superior image quality and soft tissue contrast achieved under optimal conditions.
NASA Astrophysics Data System (ADS)
Saavedra, Juan Alejandro
Quality Control (QC) and Quality Assurance (QA) strategies vary significantly across industries in the manufacturing sector depending on the product being built. Such strategies range from simple statistical analysis and process controls, decision-making process of reworking, repairing, or scraping defective product. This study proposes an optimal QC methodology in order to include rework stations during the manufacturing process by identifying the amount and location of these workstations. The factors that are considered to optimize these stations are cost, cycle time, reworkability and rework benefit. The goal is to minimize the cost and cycle time of the process, but increase the reworkability and rework benefit. The specific objectives of this study are: (1) to propose a cost estimation model that includes energy consumption, and (2) to propose an optimal QC methodology to identify quantity and location of rework workstations. The cost estimation model includes energy consumption as part of the product direct cost. The cost estimation model developed allows the user to calculate product direct cost as the quality sigma level of the process changes. This provides a benefit because a complete cost estimation calculation does not need to be performed every time the processes yield changes. This cost estimation model is then used for the QC strategy optimization process. In order to propose a methodology that provides an optimal QC strategy, the possible factors that affect QC were evaluated. A screening Design of Experiments (DOE) was performed on seven initial factors and identified 3 significant factors. It reflected that one response variable was not required for the optimization process. A full factorial DOE was estimated in order to verify the significant factors obtained previously. The QC strategy optimization is performed through a Genetic Algorithm (GA) which allows the evaluation of several solutions in order to obtain feasible optimal solutions. The GA evaluates possible solutions based on cost, cycle time, reworkability and rework benefit. Finally it provides several possible solutions because this is a multi-objective optimization problem. The solutions are presented as chromosomes that clearly state the amount and location of the rework stations. The user analyzes these solutions in order to select one by deciding which of the four factors considered is most important depending on the product being manufactured or the company's objective. The major contribution of this study is to provide the user with a methodology used to identify an effective and optimal QC strategy that incorporates the number and location of rework substations in order to minimize direct product cost, and cycle time, and maximize reworkability, and rework benefit.
Huang, Kai-Fa; Liaw, Su-Sen; Huang, Wei-Lin; Chia, Cho-Yun; Lo, Yan-Chung; Chen, Yi-Ling; Wang, Andrew H.-J.
2011-01-01
Aberrant pyroglutamate formation at the N terminus of certain peptides and proteins, catalyzed by glutaminyl cyclases (QCs), is linked to some pathological conditions, such as Alzheimer disease. Recently, a glutaminyl cyclase (QC) inhibitor, PBD150, was shown to be able to reduce the deposition of pyroglutamate-modified amyloid-β peptides in brain of transgenic mouse models of Alzheimer disease, leading to a significant improvement of learning and memory in those transgenic animals. Here, we report the 1.05–1.40 Å resolution structures, solved by the sulfur single-wavelength anomalous dispersion phasing method, of the Golgi-luminal catalytic domain of the recently identified Golgi-resident QC (gQC) and its complex with PBD150. We also describe the high-resolution structures of secretory QC (sQC)-PBD150 complex and two other gQC-inhibitor complexes. gQC structure has a scaffold similar to that of sQC but with a relatively wider and negatively charged active site, suggesting a distinct substrate specificity from sQC. Upon binding to PBD150, a large loop movement in gQC allows the inhibitor to be tightly held in its active site primarily by hydrophobic interactions. Further comparisons of the inhibitor-bound structures revealed distinct interactions of the inhibitors with gQC and sQC, which are consistent with the results from our inhibitor assays reported here. Because gQC and sQC may play different biological roles in vivo, the different inhibitor binding modes allow the design of specific inhibitors toward gQC and sQC. PMID:21288892
Huang, Kai-Fa; Liaw, Su-Sen; Huang, Wei-Lin; Chia, Cho-Yun; Lo, Yan-Chung; Chen, Yi-Ling; Wang, Andrew H-J
2011-04-08
Aberrant pyroglutamate formation at the N terminus of certain peptides and proteins, catalyzed by glutaminyl cyclases (QCs), is linked to some pathological conditions, such as Alzheimer disease. Recently, a glutaminyl cyclase (QC) inhibitor, PBD150, was shown to be able to reduce the deposition of pyroglutamate-modified amyloid-β peptides in brain of transgenic mouse models of Alzheimer disease, leading to a significant improvement of learning and memory in those transgenic animals. Here, we report the 1.05-1.40 Å resolution structures, solved by the sulfur single-wavelength anomalous dispersion phasing method, of the Golgi-luminal catalytic domain of the recently identified Golgi-resident QC (gQC) and its complex with PBD150. We also describe the high-resolution structures of secretory QC (sQC)-PBD150 complex and two other gQC-inhibitor complexes. gQC structure has a scaffold similar to that of sQC but with a relatively wider and negatively charged active site, suggesting a distinct substrate specificity from sQC. Upon binding to PBD150, a large loop movement in gQC allows the inhibitor to be tightly held in its active site primarily by hydrophobic interactions. Further comparisons of the inhibitor-bound structures revealed distinct interactions of the inhibitors with gQC and sQC, which are consistent with the results from our inhibitor assays reported here. Because gQC and sQC may play different biological roles in vivo, the different inhibitor binding modes allow the design of specific inhibitors toward gQC and sQC.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ford, Eric C., E-mail: eford@uw.edu; Terezakis, Stephanie; Souranis, Annette
Purpose: To quantify the error-detection effectiveness of commonly used quality control (QC) measures. Methods: We analyzed incidents from 2007-2010 logged into a voluntary in-house, electronic incident learning systems at 2 academic radiation oncology clinics. None of the incidents resulted in patient harm. Each incident was graded for potential severity using the French Nuclear Safety Authority scoring scale; high potential severity incidents (score >3) were considered, along with a subset of 30 randomly chosen low severity incidents. Each report was evaluated to identify which of 15 common QC checks could have detected it. The effectiveness was calculated, defined as the percentagemore » of incidents that each QC measure could detect, both for individual QC checks and for combinations of checks. Results: In total, 4407 incidents were reported, 292 of which had high-potential severity. High- and low-severity incidents were detectable by 4.0 {+-} 2.3 (mean {+-} SD) and 2.6 {+-} 1.4 QC checks, respectively (P<.001). All individual checks were less than 50% sensitive with the exception of pretreatment plan review by a physicist (63%). An effectiveness of 97% was achieved with 7 checks used in combination and was not further improved with more checks. The combination of checks with the highest effectiveness includes physics plan review, physician plan review, Electronic Portal Imaging Device-based in vivo portal dosimetry, radiation therapist timeout, weekly physics chart check, the use of checklists, port films, and source-to-skin distance checks. Some commonly used QC checks such as pretreatment intensity modulated radiation therapy QA do not substantially add to the ability to detect errors in these data. Conclusions: The effectiveness of QC measures in radiation oncology depends sensitively on which checks are used and in which combinations. A small percentage of errors cannot be detected by any of the standard formal QC checks currently in broad use, suggesting that further improvements are needed. These data require confirmation with a broader incident-reporting database.« less
Production of latex agglutination reagents for pneumococcal serotyping
2013-01-01
Background The current ‘gold standard’ for serotyping pneumococci is the Quellung test. This technique is laborious and requires a certain level of training to correctly perform. Commercial pneumococcal latex agglutination serotyping reagents are available, but these are expensive. In-house production of latex agglutination reagents can be a cost-effective alternative to using commercially available reagents. This paper describes a method for the production and quality control (QC) of latex reagents, including problem solving recommendations, for pneumococcal serotyping. Results Here we describe a method for the production of latex agglutination reagents based on the passive adsorption of antibodies to latex particles. Sixty-five latex agglutination reagents were made using the PneuCarriage Project (PCP) method, of which 35 passed QC. The other 30 reagents failed QC due to auto-agglutination (n=2), no reactivity with target serotypes (n=8) or cross-reactivity with non-target serotypes (n=20). Dilution of antisera resulted in a further 27 reagents passing QC. The remaining three reagents passed QC when prepared without centrifugation and wash steps. Protein estimates indicated that latex reagents that failed QC when prepared using the PCP method passed when made with antiserum containing ≤ 500 μg/ml of protein. Sixty-one nasopharyngeal isolates were serotyped with our in-house latex agglutination reagents, with the results showing complete concordance with the Quellung reaction. Conclusions The method described here to produce latex agglutination reagents allows simple and efficient serotyping of pneumococci and may be applicable to latex agglutination reagents for typing or identification of other microorganisms. We recommend diluting antisera or removing centrifugation and wash steps for any latex reagents that fail QC. Our latex reagents are cost-effective, technically undemanding to prepare and remain stable for long periods of time, making them ideal for use in low-income countries. PMID:23379961
Rossum, Huub H van; Kemperman, Hans
2017-07-26
General application of a moving average (MA) as continuous analytical quality control (QC) for routine chemistry assays has failed due to lack of a simple method that allows optimization of MAs. A new method was applied to optimize the MA for routine chemistry and was evaluated in daily practice as continuous analytical QC instrument. MA procedures were optimized using an MA bias detection simulation procedure. Optimization was graphically supported by bias detection curves. Next, all optimal MA procedures that contributed to the quality assurance were run for 100 consecutive days and MA alarms generated during working hours were investigated. Optimized MA procedures were applied for 24 chemistry assays. During this evaluation, 303,871 MA values and 76 MA alarms were generated. Of all alarms, 54 (71%) were generated during office hours. Of these, 41 were further investigated and were caused by ion selective electrode (ISE) failure (1), calibration failure not detected by QC due to improper QC settings (1), possible bias (significant difference with the other analyzer) (10), non-human materials analyzed (2), extreme result(s) of a single patient (2), pre-analytical error (1), no cause identified (20), and no conclusion possible (4). MA was implemented in daily practice as a continuous QC instrument for 24 routine chemistry assays. In our setup when an MA alarm required follow-up, a manageable number of MA alarms was generated that resulted in valuable MA alarms. For the management of MA alarms, several applications/requirements in the MA management software will simplify the use of MA procedures.
NASA Astrophysics Data System (ADS)
Susskind, J.; Rosenberg, R. I.
2016-12-01
The GEOS-5 Data Assimilation System (DAS) generates a global analysis every six hours by combining the previous six hour forecast for that time period with contemporaneous observations. These observations include in-situ observations as well as those taken by satellite borne instruments, such as AIRS/AMSU on EOS Aqua and CrIS/ATMS on S-NPP. Operational data assimilation methodology assimilates observed channel radiances Ri for IR sounding instruments such as AIRS and CrIS, but only for those channels i in a given scene whose radiances are thought to be unaffected by clouds. A limitation of this approach is that radiances in most tropospheric sounding channels are affected by clouds under partial cloud cover conditions, which occurs most of the time. The AIRS Science Team Version-6 retrieval algorithm generates cloud cleared radiances (CCR's) for each channel in a given scene, which represent the radiances AIRS would have observed if the scene were cloud free, and then uses them to determine quality controlled (QC'd) temperature profiles T(p) under all cloud conditions. There are potential advantages to assimilate either AIRS QC'd CCR's or QC'd T(p) instead of Ri in that the spatial coverage of observations is greater under partial cloud cover. We tested these two alternate data assimilation approaches by running three parallel data assimilation experiments over different time periods using GEOS-5. Experiment 1 assimilated all observations as done operationally, Experiment 2 assimilated QC'd values of AIRS CCRs in place of AIRS radiances, and Experiment 3 assimilated QC'd values of T(p) in place of observed radiances. Assimilation of QC'd AIRS T(p) resulted in significant improvement in seven day forecast skill compared to assimilation of CCR's or assimilation of observed radiances, especially in the Southern Hemisphere Extra-tropics.
NASA Astrophysics Data System (ADS)
Chan, S.; Billesbach, D. P.; Hanson, C. V.; Biraud, S.
2014-12-01
The AmeriFlux quality assurance and quality control (QA/QC) technical team conducts short term (<2 weeks) intercomparisons using a portable eddy covariance system (PECS) to maintain high quality data observations and data consistency across the AmeriFlux network (http://ameriflux.lbl.gov/). Site intercomparisons identify discrepancies between the in situ and portable measurements and calculated fluxes. Findings are jointly discussed by the site staff and the QA/QC team to improve in the situ observations. Despite the relatively short duration of an individual site intercomparison, the accumulated record of all site visits (numbering over 100 since 2002) is a unique dataset. The ability to deploy redundant sensors provides a rare opportunity to identify, quantify, and understand uncertainties in eddy covariance and ancillary measurements. We present a few specific case studies from QA/QC site visits to highlight and share new and relevant findings related to eddy covariance instrumentation and operation.
Brummel, Olaf; Waidhas, Fabian; Bauer, Udo; Wu, Yanlin; Bochmann, Sebastian; Steinrück, Hans-Peter; Papp, Christian; Bachmann, Julien; Libuda, Jörg
2017-07-06
The two valence isomers norbornadiene (NBD) and quadricyclane (QC) enable solar energy storage in a single molecule system. We present a new photoelectrochemical infrared reflection absorption spectroscopy (PEC-IRRAS) experiment, which allows monitoring of the complete energy storage and release cycle by in situ vibrational spectroscopy. Both processes were investigated, the photochemical conversion from NBD to QC using the photosensitizer 4,4'-bis(dimethylamino)benzophenone (Michler's ketone, MK) and the electrochemically triggered cycloreversion from QC to NBD. Photochemical conversion was obtained with characteristic conversion times on the order of 500 ms. All experiments were performed under full potential control in a thin-layer configuration with a Pt(111) working electrode. The vibrational spectra of NBD, QC, and MK were analyzed in the fingerprint region, permitting quantitative analysis of the spectroscopic data. We determined selectivities for both the photochemical conversion and the electrochemical cycloreversion and identified the critical steps that limit the reversibility of the storage cycle.
Phase 2 Site Investigations Report. Volume 3 of 3: Appendices
1994-09-01
Phase II Site Investigations Ee Report Cn Volume III of III Appendices Fort Devens Sudbury Training Annex, Massachusetts September 1994 Contract No...laboratory quality control (QC) samples collected during field investigations at the Sudbury Training Annex of Fort Devens , Massachusetts. The QC...returned to its original condition. E & E performed this procedure for each monitoring well tested during the 1993 slug testing activities at Fort Devens
NASA Technical Reports Server (NTRS)
Barbre, Robert E., Jr.
2012-01-01
This paper presents the process used by the Marshall Space Flight Center Natural Environments Branch (EV44) to quality control (QC) data from the Kennedy Space Center's 50-MHz Doppler Radar Wind Profiler for use in vehicle wind loads and steering commands. The database has been built to mitigate limitations of using the currently archived databases from weather balloons. The DRWP database contains wind measurements from approximately 2.7-18.6 km altitude at roughly five minute intervals for the August 1997 to December 2009 period of record, and the extensive QC process was designed to remove spurious data from various forms of atmospheric and non-atmospheric artifacts. The QC process is largely based on DRWP literature, but two new algorithms have been developed to remove data contaminated by convection and excessive first guess propagations from the Median Filter First Guess Algorithm. In addition to describing the automated and manual QC process in detail, this paper describes the extent of the data retained. Roughly 58% of all possible wind observations exist in the database, with approximately 100 times as many complete profile sets existing relative to the EV44 balloon databases. This increased sample of near-continuous wind profile measurements may help increase launch availability by reducing the uncertainty of wind changes during launch countdown
Torres, Leticia; Liu, Yue; Guitreau, Amy; Yang, Huiping; Tiersch, Terrence R
2017-12-01
Quality control (QC) is essential for reproducible and efficient functioning of germplasm repositories. However, many biomedical fish models present significant QC challenges due to small body sizes (<5 cm) and miniscule sperm volumes (<5 μL). Using minimal volumes of sperm, we used Zebrafish to evaluate common QC endpoints as surrogates for fertilization success along sequential steps of cryopreservation. First, concentrations of calibration bead suspensions were evaluated with a Makler ® counting chamber by using different sample volumes and mixing methods. For sperm analysis, samples were initially diluted at a 1:30 ratio with Hanks' balanced salt solution (HBSS). Motility was evaluated by using different ratios of sperm and activation medium, and membrane integrity was analyzed with flow cytometry at different concentrations. Concentration and sperm motility could be confidently estimated by using volumes as small as 1 μL, whereas membrane integrity required a minimum of 2 μL (at 1 × 10 6 cells/mL). Thus, <5 μL of sperm suspension (after dilution to 30-150 μL with HBSS) was required to evaluate sperm quality by using three endpoints. Sperm quality assessment using a combination of complementary endpoints enhances QC efforts during cryopreservation, increasing reliability and reproducibility, and reducing waste of time and resources.
Kim, Sung-Su; Choi, Hyun-Jeung; Kim, Jin Ju; Kim, M Sun; Lee, In-Seon; Byun, Bohyun; Jia, Lina; Oh, Myung Ryurl; Moon, Youngho; Park, Sarah; Choi, Joon-Seok; Chae, Seoung Wan; Nam, Byung-Ho; Kim, Jin-Soo; Kim, Jihun; Min, Byung Soh; Lee, Jae Seok; Won, Jae-Kyung; Cho, Soo Youn; Choi, Yoon-La; Shin, Young Kee
2018-01-11
In clinical translational research and molecular in vitro diagnostics, a major challenge in the detection of genetic mutations is overcoming artefactual results caused by the low-quality of formalin-fixed paraffin-embedded tissue (FFPET)-derived DNA (FFPET-DNA). Here, we propose the use of an 'internal quality control (iQC) index' as a criterion for judging the minimum quality of DNA for PCR-based analyses. In a pre-clinical study comparing the results from droplet digital PCR-based EGFR mutation test (ddEGFR test) and qPCR-based EGFR mutation test (cobas EGFR test), iQC index ≥ 0.5 (iQC copies ≥ 500, using 3.3 ng of FFPET-DNA [1,000 genome equivalents]) was established, indicating that more than half of the input DNA was amplifiable. Using this criterion, we conducted a retrospective comparative clinical study of the ddEGFR and cobas EGFR tests for the detection of EGFR mutations in non-small cell lung cancer (NSCLC) FFPET-DNA samples. Compared with the cobas EGFR test, the ddEGFR test exhibited superior analytical performance and equivalent or higher clinical performance. Furthermore, iQC index is a reliable indicator of the quality of FFPET-DNA and could be used to prevent incorrect diagnoses arising from low-quality samples.
SU-E-T-473: A Patient-Specific QC Paradigm Based On Trajectory Log Files and DICOM Plan Files
DOE Office of Scientific and Technical Information (OSTI.GOV)
DeMarco, J; McCloskey, S; Low, D
Purpose: To evaluate a remote QC tool for monitoring treatment machine parameters and treatment workflow. Methods: The Varian TrueBeamTM linear accelerator is a digital machine that records machine axis parameters and MLC leaf positions as a function of delivered monitor unit or control point. This information is saved to a binary trajectory log file for every treatment or imaging field in the patient treatment session. A MATLAB analysis routine was developed to parse the trajectory log files for a given patient, compare the expected versus actual machine and MLC positions as well as perform a cross-comparison with the DICOM-RT planmore » file exported from the treatment planning system. The parsing routine sorts the trajectory log files based on the time and date stamp and generates a sequential report file listing treatment parameters and provides a match relative to the DICOM-RT plan file. Results: The trajectory log parsing-routine was compared against a standard record and verify listing for patients undergoing initial IMRT dosimetry verification and weekly and final chart QC. The complete treatment course was independently verified for 10 patients of varying treatment site and a total of 1267 treatment fields were evaluated including pre-treatment imaging fields where applicable. In the context of IMRT plan verification, eight prostate SBRT plans with 4-arcs per plan were evaluated based on expected versus actual machine axis parameters. The average value for the maximum RMS MLC error was 0.067±0.001mm and 0.066±0.002mm for leaf bank A and B respectively. Conclusion: A real-time QC analysis program was tested using trajectory log files and DICOM-RT plan files. The parsing routine is efficient and able to evaluate all relevant machine axis parameters during a patient treatment course including MLC leaf positions and table positions at time of image acquisition and during treatment.« less
Individualized Quality Control Plan (IQCP): Is It Value-Added for Clinical Microbiology?
Miller, Melissa B.; Hindler, Janet
2015-01-01
The Center for Medicaid and Medicare Services (CMS) recently published their Individualized Quality Control Plan (IQCP [https://www.cms.gov/regulations-and-guidance/legislation/CLIA/Individualized_Quality_Control_Plan_IQCP.html]), which will be the only option for quality control (QC) starting in January 2016 if laboratories choose not to perform Clinical Laboratory Improvement Act (CLIA) [U.S. Statutes at Large 81(1967):533] default QC. Laboratories will no longer be able to use “equivalent QC” (EQC) or the Clinical and Laboratory Standards Institute (CLSI) standards alone for quality control of their microbiology systems. The implementation of IQCP in clinical microbiology laboratories will most certainly be an added burden, the benefits of which are currently unknown. PMID:26447112
NASA Astrophysics Data System (ADS)
Hussmann, Stephan; Lau, Wing Y.; Chu, Terry; Grothof, Markus
2003-07-01
Traditionally, the measuring or monitoring system of manufacturing industries uses sensors, computers and screens for their quality control (Q.C.). The acquired information is fed back to the control room by wires, which - for obvious reason - are not suitable in many environments. This paper describes a method to solve this problem by employing the new Bluetooth technology to set up a complete new system, where a total wireless solution is made feasible. This new Q.C. system allows several line scan cameras to be connected at once to a graphical user interface (GUI) that can monitor the production process. There are many Bluetooth devices available on the market such as cell-phones, headsets, printers, PDA etc. However, the detailed application is a novel implementation in the industrial Q.C. area. This paper will contain more details about the Bluetooth standard and why it is used (nework topologies, host controller interface, data rates, etc.), the Bluetooth implemetation in the microcontroller of the line scan camera, and the GUI and its features.
Analysis of glycoprotein processing in the endoplasmic reticulum using synthetic oligosaccharides.
Ito, Yukishige; Takeda, Yoichi
2012-01-01
Protein quality control (QC) in the endoplasmic reticulum (ER) comprises many steps, including folding and transport of nascent proteins as well as degradation of misfolded proteins. Recent studies have revealed that high-mannose-type glycans play a pivotal role in the QC process. To gain knowledge about the molecular basis of this process with well-defined homogeneous compounds, we achieved a convergent synthesis of high-mannose-type glycans and their functionalized derivatives. We focused on analyses of UDP-Glc: glycoprotein glucosyltransferase (UGGT) and ER Glucosidase II, which play crucial roles in glycoprotein QC; however, their specificities remain unclear. In addition, we established an in vitro assay system mimicking the in vivo condition which is highly crowded because of the presence of various biomacromolecules.
Wei, Ling; Shi, Jianfeng; Afari, George; Bhattacharyya, Sibaprasad
2014-01-01
Panitumumab is a fully human monoclonal antibody approved for the treatment of epidermal growth factor receptor (EGFR) positive colorectal cancer. Recently, panitumumab has been radiolabeled with 89Zr and evaluated for its potential to be used as immuno-positron emission tomography (PET) probe for EGFR positive cancers. Interesting preclinical results published by several groups of researchers have prompted us to develop a robust procedure for producing clinical-grade 89Zr-panitumumab as an immuno-PET probe to evaluate EGFR-targeted therapy. In this process, clinical-grade panitumumab is bio-conjugated with desferrioxamine chelate and subsequently radiolabeled with 89Zr resulting in high radiochemical yield (>70%, n=3) and purity (>98%, n=3). All quality control (QC) tests were performed according to United States Pharmacopeia specifications. QC tests showed that 89Zr-panitumumab met all specifications for human injection. Herein, we describe a step-by-step method for the facile synthesis and QC tests of 89Zr-panitumumab for medical use. The entire process of bioconjugation, radiolabeling, and all QC tests will take about 5h. Because the synthesis is fully manual, two rapid, in-process QC tests have been introduced to make the procedure robust and error free. PMID:24448743
QC-ART: A tool for real-time quality control assessment of mass spectrometry-based proteomics data.
Stanfill, Bryan A; Nakayasu, Ernesto S; Bramer, Lisa M; Thompson, Allison M; Ansong, Charles K; Clauss, Therese; Gritsenko, Marina A; Monroe, Matthew E; Moore, Ronald J; Orton, Daniel J; Piehowski, Paul D; Schepmoes, Athena A; Smith, Richard D; Webb-Robertson, Bobbie-Jo; Metz, Thomas O
2018-04-17
Liquid chromatography-mass spectrometry (LC-MS)-based proteomics studies of large sample cohorts can easily require from months to years to complete. Acquiring consistent, high-quality data in such large-scale studies is challenging because of normal variations in instrumentation performance over time, as well as artifacts introduced by the samples themselves, such as those due to collection, storage and processing. Existing quality control methods for proteomics data primarily focus on post-hoc analysis to remove low-quality data that would degrade downstream statistics; they are not designed to evaluate the data in near real-time, which would allow for interventions as soon as deviations in data quality are detected. In addition to flagging analyses that demonstrate outlier behavior, evaluating how the data structure changes over time can aide in understanding typical instrument performance or identify issues such as a degradation in data quality due to the need for instrument cleaning and/or re-calibration. To address this gap for proteomics, we developed Quality Control Analysis in Real-Time (QC-ART), a tool for evaluating data as they are acquired in order to dynamically flag potential issues with instrument performance or sample quality. QC-ART has similar accuracy as standard post-hoc analysis methods with the additional benefit of real-time analysis. We demonstrate the utility and performance of QC-ART in identifying deviations in data quality due to both instrument and sample issues in near real-time for LC-MS-based plasma proteomics analyses of a sample subset of The Environmental Determinants of Diabetes in the Young cohort. We also present a case where QC-ART facilitated the identification of oxidative modifications, which are often underappreciated in proteomic experiments. Published under license by The American Society for Biochemistry and Molecular Biology, Inc.
Li, Chunhua; Lu, Ling; Wu, Xianghong; Wang, Chuanxi; Bennett, Phil; Lu, Teng; Murphy, Donald
2009-08-01
In this study, we characterized the full-length genomic sequences of 13 distinct hepatitis C virus (HCV) genotype 4 isolates/subtypes: QC264/4b, QC381/4c, QC382/4d, QC193/4g, QC383/4k, QC274/4l, QC249/4m, QC97/4n, QC93/4o, QC139/4p, QC262/4q, QC384/4r and QC155/4t. These were amplified, using RT-PCR, from the sera of patients now residing in Canada, 11 of which were African immigrants. The resulting genomes varied between 9421 and 9475 nt in length and each contains a single ORF of 9018-9069 nt. The sequences showed nucleotide similarities of 77.3-84.3 % in comparison with subtypes 4a (GenBank accession no. Y11604) and 4f (EF589160) and 70.6-72.8 % in comparison with genotype 1 (M62321/1a, M58335/1b, D14853/1c, and 1?/AJ851228) reference sequences. These similarities were often higher than those currently defined by HCV classification criteria for subtype (75.0-80.0 %) and genotype (67.0-70.0 %) division, respectively. Further analyses of the complete and partial E1 and partial NS5B sequences confirmed these 13 'provisionally assigned subtypes'.
From field notes to data portal - An operational QA/QC framework for tower networks
NASA Astrophysics Data System (ADS)
Sturtevant, C.; Hackley, S.; Meehan, T.; Roberti, J. A.; Holling, G.; Bonarrigo, S.
2016-12-01
Quality assurance and control (QA/QC) is one of the most important yet challenging aspects of producing research-quality data. This is especially so for environmental sensor networks collecting numerous high-frequency measurement streams at distributed sites. Here, the quality issues are multi-faceted, including sensor malfunctions, unmet theoretical assumptions, and measurement interference from the natural environment. To complicate matters, there are often multiple personnel managing different sites or different steps in the data flow. For large, centrally managed sensor networks such as NEON, the separation of field and processing duties is in the extreme. Tower networks such as Ameriflux, ICOS, and NEON continue to grow in size and sophistication, yet tools for robust, efficient, scalable QA/QC have lagged. Quality control remains a largely manual process relying on visual inspection of the data. In addition, notes of observed measurement interference or visible problems are often recorded on paper without an explicit pathway to data flagging during processing. As such, an increase in network size requires a near-proportional increase in personnel devoted to QA/QC, quickly stressing the human resources available. There is a need for a scalable, operational QA/QC framework that combines the efficiency and standardization of automated tests with the power and flexibility of visual checks, and includes an efficient communication pathway from field personnel to data processors to end users. Here we propose such a framework and an accompanying set of tools in development, including a mobile application template for recording tower maintenance and an R/shiny application for efficiently monitoring and synthesizing data quality issues. This framework seeks to incorporate lessons learned from the Ameriflux community and provide tools to aid continued network advancements.
Integrative Blood Pressure Response to Upright Tilt Post Renal Denervation
Howden, Erin J.; East, Cara; Lawley, Justin S.; Stickford, Abigail S.L.; Verhees, Myrthe; Fu, Qi
2017-01-01
Abstract BACKGROUND Whether renal denervation (RDN) in patients with resistant hypertension normalizes blood pressure (BP) regulation in response to routine cardiovascular stimuli such as upright posture is unknown. We conducted an integrative study of BP regulation in patients with resistant hypertension who had received RDN to characterize autonomic circulatory control. METHODS Twelve patients (60 ± 9 [SD] years, n = 10 males) who participated in the Symplicity HTN-3 trial were studied and compared to 2 age-matched normotensive (Norm) and hypertensive (unmedicated, HTN) control groups. BP, heart rate (HR), cardiac output (Qc), muscle sympathetic nerve activity (MSNA), and neurohormonal variables were measured supine, and 30° (5 minutes) and 60° (20 minutes) head-up-tilt (HUT). Total peripheral resistance (TPR) was calculated from mean arterial pressure and Qc. RESULTS Despite treatment with RDN and 4.8 (range, 3–7) antihypertensive medications, the RDN had significantly higher supine systolic BP compared to Norm and HTN (149 ± 15 vs. 118 ± 6, 108 ± 8 mm Hg, P < 0.001). When supine, RDN had higher HR, TPR, MSNA, plasma norepinephrine, and effective arterial elastance compared to Norm. Plasma norepinephrine, Qc, and HR were also higher in the RDN vs. HTN. During HUT, BP remained higher in the RDN, due to increases in Qc, plasma norepinephrine, and aldosterone. CONCLUSION We provide evidence of a possible mechanism by which BP remains elevated post RDN, with the observation of increased Qc and arterial stiffness, as well as plasma norepinephrine and aldosterone levels at approximately 2 years post treatment. These findings may be the consequence of incomplete ablation of sympathetic renal nerves or be related to other factors. PMID:28338768
Quality assurance and quality control in mammography: a review of available guidance worldwide.
Reis, Cláudia; Pascoal, Ana; Sakellaris, Taxiarchis; Koutalonis, Manthos
2013-10-01
Review available guidance for quality assurance (QA) in mammography and discuss its contribution to harmonise practices worldwide. Literature search was performed on different sources to identify guidance documents for QA in mammography available worldwide in international bodies, healthcare providers, professional/scientific associations. The guidance documents identified were reviewed and a selection was compared for type of guidance (clinical/technical), technology and proposed QA methodologies focusing on dose and image quality (IQ) performance assessment. Fourteen protocols (targeted at conventional and digital mammography) were reviewed. All included recommendations for testing acquisition, processing and display systems associated with mammographic equipment. All guidance reviewed highlighted the importance of dose assessment and testing the Automatic Exposure Control (AEC) system. Recommended tests for assessment of IQ showed variations in the proposed methodologies. Recommended testing focused on assessment of low-contrast detection, spatial resolution and noise. QC of image display is recommended following the American Association of Physicists in Medicine guidelines. The existing QA guidance for mammography is derived from key documents (American College of Radiology and European Union guidelines) and proposes similar tests despite the variations in detail and methodologies. Studies reported on QA data should provide detail on experimental technique to allow robust data comparison. Countries aiming to implement a mammography/QA program may select/prioritise the tests depending on available technology and resources. •An effective QA program should be practical to implement in a clinical setting. •QA should address the various stages of the imaging chain: acquisition, processing and display. •AEC system QC testing is simple to implement and provides information on equipment performance.
Sho, Shonan; Court, Colin M; Winograd, Paul; Lee, Sangjun; Hou, Shuang; Graeber, Thomas G; Tseng, Hsian-Rong; Tomlinson, James S
2017-07-01
Sequencing analysis of circulating tumor cells (CTCs) enables "liquid biopsy" to guide precision oncology strategies. However, this requires low-template whole genome amplification (WGA) that is prone to errors and biases from uneven amplifications. Currently, quality control (QC) methods for WGA products, as well as the number of CTCs needed for reliable downstream sequencing, remain poorly defined. We sought to define strategies for selecting and generating optimal WGA products from low-template input as it relates to their potential applications in precision oncology strategies. Single pancreatic cancer cells (HPAF-II) were isolated using laser microdissection. WGA was performed using multiple displacement amplification (MDA), multiple annealing and looping based amplification (MALBAC) and PicoPLEX. Quality of amplified DNA products were assessed using a multiplex/RT-qPCR based method that evaluates for 8-cancer related genes and QC-scores were assigned. We utilized this scoring system to assess the impact of de novo modifications to the WGA protocol. WGA products were subjected to Sanger sequencing, array comparative genomic hybridization (aCGH) and next generation sequencing (NGS) to evaluate their performances in respective downstream analyses providing validation of the QC-score. Single-cell WGA products exhibited a significant sample-to-sample variability in amplified DNA quality as assessed by our 8-gene QC assay. Single-cell WGA products that passed the pre-analysis QC had lower amplification bias and improved aCGH/NGS performance metrics when compared to single-cell WGA products that failed the QC. Increasing the number of cellular input resulted in improved QC-scores overall, but a resultant WGA product that consistently passed the QC step required a starting cellular input of at least 20-cells. Our modified-WGA protocol effectively reduced this number, achieving reproducible high-quality WGA products from ≥5-cells as a starting template. A starting cellular input of 5 to 10-cells amplified using the modified-WGA achieved aCGH and NGS results that closely matched that of unamplified, batch genomic DNA. The modified-WGA protocol coupled with the 8-gene QC serve as an effective strategy to enhance the quality of low-template WGA reactions. Furthermore, a threshold number of 5-10 cells are likely needed for a reliable WGA reaction and product with high fidelity to the original starting template.
20 CFR 602.21 - Standard methods and procedures.
Code of Federal Regulations, 2010 CFR
2010-04-01
..., (2) Use a questionnaire, prescribed by the Department, which is designed to obtain such data as the Department deems necessary for the operation of the QC program; require completion of the questionnaire by...
NASA Astrophysics Data System (ADS)
Christianson, D. S.; Beekwilder, N.; Chan, S.; Cheah, Y. W.; Chu, H.; Dengel, S.; O'Brien, F.; Pastorello, G.; Sandesh, M.; Torn, M. S.; Agarwal, D.
2017-12-01
AmeriFlux is a network of scientists who independently collect eddy covariance and related environmental observations at over 250 locations across the Americas. As part of the AmeriFlux Management Project, the AmeriFlux Data Team manages standardization, collection, quality assurance / quality control (QA/QC), and distribution of data submitted by network members. To generate data products that are timely, QA/QC'd, and repeatable, and have traceable provenance, we developed a semi-automated data processing pipeline. The new pipeline consists of semi-automated format and data QA/QC checks. Results are communicated via on-line reports as well as an issue-tracking system. Data processing time has been reduced from 2-3 days to a few hours of manual review time, resulting in faster data availability from the time of data submission. The pipeline is scalable to the network level and has the following key features. (1) On-line results of the format QA/QC checks are available immediately for data provider review. This enables data providers to correct and resubmit data quickly. (2) The format QA/QC assessment includes an automated attempt to fix minor format errors. Data submissions that are formatted in the new AmeriFlux FP-In standard can be queued for the data QA/QC assessment, often with minimal delay. (3) Automated data QA/QC checks identify and communicate potentially erroneous data via online, graphical quick views that highlight observations with unexpected values, incorrect units, time drifts, invalid multivariate correlations, and/or radiation shadows. (4) Progress through the pipeline is integrated with an issue-tracking system that facilitates communications between data providers and the data processing team in an organized and searchable fashion. Through development of these and other features of the pipeline, we present solutions to challenges that include optimizing automated with manual processing, bridging legacy data management infrastructure with various software tools, and working across interdisciplinary and international science cultures. Additionally, we discuss results from community member feedback that helped refine QA/QC communications for efficient data submission and revision.
A real-time automated quality control of rain gauge data based on multiple sensors
NASA Astrophysics Data System (ADS)
qi, Y.; Zhang, J.
2013-12-01
Precipitation is one of the most important meteorological and hydrological variables. Automated rain gauge networks provide direct measurements of precipitation and have been used for numerous applications such as generating regional and national precipitation maps, calibrating remote sensing data, and validating hydrological and meteorological model predictions. Automated gauge observations are prone to a variety of error sources (instrument malfunction, transmission errors, format changes), and require careful quality controls (QC). Many previous gauge QC techniques were based on neighborhood checks within the gauge network itself and the effectiveness is dependent on gauge densities and precipitation regimes. The current study takes advantage of the multi-sensor data sources in the National Mosaic and Multi-Sensor QPE (NMQ/Q2) system and developes an automated gauge QC scheme based the consistency of radar hourly QPEs and gauge observations. Error characteristics of radar and gauge as a function of the radar sampling geometry, precipitation regimes, and the freezing level height are considered. The new scheme was evaluated by comparing an NMQ national gauge-based precipitation product with independent manual gauge observations. Twelve heavy rainfall events from different seasons and areas of the United States are selected for the evaluation, and the results show that the new NMQ product with QC'ed gauges has a more physically spatial distribution than the old product. And the new product agrees much better statistically with the independent gauges.
The purpose of this SOP is to define the procedures used for the initial and periodic verification and validation of computer programs used during the Arizona NHEXAS project and the Border study. Keywords: Computers; Software; QA/QC.
The U.S.-Mexico Border Program is sponsored ...
SU-D-201-04: Evaluation of Elekta Agility MLC Performance Using Statistical Process Control
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meyers, SM; Balderson, MJ; Letourneau, D
2016-06-15
Purpose: to evaluate the performance and stability of the Elekta Agility MLC model using an automated quality control (QC) test in combination with statistical process control tools. Methods: Leaf positions were collected daily for 11 Elekta units over 5–19 months using the automated QC test, which analyzes 23 MV images to determine the location of MLC leaves relative to the radiation isocenter. The leaf positions are measured at 5 nominal positions, and images are acquired at collimator 0° and 180° to capture all MLC leaves in the field-of-view. Leaf positioning accuracy was assessed using individual and moving range control charts.more » Control limits were recomputed following MLC recalibration (occurred 1–2 times for 4 units). Specification levels of ±0.5, ±1 and ±1.5mm were tested. The mean and range of duration between out-of-control and out-of-specification events were determined. Results: Leaf position varied little over time, as confirmed by very tight individual control limits (mean ±0.19mm, range 0.09–0.44). Mean leaf position error was −0.03mm (range −0.89–0.83). Due to sporadic out-of-control events, the mean in-control duration was 3.3 days (range 1–23). Data stayed within ±1mm specification for 205 days on average (range 3–372) and within ±1.5mm for the entire date range. Measurements stayed within ±0.5mm for 1 day on average (range 0–17); however, our MLC leaves were not calibrated to this level of accuracy. Conclusion: The Elekta Agility MLC model was found to perform with high stability, as evidenced by the tight control limits. The in-specification durations support the current recommendation of monthly MLC QC tests with a ±1mm tolerance. Future work is on-going to determine if Agility performance can be optimized further using high-frequency QC test results to drive recalibration frequency. Factors that can affect leaf positioning accuracy, including beam spot motion, leaf gain calibration, drifting leaves, and image artifacts, are under investigation.« less
Protecting the proteome: Eukaryotic cotranslational quality control pathways
2014-01-01
The correct decoding of messenger RNAs (mRNAs) into proteins is an essential cellular task. The translational process is monitored by several quality control (QC) mechanisms that recognize defective translation complexes in which ribosomes are stalled on substrate mRNAs. Stalled translation complexes occur when defects in the mRNA template, the translation machinery, or the nascent polypeptide arrest the ribosome during translation elongation or termination. These QC events promote the disassembly of the stalled translation complex and the recycling and/or degradation of the individual mRNA, ribosomal, and/or nascent polypeptide components, thereby clearing the cell of improper translation products and defective components of the translation machinery. PMID:24535822
General Quality Control (QC) Guidelines for SAM Methods
Learn more about quality control guidelines and recommendations for the analysis of samples using the methods listed in EPA's Selected Analytical Methods for Environmental Remediation and Recovery (SAM).
Spectrally high performing quantum cascade lasers
NASA Astrophysics Data System (ADS)
Toor, Fatima
Quantum cascade (QC) lasers are versatile semiconductor light sources that can be engineered to emit light of almost any wavelength in the mid- to far-infrared (IR) and terahertz region from 3 to 300 mum [1-5]. Furthermore QC laser technology in the mid-IR range has great potential for applications in environmental, medical and industrial trace gas sensing [6-10] since several chemical vapors have strong rovibrational frequencies in this range and are uniquely identifiable by their absorption spectra through optical probing of absorption and transmission. Therefore, having a wide range of mid-IR wavelengths in a single QC laser source would greatly increase the specificity of QC laser-based spectroscopic systems, and also make them more compact and field deployable. This thesis presents work on several different approaches to multi-wavelength QC laser sources that take advantage of band-structure engineering and the uni-polar nature of QC lasers. Also, since for chemical sensing, lasers with narrow linewidth are needed, work is presented on a single mode distributed feedback (DFB) QC laser. First, a compact four-wavelength QC laser source, which is based on a 2-by-2 module design, with two waveguides having QC laser stacks for two different emission wavelengths each, one with 7.0 mum/11.2 mum, and the other with 8.7 mum/12.0 mum is presented. This is the first design of a four-wavelength QC laser source with widely different emission wavelengths that uses minimal optics and electronics. Second, since there are still several unknown factors that affect QC laser performance, results on a first ever study conducted to determine the effects of waveguide side-wall roughness on QC laser performance using the two-wavelength waveguides is presented. The results are consistent with Rayleigh scattering effects in the waveguides, with roughness effecting shorter wavelengths more than longer wavelengths. Third, a versatile time-multiplexed multi-wavelength QC laser system that emits at lambda = 10.8 mum for positive and lambda = 8.6 mum for negative polarity current with microsecond time delay is presented. Such a system is the first demonstration of a time and wavelength multiplexed system that uses a single QC laser. Fourth, work on the design and fabrication of a single-mode distributed feedback (DFB) QC laser emitting at lambda ≈ 7.7 mum to be used in a QC laser based photoacoustic sensor is presented. The DFB QC laser had a temperature tuning co-efficient of 0.45 nm/K for a temperature range of 80 K to 320 K, and a side mode suppression ratio of greater than 30 dB. Finally, study on the lateral mode patterns of wide ridge QC lasers is presented. The results include the observation of degenerate and non-degenerate lateral modes in wide ridge QC lasers emitting at lambda ≈ 5.0 mum. This study was conducted with the end goal of using wide ridge QC lasers in a novel technique to spatiospectrally combine multiple transverse modes to obtain an ultra high power single spot QC laser beam.
2012-09-30
briefing for aircraft operations in Diego Garcia, reports posted on EOL field catalog in realtime (http://catalog.eol.ucar.edu/cgi- bin/dynamo/report...index); • Dropsonde data processing on all P3 flights and realtime QC/reporting to GTS; and • Science summary of aircraft missions posted on EOL ...data analysis, worked with EOL on data quality control (QC), participated in the DYNAMO Sounding Workshop at EOL /NCAR from 6-7 February 2012
Srivastava, Praveen; Moorthy, Ganesh S; Gross, Robert; Barrett, Jeffrey S
2013-01-01
A selective and a highly sensitive method for the determination of the non-nucleoside reverse transcriptase inhibitor (NNRTI), efavirenz, in human plasma has been developed and fully validated based on high performance liquid chromatography tandem mass spectrometry (LC-MS/MS). Sample preparation involved protein precipitation followed by one to one dilution with water. The analyte, efavirenz was separated by high performance liquid chromatography and detected with tandem mass spectrometry in negative ionization mode with multiple reaction monitoring. Efavirenz and ¹³C₆-efavirenz (Internal Standard), respectively, were detected via the following MRM transitions: m/z 314.20243.90 and m/z 320.20249.90. A gradient program was used to elute the analytes using 0.1% formic acid in water and 0.1% formic acid in acetonitrile as mobile phase solvents, at a flow-rate of 0.3 mL/min. The total run time was 5 min and the retention times for the internal standard (¹³C₆-efavirenz) and efavirenz was approximately 2.6 min. The calibration curves showed linearity (coefficient of regression, r>0.99) over the concentration range of 1.0-2,500 ng/mL. The intraday precision based on the standard deviation of replicates of lower limit of quantification (LLOQ) was 9.24% and for quality control (QC) samples ranged from 2.41% to 6.42% and with accuracy from 112% and 100-111% for LLOQ and QC samples. The inter day precision was 12.3% and 3.03-9.18% for LLOQ and quality controls samples, and the accuracy was 108% and 95.2-108% for LLOQ and QC samples. Stability studies showed that efavirenz was stable during the expected conditions for sample preparation and storage. The lower limit of quantification for efavirenz was 1 ng/mL. The analytical method showed excellent sensitivity, precision, and accuracy. This method is robust and is being successfully applied for therapeutic drug monitoring and pharmacokinetic studies in HIV-infected patients.
75 FR 8031 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-23
... program participants. Also, these forms are essential part of the accounting system used by the subject... legislative basis for the operation of the QC system. Need and Use of the Information: The Food and Nutrition...
Glutaminyl Cyclase Knock-out Mice Exhibit Slight Hypothyroidism but No Hypogonadism
Schilling, Stephan; Kohlmann, Stephanie; Bäuscher, Christoph; Sedlmeier, Reinhard; Koch, Birgit; Eichentopf, Rico; Becker, Andreas; Cynis, Holger; Hoffmann, Torsten; Berg, Sabine; Freyse, Ernst-Joachim; von Hörsten, Stephan; Rossner, Steffen; Graubner, Sigrid; Demuth, Hans-Ulrich
2011-01-01
Glutaminyl cyclases (QCs) catalyze the formation of pyroglutamate (pGlu) residues at the N terminus of peptides and proteins. Hypothalamic pGlu hormones, such as thyrotropin-releasing hormone and gonadotropin-releasing hormone are essential for regulation of metabolism and fertility in the hypothalamic pituitary thyroid and gonadal axes, respectively. Here, we analyzed the consequences of constitutive genetic QC ablation on endocrine functions and on the behavior of adult mice. Adult homozygous QC knock-out mice are fertile and behave indistinguishably from wild type mice in tests of motor function, cognition, general activity, and ingestion behavior. The QC knock-out results in a dramatic drop of enzyme activity in the brain, especially in hypothalamus and in plasma. Other peripheral organs like liver and spleen still contain QC activity, which is most likely caused by its homolog isoQC. The serum gonadotropin-releasing hormone, TSH, and testosterone concentrations were not changed by QC depletion. The serum thyroxine was decreased by 24% in homozygous QC knock-out animals, suggesting a mild hypothyroidism. QC knock-out mice were indistinguishable from wild type with regard to blood glucose and glucose tolerance, thus differing from reports of thyrotropin-releasing hormone knock-out mice significantly. The results suggest a significant formation of the hypothalamic pGlu hormones by alternative mechanisms, like spontaneous cyclization or conversion by isoQC. The different effects of QC depletion on the hypothalamic pituitary thyroid and gonadal axes might indicate slightly different modes of substrate conversion of both enzymes. The absence of significant abnormalities in QC knock-out mice suggests the presence of a therapeutic window for suppression of QC activity in current drug development. PMID:21330373
Satellite-Based Quantum Communications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hughes, Richard J; Nordholt, Jane E; McCabe, Kevin P
2010-09-20
Single-photon quantum communications (QC) offers the attractive feature of 'future proof', forward security rooted in the laws of quantum physics. Ground based quantum key distribution (QKD) experiments in optical fiber have attained transmission ranges in excess of 200km, but for larger distances we proposed a methodology for satellite-based QC. Over the past decade we have devised solutions to the technical challenges to satellite-to-ground QC, and we now have a clear concept for how space-based QC could be performed and potentially utilized within a trusted QKD network architecture. Functioning as a trusted QKD node, a QC satellite ('QC-sat') could deliver secretmore » keys to the key stores of ground-based trusted QKD network nodes, to each of which multiple users are connected by optical fiber or free-space QC. A QC-sat could thereby extend quantum-secured connectivity to geographically disjoint domains, separated by continental or inter-continental distances. In this paper we describe our system concept that makes QC feasible with low-earth orbit (LEO) QC-sats (200-km-2,000-km altitude orbits), and the results of link modeling of expected performance. Using the architecture that we have developed, LEO satellite-to-ground QKD will be feasible with secret bit yields of several hundred 256-bit AES keys per contact. With multiple ground sites separated by {approx} 100km, mitigation of cloudiness over any single ground site would be possible, potentially allowing multiple contact opportunities each day. The essential next step is an experimental QC-sat. A number of LEO-platforms would be suitable, ranging from a dedicated, three-axis stabilized small satellite, to a secondary experiment on an imaging satellite. to the ISS. With one or more QC-sats, low-latency quantum-secured communications could then be provided to ground-based users on a global scale. Air-to-ground QC would also be possible.« less
Joint design of QC-LDPC codes for coded cooperation system with joint iterative decoding
NASA Astrophysics Data System (ADS)
Zhang, Shunwai; Yang, Fengfan; Tang, Lei; Ejaz, Saqib; Luo, Lin; Maharaj, B. T.
2016-03-01
In this paper, we investigate joint design of quasi-cyclic low-density-parity-check (QC-LDPC) codes for coded cooperation system with joint iterative decoding in the destination. First, QC-LDPC codes based on the base matrix and exponent matrix are introduced, and then we describe two types of girth-4 cycles in QC-LDPC codes employed by the source and relay. In the equivalent parity-check matrix corresponding to the jointly designed QC-LDPC codes employed by the source and relay, all girth-4 cycles including both type I and type II are cancelled. Theoretical analysis and numerical simulations show that the jointly designed QC-LDPC coded cooperation well combines cooperation gain and channel coding gain, and outperforms the coded non-cooperation under the same conditions. Furthermore, the bit error rate performance of the coded cooperation employing jointly designed QC-LDPC codes is better than those of random LDPC codes and separately designed QC-LDPC codes over AWGN channels.
Evaluation of methods to reduce background using the Python-based ELISA_QC program.
Webster, Rose P; Cohen, Cinder F; Saeed, Fatima O; Wetzel, Hanna N; Ball, William J; Kirley, Terence L; Norman, Andrew B
2018-05-01
Almost all immunological approaches [immunohistochemistry, enzyme-linked immunosorbent assay (ELISA), Western blot], that are used to quantitate specific proteins have had to address high backgrounds due to non-specific reactivity. We report here for the first time a quantitative comparison of methods for reduction of the background of commercial biotinylated antibodies using the Python-based ELISA_QC program. This is demonstrated using a recombinant humanized anti-cocaine monoclonal antibody. Several approaches, such as adjustment of the incubation time and the concentration of blocking agent, as well as the dilution of secondary antibodies, have been explored to address this issue. In this report, systematic comparisons of two different methods, contrasted with other more traditional methods to address this problem are provided. Addition of heparin (HP) at 1 μg/ml to the wash buffer prior to addition of the secondary biotinylated antibody reduced the elevated background absorbance values (from a mean of 0.313 ± 0.015 to 0.137 ± 0.002). A novel immunodepletion (ID) method also reduced the background (from a mean of 0.331 ± 0.010 to 0.146 ± 0.013). Overall, the ID method generated more similar results at each concentration of the ELISA standard curve to that using the standard lot 1 than the HP method, as analyzed by the Python-based ELISA_QC program. We conclude that the ID method, while more laborious, provides the best solution to resolve the high background seen with specific lots of biotinylated secondary antibody. Copyright © 2018. Published by Elsevier B.V.
quantGenius: implementation of a decision support system for qPCR-based gene quantification.
Baebler, Špela; Svalina, Miha; Petek, Marko; Stare, Katja; Rotter, Ana; Pompe-Novak, Maruša; Gruden, Kristina
2017-05-25
Quantitative molecular biology remains a challenge for researchers due to inconsistent approaches for control of errors in the final results. Due to several factors that can influence the final result, quantitative analysis and interpretation of qPCR data are still not trivial. Together with the development of high-throughput qPCR platforms, there is a need for a tool allowing for robust, reliable and fast nucleic acid quantification. We have developed "quantGenius" ( http://quantgenius.nib.si ), an open-access web application for a reliable qPCR-based quantification of nucleic acids. The quantGenius workflow interactively guides the user through data import, quality control (QC) and calculation steps. The input is machine- and chemistry-independent. Quantification is performed using the standard curve approach, with normalization to one or several reference genes. The special feature of the application is the implementation of user-guided QC-based decision support system, based on qPCR standards, that takes into account pipetting errors, assay amplification efficiencies, limits of detection and quantification of the assays as well as the control of PCR inhibition in individual samples. The intermediate calculations and final results are exportable in a data matrix suitable for further statistical analysis or visualization. We additionally compare the most important features of quantGenius with similar advanced software tools and illustrate the importance of proper QC system in the analysis of qPCR data in two use cases. To our knowledge, quantGenius is the only qPCR data analysis tool that integrates QC-based decision support and will help scientists to obtain reliable results which are the basis for biologically meaningful data interpretation.
The purpose of this presentation is to present an overview of the quality control (QC) sections of a draft EPA document entitled, "Quality Assurance/Quality Control Guidance for Laboratories Performing PCR Analyses on Environmental Samples." This document has been prepared by th...
Kaufmann-Kolle, Petra; Szecsenyi, Joachim; Broge, Björn; Haefeli, Walter Emil; Schneider, Antonius
2011-01-01
The purpose of this cluster-randomised controlled trial was to evaluate the efficacy of quality circles (QCs) working either with general data-based feedback or with an open benchmark within the field of asthma care and drug-drug interactions. Twelve QCs, involving 96 general practitioners from 85 practices, were randomised. Six QCs worked with traditional anonymous feedback and six with an open benchmark. Two QC meetings supported with feedback reports were held covering the topics "drug-drug interactions" and "asthma"; in both cases discussions were guided by a trained moderator. Outcome measures included health-related quality of life and patient satisfaction with treatment, asthma severity and number of potentially inappropriate drug combinations as well as the general practitioners' satisfaction in relation to the performance of the QC. A significant improvement in the treatment of asthma was observed in both trial arms. However, there was only a slight improvement regarding inappropriate drug combinations. There were no relevant differences between the group with open benchmark (B-QC) and traditional quality circles (T-QC). The physicians' satisfaction with the QC performance was significantly higher in the T-QCs. General practitioners seem to take a critical perspective about open benchmarking in quality circles. Caution should be used when implementing benchmarking in a quality circle as it did not improve healthcare when compared to the traditional procedure with anonymised comparisons. Copyright © 2011. Published by Elsevier GmbH.
Environment-induced quantum coherence spreading of a qubit
NASA Astrophysics Data System (ADS)
Pozzobom, Mauro B.; Maziero, Jonas
2017-02-01
We make a thorough study of the spreading of quantum coherence (QC), as quantified by the l1-norm QC, when a qubit (a two-level quantum system) is subjected to noise quantum channels commonly appearing in quantum information science. We notice that QC is generally not conserved and that even incoherent initial states can lead to transitory system-environment QC. We show that for the amplitude damping channel the evolved total QC can be written as the sum of local and non-local parts, with the last one being equal to entanglement. On the other hand, for the phase damping channel (PDC) entanglement does not account for all non-local QC, with the gap between them depending on time and also on the qubit's initial state. Besides these issues, the possibility and conditions for time invariance of QC are regarded in the case of bit, phase, and bit-phase flip channels. Here we reveal the qualitative dynamical inequivalence between these channels and the PDC and show that the creation of system-environment entanglement does not necessarily imply the destruction of the qubit's QC. We also investigate the resources needed for non-local QC creation, showing that while the PDC requires initial coherence of the qubit, for some other channels non-zero population of the excited state (i.e., energy) is sufficient. Related to that, considering the depolarizing channel we notice the qubit's ability to act as a catalyst for the creation of joint QC and entanglement, without need for nonzero initial QC or excited state population.
1994-03-04
WalerQC METHOD BANK 30104 79-0146 TRHICLOROE1Ifl.BEE(TE) 0.j U11.01 WalerQC UShODSBAIN 301 04W 79-0146 TRIILMOROBHYLBEE (TCE) IU 1101.. alerQC METHOD...OOUL1!ANE -SS 89 %IC WSWeQC METHOD BANK 3020(1400 22M 0-Si-S 2*OOCLOROBUTANE -SI 902 sm WalerQC METHOD BLANK 8020(1400 22M 0-365 1.4003C2LOROSUfANE...SS 920 %wI WmerQC METHMOD BANK 0102(1400 CH 10-56-5 I.OX4-D01OOSUANE -SI IisBc WaNer C METHOD BLANK 8100(1400 22 10-5&5 2.40 EHOROSUTANE -SI 92 IC
Cendejas, Richard A; Phillips, Mark C; Myers, Tanya L; Taubman, Matthew S
2010-12-06
An external-cavity (EC) quantum cascade (QC) laser using optical feedback from a partial-reflector is reported. With this configuration, the otherwise multi-mode emission of a Fabry-Perot QC laser was made single-mode with optical output powers exceeding 40 mW. A mode-hop free tuning range of 2.46 cm(-1) was achieved by synchronously tuning the EC length and QC laser current. The linewidth of the partial-reflector EC-QC laser was measured for integration times from 100 μs to 4 seconds, and compared to a distributed feedback QC laser. Linewidths as small as 480 kHz were recorded for the EC-QC laser.
Sobol, Wlad T
2002-01-01
A simple kinetic model that describes the time evolution of the chemical concentration of an arbitrary compound within the tank of an automatic film processor is presented. It provides insights into the kinetics of chemistry concentration inside the processor's tank; the results facilitate the tasks of processor tuning and quality control (QC). The model has successfully been used in several troubleshooting sessions of low-volume mammography processors for which maintaining consistent QC tracking was difficult due to fluctuations of bromide levels in the developer tank.
Proximate Composition Analysis.
2016-01-01
The proximate composition of foods includes moisture, ash, lipid, protein and carbohydrate contents. These food components may be of interest in the food industry for product development, quality control (QC) or regulatory purposes. Analyses used may be rapid methods for QC or more accurate but time-consuming official methods. Sample collection and preparation must be considered carefully to ensure analysis of a homogeneous and representative sample, and to obtain accurate results. Estimation methods of moisture content, ash value, crude lipid, total carbohydrates, starch, total free amino acids and total proteins are put together in a lucid manner.
Coda Wave Attenuation Characteristics for North Anatolian Fault Zone, Turkey
NASA Astrophysics Data System (ADS)
Sertcelik, Fadime; Guleroglu, Mehmet
2017-10-01
North Anatolian Fault Zone, on which large earthquakes have occurred in the past, migrates regularly from east to west, and it is one of the most active faults in the world. The purpose of this study is to estimate the coda wave quality factor (Qc) for each of the five sub regionsthat were determined according to the fault rupture of these large earthquakes and along the fault. 978 records have been analyzed for 1.5, 3, 6, 9, 12 and 18 Hz frequencies by Single Backscattering Method. Along the fault, the variations in the Qc with lapse time are determined via, Qc = (136±25)f(0.96±0.027), Qc = (208±22)f(0.85±0.02) Qc = (307±28)f(0.72±0.025) at 20, 30, 40 sec lapse times, respectively. The estimated average frequency-dependence quality factor for all lapse time are; Qc(f) = (189±26)f(0.86±0.02) for Karliova-Tokat region; Qc(f) = (216±19)f(0.76±0.018) for Tokat-Çorum region; Qc(f) = (232±18)f(0.76±0.019) for Çorum-Adapazari region; Qc(f) = (280±28)f(0.79±0.021) for Adapazari-Yalova region; Qc(f) = (252±26)f(0.81±0.022) for Yalova-Gulf of Saros region. The coda wave quality factor at all the lapse times and frequencies is Qc(f) = (206±15)f(0.85±0.012) in the study area. The most change of Qc with lapse time is determined at Yalova-Saros region. The result may be related to heterogeneity degree of rapidly decreases towards the deep crust like compared to the other sub region. Moreover, the highest Qc is calculated between Adapazari - Yalova. It was interpreted as a result of seismic energy released by 1999 Kocaeli Earthquake. Besides, it couldn't be established a causal relationship between the regional variation of Qc with frequency and lapse time associated to migration of the big earthquakes. These results have been interpreted as the attenuation mechanism is affected by both regional heterogeneity and consist of a single or multi strands of the fault structure.
Generation of Polar Semi-Saturated Bicyclic Pyrazoles for Fragment-Based Drug Discovery Campaigns.
Luise, Nicola; Wyatt, Paul
2018-05-07
Synthesising polar semi-saturated bicyclic heterocycles can lead to better starting points for fragment-based drug discovery (FBDD) programs. This communication highlights the application of diverse chemistry to construct bicyclic systems from a common intermediate, where pyrazole, a privileged heteroaromatic able to bind effectively to biological targets, is fused to diverse saturated counterparts. The generated fragments can be further developed either after confirmation of their binding pose or early in the process, as their synthetic intermediates. Essential quality control (QC) for selection of small molecules to add to a fragment library is discussed. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Anderson, Nancy
2015-11-15
As of January 1, 2016, microbiology laboratories can choose to adopt a new quality control option, the Individualized Quality Control Plan (IQCP), under the Clinical Laboratory Improvement Amendments of 1988 (CLIA). This voluntary approach increases flexibility for meeting regulatory requirements and provides laboratories the opportunity to customize QC for their testing in their unique environments and by their testing personnel. IQCP is an all-inclusive approach to quality based on risk management to address potential errors in the total testing process. It includes three main steps, (1) performing a risk assessment, (2) developing a QC plan, and (3) monitoring the plan through quality assessment. Resources are available from the Centers for Medicare & Medicaid Services, Centers for Disease Control and Prevention, American Society for Microbiology, Clinical and Laboratory Standards Institute, and accrediting organizations, such as the College of American Pathologists and Joint Commission, to assist microbiology laboratories implementing IQCP.
NASA Astrophysics Data System (ADS)
Yuan, Jian-guo; Liang, Meng-qi; Wang, Yong; Lin, Jin-zhao; Pang, Yu
2016-03-01
A novel lower-complexity construction scheme of quasi-cyclic low-density parity-check (QC-LDPC) codes for optical transmission systems is proposed based on the structure of the parity-check matrix for the Richardson-Urbanke (RU) algorithm. Furthermore, a novel irregular QC-LDPC(4 288, 4 020) code with high code-rate of 0.937 is constructed by this novel construction scheme. The simulation analyses show that the net coding gain ( NCG) of the novel irregular QC-LDPC(4 288,4 020) code is respectively 2.08 dB, 1.25 dB and 0.29 dB more than those of the classic RS(255, 239) code, the LDPC(32 640, 30 592) code and the irregular QC-LDPC(3 843, 3 603) code at the bit error rate ( BER) of 10-6. The irregular QC-LDPC(4 288, 4 020) code has the lower encoding/decoding complexity compared with the LDPC(32 640, 30 592) code and the irregular QC-LDPC(3 843, 3 603) code. The proposed novel QC-LDPC(4 288, 4 020) code can be more suitable for the increasing development requirements of high-speed optical transmission systems.
Development of an Operational TS Dataset Production System for the Data Assimilation System
NASA Astrophysics Data System (ADS)
Kim, Sung Dae; Park, Hyuk Min; Kim, Young Ho; Park, Kwang Soon
2017-04-01
An operational TS (Temperature and Salinity) dataset production system was developed to provide near real-time data to the data assimilation system periodically. It collects the latest 15 days' TS data of the north western pacific area (20°N - 55°N, 110°E - 150°E), applies QC tests to the archived data and supplies them to numerical prediction models of KIOST (Korea Institute of Ocean Science and Technology). The latest real-time TS data are collected from Argo GDAC and GTSPP data server every week. Argo data are downloaded from /latest_data directory of Argo GDAC. Because many duplicated data exist when all profile data are extracted from all Argo netCDF files, DB system is used to avoid duplication. All metadata (float ID, location, observation date and time, etc) of all Argo floats is stored into Database system and a Matlab program was developed to manipulate DB data, to check the duplication and to exclude duplicated data. GTSPP data are downloaded from /realtime directory of GTSPP data service. The latest data except ARGO data are extracted from the original data. Another Matlab program was coded to inspect all collected data using 10 QC tests and produce final dataset which can be used by the assimilation system. Three regional range tests to inspect annual, seasonal and monthly variations are included in the QC procedures. The C program was developed to provide regional ranges to data managers. It can calculate upper limit and lower limit of temperature and salinity at depth from 0 to 1550m. The final TS dataset contains the latest 15 days' TS data in netCDF format. It is updated every week and transmitted to numerical modeler of KIOST for operational use.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hollister, R
QC sample results (daily background check drum and 100-gram SGS check drum) were within acceptance criteria established by WIPP's Quality Assurance Objectives for TRU Waste Characterization. Replicate runs were performed on drum LL85501243TRU. Replicate measurement results are identical at the 95% confidence level as established by WIPP criteria. HWM NCAR No. 02-1000168 issued on 17-Oct-2002 regarding a partially dislodged Cd sheet filter on the HPGe coaxial detector. This physical geometry occurred on 01-Oct-2002 and was not corrected until 10-Oct-2002, during which period is inclusive of the present batch run of drums. Per discussions among the Independent Technical Reviewer, Expert Reviewermore » and the Technical QA Supervisor, as well as in consultation with John Fleissner, Technical Point of Contact from Canberra, the analytical results are technically reliable. All QC standard runs during this period were in control. Data packet for SGS Batch 2002-13 generated using passive gamma-ray spectroscopy with the Pu Facility SGS unit is technically reasonable. All QC samples are in compliance with establiShed control limits. The batch data packet has been reviewed for correctness, completeness, consistency and compliance with WIPP's Quality Assurance Objectives and determined to be acceptable.« less
Sorich, Michael J; McKinnon, Ross A; Miners, John O; Winkler, David A; Smith, Paul A
2004-10-07
This study aimed to evaluate in silico models based on quantum chemical (QC) descriptors derived using the electronegativity equalization method (EEM) and to assess the use of QC properties to predict chemical metabolism by human UDP-glucuronosyltransferase (UGT) isoforms. Various EEM-derived QC molecular descriptors were calculated for known UGT substrates and nonsubstrates. Classification models were developed using support vector machine and partial least squares discriminant analysis. In general, the most predictive models were generated with the support vector machine. Combining QC and 2D descriptors (from previous work) using a consensus approach resulted in a statistically significant improvement in predictivity (to 84%) over both the QC and 2D models and the other methods of combining the descriptors. EEM-derived QC descriptors were shown to be both highly predictive and computationally efficient. It is likely that EEM-derived QC properties will be generally useful for predicting ADMET and physicochemical properties during drug discovery.
Jiang, Jian; James, Christopher A; Wong, Philip
2016-09-05
A LC-MS/MS method has been developed and validated for the determination of glycine in human cerebrospinal fluid (CSF). The validated method used artificial cerebrospinal fluid as a surrogate matrix for calibration standards. The calibration curve range for the assay was 100-10,000ng/mL and (13)C2, (15)N-glycine was used as an internal standard (IS). Pre-validation experiments were performed to demonstrate parallelism with surrogate matrix and standard addition methods. The mean endogenous glycine concentration in a pooled human CSF determined on three days by using artificial CSF as a surrogate matrix and the method of standard addition was found to be 748±30.6 and 768±18.1ng/mL, respectively. A percentage difference of -2.6% indicated that artificial CSF could be used as a surrogate calibration matrix for the determination of glycine in human CSF. Quality control (QC) samples, except the lower limit of quantitation (LLOQ) QC and low QC samples, were prepared by spiking glycine into aliquots of pooled human CSF sample. The low QC sample was prepared from a separate pooled human CSF sample containing low endogenous glycine concentrations, while the LLOQ QC sample was prepared in artificial CSF. Standard addition was used extensively to evaluate matrix effects during validation. The validated method was used to determine the endogenous glycine concentrations in human CSF samples. Incurred sample reanalysis demonstrated reproducibility of the method. Copyright © 2016 Elsevier B.V. All rights reserved.
Park, Sang Hyuk; Park, Chan-Jeoung; Kim, Mi-Jeong; Choi, Mi-Ok; Han, Min-Young; Cho, Young-Uk; Jang, Seongsoo
2014-12-01
We developed and validated an interinstrument comparison method for automatic hematology analyzers based on the 99th percentile coefficient of variation (CV) cutoff of daily means and validated in both patient samples and quality control (QC) materials. A total of 120 patient samples were obtained over 6 months. Data from the first 3 months were used to determine 99th percentile CV cutoff values, and data obtained in the last 3 months were used to calculate acceptable ranges and rejection rates. Identical analyses were also performed using QC materials. Two instrument comparisons were also performed, and the most appropriate allowable total error (ATE) values were determined. The rejection rates based on the 99th percentile cutoff values were within 10.00% and 9.30% for the patient samples and QC materials, respectively. The acceptable ranges of QC materials based on the currently used method were wider than those calculated from the 99th percentile CV cutoff values in most items. In two-instrument comparisons, 34.8% of all comparisons failed, and 87.0% of failed comparisons were successful when 4 SD was applied as an ATE value instead of 3 SD. The 99th percentile CV cutoff value-derived daily acceptable ranges can be used as a real-time interinstrument comparison method in both patient samples and QC materials. Applying 4 SD as an ATE value can significantly reduce unnecessarily followed recalibration in the leukocyte differential counts, reticulocytes, and mean corpuscular volume. Copyright© by the American Society for Clinical Pathology.
Effects of Data Quality on the Characterization of Aerosol Properties from Multiple Sensors
NASA Technical Reports Server (NTRS)
Petrenko, Maksym; Ichoku, Charles; Leptoukh, Gregory
2011-01-01
Cross-comparison of aerosol properties between ground-based and spaceborne measurements is an important validation technique that helps to investigate the uncertainties of aerosol products acquired using spaceborne sensors. However, it has been shown that even minor differences in the cross-characterization procedure may significantly impact the results of such validation. Of particular consideration is the quality assurance I quality control (QA/QC) information - an auxiliary data indicating a "confidence" level (e.g., Bad, Fair, Good, Excellent, etc.) conferred by the retrieval algorithms on the produced data. Depending on the treatment of available QA/QC information, a cross-characterization procedure has the potential of filtering out invalid data points, such as uncertain or erroneous retrievals, which tend to reduce the credibility of such comparisons. However, under certain circumstances, even high QA/QC values may not fully guarantee the quality of the data. For example, retrievals in proximity of a cloud might be particularly perplexing for an aerosol retrieval algorithm, resulting in an invalid data that, nonetheless, could be assigned a high QA/QC confidence. In this presentation, we will study the effects of several QA/QC parameters on cross-characterization of aerosol properties between the data acquired by multiple spaceborne sensors. We will utilize the Multi-sensor Aerosol Products Sampling System (MAPSS) that provides a consistent platform for multi-sensor comparison, including collocation with measurements acquired by the ground-based Aerosol Robotic Network (AERONET), The multi-sensor spaceborne data analyzed include those acquired by the Terra-MODIS, Aqua-MODIS, Terra-MISR, Aura-OMI, Parasol-POLDER, and CalipsoCALIOP satellite instruments.
Aris-Brosou, Stephane; Kim, James; Li, Li; Liu, Hui
2018-05-15
Vendors in the health care industry produce diagnostic systems that, through a secured connection, allow them to monitor performance almost in real time. However, challenges exist in analyzing and interpreting large volumes of noisy quality control (QC) data. As a result, some QC shifts may not be detected early enough by the vendor, but lead a customer to complain. The aim of this study was to hypothesize that a more proactive response could be designed by utilizing the collected QC data more efficiently. Our aim is therefore to help prevent customer complaints by predicting them based on the QC data collected by in vitro diagnostic systems. QC data from five select in vitro diagnostic assays were combined with the corresponding database of customer complaints over a period of 90 days. A subset of these data over the last 45 days was also analyzed to assess how the length of the training period affects predictions. We defined a set of features used to train two classifiers, one based on decision trees and the other based on adaptive boosting, and assessed model performance by cross-validation. The cross-validations showed classification error rates close to zero for some assays with adaptive boosting when predicting the potential cause of customer complaints. Performance was improved by shortening the training period when the volume of complaints increased. Denoising filters that reduced the number of categories to predict further improved performance, as their application simplified the prediction problem. This novel approach to predicting customer complaints based on QC data may allow the diagnostic industry, the expected end user of our approach, to proactively identify potential product quality issues and fix these before receiving customer complaints. This represents a new step in the direction of using big data toward product quality improvement. ©Stephane Aris-Brosou, James Kim, Li Li, Hui Liu. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 15.05.2018.
Kim, James; Li, Li; Liu, Hui
2018-01-01
Background Vendors in the health care industry produce diagnostic systems that, through a secured connection, allow them to monitor performance almost in real time. However, challenges exist in analyzing and interpreting large volumes of noisy quality control (QC) data. As a result, some QC shifts may not be detected early enough by the vendor, but lead a customer to complain. Objective The aim of this study was to hypothesize that a more proactive response could be designed by utilizing the collected QC data more efficiently. Our aim is therefore to help prevent customer complaints by predicting them based on the QC data collected by in vitro diagnostic systems. Methods QC data from five select in vitro diagnostic assays were combined with the corresponding database of customer complaints over a period of 90 days. A subset of these data over the last 45 days was also analyzed to assess how the length of the training period affects predictions. We defined a set of features used to train two classifiers, one based on decision trees and the other based on adaptive boosting, and assessed model performance by cross-validation. Results The cross-validations showed classification error rates close to zero for some assays with adaptive boosting when predicting the potential cause of customer complaints. Performance was improved by shortening the training period when the volume of complaints increased. Denoising filters that reduced the number of categories to predict further improved performance, as their application simplified the prediction problem. Conclusions This novel approach to predicting customer complaints based on QC data may allow the diagnostic industry, the expected end user of our approach, to proactively identify potential product quality issues and fix these before receiving customer complaints. This represents a new step in the direction of using big data toward product quality improvement. PMID:29764796
NASA Astrophysics Data System (ADS)
Bushnell, M.; Waldmann, C.; Hermes, J.; Tamburri, M.
2017-12-01
Many oceanographic observation groups create and maintain QA, QC, and best practices (BP) to ensure efficient and accurate data collection and quantify quality. Several entities - IOOS® QARTOD, AtlantOS, ACT, WMO/IOC JCOMM OCG - have joined forces to document existing practices, identify gaps, and support development of emerging techniques. While each group has a slightly different focus, many underlying QA/QC/BP needs can be quite common. QARTOD focuses upon real-time data QC, and has produced manuals that address QC tests for eleven ocean variables. AtlantOS is a research and innovation project working towards the integration of ocean-observing activities across all disciplines in the Atlantic Basin. ACT brings together research institutions, resource managers, and private companies to foster the development and adoption of effective and reliable sensors for coastal, freshwater, and ocean environments. JCOMM promotes broad international coordination of oceanographic and marine meteorological observations and data management and services. Leveraging existing efforts of these organizations is an efficient way to consolidate available information, develop new practices, and evaluate the use of ISO standards to judge the quality of measurements. ISO standards may offer accepted support for a framework for an ocean data quality management system, similar to the meteorological standards defined by WMO (https://www.wmo.int/pages/prog/arep/gaw/qassurance.html). We will first cooperatively develop a plan to create a QA/QC/BP manual. The resulting plan will describe the need for such a manual, the extent of the manual, the process used to engage the community in creating it, the maintenance of the resultant document, and how these things will be done. It will also investigate standards for metadata. The plan will subsequently be used to develop the QA/QC/BP manual, providing guidance which advances the standards adopted by IOOS, AtlantOS, JCOMM, and others.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Malkoske, Kyle; Nielsen, Michelle; Brown, Erika
A close partnership between the Canadian Partnership for Quality Radiotherapy (CPQR) and the Canadian Organization of Medical Physicist’s (COMP) Quality Assurance and Radiation Safety Advisory Committee (QARSAC) has resulted in the development of a suite of Technical Quality Control (TQC) Guidelines for radiation treatment equipment, that outline specific performance objectives and criteria that equipment should meet in order to assure an acceptable level of radiation treatment quality. The framework includes consolidation of existing guidelines and/or literature by expert reviewers, structured stages of public review, external field-testing and ratification by COMP. The adopted framework for the development and maintenance of themore » TQCs ensures the guidelines incorporate input from the medical physics community during development, measures the workload required to perform the QC tests outlined in each TQC, and remain relevant (i.e. “living documents”) through subsequent planned reviews and updates. This presentation will show the Multi-Leaf Linear Accelerator document as an example of how feedback and cross-national work to achieve a robust guidance document. During field-testing, each technology was tested at multiple centres in a variety of clinic environments. As part of the defined feedback, workload data was captured. This lead to average time associated with testing as defined in each TQC document. As a result, for a medium-sized centre comprising 6 linear accelerators and a comprehensive brachytherapy program, we evaluate the physics workload to 1.5 full-time equivalent physicist per year to complete all QC tests listed in this suite.« less
Zamengo, Luca; Frison, Giampietro; Tedeschi, Gianpaola; Frasson, Samuela
2016-08-01
Blood alcohol concentration is the most frequent analytical determination carried out in forensic toxicology laboratories worldwide. It is usually required to assess if an offence has been committed by comparing blood alcohol levels with specified legal limits, which can vary widely among countries. Due to possible serious legal consequences associated with non-compliant alcohol levels, measurement uncertainty should be carefully evaluated, along with other metrological aspects which can influence the final result. The whole procedure can be time-consuming and error-generating in routine practice, increasing the risks for unreliable assessments. A software application named Ethanol WorkBook (EtWB) was developed at the author's laboratory by using Visual Basic for Application language and MS Excel(®), with the aim of providing help to forensic analysts involved in blood alcohol determinations. The program can (i) calculate measurement uncertainties and decision limits with different methodologies; (ii) assess compliance to specification limits with a guard-band approach; (iii) manage quality control (QC) data and create control charts for QC samples; (iv) create control maps from real cases data archives; (v) provide laboratory reports with graphical outputs for elaborated data and (vi) create comprehensive searchable case archives. A typical example of drink driving case is presented and discussed to illustrate the importance of a metrological approach for reliable compliance assessment and to demonstrate software application in routine practice. The tool is made freely available to the scientific community at request. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
40 CFR 136.7 - Quality assurance and quality control.
Code of Federal Regulations, 2014 CFR
2014-07-01
... quality control elements, where applicable, into the laboratory's documented standard operating procedure... quality control elements must be clearly documented in the written standard operating procedure for each... Methods contains QA/QC procedures in the Part 1000 section of the Standard Methods Compendium. The...
40 CFR 136.7 - Quality assurance and quality control.
Code of Federal Regulations, 2013 CFR
2013-07-01
... quality control elements, where applicable, into the laboratory's documented standard operating procedure... quality control elements must be clearly documented in the written standard operating procedure for each... Methods contains QA/QC procedures in the Part 1000 section of the Standard Methods Compendium. The...
40 CFR 136.7 - Quality assurance and quality control.
Code of Federal Regulations, 2012 CFR
2012-07-01
... quality control elements, where applicable, into the laboratory's documented standard operating procedure... quality control elements must be clearly documented in the written standard operating procedure for each... Methods contains QA/QC procedures in the Part 1000 section of the Standard Methods Compendium. The...
77 FR 73611 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-11
...: Negative Quality Control Review Schedule. OMB Control Number: 0584-0034. Summary of Collection: The legislative basis for the operation of the quality control system is provided by section 16 of the Food and Nutrition Act of 2008. State agencies are required to perform Quality Control (QC) reviews for the...
NASA Astrophysics Data System (ADS)
Acaba, K. J. C.; Cinco, L. D.; Melchor, J. N.
2016-03-01
Daily QC tests performed on screen film mammography (SFM) equipment are essential to ensure that both SFM unit and film processor are working in a consistent manner. The Breast Imaging Unit of USTH-Benavides Cancer Institute has been conducting QC following the test protocols in the IAEA Human Health Series No.2 manual. However, the availability of Leeds breast phantom (CRP E13039) in the facility made the task easier. Instead of carrying out separate tests on AEC constancy and light sensitometry, only one exposure of the phantom is done to accomplish the two tests. It was observed that measurements made on mAs output and optical densities (ODs) using the Leeds TOR (MAX) phantom are comparable with that obtained from the usual conduct of tests, taking into account the attenuation characteristic of the phantom. Image quality parameters such as low contrast and high contrast details were also evaluated from the phantom image. The authors recognize the usefulness of the phantom in determining technical factors that will help improve detection of smallest pathological details on breast images. The phantom is also convenient for daily QC monitoring and economical since less number of films is expended.
Evaluation of peak picking quality in LC-MS metabolomics data.
Brodsky, Leonid; Moussaieff, Arieh; Shahaf, Nir; Aharoni, Asaph; Rogachev, Ilana
2010-11-15
The output of LC-MS metabolomics experiments consists of mass-peak intensities identified through a peak-picking/alignment procedure. Besides imperfections in biological samples and instrumentation, data accuracy is highly dependent on the applied algorithms and their parameters. Consequently, quality control (QC) is essential for further data analysis. Here, we present a QC approach that is based on discrepancies between replicate samples. First, the quantile normalization of per-sample log-signal distributions is applied to each group of biologically homogeneous samples. Next, the overall quality of each replicate group is characterized by the Z-transformed correlation coefficients between samples. This general QC allows a tuning of the procedure's parameters which minimizes the inter-replicate discrepancies in the generated output. Subsequently, an in-depth QC measure detects local neighborhoods on a template of aligned chromatograms that are enriched by divergences between intensity profiles of replicate samples. These neighborhoods are determined through a segmentation algorithm. The retention time (RT)-m/z positions of the neighborhoods with local divergences are indicative of either: incorrect alignment of chromatographic features, technical problems in the chromatograms, or to a true biological discrepancy between replicates for particular metabolites. We expect this method to aid in the accurate analysis of metabolomics data and in the development of new peak-picking/alignment procedures.
Testing and analysis of LWT and SCB properties of asphalt concrete mixtures.
DOT National Transportation Integrated Search
2016-04-01
Currently, Louisianas Quality Control and Quality Assurance (QC/QA) practice for asphalt mixtures in : pavement construction is mainly based on controlling properties of plant produced mixtures that include : gradation and asphalt content, voids f...
Embankment quality and assessment of moisture control implementation : tech transfer summary.
DOT National Transportation Integrated Search
2016-02-01
The motivation for this project was based on work by : Iowa State University (ISU) researchers at a few recent : grading projects that demonstrated embankments were : being constructed outside moisture control limits, even : though the contractor QC ...
Quality control and quality assurance of hot mix asphalt construction in Delaware.
DOT National Transportation Integrated Search
2006-07-01
Since the mid 60s the Federal Highway Administration began to encourage : Departments of Transportation and Contractors toward the use of quality control and : quality assurance (QA/QC) specifications, which are statistically based. : For example,...
Duo, Jia; Dong, Huijin; DeSilva, Binodh; Zhang, Yan J
2013-07-01
Sample dilution and reagent pipetting are time-consuming steps in ligand-binding assays (LBAs). Traditional automation-assisted LBAs use assay-specific scripts that require labor-intensive script writing and user training. Five major script modules were developed on Tecan Freedom EVO liquid handling software to facilitate the automated sample preparation and LBA procedure: sample dilution, sample minimum required dilution, standard/QC minimum required dilution, standard/QC/sample addition, and reagent addition. The modular design of automation scripts allowed the users to assemble an automated assay with minimal script modification. The application of the template was demonstrated in three LBAs to support discovery biotherapeutic programs. The results demonstrated that the modular scripts provided the flexibility in adapting to various LBA formats and the significant time saving in script writing and scientist training. Data generated by the automated process were comparable to those by manual process while the bioanalytical productivity was significantly improved using the modular robotic scripts.
Variation of coda wave attenuation in the Alborz region and central Iran
NASA Astrophysics Data System (ADS)
Rahimi, H.; Motaghi, K.; Mukhopadhyay, S.; Hamzehloo, H.
2010-06-01
More than 340 earthquakes recorded by the Institute of Geophysics, University of Tehran (IGUT) short period stations from 1996 to 2004 were analysed to estimate the S-coda attenuation in the Alborz region, the northern part of the Alpine-Himalayan orogen in western Asia, and in central Iran, which is the foreland of this orogen. The coda quality factor, Qc, was estimated using the single backscattering model in frequency bands of 1-25 Hz. In this research, lateral and depth variation of Qc in the Alborz region and central Iran are studied. It is observed that in the Alborz region there is absence of significant lateral variation in Qc. The average frequency relation for this region is Qc = 79 +/- 2f1.07+/-0.08. Two anomalous high-attenuation areas in central Iran are recognized around the stations LAS and RAZ. The average frequency relation for central Iran excluding the values of these two stations is Qc = 94 +/- 2f0.97+/-0.12. To investigate the attenuation variation with depth, Qc value was calculated for 14 lapse times (25, 30, 35,... 90s) for two data sets having epicentral distance range R < 100 km (data set 1) and 100 < R < 200 km (data set 2) in each area. It is observed that Qc increases with depth. However, the rate of increase of Qc with depth is not uniform in our study area. Beneath central Iran the rate of increase of Qc is greater at depths less than 100 km compared to that at larger depths indicating the existence of a high attenuation anomalous structure under the lithosphere of central Iran. In addition, below ~180 km, the Qc value does not vary much with depth under both study areas, indicating the presence of a transparent mantle under them.
Hybrid spin and valley quantum computing with singlet-triplet qubits.
Rohling, Niklas; Russ, Maximilian; Burkard, Guido
2014-10-24
The valley degree of freedom in the electronic band structure of silicon, graphene, and other materials is often considered to be an obstacle for quantum computing (QC) based on electron spins in quantum dots. Here we show that control over the valley state opens new possibilities for quantum information processing. Combining qubits encoded in the singlet-triplet subspace of spin and valley states allows for universal QC using a universal two-qubit gate directly provided by the exchange interaction. We show how spin and valley qubits can be separated in order to allow for single-qubit rotations.
Liu, Zhao; Zheng, Chaorong; Wu, Yue
2017-09-01
Wind profilers have been widely adopted to observe the wind field information in the atmosphere for different purposes. But accuracy of its observation has limitations due to various noises or disturbances and hence need to be further improved. In this paper, the data measured under strong wind conditions, using a 1290-MHz boundary layer profiler (BLP), are quality controlled via a composite quality control (QC) procedure proposed by the authors. Then, through the comparison with the data measured by radiosonde flights (balloon observations), the critical thresholds in the composite QC procedure, including consensus average threshold T 1 and vertical shear threshold T 3 , are systematically discussed. And the performance of the BLP operated under precipitation is also evaluated. It is found that to ensure the high accuracy and high data collectable rate, the optimal range of subsets is determined to be 4 m/s. Although the number of data rejected by the combined algorithm of vertical shear examination and small median test is quite limited, it is proved that the algorithm is quite useful to recognize the outlier with a large discrepancy. And the optimal wind shear threshold T 3 can be recommended as 5 ms -1 /100m. During patchy precipitation, the quality of data measured by the four oblique beams (using the DBS measuring technique) can still be ensured. After the BLP data are quality controlled by the composite QC procedure, the output can show good agreement with the balloon observation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ma, Biao; Yamaguchi, Keiichi; Fukuoka, Mayuko
To accelerate the logical drug design procedure, we created the program “NAGARA,” a plugin for PyMOL, and applied it to the discovery of small compounds called medical chaperones (MCs) that stabilize the cellular form of a prion protein (PrP{sup C}). In NAGARA, we constructed a single platform to unify the docking simulation (DS), free energy calculation by molecular dynamics (MD) simulation, and interfragment interaction energy (IFIE) calculation by quantum chemistry (QC) calculation. NAGARA also enables large-scale parallel computing via a convenient graphical user interface. Here, we demonstrated its performance and its broad applicability from drug discovery to lead optimization withmore » full compatibility with various experimental methods including Western blotting (WB) analysis, surface plasmon resonance (SPR), and nuclear magnetic resonance (NMR) measurements. Combining DS and WB, we discovered anti-prion activities for two compounds and tegobuvir (TGV), a non-nucleoside non-structural protein NS5B polymerase inhibitor showing activity against hepatitis C virus genotype 1. Binding profiles predicted by MD and QC are consistent with those obtained by SPR and NMR. Free energy analyses showed that these compounds stabilize the PrP{sup C} conformation by decreasing the conformational fluctuation of the PrP{sup C}. Because TGV has been already approved as a medicine, its extension to prion diseases is straightforward. Finally, we evaluated the affinities of the fragmented regions of TGV using QC and found a clue for its further optimization. By repeating WB, MD, and QC recursively, we were able to obtain the optimum lead structure. - Highlights: • NAGARA integrates docking simulation, molecular dynamics, and quantum chemistry. • We found many compounds, e.g., tegobuvir (TGV), that exhibit anti-prion activities. • We obtained insights into the action mechanism of TGV as a medical chaperone. • Using QC, we obtained useful information for optimization of the lead compound, TGV. • NAGARA is a convenient platform for drug discovery and lead optimization.« less
Lee, Sejoon; Lee, Soohyun; Ouellette, Scott; Park, Woong-Yang; Lee, Eunjung A; Park, Peter J
2017-06-20
In many next-generation sequencing (NGS) studies, multiple samples or data types are profiled for each individual. An important quality control (QC) step in these studies is to ensure that datasets from the same subject are properly paired. Given the heterogeneity of data types, file types and sequencing depths in a multi-dimensional study, a robust program that provides a standardized metric for genotype comparisons would be useful. Here, we describe NGSCheckMate, a user-friendly software package for verifying sample identities from FASTQ, BAM or VCF files. This tool uses a model-based method to compare allele read fractions at known single-nucleotide polymorphisms, considering depth-dependent behavior of similarity metrics for identical and unrelated samples. Our evaluation shows that NGSCheckMate is effective for a variety of data types, including exome sequencing, whole-genome sequencing, RNA-seq, ChIP-seq, targeted sequencing and single-cell whole-genome sequencing, with a minimal requirement for sequencing depth (>0.5X). An alignment-free module can be run directly on FASTQ files for a quick initial check. We recommend using this software as a QC step in NGS studies. https://github.com/parklab/NGSCheckMate. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
The Ocean Observatories Initiative Data Management and QA/QC: Lessons Learned and the Path Ahead
NASA Astrophysics Data System (ADS)
Vardaro, M.; Belabbassi, L.; Garzio, L. M.; Knuth, F.; Smith, M. J.; Kerfoot, J.; Crowley, M. F.
2016-02-01
The Ocean Observatories Initiative (OOI) is a multi-decadal, NSF-funded program that will provide long-term, near real-time cabled and telemetered measurements of climate variability, ocean circulation, ecosystem dynamics, air-sea exchange, seafloor processes, and plate-scale geodynamics. The OOI platforms consist of seafloor sensors, fixed moorings, and mobile assets containing over 700 operational instruments in the Atlantic and Pacific oceans. Rutgers University operates the Cyberinfrastructure (CI) component of the OOI, which acquires, processes and distributes data to scientists, researchers, educators and the public. It will also provide observatory mission command and control, data assessment and distribution, and long-term data management. The Rutgers Data Management Team consists of a data manager and four data evaluators, who are tasked with ensuring data completeness and quality, as well as interaction with OOI users to facilitate data delivery and utility. Here we will discuss the procedures developed to guide the data team workflow, the automated QC algorithms and human-in-the-loop (HITL) annotations that are used to flag suspect data (whether due to instrument failures, biofouling, or unanticipated events), system alerts and alarms, long-term data storage and CF (Climate and Forecast) standard compliance, and the lessons learned during construction and the first several months of OOI operations.
Szaszkó, Mária; Hajdú, István; Flachner, Beáta; Dobi, Krisztina; Magyar, Csaba; Simon, István; Lőrincz, Zsolt; Kapui, Zoltán; Pázmány, Tamás; Cseh, Sándor; Dormán, György
2017-02-01
A glutaminyl cyclase (QC) fragment library was in silico selected by disconnection of the structure of known QC inhibitors and by lead-like 2D virtual screening of the same set. The resulting fragment library (204 compounds) was acquired from commercial suppliers and pre-screened by differential scanning fluorimetry followed by functional in vitro assays. In this way, 10 fragment hits were identified ([Formula: see text]5 % hit rate, best inhibitory activity: 16 [Formula: see text]). The in vitro hits were then docked to the active site of QC, and the best scoring compounds were analyzed for binding interactions. Two fragments bound to different regions in a complementary manner, and thus, linking those fragments offered a rational strategy to generate novel QC inhibitors. Based on the structure of the virtual linked fragment, a 77-membered QC target focused library was selected from vendor databases and docked to the active site of QC. A PubChem search confirmed that the best scoring analogues are novel, potential QC inhibitors.
SRT Evaluation of AIRS Version-6.02 and Version-6.02 AIRS Only (6.02 AO) Products
NASA Technical Reports Server (NTRS)
Susskind, Joel; Iredell, Lena; Molnar, Gyula; Blaisdell, John
2012-01-01
Version-6 contains a number of significant improvements over Version-5. This report compares Version-6 products resulting from the advances listed below to those from Version-5. 1. Improved methodology to determine skin temperature (T(sub s)) and spectral emissivity (Epsilon(sub v)). 2. Use of Neural-net start-up state. 3. Improvements which decrease the spurious negative Version-5 trend in tropospheric temperatures. 4. Improved QC methodology. Version-6 uses separate QC thresholds optimized for Data Assimilation (QC=0) and Climate applications (QC=0,1) respectively. 5. Channel-by-channel clear-column radiances R-hat(sub tau) QC flags. 6. Improved cloud parameter retrieval algorithm. 7. Improved OLR RTA. Our evaluation compared V6.02 and V6.02 AIRS Only (V6.02 AO) Quality Controlled products with those of Version-5.0. In particular we evaluated surface skin temperature T(sub s), surface spectral emissivity Epsilon(sub v), temperature profile T(p), water vapor profile q(p), OLR, OLR(sub CLR), effective cloud fraction alpha-Epsilon, and cloud cleared radiances R-hat(sub tau) . We conducted two types of evaluations. The first compared results on 7 focus days to collocated ECMWF truth. The seven focus days are: September 6, 2002; January 25, 2003; September 29, 2004; August 5, 2005; February 24, 2007; August 10, 2007; and May 30, 2010. In these evaluations, we show results for T(sub s), Epsilon(sub v), T(p), and q(p) in terms of yields, and RMS differences and biases with regard to ECMWF. We also show yield trends as well as bias trends of these quantities relative to ECMWF truth. We also show yields and accuracy of channel by channel QC d values of R-hat(sub tau) for V6.02 and V6.02 AO. Version-5 did not contain channel by channel QC d values of R-hat(sub tau). In the second type of evaluation, we compared V6.03 monthly mean Level-3 products to those of Version-5.0, for four different months: January, April, July, and October; in 3 different years 2003, 2007, and 2011. In particular, we compared V6.03 and V5.0 trends of T(p), q(p), alpha-Epsilon, OLR, and OLR(sub CLR) computed based on results for these 12 time periods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Luyao; Curwen, Christopher; Chen, Daguan
A longstanding challenge for terahertz quantum-cascade (QC) lasers is achieving both a high power and high-quality beam pattern, this is due in part due to their use of sub-wavelength metallic waveguides. Recently, the vertical-external-cavity surface-emitting laser (VECSEL) concept was demonstrated for the first time in the terahertz range and for a QC-laser. This is enabled by the development of an amplifying metasurface reflector capable of coupling incident free-space THz radiation to the QC-laser material such that it is amplified and re-radiated. The THz metasurface QC-VECSEL initiates a new approach for making QC-lasers with high power and excellent beam pattern. Furthermore,more » the ability to engineer the electromagnetic phase, amplitude, and polarization response of the metasurface enables lasers with new functionality. Our article provides an overview of the fundamental theory, design considerations, and recent results for high-performance THz QC-VECSELs.« less
Terahertz metasurface quantum-cascade VECSELs: theory and performance
Xu, Luyao; Curwen, Christopher; Chen, Daguan; ...
2017-04-12
A longstanding challenge for terahertz quantum-cascade (QC) lasers is achieving both a high power and high-quality beam pattern, this is due in part due to their use of sub-wavelength metallic waveguides. Recently, the vertical-external-cavity surface-emitting laser (VECSEL) concept was demonstrated for the first time in the terahertz range and for a QC-laser. This is enabled by the development of an amplifying metasurface reflector capable of coupling incident free-space THz radiation to the QC-laser material such that it is amplified and re-radiated. The THz metasurface QC-VECSEL initiates a new approach for making QC-lasers with high power and excellent beam pattern. Furthermore,more » the ability to engineer the electromagnetic phase, amplitude, and polarization response of the metasurface enables lasers with new functionality. Our article provides an overview of the fundamental theory, design considerations, and recent results for high-performance THz QC-VECSELs.« less
Details on the verification test design, measurement test procedures, and Quality assurance/Quality Control (QA/QC) procedures can be found in the test plan titled Testing and Quality Assurance Plan, MIRATECH Corporation GECO 3100 Air/Fuel Ratio Controller (SRI 2001). It can be d...
Quevauviller, P; Bennink, D; Bøwadt, S
2001-05-01
It is now well recognised that the quality control (QC) of all types of analyses, including environmental analyses depends on the appropriate use of reference materials. One of the ways to check the accuracy of methods is based on the use of Certified Reference Materials (CRMs), whereas other types of (not certified) Reference Materials (RMs) are used for routine quality control (establishment of control charts) and interlaboratory testing (e.g. proficiency testing). The perception of these materials, in particular with respect to their production and use, differs widely according to various perspectives (e.g. RM producers, routine laboratories, researchers). This review discusses some critical aspects of RM use and production for the QC of environmental analyses and describes the new approach followed by the Measurements & Testing Generic Activity (European Commission) to tackle new research and production needs.
Srivastava, Praveen; Moorthy, Ganesh S.; Gross, Robert; Barrett, Jeffrey S.
2013-01-01
A selective and a highly sensitive method for the determination of the non-nucleoside reverse transcriptase inhibitor (NNRTI), efavirenz, in human plasma has been developed and fully validated based on high performance liquid chromatography tandem mass spectrometry (LC–MS/MS). Sample preparation involved protein precipitation followed by one to one dilution with water. The analyte, efavirenz was separated by high performance liquid chromatography and detected with tandem mass spectrometry in negative ionization mode with multiple reaction monitoring. Efavirenz and 13C6-efavirenz (Internal Standard), respectively, were detected via the following MRM transitions: m/z 314.20243.90 and m/z 320.20249.90. A gradient program was used to elute the analytes using 0.1% formic acid in water and 0.1% formic acid in acetonitrile as mobile phase solvents, at a flow-rate of 0.3 mL/min. The total run time was 5 min and the retention times for the internal standard (13C6-efavirenz) and efavirenz was approximately 2.6 min. The calibration curves showed linearity (coefficient of regression, r>0.99) over the concentration range of 1.0–2,500 ng/mL. The intraday precision based on the standard deviation of replicates of lower limit of quantification (LLOQ) was 9.24% and for quality control (QC) samples ranged from 2.41% to 6.42% and with accuracy from 112% and 100–111% for LLOQ and QC samples. The inter day precision was 12.3% and 3.03–9.18% for LLOQ and quality controls samples, and the accuracy was 108% and 95.2–108% for LLOQ and QC samples. Stability studies showed that efavirenz was stable during the expected conditions for sample preparation and storage. The lower limit of quantification for efavirenz was 1 ng/mL. The analytical method showed excellent sensitivity, precision, and accuracy. This method is robust and is being successfully applied for therapeutic drug monitoring and pharmacokinetic studies in HIV-infected patients. PMID:23755102
Jaccoulet, E; Schweitzer-Chaput, A; Toussaint, B; Prognon, P; Caudron, E
2018-09-01
Compounding of monoclonal antibody (mAbs) constantly increases in hospital. Quality control (QC) of the compounded mAbs based on quantification and identification is required to prevent potential errors and fast method is needed to manage outpatient chemotherapy administration. A simple and ultra-fast (less than 30 s) method using flow injection analysis associated to least square matching method issued from the analyzer software was performed and evaluated for the routine hospital QC of three compounded mAbs: bevacizumab, infliximab and rituximab. The method was evaluated through qualitative and quantitative parameters. Preliminary analysis of the UV absorption and second derivative spectra of the mAbs allowed us to adapt analytical conditions according to the therapeutic range of the mAbs. In terms of quantitative QC, linearity, accuracy and precision were assessed as specified in ICH guidelines. Very satisfactory recovery was achieved and the RSD (%) of the intermediate precision were less than 1.1%. Qualitative analytical parameters were also evaluated in terms of specificity, sensitivity and global precision through a matrix of confusion. Results showed to be concentration and mAbs dependant and excellent (100%) specificity and sensitivity were reached within specific concentration range. Finally, routine application on "real life" samples (n = 209) from different batch of the three mAbs complied with the specifications of the quality control i.e. excellent identification (100%) and ± 15% of targeting concentration belonging to the calibration range. The successful use of the combination of second derivative spectroscopy and partial least square matching method demonstrated the interest of FIA for the ultra-fast QC of mAbs after compounding using matching method. Copyright © 2018 Elsevier B.V. All rights reserved.
Diffusion imaging quality control via entropy of principal direction distribution.
Farzinfar, Mahshid; Oguz, Ipek; Smith, Rachel G; Verde, Audrey R; Dietrich, Cheryl; Gupta, Aditya; Escolar, Maria L; Piven, Joseph; Pujol, Sonia; Vachet, Clement; Gouttard, Sylvain; Gerig, Guido; Dager, Stephen; McKinstry, Robert C; Paterson, Sarah; Evans, Alan C; Styner, Martin A
2013-11-15
Diffusion MR imaging has received increasing attention in the neuroimaging community, as it yields new insights into the microstructural organization of white matter that are not available with conventional MRI techniques. While the technology has enormous potential, diffusion MRI suffers from a unique and complex set of image quality problems, limiting the sensitivity of studies and reducing the accuracy of findings. Furthermore, the acquisition time for diffusion MRI is longer than conventional MRI due to the need for multiple acquisitions to obtain directionally encoded Diffusion Weighted Images (DWI). This leads to increased motion artifacts, reduced signal-to-noise ratio (SNR), and increased proneness to a wide variety of artifacts, including eddy-current and motion artifacts, "venetian blind" artifacts, as well as slice-wise and gradient-wise inconsistencies. Such artifacts mandate stringent Quality Control (QC) schemes in the processing of diffusion MRI data. Most existing QC procedures are conducted in the DWI domain and/or on a voxel level, but our own experiments show that these methods often do not fully detect and eliminate certain types of artifacts, often only visible when investigating groups of DWI's or a derived diffusion model, such as the most-employed diffusion tensor imaging (DTI). Here, we propose a novel regional QC measure in the DTI domain that employs the entropy of the regional distribution of the principal directions (PD). The PD entropy quantifies the scattering and spread of the principal diffusion directions and is invariant to the patient's position in the scanner. High entropy value indicates that the PDs are distributed relatively uniformly, while low entropy value indicates the presence of clusters in the PD distribution. The novel QC measure is intended to complement the existing set of QC procedures by detecting and correcting residual artifacts. Such residual artifacts cause directional bias in the measured PD and here called dominant direction artifacts. Experiments show that our automatic method can reliably detect and potentially correct such artifacts, especially the ones caused by the vibrations of the scanner table during the scan. The results further indicate the usefulness of this method for general quality assessment in DTI studies. Copyright © 2013 Elsevier Inc. All rights reserved.
Diffusion imaging quality control via entropy of principal direction distribution
Oguz, Ipek; Smith, Rachel G.; Verde, Audrey R.; Dietrich, Cheryl; Gupta, Aditya; Escolar, Maria L.; Piven, Joseph; Pujol, Sonia; Vachet, Clement; Gouttard, Sylvain; Gerig, Guido; Dager, Stephen; McKinstry, Robert C.; Paterson, Sarah; Evans, Alan C.; Styner, Martin A.
2013-01-01
Diffusion MR imaging has received increasing attention in the neuroimaging community, as it yields new insights into the microstructural organization of white matter that are not available with conventional MRI techniques. While the technology has enormous potential, diffusion MRI suffers from a unique and complex set of image quality problems, limiting the sensitivity of studies and reducing the accuracy of findings. Furthermore, the acquisition time for diffusion MRI is longer than conventional MRI due to the need for multiple acquisitions to obtain directionally encoded Diffusion Weighted Images (DWI). This leads to increased motion artifacts, reduced signal-to-noise ratio (SNR), and increased proneness to a wide variety of artifacts, including eddy-current and motion artifacts, “venetian blind” artifacts, as well as slice-wise and gradient-wise inconsistencies. Such artifacts mandate stringent Quality Control (QC) schemes in the processing of diffusion MRI data. Most existing QC procedures are conducted in the DWI domain and/or on a voxel level, but our own experiments show that these methods often do not fully detect and eliminate certain types of artifacts, often only visible when investigating groups of DWI's or a derived diffusion model, such as the most-employed diffusion tensor imaging (DTI). Here, we propose a novel regional QC measure in the DTI domain that employs the entropy of the regional distribution of the principal directions (PD). The PD entropy quantifies the scattering and spread of the principal diffusion directions and is invariant to the patient's position in the scanner. High entropy value indicates that the PDs are distributed relatively uniformly, while low entropy value indicates the presence of clusters in the PD distribution. The novel QC measure is intended to complement the existing set of QC procedures by detecting and correcting residual artifacts. Such residual artifacts cause directional bias in the measured PD and here called dominant direction artifacts. Experiments show that our automatic method can reliably detect and potentially correct such artifacts, especially the ones caused by the vibrations of the scanner table during the scan. The results further indicate the usefulness of this method for general quality assessment in DTI studies. PMID:23684874
Lin, Sabrina C.; Bays, Brett C.; Omaiye, Esther; Bhanu, Bir; Talbot, Prue
2016-01-01
There is a foundational need for quality control tools in stem cell laboratories engaged in basic research, regenerative therapies, and toxicological studies. These tools require automated methods for evaluating cell processes and quality during in vitro passaging, expansion, maintenance, and differentiation. In this paper, an unbiased, automated high-content profiling toolkit, StemCellQC, is presented that non-invasively extracts information on cell quality and cellular processes from time-lapse phase-contrast videos. Twenty four (24) morphological and dynamic features were analyzed in healthy, unhealthy, and dying human embryonic stem cell (hESC) colonies to identify those features that were affected in each group. Multiple features differed in the healthy versus unhealthy/dying groups, and these features were linked to growth, motility, and death. Biomarkers were discovered that predicted cell processes before they were detectable by manual observation. StemCellQC distinguished healthy and unhealthy/dying hESC colonies with 96% accuracy by non-invasively measuring and tracking dynamic and morphological features over 48 hours. Changes in cellular processes can be monitored by StemCellQC and predictions can be made about the quality of pluripotent stem cell colonies. This toolkit reduced the time and resources required to track multiple pluripotent stem cell colonies and eliminated handling errors and false classifications due to human bias. StemCellQC provided both user-specified and classifier-determined analysis in cases where the affected features are not intuitive or anticipated. Video analysis algorithms allowed assessment of biological phenomena using automatic detection analysis, which can aid facilities where maintaining stem cell quality and/or monitoring changes in cellular processes are essential. In the future StemCellQC can be expanded to include other features, cell types, treatments, and differentiating cells. PMID:26848582
Zahedi, Atena; On, Vincent; Lin, Sabrina C; Bays, Brett C; Omaiye, Esther; Bhanu, Bir; Talbot, Prue
2016-01-01
There is a foundational need for quality control tools in stem cell laboratories engaged in basic research, regenerative therapies, and toxicological studies. These tools require automated methods for evaluating cell processes and quality during in vitro passaging, expansion, maintenance, and differentiation. In this paper, an unbiased, automated high-content profiling toolkit, StemCellQC, is presented that non-invasively extracts information on cell quality and cellular processes from time-lapse phase-contrast videos. Twenty four (24) morphological and dynamic features were analyzed in healthy, unhealthy, and dying human embryonic stem cell (hESC) colonies to identify those features that were affected in each group. Multiple features differed in the healthy versus unhealthy/dying groups, and these features were linked to growth, motility, and death. Biomarkers were discovered that predicted cell processes before they were detectable by manual observation. StemCellQC distinguished healthy and unhealthy/dying hESC colonies with 96% accuracy by non-invasively measuring and tracking dynamic and morphological features over 48 hours. Changes in cellular processes can be monitored by StemCellQC and predictions can be made about the quality of pluripotent stem cell colonies. This toolkit reduced the time and resources required to track multiple pluripotent stem cell colonies and eliminated handling errors and false classifications due to human bias. StemCellQC provided both user-specified and classifier-determined analysis in cases where the affected features are not intuitive or anticipated. Video analysis algorithms allowed assessment of biological phenomena using automatic detection analysis, which can aid facilities where maintaining stem cell quality and/or monitoring changes in cellular processes are essential. In the future StemCellQC can be expanded to include other features, cell types, treatments, and differentiating cells.
QUALITY ASSURANCE AND QUALITY CONTROL FOR WASTE CONTAINMENT FACILITIES. Project Summary
It is generally agreed that both quality assurance (QA) and quality control (QC) are essential to the proper installation and eventual performance of environmentally safe and secure waste containment systems. Even further, there are both manufacturing and construction aspects to...
DOT National Transportation Integrated Search
2013-11-01
Current roadway quality control and quality acceptance (QC/QA) procedures for the Louisiana Department of Transportation and : Development (LADOTD) include coring for thickness, density, and air voids in hot mix asphalt (HMA) pavements and thickness ...
HANDBOOK: QUALITY ASSURANCE/QUALITY CONTROL (QA/QC) PROCEDURES FOR HAZARDOUS WASTE INCINERATION
Resource Conservation and Recovery Act regulations for hazardous waste incineration require trial burns by permit applicants. uality Assurance Project Plan (QAPjP) must accompany a trial burn plan with appropriate quality assurance/quality control procedures. uidance on the prepa...
Rabe, Fran; Kadidlo, Diane; Van Orsow, Lisa; McKenna, David
2013-10-01
Qualification of a cord blood bank (CBB) is a complex process that includes evaluation of multiple aspects of donor screening and testing, processing, accreditation and approval by professional cell therapy groups, and results of received cord blood units. The University of Minnesota Medical Center Cell Therapy Laboratory has established a CBB vendor qualification process to ensure the CBB meets established regulatory and quality requirements. The deployed qualification of CBBs is based on retrospective and prospective review of the CBB. Forty-one CBBs were evaluated retrospectively: seven CBBs were disqualified based on failed quality control (QC) results. Eight CBBs did not meet the criteria for retrospective qualification because fewer than 3 cord blood units were received and the CBB was not accredited. As of March 2012, three US and one non-US CBBs have been qualified prospectively. One CBB withdrew from the qualification process after successful completion of the comprehensive survey and subsequent failure of the provided QC unit to pass the minimum criteria. One CBB failed the prospective qualification process based on processing methods that were revealed during the paper portion of the evaluation. A CBB qualification process is necessary for a transplant center to manage the qualification of the large number of CBBs needed to support a umbilical cord blood transplantation program. A transplant center that has utilized cord blood for a number of years before implementation of a qualification process should use a retrospective qualification process along with a prospective process. © 2013 American Association of Blood Banks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Engels, J.
The Environmental Restoration (ER) Program was established for the investigation and remediation of inactive US Department of Energy (DOE) sites and facilities that have been declared surplus in terms of their previous uses. The purpose of this document is to Specify ER requirements for quality control (QC) of analytical data. Activities throughout all phases of the investigation may affect the quality of the final data product, thus are subject to control specifications. Laboratory control is emphasized in this document, and field concerns will be addressed in a companion document Energy Systems, in its role of technical coordinator and at themore » request of DOE-OR, extends the application of these requirements to all participants in ER activities. Because every instance and concern may not be addressed in this document, participants are encouraged to discuss any questions with the ER Quality Assurance (QA) Office, the Analytical Environmental Support Group (AESG), or the Analytical Project Office (APO).« less
Visualization and Quality Control Web Tools for CERES Products
NASA Astrophysics Data System (ADS)
Mitrescu, C.; Doelling, D. R.
2017-12-01
The NASA CERES project continues to provide the scientific communities a wide variety of satellite-derived data products such as observed TOA broadband shortwave and longwave observed fluxes, computed TOA and Surface fluxes, as well as cloud, aerosol, and other atmospheric parameters. They encompass a wide range of temporal and spatial resolutions, suited to specific applications. CERES data is used mostly by climate modeling communities but also by a wide variety of educational institutions. To better serve our users, a web-based Ordering and Visualization Tool (OVT) was developed by using Opens Source Software such as Eclipse, java, javascript, OpenLayer, Flot, Google Maps, python, and others. Due to increased demand by our own scientists, we also implemented a series of specialized functions to be used in the process of CERES Data Quality Control (QC) such as 1- and 2-D histograms, anomalies and differences, temporal and spatial averaging, side-by-side parameter comparison, and others that made the process of QC far easier and faster, but more importantly far more portable. With the integration of ground site observed surface fluxes we further facilitate the CERES project to QC the CERES computed surface fluxes. An overview of the CERES OVT basic functions using Open Source Software, as well as future steps in expanding its capabilities will be presented at the meeting.
PACS 2000: quality control using the task allocation chart
NASA Astrophysics Data System (ADS)
Norton, Gary S.; Romlein, John R.; Lyche, David K.; Richardson, Ronald R., Jr.
2000-05-01
Medical imaging's technological evolution in the next century will continue to include Picture Archive and Communication Systems (PACS) and teleradiology. It is difficult to predict radiology's future in the new millennium with both computed radiography and direct digital capture competing as the primary image acquisition methods for routine radiography. Changes in Computed Axial Tomography (CT) and Magnetic Resonance Imaging (MRI) continue to amaze the healthcare community. No matter how the acquisition, display, and archive functions change, Quality Control (QC) of the radiographic imaging chain will remain an important step in the imaging process. The Task Allocation Chart (TAC) is a tool that can be used in a medical facility's QC process to indicate the testing responsibilities of the image stakeholders and the medical informatics department. The TAC shows a grid of equipment to be serviced, tasks to be performed, and the organization assigned to perform each task. Additionally, skills, tasks, time, and references for each task can be provided. QC of the PACS must be stressed as a primary element of a PACS' implementation. The TAC can be used to clarify responsibilities during warranty and paid maintenance periods. Establishing a TAC a part of a PACS implementation has a positive affect on patient care and clinical acceptance.
Autonomous Quality Control of Joint Orientation Measured with Inertial Sensors.
Lebel, Karina; Boissy, Patrick; Nguyen, Hung; Duval, Christian
2016-07-05
Clinical mobility assessment is traditionally performed in laboratories using complex and expensive equipment. The low accessibility to such equipment, combined with the emerging trend to assess mobility in a free-living environment, creates a need for body-worn sensors (e.g., inertial measurement units-IMUs) that are capable of measuring the complexity in motor performance using meaningful measurements, such as joint orientation. However, accuracy of joint orientation estimates using IMUs may be affected by environment, the joint tracked, type of motion performed and velocity. This study investigates a quality control (QC) process to assess the quality of orientation data based on features extracted from the raw inertial sensors' signals. Joint orientation (trunk, hip, knee, ankle) of twenty participants was acquired by an optical motion capture system and IMUs during a variety of tasks (sit, sit-to-stand transition, walking, turning) performed under varying conditions (speed, environment). An artificial neural network was used to classify good and bad sequences of joint orientation with a sensitivity and a specificity above 83%. This study confirms the possibility to perform QC on IMU joint orientation data based on raw signal features. This innovative QC approach may be of particular interest in a big data context, such as for remote-monitoring of patients' mobility.
Kpaibe, André P S; Ben-Ameur, Randa; Coussot, Gaëlle; Ladner, Yoann; Montels, Jérôme; Ake, Michèle; Perrin, Catherine
2017-08-01
Snake venoms constitute a very promising resource for the development of new medicines. They are mainly composed of very complex peptide and protein mixtures, which composition may vary significantly from batch to batch. This latter consideration is a challenge for routine quality control (QC) in the pharmaceutical industry. In this paper, we report the use of capillary zone electrophoresis for the development of an analytical fingerprint methodology to assess the quality of snake venoms. The analytical fingerprint concept is being widely used for the QC of herbal drugs but rarely for venoms QC so far. CZE was chosen for its intrinsic efficiency in the separation of protein and peptide mixtures. The analytical fingerprint methodology was first developed and evaluated for a particular snake venom, Lachesis muta. Optimal analysis conditions required the use of PDADMAC capillary coating to avoid protein and peptide adsorption. Same analytical conditions were then applied to other snake venom species. Different electrophoretic profiles were obtained for each venom. Excellent repeatability and intermediate precision was observed for each batch. Analysis of different batches of the same species revealed inherent qualitative and quantitative composition variations of the venoms between individuals. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Origin of the concept of the quiescent centre of plant roots.
Barlow, Peter W
2016-09-01
Concepts in biology feed into general theories of growth, development and evolution of organisms and how they interact with the living and non-living components of their environment. A well-founded concept clarifies unsolved problems and serves as a focus for further research. One such example of a constructive concept in the plant sciences is that of the quiescent centre (QC). In anatomical terms, the QC is an inert group of cells maintained within the apex of plant roots. However, the evidence that established the presence of a QC accumulated only gradually, making use of strands of different types of observations, notably from geometrical-analytical anatomy, radioisotope labelling and autoradiography. In their turn, these strands contributed to other concepts: those of the mitotic cell cycle and of tissue-related cell kinetics. Another important concept to which the QC contributed was that of tissue homeostasis. The general principle of this last-mentioned concept is expressed by the QC in relation to the recovery of root growth following a disturbance to cell proliferation; the resulting activation of the QC provides new cells which not only repair the root meristem but also re-establish a new QC.
Scheltema, Richard A; Mann, Matthias
2012-06-01
With the advent of high-throughput mass spectrometry (MS)-based proteomics, the magnitude and complexity of the performed experiments has increased dramatically. Likewise, investments in chromatographic and MS instrumentation are a large proportion of the budget of proteomics laboratories. Guarding measurement quality and maximizing uptime of the LC-MS/MS systems therefore requires constant care despite automated workflows. We describe a real-time surveillance system, called SprayQc, that continuously monitors the status of the peripheral equipment to ensure that operational parameters are within an acceptable range. SprayQc is composed of multiple plug-in software components that use computer vision to analyze electrospray conditions, monitor the chromatographic device for stable backpressure, interact with a column oven to control pressure by temperature, and ensure that the mass spectrometer is still acquiring data. Action is taken when a failure condition has been detected, such as stopping the column oven and the LC flow, as well as automatically notifying the appropriate operator. Additionally, all defined metrics can be recorded synchronized on retention time with the MS acquisition file, allowing for later inspection and providing valuable information for optimization. SprayQc has been extensively tested in our laboratory, supports third-party plug-in development, and is freely available for download from http://sourceforge.org/projects/sprayqc .
DOT National Transportation Integrated Search
2013-11-01
Current roadway quality control and quality acceptance (QC/QA) procedures for the Louisiana Department of Transportation and Development : (LADOTD) include coring for thickness, density, and air voids in hot mix asphalt (HMA) pavements and thickness ...
Assessment of in-situ test technology for construction control of base courses and embankments.
DOT National Transportation Integrated Search
2004-05-01
With the coming move from an empirical to mechanistic-empirical pavement design, it is essential to improve the quality control/quality assurance (QC/QA) procedures of compacted materials from a density-based criterion to a stiffness/strength-based c...
Operational CryoSat Product Quality Assessment
NASA Astrophysics Data System (ADS)
Mannan, Rubinder; Webb, Erica; Hall, Amanda; Bouzinac, Catherine
2013-12-01
The performance and quality of the CryoSat data products are routinely assessed by the Instrument Data quality Evaluation and Analysis Service (IDEAS). This information is then conveyed to the scientific and user community in order to allow them to utilise CryoSat data with confidence. This paper presents details of the Quality Control (QC) activities performed for CryoSat products under the IDEAS contract. Details of the different QC procedures and tools deployed by IDEAS to assess the quality of operational data are presented. The latest updates to the Instrument Processing Facility (IPF) for the Fast Delivery Marine (FDM) products and the future update to Baseline-C are discussed.
Energy transfer networks: Quasicontinuum photoluminescence linked to high densities of defects
Laurence, Ted A.; Ly, Sonny; Bude, Jeff D.; ...
2017-11-06
In a series of studies related to laser-induced damage of optical materials and deposition of plastics, we discovered a broadly emitting photoluminescence with fast lifetimes that we termed quasicontinuum photoluminescence (QC-PL). Here in this paper, we suggest that a high density of optically active defects leads to QC-PL, where interactions between defects affect the temporal and spectral characteristics of both excitation and emission. We develop a model that predicts the temporal characteristics of QC-PL, based on energy transfer interactions between high densities of defects. Our model does not explain all spectral broadening and redshifts found in QC-PL, since we domore » not model spectral changes in defects due to proximity to other defects. However, we do provide an example of a well-defined system that exhibits the QC-PL characteristics of a distribution in shortened lifetimes and broadened, redshifted energy levels: an organic chromophore (fluorescein) that has been dried rapidly on a fused silica surface. Recently, we showed that regions of fused silica exposed to up to 1 billion high-fluence laser shots at 351 rm nm at subdamage fluences exhibit significant transmission losses at the surface. Here, we find that these laser-exposed regions also exhibit QC-PL. Increases in the density of induced defects on these laser-exposed surfaces, as measured by the local transmission loss, lead to decreases in the observed lifetime and redshifts in the spectrum of the QC-PL, consistent with our explanation for QC-PL. In conclusion, we have found QC-PL in an increasing variety of situations and materials, and we believe it is a phenomenon commonly found on surfaces and nanostructured materials.« less
Energy transfer networks: Quasicontinuum photoluminescence linked to high densities of defects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Laurence, Ted A.; Ly, Sonny; Bude, Jeff D.
In a series of studies related to laser-induced damage of optical materials and deposition of plastics, we discovered a broadly emitting photoluminescence with fast lifetimes that we termed quasicontinuum photoluminescence (QC-PL). Here in this paper, we suggest that a high density of optically active defects leads to QC-PL, where interactions between defects affect the temporal and spectral characteristics of both excitation and emission. We develop a model that predicts the temporal characteristics of QC-PL, based on energy transfer interactions between high densities of defects. Our model does not explain all spectral broadening and redshifts found in QC-PL, since we domore » not model spectral changes in defects due to proximity to other defects. However, we do provide an example of a well-defined system that exhibits the QC-PL characteristics of a distribution in shortened lifetimes and broadened, redshifted energy levels: an organic chromophore (fluorescein) that has been dried rapidly on a fused silica surface. Recently, we showed that regions of fused silica exposed to up to 1 billion high-fluence laser shots at 351 rm nm at subdamage fluences exhibit significant transmission losses at the surface. Here, we find that these laser-exposed regions also exhibit QC-PL. Increases in the density of induced defects on these laser-exposed surfaces, as measured by the local transmission loss, lead to decreases in the observed lifetime and redshifts in the spectrum of the QC-PL, consistent with our explanation for QC-PL. In conclusion, we have found QC-PL in an increasing variety of situations and materials, and we believe it is a phenomenon commonly found on surfaces and nanostructured materials.« less
Energy transfer networks: Quasicontinuum photoluminescence linked to high densities of defects
NASA Astrophysics Data System (ADS)
Laurence, Ted A.; Ly, Sonny; Bude, Jeff D.; Baxamusa, Salmaan H.; Lepró, Xavier; Ehrmann, Paul
2017-11-01
In a series of studies related to laser-induced damage of optical materials and deposition of plastics, we discovered a broadly emitting photoluminescence with fast lifetimes that we termed quasicontinuum photoluminescence (QC-PL). Here, we suggest that a high density of optically active defects leads to QC-PL, where interactions between defects affect the temporal and spectral characteristics of both excitation and emission. We develop a model that predicts the temporal characteristics of QC-PL, based on energy transfer interactions between high densities of defects. Our model does not explain all spectral broadening and redshifts found in QC-PL, since we do not model spectral changes in defects due to proximity to other defects. However, we do provide an example of a well-defined system that exhibits the QC-PL characteristics of a distribution in shortened lifetimes and broadened, redshifted energy levels: an organic chromophore (fluorescein) that has been dried rapidly on a fused silica surface. Recently, we showed that regions of fused silica exposed to up to 1 billion high-fluence laser shots at 351 rm nm at subdamage fluences exhibit significant transmission losses at the surface. Here, we find that these laser-exposed regions also exhibit QC-PL. Increases in the density of induced defects on these laser-exposed surfaces, as measured by the local transmission loss, lead to decreases in the observed lifetime and redshifts in the spectrum of the QC-PL, consistent with our explanation for QC-PL. We have found QC-PL in an increasing variety of situations and materials, and we believe it is a phenomenon commonly found on surfaces and nanostructured materials.
Lierman, Sylvie; De Sutter, Petra; Dhont, Marc; Van der Elst, Josiane
2007-10-01
To submit different glove brands to double-quality control tests using mouse embryo assay (MEA) and the human sperm motility assay (HuSMA). Operator protection against infectious body fluid contamination is a safety issue in assisted reproductive technology (ART). When using gloves in the ART laboratory, toxic substances can be transmitted to culture media, even during brief contact. Quality control study of gloves in ART. University hospital-based infertility center. Seven- to 8-week-old female B6D2F1 hybrid mice. We tested two surgical, two cleanroom, and six examination glove brands. Only gloves brands that passed both HuSMA and MEA were submitted to further QC using zona-free and/or cryopreserved MEA. Sperm motility index, two-cell and blastocyst development, blastocyst total cell number. Quality control by MEA and HuSMA identified two glove brands to be nontoxic. Our study shows that gloves used in ART can be toxic and should be tested as part of an ongoing quality control program.
Greene, Karen E.
1997-01-01
A study of the ambient ground-water quality in the vicinity of Naval Submarine Base (SUBASE) Bangor was conducted to provide the U.S. Navywith background levels of selected constituents.The Navy needs this information to plan and manage cleanup activities on the base. DuringMarch and April 1995, 136 water-supply wells were sampled for common ions, trace elements, and organic compounds; not all wells were sampled for all constituents. Man-made organic compounds were detected in only two of fifty wells, and the sources of these organic compounds were attributed to activities in the immediate vicinities of these off- base wells. Drinking water standards for trichloroethylene, iron, and manganese were exceeded in one of these wells, which was probablycontaminated by an old local (off-base) dump. Ground water from wells open to the following hydrogeologic units (in order from shallow to deep) was investigated: the Vashon till confining unit (Qvt, three wells); the Vashon aquifer (Qva, 54 wells); the Upper confining unit (QC1, 16 wells); the Permeable interbeds within QC1 (QC1pi, 34 wells); and the Sea-level aquifer (QA1, 29 wells).The 50th and 90th percentile ambient background levels of 35 inorganic constituents were determined for each hydrogeologic unit. At least tenmeasurements were required for a constituent in each hydro- geologic unit for determination of ambient background levels, and data for three wellsdetermined to be affected by localized activities were excluded from these analyses. The only drinking water standards exceeded by ambient background levels were secondary maximum contaminant levels for iron (300 micrograms per liter), in QC1 and QC1pi, and manganese (50 micrograms per liter), in all of the units. The 90th percentile values for arsenic in QC1pi, QA1, and for the entire study area are above 5 micrograms per liter, the Model Toxics Control Act Method A value for protecting drinking water, but well below the maximum contaminant level of 50 micrograms per liter for arsenic. The manganese standard was exceeded in 38 wells and the standard for iron was exceeded in 12 wells.Most of these wells were in QC1 or QC1pi and had dissolved oxygen concentrations of less than 1 milligram per liter and dissolved organic carbon concentrations greater than 1\\x11milligram per liter.The dissolved oxygen concentration is generally lower in the deeper units, while pH increases; the recommended pH range of 6.5-8.5 standard units was exceeded in 9 wells. The common-ion chemistry was similar for all of the units.
Novelo-Casanova, D. A.; Lee, W.H.K.
1991-01-01
Using simulated coda waves, the resolution of the single-scattering model to extract coda Q (Qc) and its power law frequency dependence was tested. The back-scattering model of Aki and Chouet (1975) and the single isotropic-scattering model of Sato (1977) were examined. The results indicate that: (1) The input Qc models are reasonably well approximated by the two methods; (2) almost equal Qc values are recovered when the techniques sample the same coda windows; (3) low Qc models are well estimated in the frequency domain from the early and late part of the coda; and (4) models with high Qc values are more accurately extracted from late code measurements. ?? 1991 Birkha??user Verlag.
Code of Federal Regulations, 2010 CFR
2010-04-01
...' Benefits EMPLOYMENT AND TRAINING ADMINISTRATION, DEPARTMENT OF LABOR QUALITY CONTROL IN THE FEDERAL-STATE... QC unit. The organizational location of this unit shall be positioned to maximize its objectivity, to... organizational conflict of interest. ...
78 FR 48766 - Petition for Waiver of Compliance
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-09
...'s Network Management Center in Montreal, QC, Canada. CP operates approximately six to eight trains a day over this segment. The trackage is operated under a Centralized Traffic Control system and...
CARINA data synthesis project: pH data scale unification and cruise adjustments
NASA Astrophysics Data System (ADS)
Velo, A.; Pérez, F. F.; Lin, X.; Key, R. M.; Tanhua, T.; de La Paz, M.; van Heuven, S.; Jutterström, S.; Ríos, A. F.
2009-10-01
Data on carbon and carbon-relevant hydrographic and hydrochemical parameters from previously non-publicly available cruise data sets in the Artic Mediterranean Seas (AMS), Atlantic and Southern Ocean have been retrieved and merged to a new database: CARINA (CARbon IN the Atlantic). These data have gone through rigorous quality control (QC) procedures to assure the highest possible quality and consistency. The data for most of the measured parameters in the CARINA database were objectively examined in order to quantify systematic differences in the reported values, i.e. secondary quality control. Systematic biases found in the data have been corrected in the data products, i.e. three merged data files with measured, calculated and interpolated data for each of the three CARINA regions; AMS, Atlantic and Southern Ocean. Out of a total of 188 cruise entries in the CARINA database, 59 reported pH measured values. Here we present details of the secondary QC on pH for the CARINA database. Procedures of quality control, including crossover analysis between cruises and inversion analysis of all crossover data are briefly described. Adjustments were applied to the pH values for 21 of the cruises in the CARINA dataset. With these adjustments the CARINA database is consistent both internally as well as with GLODAP data, an oceanographic data set based on the World Hydrographic Program in the 1990s. Based on our analysis we estimate the internal accuracy of the CARINA pH data to be 0.005 pH units. The CARINA data are now suitable for accurate assessments of, for example, oceanic carbon inventories and uptake rates and for model validation.
CARINA TCO2 data in the Atlantic Ocean
NASA Astrophysics Data System (ADS)
Pierrot, D.; Brown, P.; van Heuven, S.; Tanhua, T.; Schuster, U.; Wanninkhof, R.; Key, R. M.
2010-01-01
Water column data of carbon and carbon-relevant hydrographic and hydrochemical parameters from 188 cruises in the Arctic, Atlantic and Southern Ocean have been retrieved and merged in a new data base: the CARINA (CARbon IN the Atlantic) Project. These data have gone through rigorous quality control (QC) procedures to assure the highest possible quality and consistency. Secondary quality control, which involved objective study of data in order to quantify systematic differences in the reported values, was performed for the pertinent parameters in the CARINA data base. Systematic biases in the data have been corrected in the data products. The products are three merged data files with measured, adjusted and interpolated data of all cruises for each of the three CARINA regions (Arctic, Atlantic and Southern Ocean). Ninety-eight cruises were conducted in the "Atlantic" defined as the region south of the Greenland-Iceland-Scotland Ridge and north of about 30° S. Here we report the details of the secondary QC which was done on the total dissolved inorganic carbon (TCO2) data and the adjustments that were applied to yield the final data product in the Atlantic. Procedures of quality control - including crossover analysis between stations and inversion analysis of all crossover data - are briefly described. Adjustments were applied to TCO2 measurements for 17 of the cruises in the Atlantic Ocean region. With these adjustments, the CARINA data base is consistent both internally as well as with GLODAP data, an oceanographic data set based on the WOCE Hydrographic Program in the 1990s, and is now suitable for accurate assessments of, for example, regional oceanic carbon inventories, uptake rates and model validation.
CARINA alkalinity data in the Atlantic Ocean
NASA Astrophysics Data System (ADS)
Velo, A.; Perez, F. F.; Brown, P.; Tanhua, T.; Schuster, U.; Key, R. M.
2009-08-01
Data on carbon and carbon-relevant hydrographic and hydrochemical parameters from previously non-publicly available cruise data sets in the Arctic, Atlantic and Southern Ocean have been retrieved and merged to a new database: CARINA (CARbon IN the Atlantic). These data have gone through rigorous quality control (QC) procedures to assure the highest possible quality and consistency. The data for most of the measured parameters in the CARINA data base were objectively examined in order to quantify systematic differences in the reported values, i.e. secondary quality control. Systematic biases found in the data have been corrected in the data products, i.e. three merged data files with measured, calculated and interpolated data for each of the three CARINA regions; Arctic, Atlantic and Southern Ocean. Out of a total of 188 cruise entries in the CARINA database, 98 were conducted in the Atlantic Ocean and of these, 75 cruises report alkalinity values. Here we present details of the secondary QC on alkalinity for the Atlantic Ocean part of CARINA. Procedures of quality control, including crossover analysis between cruises and inversion analysis of all crossover data are briefly described. Adjustments were applied to the alkalinity values for 16 of the cruises in the Atlantic Ocean region. With these adjustments the CARINA database is consistent both internally as well as with GLODAP data, an oceanographic data set based on the World Hydrographic Program in the 1990s. Based on our analysis we estimate the internal accuracy of the CARINA-ATL alkalinity data to be 3.3 μmol kg-1. The CARINA data are now suitable for accurate assessments of, for example, oceanic carbon inventories and uptake rates and for model validation.
CARINA: nutrient data in the Atlantic Ocean
NASA Astrophysics Data System (ADS)
Tanhua, T.; Brown, P. J.; Key, R. M.
2009-11-01
Data on carbon and carbon-relevant hydrographic and hydrochemical parameters from previously non-publicly available cruise data sets in the Arctic, Atlantic and Southern Ocean have been retrieved and merged to a new database: CARINA (CARbon IN the Atlantic). These data have gone through rigorous quality control (QC) procedures to assure the highest possible quality and consistency. The data for most of the measured parameters in the CARINA data base were objectively examined in order to quantify systematic differences in the reported values, i.e. secondary quality control. Systematic biases found in the data have been corrected in the data products, i.e. three merged data files with measured, calculated and interpolated data for each of the three CARINA regions; Arctic Mediterranean Seas, Atlantic and Southern Ocean. Out of a total of 188 cruise entries in the CARINA database, 98 were conducted in the Atlantic Ocean and of these 84 cruises report nitrate values, 79 silicate, and 78 phosphate. Here we present details of the secondary QC for nutrients for the Atlantic Ocean part of CARINA. Procedures of quality control, including crossover analysis between cruises and inversion analysis of all crossover data are briefly described. Adjustments were applied to the nutrient values for 43 of the cruises in the Atlantic Ocean region. With these adjustments the CARINA database is consistent both internally as well as with GLODAP data, an oceanographic data set based on the World Hydrographic Program in the 1990s (Key et al., 2004). Based on our analysis we estimate the internal accuracy of the CARINA-ATL nutrient data to be: nitrate 1.5%; phosphate 2.6%; silicate 3.1%. The CARINA data are now suitable for accurate assessments of, for example, oceanic carbon inventories and uptake rates and for model validation.
CARINA: nutrient data in the Atlantic Ocean
NASA Astrophysics Data System (ADS)
Tanhua, T.; Brown, P. J.; Key, R. M.
2009-07-01
Data on carbon and carbon-relevant hydrographic and hydrochemical parameters from previously non-publicly available cruise data sets in the Arctic, Atlantic and Southern Ocean have been retrieved and merged to a new database: CARINA (CARbon IN the Atlantic). These data have gone through rigorous quality control (QC) procedures to assure the highest possible quality and consistency. The data for most of the measured parameters in the CARINA data base were objectively examined in order to quantify systematic differences in the reported values, i.e. secondary quality control. Systematic biases found in the data have been corrected in the data products, i.e. three merged data files with measured, calculated and interpolated data for each of the three CARINA regions; Arctic, Atlantic and Southern Ocean. Out of a total of 188 cruise entries in the CARINA database, 98 were conducted in the Atlantic Ocean and of these 84 cruises report nitrate values, 79 silicate, and 78 phosphate. Here we present details of the secondary QC for nutrients for the Atlantic Ocean part of CARINA. Procedures of quality control, including crossover analysis between cruises and inversion analysis of all crossover data are briefly described. Adjustments were applied to the nutrient values for 43 of the cruises in the Atlantic Ocean region. With these adjustments the CARINA database is consistent both internally as well as with GLODAP data, an oceanographic data set based on the World Hydrographic Program in the 1990s (Key et al., 2004). Based on our analysis we estimate the internal accuracy of the CARINA-ATL nutrient data to be: nitrate 1.5%; phosphate 2.6%; silicate 3.1%. The CARINA data are now suitable for accurate assessments of, for example, oceanic carbon inventories and uptake rates and for model validation.
CARINA alkalinity data in the Atlantic Ocean
NASA Astrophysics Data System (ADS)
Velo, A.; Perez, F. F.; Brown, P.; Tanhua, T.; Schuster, U.; Key, R. M.
2009-11-01
Data on carbon and carbon-relevant hydrographic and hydrochemical parameters from previously non-publicly available cruise data sets in the Arctic, Atlantic and Southern Ocean have been retrieved and merged to a new database: CARINA (CARbon IN the Atlantic). These data have gone through rigorous quality control (QC) procedures to assure the highest possible quality and consistency. The data for most of the measured parameters in the CARINA data base were objectively examined in order to quantify systematic differences in the reported values, i.e. secondary quality control. Systematic biases found in the data have been corrected in the data products, i.e. three merged data files with measured, calculated and interpolated data for each of the three CARINA regions; Arctic, Atlantic and Southern Ocean. Out of a total of 188 cruise entries in the CARINA database, 98 were conducted in the Atlantic Ocean and of these, 75 cruises report alkalinity values. Here we present details of the secondary QC on alkalinity for the Atlantic Ocean part of CARINA. Procedures of quality control, including crossover analysis between cruises and inversion analysis of all crossover data are briefly described. Adjustments were applied to the alkalinity values for 16 of the cruises in the Atlantic Ocean region. With these adjustments the CARINA database is consistent both internally as well as with GLODAP data, an oceanographic data set based on the World Hydrographic Program in the 1990s. Based on our analysis we estimate the internal accuracy of the CARINA-ATL alkalinity data to be 3.3 μmol kg-1. The CARINA data are now suitable for accurate assessments of, for example, oceanic carbon inventories and uptake rates and for model validation.
Hoang, Van-Hai; Tran, Phuong-Thao; Cui, Minghua; Ngo, Van T H; Ann, Jihyae; Park, Jongmi; Lee, Jiyoun; Choi, Kwanghyun; Cho, Hanyang; Kim, Hee; Ha, Hee-Jin; Hong, Hyun-Seok; Choi, Sun; Kim, Young-Ho; Lee, Jeewoo
2017-03-23
Glutaminyl cyclase (QC) has been implicated in the formation of toxic amyloid plaques by generating the N-terminal pyroglutamate of β-amyloid peptides (pGlu-Aβ) and thus may participate in the pathogenesis of Alzheimer's disease (AD). We designed a library of glutamyl cyclase (QC) inhibitors based on the proposed binding mode of the preferred substrate, Aβ 3E-42 . An in vitro structure-activity relationship study identified several excellent QC inhibitors demonstrating 5- to 40-fold increases in potency compared to a known QC inhibitor. When tested in mouse models of AD, compound 212 significantly reduced the brain concentrations of pyroform Aβ and total Aβ and restored cognitive functions. This potent Aβ-lowering effect was achieved by incorporating an additional binding region into our previously established pharmacophoric model, resulting in strong interactions with the carboxylate group of Glu327 in the QC binding site. Our study offers useful insights in designing novel QC inhibitors as a potential treatment option for AD.
Lin, Jou-Wei; Yang, Chen-Wei
2010-01-01
The objective of this study was to develop and validate an automated acquisition system to assess quality of care (QC) measures for cardiovascular diseases. This system combining searching and retrieval algorithms was designed to extract QC measures from electronic discharge notes and to estimate the attainment rates to the current standards of care. It was developed on the patients with ST-segment elevation myocardial infarction and tested on the patients with unstable angina/non-ST-segment elevation myocardial infarction, both diseases sharing almost the same QC measures. The system was able to reach a reasonable agreement (κ value) with medical experts from 0.65 (early reperfusion rate) to 0.97 (β-blockers and lipid-lowering agents before discharge) for different QC measures in the test set, and then applied to evaluate QC in the patients who underwent coronary artery bypass grafting surgery. The result has validated a new tool to reliably extract QC measures for cardiovascular diseases. PMID:20442141
Wu, Vincent W.; Dana, Craig M.; Iavarone, Anthony T.; ...
2017-01-17
The breakdown of plant biomass to simple sugars is essential for the production of second-generation biofuels and high-value bioproducts. Currently, enzymes produced from filamentous fungi are used for deconstructing plant cell wall polysaccharides into fermentable sugars for biorefinery applications. A post-translational N-terminal pyroglutamate modification observed in some of these enzymes occurs when N-terminal glutamine or glutamate is cyclized to form a five-membered ring. This modification has been shown to confer resistance to thermal denaturation for CBH-1 and EG-1 cellulases. In mammalian cells, the formation of pyroglutamate is catalyzed by glutaminyl cyclases. Using the model filamentous fungus Neurospora crassa, we identifiedmore » two genes ( qc-1 and qc-2) that encode proteins homologous to mammalian glutaminyl cyclases. We show that qc-1 and qc-2 are essential for catalyzing the formation of an N-terminal pyroglutamate on CBH-1 and GH5-1. CBH-1 and GH5-1 produced in a Δqc-1 Δqc-2 mutant, and thus lacking the N-terminal pyroglutamate modification, showed greater sensitivity to thermal denaturation, and for GH5-1, susceptibility to proteolytic cleavage. QC-1 and QC-2 are endoplasmic reticulum (ER)-localized proteins. The pyroglutamate modification is predicted to occur in a number of additional fungal proteins that have diverse functions. The identification of glutaminyl cyclases in fungi may have implications for production of lignocellulolytic enzymes, heterologous expression, and biotechnological applications revolving around protein stability.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Vincent W.; Dana, Craig M.; Iavarone, Anthony T.
The breakdown of plant biomass to simple sugars is essential for the production of second-generation biofuels and high-value bioproducts. Currently, enzymes produced from filamentous fungi are used for deconstructing plant cell wall polysaccharides into fermentable sugars for biorefinery applications. A post-translational N-terminal pyroglutamate modification observed in some of these enzymes occurs when N-terminal glutamine or glutamate is cyclized to form a five-membered ring. This modification has been shown to confer resistance to thermal denaturation for CBH-1 and EG-1 cellulases. In mammalian cells, the formation of pyroglutamate is catalyzed by glutaminyl cyclases. Using the model filamentous fungus Neurospora crassa, we identifiedmore » two genes ( qc-1 and qc-2) that encode proteins homologous to mammalian glutaminyl cyclases. We show that qc-1 and qc-2 are essential for catalyzing the formation of an N-terminal pyroglutamate on CBH-1 and GH5-1. CBH-1 and GH5-1 produced in a Δqc-1 Δqc-2 mutant, and thus lacking the N-terminal pyroglutamate modification, showed greater sensitivity to thermal denaturation, and for GH5-1, susceptibility to proteolytic cleavage. QC-1 and QC-2 are endoplasmic reticulum (ER)-localized proteins. The pyroglutamate modification is predicted to occur in a number of additional fungal proteins that have diverse functions. The identification of glutaminyl cyclases in fungi may have implications for production of lignocellulolytic enzymes, heterologous expression, and biotechnological applications revolving around protein stability.« less
Revision 2 of the Enbridge Quality Assurance Project Plan
This Quality Assurance Project Plan (QAPP) presents Revision 2 of the organization, objectives, planned activities, and specific quality assurance/quality control (QA/QC) procedures associated with the Enbridge Marshall Pipeline Release Project.
DOT National Transportation Integrated Search
2009-07-01
Current roadway quality control and quality acceptance (QC/QA) procedures : for Louisiana include coring for thickness, density, and air void checks in hot : mix asphalt (HMA) pavements and thickness and compressive strength for : Portland cement con...
Xu, Weichen; Jimenez, Rod Brian; Mowery, Rachel; Luo, Haibin; Cao, Mingyan; Agarwal, Nitin; Ramos, Irina; Wang, Xiangyang; Wang, Jihong
2017-10-01
During manufacturing and storage process, therapeutic proteins are subject to various post-translational modifications (PTMs), such as isomerization, deamidation, oxidation, disulfide bond modifications and glycosylation. Certain PTMs may affect bioactivity, stability or pharmacokinetics and pharmacodynamics profile and are therefore classified as potential critical quality attributes (pCQAs). Identifying, monitoring and controlling these PTMs are usually key elements of the Quality by Design (QbD) approach. Traditionally, multiple analytical methods are utilized for these purposes, which is time consuming and costly. In recent years, multi-attribute monitoring methods have been developed in the biopharmaceutical industry. However, these methods combine high-end mass spectrometry with complicated data analysis software, which could pose difficulty when implementing in a quality control (QC) environment. Here we report a multi-attribute method (MAM) using a Quadrupole Dalton (QDa) mass detector to selectively monitor and quantitate PTMs in a therapeutic monoclonal antibody. The result output from the QDa-based MAM is straightforward and automatic. Evaluation results indicate this method provides comparable results to the traditional assays. To ensure future application in the QC environment, this method was qualified according to the International Conference on Harmonization (ICH) guideline and applied in the characterization of drug substance and stability samples. The QDa-based MAM is shown to be an extremely useful tool for product and process characterization studies that facilitates facile understanding of process impact on multiple quality attributes, while being QC friendly and cost-effective.
Position paper: recommendations for a digital mammography quality assurance program V4.0.
Heggie, J C P; Barnes, P; Cartwright, L; Diffey, J; Tse, J; Herley, J; McLean, I D; Thomson, F J; Grewal, R K; Collins, L T
2017-09-01
In 2001 the ACPSEM published a position paper on quality assurance in screen film mammography which was subsequently adopted as a basis for the quality assurance programs of both the Royal Australian and New Zealand College of Radiologists (RANZCR) and of BreastScreen Australia. Since then the clinical implementation of digital mammography has been realised and it has become evident that existing screen-film protocols were not appropriate to assure the required image quality needed for reliable diagnosis or to address the new dose implications resulting from digital technology. In addition, the advantages and responsibilities inherent in teleradiology are most critical in mammography and also need to be addressed. The current document is the result of a review of current overseas practice and local experience in these areas. At this time the technology of digital imaging is undergoing significant development and there is still a lack of full international consensus about some of the detailed quality control (QC) tests that should be included in quality assurance (QA) programs. This document describes the current status in digital mammography QA and recommends test procedures that may be suitable in the Australasian environment. For completeness, this document also includes a review of the QA programs required for the various types of digital biopsy units used in mammography. In the future, international harmonisation of digital quality assurance in mammography and changes in the technology may require a review of this document. Version 2.0 represented the first of these updates and key changes related to image quality evaluation, ghost image evaluation and interpretation of signal to noise ratio measurements. In Version 3.0 some significant changes, made in light of further experience gained in testing digital mammography equipment were introduced. In Version 4.0, further changes have been made, most notably digital breast tomosynthesis (DBT) testing and QC have been addressed. Some additional testing for conventional projection imaging has been added in order that sites may have the capability to undertake dose surveys to confirm compliance with diagnostic reference levels (DRLs) that may be established at the National or State level. A key recommendation is that dosimetry calculations are now to be undertaken using the methodology of Dance et al. Some minor changes to existing facility QC tests have been made to ensure the suggested procedures align with those most recently adopted by the Royal Australian and New Zealand College of Radiologists and BreastScreen Australia. Future updates of this document may be provided as deemed necessary in electronic format on the ACPSEM's website ( https://www.acpsem.org.au/whatacpsemdoes/standards-position-papers and see also http://www.ranzcr.edu.au/quality-a-safety/radiology/practice-quality-activities/mqap ).
Maurer, Matthew J.; Spear, Eric D.; Yu, Allen T.; Lee, Evan J.; Shahzad, Saba; Michaelis, Susan
2016-01-01
Cellular protein quality control (PQC) systems selectively target misfolded or otherwise aberrant proteins for degradation by the ubiquitin-proteasome system (UPS). How cells discern abnormal from normal proteins remains incompletely understood, but involves in part the recognition between ubiquitin E3 ligases and degradation signals (degrons) that are exposed in misfolded proteins. PQC is compartmentalized in the cell, and a great deal has been learned in recent years about ER-associated degradation (ERAD) and nuclear quality control. In contrast, a comprehensive view of cytosolic quality control (CytoQC) has yet to emerge, and will benefit from the development of a well-defined set of model substrates. In this study, we generated an isogenic “degron library” in Saccharomyces cerevisiae consisting of short sequences appended to the C-terminus of a reporter protein, Ura3. About half of these degron-containing proteins are substrates of the integral membrane E3 ligase Doa10, which also plays a pivotal role in ERAD and some nuclear protein degradation. Notably, some of our degron fusion proteins exhibit dependence on the E3 ligase Ltn1/Rkr1 for degradation, apparently by a mechanism distinct from its known role in ribosomal quality control of translationally paused proteins. Ubr1 and San1, E3 ligases involved in the recognition of some misfolded CytoQC substrates, are largely dispensable for the degradation of our degron-containing proteins. Interestingly, the Hsp70/Hsp40 chaperone/cochaperones Ssa1,2 and Ydj1, are required for the degradation of all constructs tested. Taken together, the comprehensive degron library presented here provides an important resource of isogenic substrates for testing candidate PQC components and identifying new ones. PMID:27172186
40 CFR 98.424 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 22 2012-07-01 2012-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Carbon Dioxide § 98.424 Monitoring and QA/QC... determine quantity in accordance with this paragraph. (i) Reporters that supply CO2 in containers using...
40 CFR 98.424 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 22 2013-07-01 2013-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Carbon Dioxide § 98.424 Monitoring and QA/QC... determine quantity in accordance with this paragraph. (i) Reporters that supply CO2 in containers using...
40 CFR 98.334 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 22 2012-07-01 2012-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Zinc Production § 98.334 Monitoring and QA/QC requirements. If..., belt weigh feeders, weighed purchased quantities in shipments or containers, combination of bulk...
40 CFR 98.94 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 21 2011-07-01 2011-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electronics Manufacturing § 98.94 Monitoring and QA/QC requirements. (a) For calendar year 2011 monitoring, you may follow the provisions in paragraphs (a)(1) through...
40 CFR 98.424 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 21 2011-07-01 2011-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Carbon Dioxide § 98.424 Monitoring and QA/QC... determine quantity in accordance with this paragraph. (i) Reporters that supply CO2 in containers using...
40 CFR 98.334 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 21 2011-07-01 2011-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Zinc Production § 98.334 Monitoring and QA/QC requirements. If..., belt weigh feeders, weighed purchased quantities in shipments or containers, combination of bulk...
40 CFR 98.334 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 21 2014-07-01 2014-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Zinc Production § 98.334 Monitoring and QA/QC requirements. If..., belt weigh feeders, weighed purchased quantities in shipments or containers, combination of bulk...
40 CFR 98.94 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 22 2013-07-01 2013-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electronics Manufacturing § 98.94 Monitoring and QA/QC requirements. (a) For calendar year 2011 monitoring, you may follow the provisions in paragraphs (a)(1) through...
40 CFR 98.424 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 21 2014-07-01 2014-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Carbon Dioxide § 98.424 Monitoring and QA/QC... determine quantity in accordance with this paragraph. (i) Reporters that supply CO2 in containers using...
40 CFR 98.94 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 22 2012-07-01 2012-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electronics Manufacturing § 98.94 Monitoring and QA/QC requirements. (a) For calendar year 2011 monitoring, you may follow the provisions in paragraphs (a)(1) through...
40 CFR 98.334 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 22 2013-07-01 2013-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Zinc Production § 98.334 Monitoring and QA/QC requirements. If..., belt weigh feeders, weighed purchased quantities in shipments or containers, combination of bulk...
40 CFR 98.94 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 21 2014-07-01 2014-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electronics Manufacturing § 98.94 Monitoring and QA/QC...-specific heel factors for each container type for each gas used, according to the procedures in paragraphs...
Measurement of pulmonary capillary blood flow in infants by plethysmography.
Stocks, J; Costeloe, K; Winlove, C P; Godfrey, S
1977-01-01
An accurate method for measuring effective pulmonary capillary blood flow (Qc eff) in infants has been developed with an adaptation of the plethysmographic technique. Measurements were made on 19 preterm. 14 small-for-dates, and 7 fullterm normal infants with a constant volume whole body plethysmograph in which the infant rebreathed nitrous oxide. There was a highly significant correlation between Qc eff and body weight, and this relationship was unaffected by premature delivery or intrauterine growth retardation. Mean Qc eff in preterm, small-for dates, and fullterm infants was 203, 208 and 197 ml min-1 kg-1, respectively, with no significant differences between the groups. A significant negative correlation existed between Qc eff and haematocrit in the preterm infants. There was no relationship between weight standardized Qc eff and postnatal age in any of the groups. With this technique, it was possible to readily recognise the presence of rapid recirculation (indicative of shunting) in several of the infants, suggesting that rebreathing methods for the assessment of Qc eff should not be applied indiscriminately during the neonatal period. By taking care to overcome the potential sources of technical error, it was possible to obtain highly reproducible results of Qc eff in infants over a wider age range than has been previously reported. PMID:838861
Quantum cascade transmitters for ultrasensitive chemical agent and explosives detection
NASA Astrophysics Data System (ADS)
Schultz, John F.; Taubman, Matthew S.; Harper, Warren W.; Williams, Richard M.; Myers, Tanya L.; Cannon, Bret D.; Sheen, David M.; Anheier, Norman C., Jr.; Allen, Paul J.; Sundaram, S. K.; Johnson, Bradley R.; Aker, Pamela M.; Wu, Ming C.; Lau, Erwin K.
2003-07-01
The small size, high power, promise of access to any wavelength between 3.5 and 16 microns, substantial tuning range about a chosen center wavelength, and general robustness of quantum cascade (QC) lasers provide opportunities for new approaches to ultra-sensitive chemical detection and other applications in the mid-wave infrared. PNNL is developing novel remote and sampling chemical sensing systems based on QC lasers, using QC lasers loaned by Lucent Technologies. In recent months laboratory cavity-enhanced sensing experiments have achieved absorption sensitivities of 8.5 x 10-11 cm-1 Hz-1/2, and the PNNL team has begun monostatic and bi-static frequency modulated, differential absorption lidar (FM DIAL) experiments at ranges of up to 2.5 kilometers. In related work, PNNL and UCLA are developing miniature QC laser transmitters with the multiplexed tunable wavelengths, frequency and amplitude stability, modulation characteristics, and power levels needed for chemical sensing and other applications. Current miniaturization concepts envision coupling QC oscillators, QC amplifiers, frequency references, and detectors with miniature waveguides and waveguide-based modulators, isolators, and other devices formed from chalcogenide or other types of glass. Significant progress has been made on QC laser stabilization and amplification, and on development and characterization of high-purity chalcogenide glasses, waveguide writing techniques, and waveguide metrology.
NASA Technical Reports Server (NTRS)
Fargion, Giulietta S.; Barnes, Robert; McClain, Charles
2001-01-01
The purpose of this technical report is to provide current documentation of the Sensor Intercomparison and Merger for Biological and Interdisciplinary Oceanic Studies (SIMBIOS) Project Office activities on in situ aerosol optical thickness (i.e., protocols, and data QC and analysis). This documentation is necessary to ensure that critical information is related to the scientific community and NASA management. This critical information includes the technical difficulties and challenges of validating and combining ocean color data from an array of independent satellite systems to form consistent and accurate global bio-optical time series products. This technical report is not meant as a substitute for scientific literature. Instead, it will provide a ready and responsive vehicle for the multitude of technical reports issued by an operational project.
DOT National Transportation Integrated Search
2008-04-01
The objective of this study was to develop resilient modulus prediction models for possible application in the quality control/quality assurance (QC/QA) procedures during and after the construction of pavement layers. Field and laboratory testing pro...
USDA-ARS?s Scientific Manuscript database
A multi-laboratory broth microdilution method trial was performed to standardize the specialized test conditions required for fish pathogens Flavobacterium columnare and F. pyschrophilum. Nine laboratories tested the quality control (QC) strains Escherichia coli ATCC 25922 and Aeromonas salmonicid...
DOT National Transportation Integrated Search
2011-06-01
The main objective of this study is to investigate the use of the semi-circular bend (SCB) : test as a quality assurance/quality control (QA/QC) measure for field construction. : Comparison of fracture properties from the SCB test and fatigue beam te...
Sun, Yujia; Lan, Xianyong; Lei, Chuzhao; Zhang, Chunlei; Chen, Hong
2015-06-01
The aim of this study was to examine the association of cofilin2 (CFL2) gene polymorphisms with growth traits in Chinese Qinchuan cattle. Three single nucleotide polymorphisms (SNPs) were identified in the bovine CFL2 gene using DNA sequencing and (forced) PCR-RFLP methods. These polymorphisms included a missense mutation (NC_007319.5: g. C 2213 G) in exon 4, one synonymous mutation (NC_007319.5: g. T 1694 A) in exon 4, and a mutation (NC_007319.5: g. G 1500 A) in intron 2, respectively. In addition, we evaluated the haplotype frequency and linkage disequilibrium coefficient of three sequence variants in 488 individuals in QC cattle. All the three SNPs in QC cattle belonged to an intermediate level of genetic diversity (0.25
NASA Astrophysics Data System (ADS)
Le, Loan T.
Over the span of more than 20 years of development, the Quantum Cascade (QC) laser has positioned itself as the most viable mid-infrared (mid-IR) light source. Today's QC lasers emit watts of continuous wave power at room temperature. Despite significant progress, the mid-IR region remains vastly under-utilized. State-of-the-art QC lasers are found in high power defense applications and detection of trace gases with narrow absorption lines. A large number of applications, however, do not require so much power, but rather, a broadly tunable laser source to detect molecules with broad absorption features. As such, a QC laser that is broadly tunable over the entire biochemical fingerprinting region remains the missing link to markets such as non- invasive biomedical diagnostics, food safety, and stand-off detection in turbid media. In this thesis, we detail how we utilized the inherent flexibility of the QC design space to conceive a new type of laser with the potential to bridge that missing link of the QC laser to large commercial markets. Our design concept, the Super Cascade (SC) laser, works contrary to conventional laser design principle by supporting multiple independent optical transitions, each contributing to broadening the gain spectrum. We have demonstrated a room temperature laser gain medium with electroluminescence spanning 3.3-12.5 ?m and laser emission from 6.2-12.5 ?m, the record spectral width for any solid state laser gain medium. This gain bandwidth covers the entire biochemical fingerprinting region. The achievement of such a spectrally broad gain medium presents engineering challenges of how to optimally utilize the bandwidth. As of this work, a monolithi- cally integrated array of Distributed Feedback QC (DFB-QC) lasers is one of the most promising ways to fully utilize the SC gain bandwidth. Therefore, in this thesis, we explore ways of improving the yield and ease of fabrication of DFB-QC lasers, including a re-examination of the role of current spreading in QC geometry.
DOT National Transportation Integrated Search
2010-06-01
This manual provides information and recommended procedures to be utilized by an agencys Weigh-in-Motion (WIM) Office Data Analyst to perform validation and quality control (QC) checks of WIM traffic data. This manual focuses on data generated by ...
Analysis of QA procedures at the Oregon Department of Transportation.
DOT National Transportation Integrated Search
2010-06-01
This research explored the Oregon Department of Transportation (ODOT) practice of Independent Assurance (IA), : for validation of the contractors test methods, and Verification, for validation of the contractors Quality Control : (QC) data. The...
40 CFR 98.474 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 22 2013-07-01 2013-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Injection of Carbon Dioxide § 98.474 Monitoring and QA/QC.... (2) You must determine the quarterly mass or volume of contents in all containers if you receive CO2...
40 CFR 98.474 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 21 2014-07-01 2014-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Injection of Carbon Dioxide § 98.474 Monitoring and QA/QC.... (2) You must determine the quarterly mass or volume of contents in all containers if you receive CO2...
40 CFR 98.474 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 22 2012-07-01 2012-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Injection of Carbon Dioxide § 98.474 Monitoring and QA/QC.... (2) You must determine the quarterly mass or volume of contents in all containers if you receive CO2...
40 CFR 98.424 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Carbon Dioxide § 98.424 Monitoring and QA/QC... containers shall measure the mass in each CO2 container using weigh bills, scales, or load cells and sum the...
Study of quantum correlation swapping with relative entropy methods
NASA Astrophysics Data System (ADS)
Xie, Chuanmei; Liu, Yimin; Chen, Jianlan; Zhang, Zhanjun
2016-02-01
To generate long-distance shared quantum correlations (QCs) for information processing in future quantum networks, recently we proposed the concept of QC repeater and its kernel technique named QC swapping. Besides, we extensively studied the QC swapping between two simple QC resources (i.e., a pair of Werner states) with four different methods to quantify QCs (Xie et al. in Quantum Inf Process 14:653-679, 2015). In this paper, we continue to treat the same issue by employing other three different methods associated with relative entropies, i.e., the MPSVW method (Modi et al. in Phys Rev Lett 104:080501, 2010), the Zhang method (arXiv:1011.4333 [quant-ph]) and the RS method (Rulli and Sarandy in Phys Rev A 84:042109, 2011). We first derive analytic expressions of all QCs which occur during the swapping process and then reveal their properties about monotonicity and threshold. Importantly, we find that a long-distance shared QC can be generated from two short-distance ones via QC swapping indeed. In addition, we simply compare our present results with our previous ones.
Quantum Testbeds Stakeholder Workshop (QTSW) Report meeting purpose and agenda.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hebner, Gregory A.
Quantum computing (QC) is a promising early-stage technology with the potential to provide scientific computing capabilities far beyond what is possible with even an Exascale computer in specific problems of relevance to the Office of Science. These include (but are not limited to) materials modeling, molecular dynamics, and quantum chromodynamics. However, commercial QC systems are not yet available and the technical maturity of current QC hardware, software, algorithms, and systems integration is woefully incomplete. Thus, there is a significant opportunity for DOE to define the technology building blocks, and solve the system integration issues to enable a revolutionary tool. Oncemore » realized, QC will have world changing impact on economic competitiveness, the scientific enterprise, and citizen well-being. Prior to this workshop, DOE / Office of Advanced Scientific Computing Research (ASCR) hosted a workshop in 2015 to explore QC scientific applications. The goal of that workshop was to assess the viability of QC technologies to meet the computational requirements in support of DOE’s science and energy mission and to identify the potential impact of these technologies.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tandon, Lav; Kuhn, Kevin J; Drake, Lawrence R
Los Alamos National Laboratory's (LANL) Actinide Analytical Chemistry (AAC) group has been in existence since the Manhattan Project. It maintains a complete set of analytical capabilities for performing complete characterization (elemental assay, isotopic, metallic and non metallic trace impurities) of uranium and plutonium samples in different forms. For a majority of the customers there are strong quality assurance (QA) and quality control (QC) objectives including highest accuracy and precision with well defined uncertainties associated with the analytical results. Los Alamos participates in various international and national programs such as the Plutonium Metal Exchange Program, New Brunswick Laboratory's (NBL' s) Safeguardsmore » Measurement Evaluation Program (SME) and several other inter-laboratory round robin exercises to monitor and evaluate the data quality generated by AAC. These programs also provide independent verification of analytical measurement capabilities, and allow any technical problems with analytical measurements to be identified and corrected. This presentation will focus on key analytical capabilities for destructive analysis in AAC and also comparative data between LANL and peer groups for Pu assay and isotopic analysis.« less
panelcn.MOPS: Copy-number detection in targeted NGS panel data for clinical diagnostics.
Povysil, Gundula; Tzika, Antigoni; Vogt, Julia; Haunschmid, Verena; Messiaen, Ludwine; Zschocke, Johannes; Klambauer, Günter; Hochreiter, Sepp; Wimmer, Katharina
2017-07-01
Targeted next-generation-sequencing (NGS) panels have largely replaced Sanger sequencing in clinical diagnostics. They allow for the detection of copy-number variations (CNVs) in addition to single-nucleotide variants and small insertions/deletions. However, existing computational CNV detection methods have shortcomings regarding accuracy, quality control (QC), incidental findings, and user-friendliness. We developed panelcn.MOPS, a novel pipeline for detecting CNVs in targeted NGS panel data. Using data from 180 samples, we compared panelcn.MOPS with five state-of-the-art methods. With panelcn.MOPS leading the field, most methods achieved comparably high accuracy. panelcn.MOPS reliably detected CNVs ranging in size from part of a region of interest (ROI), to whole genes, which may comprise all ROIs investigated in a given sample. The latter is enabled by analyzing reads from all ROIs of the panel, but presenting results exclusively for user-selected genes, thus avoiding incidental findings. Additionally, panelcn.MOPS offers QC criteria not only for samples, but also for individual ROIs within a sample, which increases the confidence in called CNVs. panelcn.MOPS is freely available both as R package and standalone software with graphical user interface that is easy to use for clinical geneticists without any programming experience. panelcn.MOPS combines high sensitivity and specificity with user-friendliness rendering it highly suitable for routine clinical diagnostics. © 2017 The Authors. Human Mutation published by Wiley Periodicals, Inc.
panelcn.MOPS: Copy‐number detection in targeted NGS panel data for clinical diagnostics
Povysil, Gundula; Tzika, Antigoni; Vogt, Julia; Haunschmid, Verena; Messiaen, Ludwine; Zschocke, Johannes; Klambauer, Günter; Wimmer, Katharina
2017-01-01
Abstract Targeted next‐generation‐sequencing (NGS) panels have largely replaced Sanger sequencing in clinical diagnostics. They allow for the detection of copy‐number variations (CNVs) in addition to single‐nucleotide variants and small insertions/deletions. However, existing computational CNV detection methods have shortcomings regarding accuracy, quality control (QC), incidental findings, and user‐friendliness. We developed panelcn.MOPS, a novel pipeline for detecting CNVs in targeted NGS panel data. Using data from 180 samples, we compared panelcn.MOPS with five state‐of‐the‐art methods. With panelcn.MOPS leading the field, most methods achieved comparably high accuracy. panelcn.MOPS reliably detected CNVs ranging in size from part of a region of interest (ROI), to whole genes, which may comprise all ROIs investigated in a given sample. The latter is enabled by analyzing reads from all ROIs of the panel, but presenting results exclusively for user‐selected genes, thus avoiding incidental findings. Additionally, panelcn.MOPS offers QC criteria not only for samples, but also for individual ROIs within a sample, which increases the confidence in called CNVs. panelcn.MOPS is freely available both as R package and standalone software with graphical user interface that is easy to use for clinical geneticists without any programming experience. panelcn.MOPS combines high sensitivity and specificity with user‐friendliness rendering it highly suitable for routine clinical diagnostics. PMID:28449315
Lapse time and frequency-dependent coda wave attenuation for Delhi and its surrounding regions
NASA Astrophysics Data System (ADS)
Das, Rabin; Mukhopadhyay, Sagarika; Singh, Ravi Kant; Baidya, Pushap R.
2018-07-01
Attenuation of seismic wave energy of Delhi and its surrounding regions has been estimated using coda of local earthquakes. Estimated quality factor (Qc) values are strongly dependent on frequency and lapse time. Frequency dependence of Qc has been estimated from the relationship Qc(f) = Q0fn for different lapse time window lengths. Q0 and n values vary from 73 to 453 and 0.97 to 0.63 for lapse time window lengths of 15 s to 90 s respectively. Average estimated frequency dependent relation is, Qc(f) = 135 ± 8f0.96±0.02 for the entire region for a window length of 30 s, where the average Qc value varies from 200 at 1.5 Hz to 1962 at 16 Hz. These values show that the region is seismically active and highly heterogeneous. The entire study region is divided into two sub-regions according to the geology of the area to investigate if there is a spatial variation in attenuation characteristics in this region. It is observed that at smaller lapse time both regions have similar Qc values. However, at larger lapse times the rate of increase of Qc with frequency is larger for Region 2 compared to Region 1. This is understandable, as it is closer to the tectonically more active Himalayan ranges and seismically more active compared to Region 1. The difference in variation of Qc with frequencies for the two regions is such that at larger lapse time and higher frequencies Region 2 shows higher Qc compared to Region 1. For lower frequencies the opposite situation is true. This indicates that there is a systematic variation in attenuation characteristics from the south (Region 1) to the north (Region 2) in the deeper part of the study area. This variation can be explained in terms of an increase in heat flow and a decrease in the age of the rocks from south to north.
Summation rules for a fully nonlocal energy-based quasicontinuum method
NASA Astrophysics Data System (ADS)
Amelang, J. S.; Venturini, G. N.; Kochmann, D. M.
2015-09-01
The quasicontinuum (QC) method coarse-grains crystalline atomic ensembles in order to bridge the scales from individual atoms to the micro- and mesoscales. A crucial cornerstone of all QC techniques, summation or quadrature rules efficiently approximate the thermodynamic quantities of interest. Here, we investigate summation rules for a fully nonlocal, energy-based QC method to approximate the total Hamiltonian of a crystalline atomic ensemble by a weighted sum over a small subset of all atoms in the crystal lattice. Our formulation does not conceptually differentiate between atomistic and coarse-grained regions and thus allows for seamless bridging without domain-coupling interfaces. We review traditional summation rules and discuss their strengths and weaknesses with a focus on energy approximation errors and spurious force artifacts. Moreover, we introduce summation rules which produce no residual or spurious force artifacts in centrosymmetric crystals in the large-element limit under arbitrary affine deformations in two dimensions (and marginal force artifacts in three dimensions), while allowing us to seamlessly bridge to full atomistics. Through a comprehensive suite of examples with spatially non-uniform QC discretizations in two and three dimensions, we compare the accuracy of the new scheme to various previous ones. Our results confirm that the new summation rules exhibit significantly smaller force artifacts and energy approximation errors. Our numerical benchmark examples include the calculation of elastic constants from completely random QC meshes and the inhomogeneous deformation of aggressively coarse-grained crystals containing nano-voids. In the elastic regime, we directly compare QC results to those of full atomistics to assess global and local errors in complex QC simulations. Going beyond elasticity, we illustrate the performance of the energy-based QC method with the new second-order summation rule by the help of nanoindentation examples with automatic mesh adaptation. Overall, our findings provide guidelines for the selection of summation rules for the fully nonlocal energy-based QC method.
2018-01-01
This study performed two phases of analysis to shed light on the performance and thematic evolution of China’s quantum cryptography (QC) research. First, large-scale research publication metadata derived from QC research published from 2001–2017 was used to examine the research performance of China relative to that of global peers using established quantitative and qualitative measures. Second, this study identified the thematic evolution of China’s QC research using co-word cluster network analysis, a computational science mapping technique. The results from the first phase indicate that over the past 17 years, China’s performance has evolved dramatically, placing it in a leading position. Among the most significant findings is the exponential rate at which all of China’s performance indicators (i.e., Publication Frequency, citation score, H-index) are growing. China’s H-index (a normalized indicator) has surpassed all other countries’ over the last several years. The second phase of analysis shows how China’s main research focus has shifted among several QC themes, including quantum-key-distribution, photon-optical communication, network protocols, and quantum entanglement with an emphasis on applied research. Several themes were observed across time periods (e.g., photons, quantum-key-distribution, secret-messages, quantum-optics, quantum-signatures); some themes disappeared over time (e.g., computer-networks, attack-strategies, bell-state, polarization-state), while others emerged more recently (e.g., quantum-entanglement, decoy-state, unitary-operation). Findings from the first phase of analysis provide empirical evidence that China has emerged as the global driving force in QC. Considering China is the premier driving force in global QC research, findings from the second phase of analysis provide an understanding of China’s QC research themes, which can provide clarity into how QC technologies might take shape. QC and science and technology policy researchers can also use these findings to trace previous research directions and plan future lines of research. PMID:29385151
Olijnyk, Nicholas V
2018-01-01
This study performed two phases of analysis to shed light on the performance and thematic evolution of China's quantum cryptography (QC) research. First, large-scale research publication metadata derived from QC research published from 2001-2017 was used to examine the research performance of China relative to that of global peers using established quantitative and qualitative measures. Second, this study identified the thematic evolution of China's QC research using co-word cluster network analysis, a computational science mapping technique. The results from the first phase indicate that over the past 17 years, China's performance has evolved dramatically, placing it in a leading position. Among the most significant findings is the exponential rate at which all of China's performance indicators (i.e., Publication Frequency, citation score, H-index) are growing. China's H-index (a normalized indicator) has surpassed all other countries' over the last several years. The second phase of analysis shows how China's main research focus has shifted among several QC themes, including quantum-key-distribution, photon-optical communication, network protocols, and quantum entanglement with an emphasis on applied research. Several themes were observed across time periods (e.g., photons, quantum-key-distribution, secret-messages, quantum-optics, quantum-signatures); some themes disappeared over time (e.g., computer-networks, attack-strategies, bell-state, polarization-state), while others emerged more recently (e.g., quantum-entanglement, decoy-state, unitary-operation). Findings from the first phase of analysis provide empirical evidence that China has emerged as the global driving force in QC. Considering China is the premier driving force in global QC research, findings from the second phase of analysis provide an understanding of China's QC research themes, which can provide clarity into how QC technologies might take shape. QC and science and technology policy researchers can also use these findings to trace previous research directions and plan future lines of research.
Adjustment of pesticide concentrations for temporal changes in analytical recovery, 1992–2010
Martin, Jeffrey D.; Eberle, Michael
2011-01-01
Recovery is the proportion of a target analyte that is quantified by an analytical method and is a primary indicator of the analytical bias of a measurement. Recovery is measured by analysis of quality-control (QC) water samples that have known amounts of target analytes added ("spiked" QC samples). For pesticides, recovery is the measured amount of pesticide in the spiked QC sample expressed as a percentage of the amount spiked, ideally 100 percent. Temporal changes in recovery have the potential to adversely affect time-trend analysis of pesticide concentrations by introducing trends in apparent environmental concentrations that are caused by trends in performance of the analytical method rather than by trends in pesticide use or other environmental conditions. This report presents data and models related to the recovery of 44 pesticides and 8 pesticide degradates (hereafter referred to as "pesticides") that were selected for a national analysis of time trends in pesticide concentrations in streams. Water samples were analyzed for these pesticides from 1992 through 2010 by gas chromatography/mass spectrometry. Recovery was measured by analysis of pesticide-spiked QC water samples. Models of recovery, based on robust, locally weighted scatterplot smooths (lowess smooths) of matrix spikes, were developed separately for groundwater and stream-water samples. The models of recovery can be used to adjust concentrations of pesticides measured in groundwater or stream-water samples to 100 percent recovery to compensate for temporal changes in the performance (bias) of the analytical method.
Countably QC-Approximating Posets
Mao, Xuxin; Xu, Luoshan
2014-01-01
As a generalization of countably C-approximating posets, the concept of countably QC-approximating posets is introduced. With the countably QC-approximating property, some characterizations of generalized completely distributive lattices and generalized countably approximating posets are given. The main results are as follows: (1) a complete lattice is generalized completely distributive if and only if it is countably QC-approximating and weakly generalized countably approximating; (2) a poset L having countably directed joins is generalized countably approximating if and only if the lattice σ c(L)op of all σ-Scott-closed subsets of L is weakly generalized countably approximating. PMID:25165730
Desaules, André
2012-11-01
It is crucial for environmental monitoring to fully control temporal bias, which is the distortion of real data evolution by varying bias through time. Temporal bias cannot be fully controlled by statistics alone but requires appropriate and sufficient metadata, which should be under rigorous and continuous quality assurance and control (QA/QC) to reliably document the degree of consistency of the monitoring system. All presented strategies to detect and control temporal data bias (QA/QC, harmonisation/homogenisation/standardisation, mass balance approach, use of tracers and analogues and control of changing boundary conditions) rely on metadata. The Will Rogers phenomenon, due to subsequent reclassification, is a particular source of temporal data bias introduced to environmental monitoring here. Sources and effects of temporal data bias are illustrated by examples from the Swiss soil monitoring network. The attempt to make a comprehensive compilation and assessment of required metadata for soil contamination monitoring reveals that most metadata are still far from being reliable. This leads to the conclusion that progress in environmental monitoring means further development of the concept of environmental metadata for the sake of temporal data bias control as a prerequisite for reliable interpretations and decisions.
Valid internal standard technique for arson detection based on gas chromatography-mass spectrometry.
Salgueiro, Pedro A S; Borges, Carlos M F; Bettencourt da Silva, Ricardo J N
2012-09-28
The most popular procedures for the detection of residues of accelerants in fire debris are the ones published by the American Society for Testing and Materials (ASTM E1412-07 and E1618-10). The most critical stages of these tests are the conservation of fire debris from the sampling to the laboratory, the extraction of residues of accelerants from the debris to the activated charcoal strips (ACS) and from those to the final solvent, as well as the analysis of sample extract by gas chromatography-mass spectrometry (GC-MS) and the interpretation of the instrumental signal. This work proposes a strategy for checking the quality of the sample conservation, the accelerant residues transference to final solvent and GC-MS analysis, using internal standard additions. It is used internal standards ranging from a highly volatile compound for checking debris conservation to low volatile compound for checking GC-MS repeatability. The developed quality control (QC) parameters are not affected by GC-MS sensitivity variation and, specifically, the GC-MS performance control is not affected by ACS adsorption saturation that may mask test performance deviations. The proposed QC procedure proved to be adequate to check GC-MS repeatability, ACS extraction and sample conservation since: (1) standard additions are affected by negligible uncertainty and (2) observed dispersion of QC parameters are fit for its intended use. Copyright © 2012 Elsevier B.V. All rights reserved.
A method to establish seismic noise baselines for automated station assessment
McNamara, D.E.; Hutt, C.R.; Gee, L.S.; Benz, H.M.; Buland, R.P.
2009-01-01
We present a method for quantifying station noise baselines and characterizing the spectral shape of out-of-nominal noise sources. Our intent is to automate this method in order to ensure that only the highest-quality data are used in rapid earthquake products at NEIC. In addition, the station noise baselines provide a valuable tool to support the quality control of GSN and ANSS backbone data and metadata. The procedures addressed here are currently in development at the NEIC, and work is underway to understand how quickly changes from nominal can be observed and used within the NEIC processing framework. The spectral methods and software used to compute station baselines and described herein (PQLX) can be useful to both permanent and portable seismic stations operators. Applications include: general seismic station and data quality control (QC), evaluation of instrument responses, assessment of near real-time communication system performance, characterization of site cultural noise conditions, and evaluation of sensor vault design, as well as assessment of gross network capabilities (McNamara et al. 2005). Future PQLX development plans include incorporating station baselines for automated QC methods and automating station status report generation and notification based on user-defined QC parameters. The PQLX software is available through the USGS (http://earthquake. usgs.gov/research/software/pqlx.php) and IRIS (http://www.iris.edu/software/ pqlx/).
NASA Astrophysics Data System (ADS)
Choi, Hyunwoo; Kim, Tae Geun; Shin, Changhwan
2017-06-01
A topological insulator (TI) is a new kind of material that exhibits unique electronic properties owing to its topological surface state (TSS). Previous studies focused on the transport properties of the TSS, since it can be used as the active channel layer in metal-oxide-semiconductor field-effect transistors (MOSFETs). However, a TI with a negative quantum capacitance (QC) effect can be used in the gate stack of MOSFETs, thereby facilitating the creation of ultra-low power electronics. Therefore, it is important to study the physics behind the QC in TIs in the absence of any external magnetic field, at room temperature. We fabricated a simple capacitor structure using a TI (TI-capacitor: Au-TI-SiO2-Si), which shows clear evidence of QC at room temperature. In the capacitance-voltage (C-V) measurement, the total capacitance of the TI-capacitor increases in the accumulation regime, since QC is the dominant capacitive component in the series capacitor model (i.e., CT-1 = CQ-1 + CSiO2-1). Based on the QC model of the two-dimensional electron systems, we quantitatively calculated the QC, and observed that the simulated C-V curve theoretically supports the conclusion that the QC of the TI-capacitor is originated from electron-electron interaction in the two-dimensional surface state of the TI.
Material quality assurance risk assessment.
DOT National Transportation Integrated Search
2013-01-01
Over the past two decades the role of SHA has shifted from quality control (QC) of materials and : placement techniques to quality assurance (QA) and acceptance. The role of the Office of Materials : Technology (OMT) has been shifting towards assuran...
Data Validation & Laboratory Quality Assurance for Region 9
In all hazardous site investigations it is essential to know the quality of the data used for decision-making purposes. Validation of data requires that appropriate quality assurance and quality control (QA/QC) procedures be followed.
Long-term pavement performance indicators for failed materials.
DOT National Transportation Integrated Search
2016-04-01
State Transportation Agencies (STAs) use quality control/quality assurance (QC/QA) specifications to guide the testing and inspection of : road pavement construction. Although failed materials of pavement rarely occur in practice, it is critical to h...
Material quality assurance risk assessment : [summary].
DOT National Transportation Integrated Search
2013-01-01
With the shift from quality control (QC) of materials and placement techniques : to quality assurance (QA) and acceptance over the years, the role of the Office : of Materials Technology (OMT) has been shifting towards assurance of : material quality...
Quality control in the year 2000.
Schade, B
1992-01-01
'Just-in-time' production is a prerequisite for a company to meet the challenges of competition. Manufacturing cycles have been so successfully optimized that release time now has become a significant factor. A vision for a major quality-control (QC) contribution to profitability in this decade seems to be the just-in-time release. Benefits will go beyond cost savings for lower inventory. The earlier detection of problems will reduce rejections and scrap. In addition, problem analysis and problem-solving will be easier. To achieve just-in-time release, advanced automated systems like robots will become the workhorses in QC for high volume pharmaceutical production. The requirements for these systems are extremely high in terms of quality, reliability and ruggedness. Crucial for the success might be advances in use of microelectronics for error checks, system recording, trouble shooting, etc. as well as creative new approaches (for example the use of redundant assay systems).
Quality control in the year 2000
Schade, Bernd
1992-01-01
‘Just-in-time’ production is a prerequisite for a company to meet the challenges of competition. Manufacturing cycles have been so successfully optimized that release time now has become a significant factor. A vision for a major quality-control (QC) contribution to profitability in this decade seems to be the just-in-time release. Benefits will go beyond cost savings for lower inventory. The earlier detection of problems will reduce rejections and scrap. In addition, problem analysis and problem-solving will be easier. To achieve just-in-time release, advanced automated systems like robots will become the workhorses in QC for high volume pharmaceutical production. The requirements for these systems are extremely high in terms of quality, reliability and ruggedness. Crucial for the success might be advances in use of microelectronics for error checks, system recording, trouble shooting, etc. as well as creative new approaches (for example the use of redundant assay systems). PMID:18924930
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sailer, S.J.
This Quality Assurance Project Plan (QAPJP) specifies the quality of data necessary and the characterization techniques employed at the Idaho National Engineering Laboratory (INEL) to meet the objectives of the Department of Energy (DOE) Waste Isolation Pilot Plant (WIPP) Transuranic Waste Characterization Quality Assurance Program Plan (QAPP) requirements. This QAPJP is written to conform with the requirements and guidelines specified in the QAPP and the associated documents referenced in the QAPP. This QAPJP is one of a set of five interrelated QAPjPs that describe the INEL Transuranic Waste Characterization Program (TWCP). Each of the five facilities participating in the TWCPmore » has a QAPJP that describes the activities applicable to that particular facility. This QAPJP describes the roles and responsibilities of the Idaho Chemical Processing Plant (ICPP) Analytical Chemistry Laboratory (ACL) in the TWCP. Data quality objectives and quality assurance objectives are explained. Sample analysis procedures and associated quality assurance measures are also addressed; these include: sample chain of custody; data validation; usability and reporting; documentation and records; audits and 0385 assessments; laboratory QC samples; and instrument testing, inspection, maintenance and calibration. Finally, administrative quality control measures, such as document control, control of nonconformances, variances and QA status reporting are described.« less
It's Time--To Reveal the Whitlam Institute within the University of Western Sydney
ERIC Educational Resources Information Center
Curach, Liz
2005-01-01
The Whitlam Institute within the University of Western Sydney is a centre for public dialogue and progress, with the Whitlam Prime Ministerial Collection inspiring its programs. The collection, both physical and virtual, was established in 2002, drawing upon primary source material made available or donated by the Hon E G Whitlam AC QC, and…
The purpose of this SOP is to describe the procedures undertaken to calculate sampling weights. The sampling weights are needed to obtain weighted statistics of the study data. This SOP uses data that have been properly coded and certified with appropriate QA/QC procedures by th...
The purpose of this SOP is to define the procedures used for the initial and periodic verification and validation of computer programs used during the Arizona NHEXAS project and the "Border" study. Keywords: Computers; Software; QA/QC.
The National Human Exposure Assessment Sur...
Chen, Haiming; Lu, Chuanjian; Liu, Huazhen; Wang, Maojie; Zhao, Hui; Yan, Yuhong; Han, Ling
2017-07-01
Quercetin (QC) is a dietary flavonoid abundant in many natural plants. A series of studies have shown that it has been shown to exhibit several biological properties, including anti-inflammatory, anti-oxidant, cardio-protective, vasodilatory, liver-protective and anti-cancer activities. However, so far the possible therapeutic effect of QC on psoriasis has not been reported. The present study was undertaken to evaluate the potential beneficial effect of QC in psoriasis using a generated imiquimod (IMQ)-induced psoriasis-like mouse model, and to further elucidate its underlying mechanisms of action. Effects of QC on PASI scores, back temperature, histopathological changes, oxidative/anti-oxidative indexes, pro-inflammatory cytokines and NF-κB pathway in IMQ-induced mice were investigated. Our results showed that QC could significantly reduce the PASI scores, decrease the temperature of the psoriasis-like lesions, and ameliorate the deteriorating histopathology in IMQ-induced mice. Moreover, QC effectively attenuated levels of TNF-α, IL-6 and IL-17 in serum, increased activities of GSH, CAT and SOD, and decreased the accumulation of MDA in skin tissue induced by IMQ in mice. The mechanism may be associated with the down-regulation of NF-κB, IKKα, NIK and RelB expression and up-regulation of TRAF3, which were critically involved in the non-canonical NF-κB pathway. In conclusion, our present study demonstrated that QC had appreciable anti-psoriasis effects in IMQ-induced mice, and the underlying mechanism may involve the improvement of antioxidant and anti-inflammatory status and inhibition on the activation of the NF-κB signaling. Hence, QC, a naturally occurring flavone with potent anti-psoriatic effects, has the potential for further development as a candidate for psoriasis treatment. Copyright © 2017 Elsevier B.V. All rights reserved.
Improvement of the quality of work in a biochemistry laboratory via measurement system analysis.
Chen, Ming-Shu; Liao, Chen-Mao; Wu, Ming-Hsun; Lin, Chih-Ming
2016-10-31
An adequate and continuous monitoring of operational variations can effectively reduce the uncertainty and enhance the quality of laboratory reports. This study applied the evaluation rule of the measurement system analysis (MSA) method to estimate the quality of work conducted in a biochemistry laboratory. Using the gauge repeatability & reproducibility (GR&R) approach, variations in quality control (QC) data among medical technicians in conducting measurements of five biochemical items, namely, serum glucose (GLU), aspartate aminotransferase (AST), uric acid (UA), sodium (Na) and chloride (Cl), were evaluated. The measurements of the five biochemical items showed different levels of variance among the different technicians, with the variances in GLU measurements being higher than those for the other four items. The ratios of precision-to-tolerance (P/T) for Na, Cl and GLU were all above 0.5, implying inadequate gauge capability. The product variation contribution of Na was large (75.45% and 31.24% in normal and abnormal QC levels, respectively), which showed that the impact of insufficient usage of reagents could not be excluded. With regard to reproducibility, high contributions (of more than 30%) of variation for the selected items were found. These high operator variation levels implied that the possibility of inadequate gauge capacity could not be excluded. The analysis of variance (ANOVA) of GR&R showed that the operator variations in GLU measurements were significant (F=5.296, P=0.001 in the normal level and F=3.399, P=0.015 in the abnormal level, respectively). In addition to operator variations, product variations of Na were also significant for both QC levels. The heterogeneity of variance for the five technicians showed significant differences for the Na and Cl measurements in the normal QC level. The accuracy of QC for five technicians was identified for further operational improvement. This study revealed that MSA can be used to evaluate product and personnel errors and to improve the quality of work in a biochemical laboratory through proper corrective actions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoisak, J; Manger, R; Dragojevic, I
Purpose: To perform a failure mode and effects analysis (FMEA) of the process for treating superficial skin cancers with the Xoft Axxent electronic brachytherapy (eBx) system, given the recent introduction of expanded quality control (QC) initiatives at our institution. Methods: A process map was developed listing all steps in superficial treatments with Xoft eBx, from the initial patient consult to the completion of the treatment course. The process map guided the FMEA to identify the failure modes for each step in the treatment workflow and assign Risk Priority Numbers (RPN), calculated as the product of the failure mode’s probability ofmore » occurrence (O), severity (S) and lack of detectability (D). FMEA was done with and without the inclusion of recent QC initiatives such as increased staffing, physics oversight, standardized source calibration, treatment planning and documentation. The failure modes with the highest RPNs were identified and contrasted before and after introduction of the QC initiatives. Results: Based on the FMEA, the failure modes with the highest RPN were related to source calibration, treatment planning, and patient setup/treatment delivery (Fig. 1). The introduction of additional physics oversight, standardized planning and safety initiatives such as checklists and time-outs reduced the RPNs of these failure modes. High-risk failure modes that could be mitigated with improved hardware and software interlocks were identified. Conclusion: The FMEA analysis identified the steps in the treatment process presenting the highest risk. The introduction of enhanced QC initiatives mitigated the risk of some of these failure modes by decreasing their probability of occurrence and increasing their detectability. This analysis demonstrates the importance of well-designed QC policies, procedures and oversight in a Xoft eBx programme for treatment of superficial skin cancers. Unresolved high risk failure modes highlight the need for non-procedural quality initiatives such as improved planning software and more robust hardware interlock systems.« less
The Navy’s Quality Journey: Operational Implementation of TQL
1993-04-01
training. Dr. Kaoru Ishikawa "Guide to Ouality Control" "QC begins with education and ends with education. To implement TQC, we need to carry out...York: McGraw-Hill, 1986. 20. Ishikawa , Kaoru . What is Total Qualit Control? Englewood Cliffs, NJ: Prentice-Hall, Inc., 1985. 21. Ishikawa , Kaoru
Stability of Tetrahydrocannabinol and Cannabidiol in Prepared Quality Control Medible Brownies.
Wolf, Carl E; Poklis, Justin L; Poklis, Alphonse
2017-03-01
The legalization of marijuana in the USA for both medicinal and recreational use has increased in the past few years. Currently, 24 states have legalized marijuana for medicinal use. The US Drug Enforcement Administration has classified marijuana as a Schedule I substance. The US Food and Drug Administration does not regulate formulations or packages of marijuana that are currently marketed in states that have legalized marijuana. Marijuana edibles or "medibles" are typically packages of candies and baked goods consumed for medicinal as well as recreational marijuana use. They contain major psychoactive drug in marijuana, delta-9-tetrahydrocannabinol (THC) and/or cannabidiol (CBD), which has reputed medical properties. Presented is a method for the preparation and application of THC and CBD containing brownies used as quality control (QC) material for the analysis of marijuana or cannabinoid baked medibles. The performance parameters of the assay including possible matrix effects and cannabinoid stability in the brownie QC over time are presented. It was determined that the process used to prepare and bake the brownie control material did not degrade the THC or CBD. The brownie matrix was found not to interfere with the analysis of a THC or a CBD. Ten commercially available brownie matrixes were evaluated for potential interferences; none of them were found to interfere with the analysis of THC or CBD. The laboratory baked medible QC material was found to be stable at room temperature for at least 3 months. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Pellerin, D; Charbonneau, E; Fadul-Pacheco, L; Soucy, O; Wattiaux, M A
2017-10-01
Our objective was to explore the trade-offs between economic performance (farm net income, FNI) and environmental outcomes (whole-farm P and N balances) of dairy farms in Wisconsin (WI; United States) and Québec (QC; Canada). An Excel-based linear program model (N-CyCLES; nutrient cycling: crops, livestock, environment, and soil) was developed to optimize feeding, cropping, and manure management as a single unit of management. In addition to FNI, P and N balances model outputs included (1) the mix of up to 9 home-grown and 17 purchased feeds for up to 5 animal groups, (2) the mix of up to 5 crop rotations in up to 5 land units and c) the mix of up to 7 fertilizers (solid and liquid manure and 5 commercial fertilizers) to allocate in each land unit. The model was parameterized with NRC nutritional guidelines and regional nutrient management planning rules. Simulations were conducted on a typical WI farm of 107 cows and 151 ha of cropland and, a Southern QC farm of 87 cows and 142 ha of cropland and all results were expressed per kg of fat- and protein-corrected milk (FPCM). In absence of constraints on P and N balances, maximum FNI was 0.12 and 0.11 $/kg of FPCM for WI and QC, respectively, with P and N balances of 1.05 and 14.29 g/kg of FPCM in WI but 0.60 and 15.70 g/kg of FPCM in QC. The achievable reduction (balance at maximum FNI minus balance when the simulation objective was to minimize P or N balance) was 0.31 and 0.54 g of P/kg of FPCM (29 and 89% reduction), but 2.37 and 3.31 g of N/kg of FPCM (17 and 24% reduction) in WI and QC, respectively. Among other factors, differences in animal unit per hectare and reliance on biological N fixation may have contributed to lower achievable reductions of whole-farm balances in WI compared with QC. Subsequent simulations to maximize FNI under increasing constraints on nutrient balances revealed that it was possible to reduce P balance, N balance, and both together by up to 33% without a substantial effect on FNI. Partial reduction in P balance reduced N balance (synergetic effect) in WI, but increased N balance (antagonistic effect) in QC. In contrast, reducing N balance increased P balance in both regions, albeit in different magnitudes. The regional comparison highlighted the importance of site-specific conditions on modeling outcomes. This study demonstrated that even when recommended guidelines are followed for herd nutrition and crop fertilization, the optimization of herd feeding, cropping, and manure spreading as a single unit of management may help identify management options that preserve FNI, while substantially reducing whole-farm nutrient balance. Copyright © 2017 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Samuelson, John; Robbins, Phillips W.
2014-01-01
Asparagine-linked glycans (N-glycans) of medically important protists have much to tell us about the evolution of N-glycosylation and of N-glycan-dependent quality control (N-glycan QC) of protein folding in the endoplasmic reticulum. While host N-glycans are built upon a dolichol-pyrophosphate-linked precursor with 14 sugars (Glc3Man9GlcNAc2), protist N-glycan precursors vary from Glc3Man9GlcNAc2 (Acanthamoeba) to Man9GlcNAc2 (Trypanosoma) to Glc3Man5GlcNAc2 (Toxoplasma) to Man5GlcNAc2 (Entamoeba, Trichomonas, and Eimeria) to GlcNAc2 (Plasmodium and Giardia) to zero (Theileria). As related organisms have differing N-glycan lengths (e.g. Toxoplasma, Eimeria, Plasmodium, and Theileria), the present N-glycan variation is based upon secondary loss of Alg genes, which encode enzymes that add sugars to the N-glycan precursor. An N-glycan precursor with Man5GlcNAc2 is necessary but not sufficient for N-glycan QC, which is predicted by the presence of the UDP-glucose:glucosyltransferase (UGGT) plus calreticulin and/or calnexin. As many parasites lack glucose in their N-glycan precursor, UGGT product may be identified by inhibition of glucosidase II. The presence of an armless calnexin in Toxoplasma suggests secondary loss of N-glycan QC from coccidia. Positive selection for N-glycan sites occurs in secreted proteins of organisms with NG-QC and is based upon an increased likelihood of threonine but not serine in the second position versus asparagine. In contrast, there appears to be selection against N-glycan length in Plasmodium and N-glycan site density in Toxoplasma. Finally, there is suggestive evidence for N-glycan-dependent ERAD in Trichomonas, which glycosylates and degrades the exogenous reporter mutant carboxypeptidase Y (CPY*). PMID:25475176
Bergallo, M; Costa, C; Tarallo, S; Daniele, R; Merlino, C; Segoloni, G P; Negro Ponzi, A; Cavallo, R
2006-06-01
The human cytomegalovirus (HCMV) is an important pathogen in immunocompromised patients, such as transplant recipients. The use of sensitive and rapid diagnostic assays can have a great impact on antiviral prophylaxis and therapy monitoring and diagnosing active disease. Quantification of HCMV DNA may additionally have prognostic value and guide routine management. The aim of this study was to develop a reliable internally-controlled quantitative-competitive PCR (QC-PCR) for the detection and quantification of HCMV DNA viral load in peripheral blood and compare it with other methods: the HCMV pp65 antigenaemia assay in leukocyte fraction, the HCMV viraemia, both routinely employed in our laboratory, and the nucleic acid sequence-based amplification (NASBA) for detection of HCMV pp67-mRNA. Quantitative-competitive PCR is a procedure for nucleic acid quantification based on co-amplification of competitive templates, the target DNA and a competitor functioning as internal standard. In particular, a standard curve is generated by amplifying 10(2) to 10(5) copies of target pCMV-435 plasmid with 10(4) copies of competitor pCMV-C plasmid. Clinical samples derived from 40 kidney transplant patients were tested by spiking 10(4) copies of pCMV-C into the PCR mix as internal control, and comparing results with the standard curve. Of the 40 patients studied, 39 (97.5%) were positive for HCMV DNA by QC-PCR. While the correlation between the number of pp65-positive cells and the number of HCMV DNA genome copies/mL and the former and the pp67mRNA-positivity were statistically significant, there was no significant correlation between HCMV DNA viral load assayed by QC-PCR and HCMV viraemia. The QC-PCR assay could detect from 10(2) to over 10(7) copies of HCMV DNA with a range of linearity between 10(2) and 10(5) genomes.
NASA Astrophysics Data System (ADS)
Maity, H.; Biswas, A.; Bhattacharjee, A. K.; Pal, A.
In this paper, we have proposed the design of quantum cost (QC) optimized 4-bit reversible universal shift register (RUSR) using reduced number of reversible logic gates. The proposed design is very useful in quantum computing due to its low QC, less no. of reversible logic gate and less delay. The QC, no. of gates, garbage outputs (GOs) are respectively 64, 8 and 16 for proposed work. The improvement of proposed work is also presented. The QC is 5.88% to 70.9% improved, no. of gate is 60% to 83.33% improved with compared to latest reported result.
The Quasicontinuum Method: Overview, applications and current directions
NASA Astrophysics Data System (ADS)
Miller, Ronald E.; Tadmor, E. B.
2002-10-01
The Quasicontinuum (QC) Method, originally conceived and developed by Tadmor, Ortiz and Phillips [1] in 1996, has since seen a great deal of development and application by a number of researchers. The idea of the method is a relatively simple one. With the goal of modeling an atomistic system without explicitly treating every atom in the problem, the QC provides a framework whereby degrees of freedom are judiciously eliminated and force/energy calculations are expedited. This is combined with adaptive model refinement to ensure that full atomistic detail is retained in regions of the problem where it is required while continuum assumptions reduce the computational demand elsewhere. This article provides a review of the method, from its original motivations and formulation to recent improvements and developments. A summary of the important mechanics of materials results that have been obtained using the QC approach is presented. Finally, several related modeling techniques from the literature are briefly discussed. As an accompaniment to this paper, a website designed to serve as a clearinghouse for information on the QC method has been established at www.qcmethod.com. The site includes information on QC research, links to researchers, downloadable QC code and documentation.
Control of plant stem cell function by conserved interacting transcriptional regulators
Zhou, Yun; Liu, Xing; Engstrom, Eric M.; Nimchuk, Zachary L.; Pruneda-Paz, Jose L.; Tarr, Paul T.; Yan, An; Kay, Steve A.; Meyerowitz, Elliot M.
2014-01-01
SUMMARY Plant stem cells in the shoot apical meristem (SAM) and root apical meristem (RAM) provide for postembryonic development of above-ground tissues and roots, respectively, while secondary vascular stem cells sustain vascular development1–4. WUSCHEL (WUS), a homeodomain transcription factor expressed in the rib meristem of the SAM, is a key regulatory factor controlling stem cell populations in the Arabidopsis SAM5–6 and is thought to establish the shoot stem cell niche via a feedback circuit with the CLAVATA3 (CLV3) peptide signaling pathway7. WUSCHEL-RELATED HOMEOBOX5 (WOX5), specifically expressed in root quiescent center (QC), defines QC identity and functions interchangeably with WUS in control of shoot and root stem cell niches8. WOX4, expressed in Arabidopsis procambial cells, defines the vascular stem cell niche9–11. WUS/WOX family proteins are evolutionarily and functionally conserved throughout the plant kingdom12 and emerge as key actors in the specification and maintenance of stem cells within all meristems13. However, the nature of the genetic regime in stem cell niches that centers on WOX gene function has been elusive, and molecular links underlying conserved WUS/WOX function in stem cell niches remain unknown. Here we demonstrate that the Arabidopsis HAIRY MERISTEM (HAM)family transcription regulators act as conserved interacting co-factors with WUS/WOX proteins. HAM and WUS share common targets in vivo and their physical interaction is important in driving downstream transcriptional programs and in promoting shoot stem cell proliferation. Differences in the overlapping expression patterns of WOX and HAM family members underlie the formation of diverse stem cell niche locations, and the HAM family is essential for all of these stem cell niches. These findings establish a new framework for the control of stem cell production during plant development. PMID:25363783
NASA Astrophysics Data System (ADS)
Dirisu, Afusat Olayinka
Quantum Cascade (QC) lasers are intersubband light sources operating in the wavelength range of ˜ 3 to 300 mum and are used in applications such as sensing (environmental, biological, and hazardous chemical), infrared countermeasures, and free-space infrared communications. The mid-infrared range (i.e. lambda ˜ 3-30 mum) is of particular importance in sensing because of the strong interaction of laser radiation with various chemical species, while in free space communications the atmospheric windows of 3-5 mum and 8-12 mum are highly desirable for low loss transmission. Some of the requirements of these applications include, (1) high output power for improved sensitivity; (2) high operating temperatures for compact and cost-effective systems; (3) wide tunability; (4) single mode operation for high selectivity. In the past, available mid-infrared sources, such as the lead-salt and solid-state lasers, were bulky, expensive, or emit low output power. In recent years, QC lasers have been explored as cost-effective and compact sources because of their potential to satisfy and exceed all the above requirements. Also, the ultrafast carrier lifetimes of intersubband transitions in QC lasers are promising for high bandwidth free-space infrared communication. This thesis was focused on the improvement of QC lasers through the design and optimization of the laser cavity and characterization of the laser gain medium. The optimization of the laser cavity included, (1) the design and fabrication of high reflection Bragg gratings and subwavelength antireflection gratings, by focused ion beam milling, to achieve tunable, single mode and high power QC lasers, and (2) modeling of slab-coupled optical waveguide QC lasers for high brightness output beams. The characterization of the QC laser gain medium was carried out using the single-pass transmission experiment, a sensitive measurement technique, for probing the intersubband transitions and the electron distribution of QC lasers under different temperatures and applied bias conditions, unlike typical infrared measurement techniques that are restricted to non-functional devices. With the single-pass technique, basic understanding of the physics behind the workings of the QC laser gain can be achieved, which is invaluable in the design of QC lasers with high output power and high operating temperatures.
Managing the Quality of Environmental Data in EPA Region 9
EPA Pacific Southwest, Region 9's Quality Assurance (QA) section's primary mission is to effectively oversee and carry out the Quality System and Quality Management Plan, and project-level quality assurance and quality control (QA/QC) activities.
Implementation of GPS controlled highway construction equipment phase II.
DOT National Transportation Integrated Search
2008-01-01
"During 2006, WisDOT and the Construction Materials and Support Center at UW-Madison worked together to develop : a specification and QC/QA procedures for GPS machine guidance on highway construction grading operations. These : specifications and pro...
Implementation of GPS controlled highway construction equipment, phase III.
DOT National Transportation Integrated Search
2009-02-01
Beginning in 2006, WisDOT and the Construction Material and Support Center (CMSC) at UW-Madison worked : together to develop the specifications and the QA/QC procedures for GPS machine guidance on highway grading : projects. These specifications and ...
QC/QA : evaluation of effectiveness in Kentucky.
DOT National Transportation Integrated Search
2008-06-30
Quality control and quality assurance in the highway industry is going through a cultural shift. There is a growing trend toward using the contractor data for acceptance and payment purpose. This has led to serious concerns about conflicts of interes...
QSPIN: A High Level Java API for Quantum Computing Experimentation
NASA Technical Reports Server (NTRS)
Barth, Tim
2017-01-01
QSPIN is a high level Java language API for experimentation in QC models used in the calculation of Ising spin glass ground states and related quadratic unconstrained binary optimization (QUBO) problems. The Java API is intended to facilitate research in advanced QC algorithms such as hybrid quantum-classical solvers, automatic selection of constraint and optimization parameters, and techniques for the correction and mitigation of model and solution errors. QSPIN includes high level solver objects tailored to the D-Wave quantum annealing architecture that implement hybrid quantum-classical algorithms [Booth et al.] for solving large problems on small quantum devices, elimination of variables via roof duality, and classical computing optimization methods such as GPU accelerated simulated annealing and tabu search for comparison. A test suite of documented NP-complete applications ranging from graph coloring, covering, and partitioning to integer programming and scheduling are provided to demonstrate current capabilities.
Asthma Education and Intervention Program: Partnership for Asthma Trigger-Free Homes (PATH)
2010-02-01
manually on pencil and paper forms, and then entered into our electronic database program, Checkbox. All data were double-checked upon entry, and...additional QC was randomly performed for 5% of the data (e.g., comparing paper survey responses to Checkbox entries), as well as on an ―as required...unvented gas oven/ dryer /heater present in the home 1. Entryway 1. Bathroom 1. Kitchen 1. Living room 1. Dining room 1. Bedroom 1 1. Bedroom 2 1
Ensuring the reliability of stable isotope ratio data--beyond the principle of identical treatment.
Carter, J F; Fry, B
2013-03-01
The need for inter-laboratory comparability is crucial to facilitate the globalisation of scientific networks and the development of international databases to support scientific and criminal investigations. This article considers what lessons can be learned from a series of inter-laboratory comparison exercises organised by the Forensic Isotope Ratio Mass Spectrometry (FIRMS) network in terms of reference materials (RMs), the management of data quality, and technical limitations. The results showed that within-laboratory precision (repeatability) was generally good but between-laboratory accuracy (reproducibility) called for improvements. This review considers how stable isotope laboratories can establish a system of quality control (QC) and quality assurance (QA), emphasising issues of repeatability and reproducibility. For results to be comparable between laboratories, measurements must be traceable to the international δ-scales and, because isotope ratio measurements are reported relative to standards, a key aspect is the correct selection, calibration, and use of international and in-house RMs. The authors identify four principles which promote good laboratory practice. The principle of identical treatment by which samples and RMs are processed in an identical manner and which incorporates three further principles; the principle of identical correction (by which necessary corrections are identified and evenly applied), the principle of identical scaling (by which data are shifted and stretched to the international δ-scales), and the principle of error detection by which QC and QA results are monitored and acted upon. To achieve both good repeatability and good reproducibility it is essential to obtain RMs with internationally agreed δ-values. These RMs will act as the basis for QC and can be used to calibrate further in-house QC RMs tailored to the activities of specific laboratories. In-house QA standards must also be developed to ensure that QC-based calibrations and corrections lead to accurate results for samples. The δ-values assigned to RMs must be recorded and reported with all data. Reference materials must be used to determine what corrections are necessary for measured data. Each analytical sequence of samples must include both QC and QA materials which are subject to identical treatment during measurement and data processing. Results for these materials must be plotted, monitored, and acted upon. Periodically international RMs should be analysed as an in-house proficiency test to demonstrate results are accurate.
NASA Astrophysics Data System (ADS)
Huang, Sheng; Ao, Xiang; Li, Yuan-yuan; Zhang, Rui
2016-09-01
In order to meet the needs of high-speed development of optical communication system, a construction method of quasi-cyclic low-density parity-check (QC-LDPC) codes based on multiplicative group of finite field is proposed. The Tanner graph of parity check matrix of the code constructed by this method has no cycle of length 4, and it can make sure that the obtained code can get a good distance property. Simulation results show that when the bit error rate ( BER) is 10-6, in the same simulation environment, the net coding gain ( NCG) of the proposed QC-LDPC(3 780, 3 540) code with the code rate of 93.7% in this paper is improved by 2.18 dB and 1.6 dB respectively compared with those of the RS(255, 239) code in ITU-T G.975 and the LDPC(3 2640, 3 0592) code in ITU-T G.975.1. In addition, the NCG of the proposed QC-LDPC(3 780, 3 540) code is respectively 0.2 dB and 0.4 dB higher compared with those of the SG-QC-LDPC(3 780, 3 540) code based on the two different subgroups in finite field and the AS-QC-LDPC(3 780, 3 540) code based on the two arbitrary sets of a finite field. Thus, the proposed QC-LDPC(3 780, 3 540) code in this paper can be well applied in optical communication systems.
Development of a portable quality control application using a tablet-type electronic device.
Ono, Tomohiro; Miyabe, Yuki; Akimoto, Mami; Mukumoto, Nobutaka; Ishihara, Yoshitomo; Nakamura, Mitsuhiro; Mizowaki, Takashi
2018-03-01
Our aim was to develop a portable quality control (QC) application using a thermometer, a barometer, an angle gauge, and a range finder implemented in a tablet-type consumer electronic device (CED) and to assess the accuracies of the measurements made. The QC application was programmed using Java and OpenCV libraries. First, temperature and atmospheric pressure were measured over 30 days using the temperature and pressure sensors of the CED and compared with those measured by a double-tube thermometer and a digital barometer. Second, the angle gauge was developed using the accelerometer of the CED. The roll and pitch angles of the CED were measured from 0 to 90° at intervals of 10° in the clockwise (CW) and counterclockwise (CCW) directions. The values were compared with those measured by a digital angle gauge. Third, a range finder was developed using the tablet's built-in camera and image-processing capacities. Surrogate markers were detected by the camera and their positions converted to actual positions using a homographic transformation method. Fiducial markers were placed on a treatment couch and moved 100 mm in 10-mm steps in both the lateral and longitudinal directions. The values were compared with those measured by the digital output of the treatment couch. The differences between CED values and those of other devices were compared by calculating means ± standard deviations (SDs). The means ± SDs of differences in temperature and atmospheric pressure were -0.07 ± 0.25°C and 0.05 ± 0.10 hPa, respectively. The means ± SDs of the difference in angle was -0.17 ± 0.87° (0.15 ± 0.23° degrees excluding the 90° angle). The means ± SDs of distances were 0.01 ± 0.07 mm in both the lateral and longitudinal directions. Our portable QC application was accurate and may be used instead of standard measuring devices. Our portable CED is efficient and simple when used in the field of medical physics. © 2018 American Association of Physicists in Medicine.
NASA Astrophysics Data System (ADS)
Kurokawa, Ami; Doshida, Tomoki; Hagihara, Yukito; Suzuki, Hiroshi; Takai, Kenichi
2018-05-01
Though intergranular (IG) and quasi-cleavage (QC) fractures have been widely recognized as typical fracture modes of the hydrogen-induced cracking in high-strength steels, the main factor has been unclarified yet. In the present study, the hydrogen content dependence on the main factor causing hydrogen-induced cracking has been examined through the fracture mode transition from QC to IG at the crack initiation site in the tempered martensitic steels. Two kinds of tempered martensitic steels were prepared to change the cohesive force due to the different precipitation states of Fe3C on the prior γ grain boundaries. A high amount of Si (H-Si) steel has a small amount of Fe3C on the prior austenite grain boundaries. Whereas, a low amount of Si (L-Si) steel has a large amount of Fe3C sheets on the grain boundaries. The fracture modes and initiations were observed using FE-SEM (Field Emission-Scanning Electron Microscope). The crack initiation sites of the H-Si steel were QC fracture at the notch tip under various hydrogen contents. While the crack initiation of the L-Si steel change from QC fracture at the notch tip to QC and IG fractures from approximately 10 µm ahead of the notch tip as increasing in hydrogen content. For L-Si steels, two possibilities are considered that the QC or IG fracture occurred firstly, or the QC and IG fractures occurred simultaneously. Furthermore, the principal stress and equivalent plastic strain distributions near the notch tip were calculated with FEM (Finite Element Method) analysis. The plastic strain was the maximum at the notch tip and the principle stress was the maximum at approximately 10 µm from the notch tip. The position of the initiation of QC and IG fracture observed using FE-SEM corresponds to the position of maximum strain and stress obtained with FEM, respectively. These findings indicate that the main factors causing hydrogen-induced cracking are different between QC and IG fractures.
Coda Q and its Frequency Dependence in the Eastern Himalayan and Indo-Burman Plate Boundary Systems
NASA Astrophysics Data System (ADS)
Mitra, S.; Kumar, A.
2015-12-01
We use broadband waveform data for 305 local earthquakes from the Eastern Himalayan and Indo-Burman plate boundary systems, to model the seismic attenuation in NE India. We measure the decay in amplitude of coda waves at discreet frequencies (between 1 and 12Hz) to evaluate the quality factor (Qc) as a function of frequency. We combine these measurements to evaluate the frequency dependence of Qc of the form Qc(f)=Qof η, where Qo is the quality factor at 1Hz and η is the frequency dependence. Computed Qo values range from 80-360 and η ranges from 0.85-1.45. To study the lateral variation in Qo and η, we regionalise the Qc by combining all source-receiver measurements using a back-projection algorithm. For a single back scatter model, the coda waves sample an elliptical area with the epicenter and receiver at the two foci. We parameterize the region using square grids. The algorithm calculates the overlap in area and distributes Qc in the sampled grids using the average Qc as the boundary value. This is done in an iterative manner, by minimising the misfit between the observed and computed Qc within each grid. This process is repeated for all frequencies and η is computed for each grid by combining Qc for all frequencies. Our results reveal strong variation in Qo and η across NE India. The highest Qo are in the Bengal Basin (210-280) and the Indo-Burman subduction zone (300-360). The Shillong Plateau and Mikir Hills have intermediate Qo (~160) and the lowest Qo (~80) is observed in the Naga fold thrust belt. This variation in Qo demarcates the boundary between the continental crust beneath the Shillong Plateau and Mikir Hills and the transitional to oceanic crust beneath the Bengal Basin and Indo-Burman subduction zone. Thick pile of sedimentary strata in the Naga fold thrust belt results in the low Qo. Frequency dependence (η) of Qc across NE India is observed to be very high, with regions of high Qo being associated with relatively higher η.
Field correlation of PQI gauge with nuclear density gauge: phase 1.
DOT National Transportation Integrated Search
2006-12-01
Traditionally, the Oklahoma Department of Transportation (ODOT) uses a nuclear density gauge as a quality control (QC) and quality assurance (QA) tool for in-place density. The nuclear-based devices, however, tend to have problems associated with lic...
THE MAQC PROJECT: ESTABLISHING QC METRICS AND THRESHOLDS FOR MICROARRAY QUALITY CONTROL
Microarrays represent a core technology in pharmacogenomics and toxicogenomics; however, before this technology can successfully and reliably be applied in clinical practice and regulatory decision-making, standards and quality measures need to be developed. The Microarray Qualit...
Stapanian, Martin A.; Lewis, Timothy E; Palmer, Craig J.; Middlebrook Amos, Molly
2016-01-01
Unlike most laboratory studies, rigorous quality assurance/quality control (QA/QC) procedures may be lacking in ecosystem restoration (“ecorestoration”) projects, despite legislative mandates in the United States. This is due, in part, to ecorestoration specialists making the false assumption that some types of data (e.g. discrete variables such as species identification and abundance classes) are not subject to evaluations of data quality. Moreover, emergent behavior manifested by complex, adapting, and nonlinear organizations responsible for monitoring the success of ecorestoration projects tend to unconsciously minimize disorder, QA/QC being an activity perceived as creating disorder. We discuss similarities and differences in assessing precision and accuracy for field and laboratory data. Although the concepts for assessing precision and accuracy of ecorestoration field data are conceptually the same as laboratory data, the manner in which these data quality attributes are assessed is different. From a sample analysis perspective, a field crew is comparable to a laboratory instrument that requires regular “recalibration,” with results obtained by experts at the same plot treated as laboratory calibration standards. Unlike laboratory standards and reference materials, the “true” value for many field variables is commonly unknown. In the laboratory, specific QA/QC samples assess error for each aspect of the measurement process, whereas field revisits assess precision and accuracy of the entire data collection process following initial calibration. Rigorous QA/QC data in an ecorestoration project are essential for evaluating the success of a project, and they provide the only objective “legacy” of the dataset for potential legal challenges and future uses.
Sekhavati, Mohammad H; Mesgaran, Mohsen Danesh; Nassiri, Mohammad R; Mohammadabadi, Tahereh; Rezaii, Farkhondeh; Fani Maleki, Adham
2009-10-01
This paper describes the use of a quantitative competitive polymerase chain reaction (QC-PCR) assay; using PCR primers to the rRNA locus of rumen fungi and a standard-control DNA including design and validation. In order to test the efficiency of this method for quantifying anaerobic rumen fungi, it has been attempted to evaluate this method in in vitro conditions by comparing with an assay based on measuring cell wall chitin. The changes in fungal growth have been studied when they are grown in in vitro on either untreated (US) or sodium hydroxide treated wheat straw (TS). Results showed that rumen fungi growth was significantly higher in treated samples compared with untreated during the 12d incubation (P<0.05) and plotting the chitin assay's results against the competitive PCR's showed high positive correlation (R(2)> or =0.87). The low mean values of the coefficients of variance in repeatability in the QC-PCR method against the chitin assay demonstrated more reliability of this new approach. And finally, the efficiency of this method was investigated in in vivo conditions. Samples of rumen fluid were collected from four fistulated Holstein steers which were fed four different diets (basal diet, high starch, high sucrose and starch plus sucrose) in rotation. The results of QC-PCR showed that addition of these non-structural carbohydrates to the basal diets caused a significant decrease in rumen anaerobic fungi biomass. The QC-PCR method appears to be a reliable and can be used for rumen samples.
De Clercq, K; Goris, N; Barnett, P V; MacKay, D K
2008-01-01
The last decade international trade in animals and animal products was liberated and confidence in this global trade can increase only if appropriate control measures are applied. As foot-and-mouth disease (FMD) diagnostics will play an essential role in this respect, the Food and Agriculture Organization European Commission for the Control of Foot-and-Mouth Disease (EUFMD) co-ordinates, in collaboration with the European Commission, several programmes to increase the quality of FMD diagnostics. A quality assurance (QA) system is deemed essential for laboratories involved in certifying absence of FMDV or antibodies against the virus. Therefore, laboratories are encouraged to validate their diagnostic tests fully and to install a continuous quality control (QC) monitoring system. Knowledge of performance characteristics of diagnostics is essential to interpret results correctly and to calculate sample rates in regional surveillance campaigns. Different aspects of QA/QC of classical and new FMD virological and serological diagnostics are discussed in respect to the EU FMD directive (2003/85/EC). We recommended accepting trade certificates only from laboratories participating in international proficiency testing on a regular basis.
QA/QC requirements for physical properties sampling and analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Innis, B.E.
1993-07-21
This report presents results of an assessment of the available information concerning US Environmental Protection Agency (EPA) quality assurance/quality control (QA/QC) requirements and guidance applicable to sampling, handling, and analyzing physical parameter samples at Comprehensive Environmental Restoration, Compensation, and Liability Act (CERCLA) investigation sites. Geotechnical testing laboratories measure the following physical properties of soil and sediment samples collected during CERCLA remedial investigations (RI) at the Hanford Site: moisture content, grain size by sieve, grain size by hydrometer, specific gravity, bulk density/porosity, saturated hydraulic conductivity, moisture retention, unsaturated hydraulic conductivity, and permeability of rocks by flowing air. Geotechnical testing laboratories alsomore » measure the following chemical parameters of soil and sediment samples collected during Hanford Site CERCLA RI: calcium carbonate and saturated column leach testing. Physical parameter data are used for (1) characterization of vadose and saturated zone geology and hydrogeology, (2) selection of monitoring well screen sizes, (3) to support modeling and analysis of the vadose and saturated zones, and (4) for engineering design. The objectives of this report are to determine the QA/QC levels accepted in the EPA Region 10 for the sampling, handling, and analysis of soil samples for physical parameters during CERCLA RI.« less
Dionne, Shannon G.; Granato, Gregory E.; Tana, Cameron K.
1999-01-01
A readily accessible archive of information that is valid, current, and technically defensible is needed to make informed highway-planning, design, and management decisions. The National Highway Runoff Water-Quality Data and Methodology Synthesis (NDAMS) is a cataloging and assessment of the documentation of information relevant to highway-runoff water quality available in published reports. The report review process is based on the NDAMS review sheet, which was designed by the USGS with input from the FHWA, State transportation agencies, and the regulatory community. The report-review process is designed to determine the technical merit of the existing literature in terms of current requirements for data documentation, data quality, quality assurance and quality control (QA/QC), and technical issues that may affect the use of historical data. To facilitate the review process, the NDAMS review sheet is divided into 12 sections: (1) administrative review information, (2) investigation and report information, (3) temporal information, (4) location information (5) water-quality-monitoring information, (6) sample-handling methods, (7) constituent information, (8) sampling focus and matrix, (9) flow monitoring methods, (10) field QA/QC, (11) laboratory, and (12) uncertainty/error analysis. This report describes the NDAMS report reviews and metadata documentation methods and provides an overview of the approach and of the quality-assurance and quality-control program used to implement the review process. Detailed information, including a glossary of relevant terms, a copy of the report-review sheets, and reportreview instructions are completely documented in a series of three appendixes included with this report. Therefore the reviews are repeatable and the methods can be used by transportation research organizations to catalog new reports as they are published.
1984-09-01
and Control Groups on the Pretest .......................... 57 XI. T-tests Between Full-term QC and Control Groups on the Posttest ...two groups differed at the pretest in terms of self-rated job performance and job involvement. At the posttest , one significant result emerged. Table...8 Static Group Designs .......................... 9 Pretest / posttest Designs ...................... 9 Nonequivalent Control Group
A single fracture toughness parameter for fibrous composite laminates
NASA Technical Reports Server (NTRS)
Poe, C. C., Jr.
1981-01-01
A general fracture toughness parameter Qc was previously derived and verified to be a material constant, independent of layup, for centrally cracked boron aluminum composite specimens. The specimens were made with various proportions of 0 and + or - 45 degree plies. A limited amount of data indicated that the ratio Qc/epsilon tuf' where epsilon tuf is the ultimate tensile strain of the fibers, might be a constant for all composite laminates, regardless of material and layup. In that case, a single value of Qc/epsilon tuf could be used to predict the fracture toughness of all fibrous composite laminates from only the elastic constants and epsilon tuf. Values of Qc/epsilon tuf were calculated for centrally cracked specimens made from graphite/polyimide, graphite/epoxy, E glass/epoxy, boron/epoxy, and S glass graphite/epoxy materials with numerous layups. Within ordinary scatter, the data indicate that Qc/epsilon tuf is a constant for all laminates that did not split extensively at the crack tips or have other deviate failure modes.
Cynis, Holger; Hoffmann, Torsten; Friedrich, Daniel; Kehlen, Astrid; Gans, Kathrin; Kleinschmidt, Martin; Rahfeld, Jens-Ulrich; Wolf, Raik; Wermann, Michael; Stephan, Anett; Haegele, Monique; Sedlmeier, Reinhard; Graubner, Sigrid; Jagla, Wolfgang; Müller, Anke; Eichentopf, Rico; Heiser, Ulrich; Seifert, Franziska; Quax, Paul H A; de Vries, Margreet R; Hesse, Isabel; Trautwein, Daniela; Wollert, Ulrich; Berg, Sabine; Freyse, Ernst-Joachim; Schilling, Stephan; Demuth, Hans-Ulrich
2011-01-01
Acute and chronic inflammatory disorders are characterized by detrimental cytokine and chemokine expression. Frequently, the chemotactic activity of cytokines depends on a modified N-terminus of the polypeptide. Among those, the N-terminus of monocyte chemoattractant protein 1 (CCL2 and MCP-1) is modified to a pyroglutamate (pE-) residue protecting against degradation in vivo. Here, we show that the N-terminal pE-formation depends on glutaminyl cyclase activity. The pE-residue increases stability against N-terminal degradation by aminopeptidases and improves receptor activation and signal transduction in vitro. Genetic ablation of the glutaminyl cyclase iso-enzymes QC (QPCT) or isoQC (QPCTL) revealed a major role of isoQC for pE1-CCL2 formation and monocyte infiltration. Consistently, administration of QC-inhibitors in inflammatory models, such as thioglycollate-induced peritonitis reduced monocyte infiltration. The pharmacologic efficacy of QC/isoQC-inhibition was assessed in accelerated atherosclerosis in ApoE3*Leiden mice, showing attenuated atherosclerotic pathology following chronic oral treatment. Current strategies targeting CCL2 are mainly based on antibodies or spiegelmers. The application of small, orally available inhibitors of glutaminyl cyclases represents an alternative therapeutic strategy to treat CCL2-driven disorders such as atherosclerosis/restenosis and fibrosis. PMID:21774078
A novel construction method of QC-LDPC codes based on CRT for optical communications
NASA Astrophysics Data System (ADS)
Yuan, Jian-guo; Liang, Meng-qi; Wang, Yong; Lin, Jin-zhao; Pang, Yu
2016-05-01
A novel construction method of quasi-cyclic low-density parity-check (QC-LDPC) codes is proposed based on Chinese remainder theory (CRT). The method can not only increase the code length without reducing the girth, but also greatly enhance the code rate, so it is easy to construct a high-rate code. The simulation results show that at the bit error rate ( BER) of 10-7, the net coding gain ( NCG) of the regular QC-LDPC(4 851, 4 546) code is respectively 2.06 dB, 1.36 dB, 0.53 dB and 0.31 dB more than those of the classic RS(255, 239) code in ITU-T G.975, the LDPC(32 640, 30 592) code in ITU-T G.975.1, the QC-LDPC(3 664, 3 436) code constructed by the improved combining construction method based on CRT and the irregular QC-LDPC(3 843, 3 603) code constructed by the construction method based on the Galois field ( GF( q)) multiplicative group. Furthermore, all these five codes have the same code rate of 0.937. Therefore, the regular QC-LDPC(4 851, 4 546) code constructed by the proposed construction method has excellent error-correction performance, and can be more suitable for optical transmission systems.
Stable Isotopes, Quantum Computing and Consciousness
NASA Astrophysics Data System (ADS)
Berezin, Alexander A.
2000-10-01
Recent proposals of quantum computing/computers (QC) based on nuclear spins suggest that consciousness (CON) activity may be related (assisted) to subset of C13 atoms incorporated randomly, or quasirandomly, in neural structures. Consider two DNA chains. Even if they are completely identical chemically (same sequence of codons), patterns of 12C and 13C isotopes in them are different (possible origin of personal individuality). Perhaps it is subsystem of nuclear spins of 13C "sublattice" which forms dynamical system capable of QC and on which CON is "spanned". Some issues related to this hypothesis are: (1) existence of CON-driven positional correlations among C13 atoms, (2) motion (hopping) of C13 via enhanced neutron tunneling, cf. quantum "anti Zeno-effect", (3) possible optimization of concentration of QC-active C13 atoms above their standard isotopic abundance, (4) characteristic time-scales for operation of C13-based QC (perrhaps, broad range of scales), (5) reflection of QC dynamics of C13 on CON, (6) possibility that C13-based QC operates "above" level of "regular" CON (perhaps, Jungian sub/super-CON), (7) isotopicity as connector to universal Library of Patterns ("Platonic World"), (8) self-stabilization of coherence in C13 (sub)system. Some of this questions are, in principle, experimentally addressable through shifting of isotopic abundances.
Riddick, L; Simbanin, C
2001-01-01
EPA is conducting a National Study of Chemical Residues in Lake Fish Tissue. The study involves five analytical laboratories, multiple sampling teams from each of the 47 participating states, several tribes, all 10 EPA Regions and several EPA program offices, with input from other federal agencies. To fulfill study objectives, state and tribal sampling teams are voluntarily collecting predator and bottom-dwelling fish from approximately 500 randomly selected lakes over a 4-year period. The fish will be analyzed for more than 300 pollutants. The long-term nature of the study, combined with the large number of participants, created several QA challenges: (1) controlling variability among sampling activities performed by different sampling teams from more than 50 organizations over a 4-year period; (2) controlling variability in lab processes over a 4-year period; (3) generating results that will meet the primary study objectives for use by OW statisticians; (4) generating results that will meet the undefined needs of more than 50 participating organizations; and (5) devising a system for evaluating and defining data quality and for reporting data quality assessments concurrently with the data to ensure that assessment efforts are streamlined and that assessments are consistent among organizations. This paper describes the QA program employed for the study and presents an interim assessment of the program's effectiveness.
Unique and Conserved Features of the Barley Root Meristem
Kirschner, Gwendolyn K.; Stahl, Yvonne; Von Korff, Maria; Simon, Rüdiger
2017-01-01
Plant root growth is enabled by root meristems that harbor the stem cell niches as a source of progenitors for the different root tissues. Understanding the root development of diverse plant species is important to be able to control root growth in order to gain better performances of crop plants. In this study, we analyzed the root meristem of the fourth most abundant crop plant, barley (Hordeum vulgare). Cell division studies revealed that the barley stem cell niche comprises a Quiescent Center (QC) of around 30 cells with low mitotic activity. The surrounding stem cells contribute to root growth through the production of new cells that are displaced from the meristem, elongate and differentiate into specialized root tissues. The distal stem cells produce the root cap and lateral root cap cells, while cells lateral to the QC generate the epidermis, as it is typical for monocots. Endodermis and inner cortex are derived from one common initial lateral to the QC, while the outer cortex cell layers are derived from a distinct stem cell. In rice and Arabidopsis, meristem homeostasis is achieved through feedback signaling from differentiated cells involving peptides of the CLE family. Application of synthetic CLE40 orthologous peptide from barley promotes meristem cell differentiation, similar to rice and Arabidopsis. However, in contrast to Arabidopsis, the columella stem cells do not respond to the CLE40 peptide, indicating that distinct mechanisms control columella cell fate in monocot and dicot plants. PMID:28785269
Visualization and Quality Control Web Tools for CERES Products
NASA Astrophysics Data System (ADS)
Mitrescu, C.; Doelling, D. R.; Rutan, D. A.
2016-12-01
The CERES project continues to provide the scientific community a wide variety of satellite-derived data products such as observed TOA broadband shortwave and longwave observed fluxes, computed TOA and Surface fluxes, as well as cloud, aerosol, and other atmospheric parameters. They encompass a wide range of temporal and spatial resolutions, suited to specific applications. Now in its 16-year, CERES products are mostly used by climate modeling communities that focus on global mean energetics, meridianal heat transport, and climate trend studies. In order to serve all our users, we developed a web-based Ordering and Visualization Tool (OVT). Using Opens Source Software such as Eclipse, java, javascript, OpenLayer, Flot, Google Maps, python, and others, the OVT Team developed a series of specialized functions to be used in the process of CERES Data Quality Control (QC). We mention 1- and 2-D histogram, anomaly, deseasonalization, temporal and spatial averaging, side-by-side parameter comparison, and others that made the process of QC far easier and faster, but more importantly far more portable. We are now in the process of integrating ground site observed surface fluxes to further facilitate the CERES project to QC the CERES computed surface fluxes. These features will give users the opportunity to perform their own comparisons of the CERES computed surface fluxes and observed ground site fluxes. An overview of the CERES OVT basic functions using Open Source Software, as well as future steps in expanding its capabilities will be presented at the meeting.
Genome measures used for quality control are dependent on gene function and ancestry.
Wang, Jing; Raskin, Leon; Samuels, David C; Shyr, Yu; Guo, Yan
2015-02-01
The transition/transversion (Ti/Tv) ratio and heterozygous/nonreference-homozygous (het/nonref-hom) ratio have been commonly computed in genetic studies as a quality control (QC) measurement. Additionally, these two ratios are helpful in our understanding of the patterns of DNA sequence evolution. To thoroughly understand these two genomic measures, we performed a study using 1000 Genomes Project (1000G) released genotype data (N=1092). An additional two datasets (N=581 and N=6) were used to validate our findings from the 1000G dataset. We compared the two ratios among continental ancestry, genome regions and gene functionality. We found that the Ti/Tv ratio can be used as a quality indicator for single nucleotide polymorphisms inferred from high-throughput sequencing data. The Ti/Tv ratio varies greatly by genome region and functionality, but not by ancestry. The het/nonref-hom ratio varies greatly by ancestry, but not by genome regions and functionality. Furthermore, extreme guanine + cytosine content (either high or low) is negatively associated with the Ti/Tv ratio magnitude. Thus, when performing QC assessment using these two measures, care must be taken to apply the correct thresholds based on ancestry and genome region. Failure to take these considerations into account at the QC stage will bias any following analysis. yan.guo@vanderbilt.edu Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
FDAs Critical Path Initiative identifies pharmacogenomics and toxicogenomics as key opportunities in advancing medical product development and personalized medicine, and the Guidance for Industry: Pharmacogenomic Data Submissions has been released. Microarrays represent a co...
Implementation of GPS Machine Controlled Grading - Phase III (2008) and Technical Training
DOT National Transportation Integrated Search
2009-02-01
Beginning in 2006, WisDOT and the Construction Material and Support Center (CMSC) at UW-Madison worked together to develop the specifications and the QA/QC procedures for GPS machine guidance on highway grading projects. These specifications and proc...
Jordan, A; Chen, D; Yi, Q-L; Kanias, T; Gladwin, M T; Acker, J P
2016-07-01
Quality control (QC) data collected by blood services are used to monitor production and to ensure compliance with regulatory standards. We demonstrate how analysis of quality control data can be used to highlight the sources of variability within red cell concentrates (RCCs). We merged Canadian Blood Services QC data with manufacturing and donor records for 28 227 RCC between June 2011 and October 2014. Units were categorized based on processing method, bag manufacturer, donor age and donor sex, then assessed based on product characteristics: haemolysis and haemoglobin levels, unit volume, leucocyte count and haematocrit. Buffy-coat method (top/bottom)-processed units exhibited lower haemolysis than units processed using the whole-blood filtration method (top/top). Units from female donors exhibited lower haemolysis than male donations. Processing method influenced unit volume and the ratio of additive solution to residual plasma. Stored red blood cell characteristics are influenced by prestorage processing and donor factors. Understanding the relationship between processing, donors and RCC quality will help blood services to ensure the safety of transfused products. © 2016 International Society of Blood Transfusion.
Automated locomotor activity monitoring as a quality control assay for mass-reared tephritid flies.
Dominiak, Bernard C; Fanson, Benjamin G; Collins, Samuel R; Taylor, Phillip W
2014-02-01
The Sterile Insect Technique (SIT) requires vast numbers of consistently high quality insects to be produced over long periods. Quality control (QC) procedures are critical to effective SIT, both providing quality assurance and warning of operational deficiencies. We here present a potential new QC assay for mass rearing of Queensland fruit flies (Bactrocera tryoni Froggatt) for SIT; locomotor activity monitoring. We investigated whether automated locomotor activity monitors (LAMs) that simply detect how often a fly passes an infrared sensor in a glass tube might provide similar insights but with much greater economy. Activity levels were generally lower for females than for males, and declined over five days in the monitor for both sexes. Female activity levels were not affected by irradiation, but males irradiated at 60 or 70 Gy had reduced activity levels compared with unirradiated controls. We also found some evidence that mild heat shock of pupae results in adults with reduced activity. LAM offers a convenient, effective and economical assay to probe such changes. © 2013 Society of Chemical Industry.
Lens Coupled Quantum Cascade Laser
NASA Technical Reports Server (NTRS)
Lee, Alan Wei Min (Inventor); Hu, Qing (Inventor)
2013-01-01
Terahertz quantum cascade (QC) devices are disclosed that can operate, e.g., in a range of about 1 THz to about 10 THz. In some embodiments, QC lasers are disclosed in which an optical element (e.g., a lens) is coupled to an output facet of the laser's active region to enhance coupling of the lasing radiation from the active region to an external environment. In other embodiments, terahertz amplifier and tunable terahertz QC lasers are disclosed.
NASA Astrophysics Data System (ADS)
Selima, Ehab S.; Seadawy, Aly R.; Yao, Xiaohua; Essa, F. A.
2018-02-01
This paper is devoted to study the (1+1)-dimensional coupled cubic-quintic complex Ginzburg-Landau equations (cc-qcGLEs) with complex coefficients. This equation can be used to describe the nonlinear evolution of slowly varying envelopes of periodic spatial-temporal patterns in a convective binary fluid. Dispersion relation and properties of cc-qcGLEs are constructed. Painlevé analysis is used to check the integrability of cc-qcGLEs and to establish the Bäcklund transformation form. New traveling wave solutions and a general form of multiple-soliton solutions of cc-qcGLEs are obtained via the Bäcklund transformation and simplest equation method with Bernoulli, Riccati and Burgers’ equations as simplest equations.
Watering the Tree of Science: Science Education, Local Knowledge, and Agency in Zambia's PSA Program
NASA Astrophysics Data System (ADS)
Lample, Emily
With increased public interest in protecting the environment, scientists and engineers aim to improve energy conversion efficiency. Thermoelectrics offer many advantages as thermal management technology. When compared to vapor compression refrigeration, above approximately 200 to 600 watts, cost in dollars per watt as well as COP are not advantageous for thermoelectrics. The goal of this work was to determine if optimized pulse supercooling operation could improve cooling capacity or efficiency of a thermoelectric device. The basis of this research is a thermal-electrical analogy based modeling study using SPICE. Two models were developed. The first model, a standalone thermocouple with no attached mass to be cooled. The second, a system that includes a module attached to a heat generating mass. With the thermocouple study, a new approach of generating response surfaces with characteristic parameters was applied. The current pulse height and pulse on-time was identified for maximizing Net Transient Advantage, a newly defined metric. The corresponding pulse height and pulse on-time was utilized for the system model. Along with the traditional steady state starting current of Imax, Iopt was employed. The pulse shape was an isosceles triangle. For the system model, metrics new to pulse cooling were Qc, power consumption and COP. The effects of optimized current pulses were studied by changing system variables. Further studies explored time spacing between pulses and temperature distribution in the thermoelement. It was found net Q c over an entire pulse event can be improved over Imax steady operation but not over steady I opt operation. Qc can be improved over Iopt operation but only during the early part of the pulse event. COP is reduced in transient pulse operation due to the different time constants of Qc and Pin. In some cases lower performance interface materials allow more Qc and better COP during transient operation than higher performance interface materials. Important future work might look at developing innovative ways of biasing Joule heat to Th..
Seismic Activity at tres Virgenes Volcanic and Geothermal Field
NASA Astrophysics Data System (ADS)
Antayhua, Y. T.; Lermo, J.; Quintanar, L.; Campos-Enriquez, J. O.
2013-05-01
The volcanic and geothermal field Tres Virgenes is in the NE portion of Baja California Sur State, Mexico, between -112°20'and -112°40' longitudes, and 27°25' to 27°36' latitudes. Since 2003 Power Federal Commission and the Engineering Institute of the National Autonomous University of Mexico (UNAM) initiated a seismic monitoring program. The seismograph network installed inside and around the geothermal field consisted, at the beginning, of Kinemetrics K2 accelerometers; since 2009 the network is composed by Guralp CMG-6TD broadband seismometers. The seismic data used in this study covered the period from September 2003 - November 2011. We relocated 118 earthquakes with epicenter in the zone of study recorded in most of the seismic stations. The events analysed have shallow depths (≤10 km), coda Magnitude Mc≤2.4, with epicentral and hypocentral location errors <2 km. These events concentrated mainly below Tres Virgenes volcanoes, and the geothermal explotation zone where there is a system NW-SE, N-S and W-E of extensional faults. Also we obtained focal mechanisms for 38 events using the Focmec, Hash, and FPFIT methods. The results show normal mechanisms which correlate with La Virgen, El Azufre, El Cimarron and Bonfil fault systems, whereas inverse and strike-slip solutions correlate with Las Viboras fault. Additionally, the Qc value was obtained for 118 events. This value was calculated using the Single Back Scattering model, taking the coda-waves train with window lengths of 5 sec. Seismograms were filtered at 4 frequency bands centered at 2, 4, 8 and 16 Hz respectively. The estimates of Qc vary from 62 at 2 Hz, up to 220 at 16 Hz. The frequency-Qc relationship obtained is Qc=40±2f(0.62±0.02), representing the average attenuation characteristics of seismic waves at Tres Virgenes volcanic and geothermal field. This value correlated with those observed at other geothermal and volcanic fields.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, H; Yi, B; Prado, K
2015-06-15
Purpose: This work is to investigate the feasibility of a standardized monthly quality check (QC) of LINAC output determination in a multi-site, multi-LINAC institution. The QC was developed to determine individual LINAC output using the same optimized measurement setup and a constant calibration factor for all machines across the institution. Methods: The QA data over 4 years of 7 Varian machines over four sites, were analyzed. The monthly output constancy checks were performed using a fixed source-to-chamber-distance (SCD), with no couch position adjustment throughout the measurement cycle for all the photon energies: 6 and 18MV, and electron energies: 6, 9,more » 12, 16 and 20 MeV. The constant monthly output calibration factor (Nconst) was determined by averaging the machines’ output data, acquired with the same monthly ion chamber. If a different monthly ion chamber was used, Nconst was then re-normalized to consider its different NDW,Co-60. Here, the possible changes of Nconst over 4 years have been tracked, and the precision of output results based on this standardized monthly QA program relative to the TG-51 calibration for each machine was calculated. Any outlier of the group was investigated. Results: The possible changes of Nconst varied between 0–0.9% over 4 years. The normalization of absorbed-dose-to-water calibration factors corrects for up to 3.3% variations of different monthly QA chambers. The LINAC output precision based on this standardized monthly QC relative to the TG-51 output calibration is within 1% for 6MV photon energy and 2% for 18MV and all the electron energies. A human error in one TG-51 report was found through a close scrutiny of outlier data. Conclusion: This standardized QC allows for a reasonably simplified, precise and robust monthly LINAC output constancy check, with the increased sensitivity needed to detect possible human errors and machine problems.« less
Suke, Sanvidhan G; Sherekar, Prasad; Kahale, Vivek; Patil, Shaktipal; Mundhada, Dharmendra; Nanoti, Vivek M
2018-04-18
The theme of the present work is to evaluate the protective effect of nanoencapsulated quercetin (NEQ) against chlorpyrifos (CPF)-induced hepatic damage and immune alterations in animals. Nanoparticles (NP) drug encapsulation was prepared. Forty male Wistar rats were divided into eight groups. Two groups served as control and CPF (13.5 mg/kg) treatment for 28 days. Other three groups were free quercetin (QC), NP and NEQ treated with 3 mg/kg respectively for 15 days; whereas remaining three groups received treatment of CPF and QC, NP, NEQ, respectively, for 15 days. The results show that significantly altered oxidative stress in the liver tissue and liver enzyme parameters in blood and immune responses in CPF-treated rats compared to controls. Administration of NEQ attenuated biochemical and immunological parameters. The liver histopathological analysis confirmed pathological improvement. Hence, use of NEQ appeared to be beneficial to a great extent in attenuating and restoring hepatic oxidative damage and immune alteration sustained by pesticide exposure. © 2018 Wiley Periodicals, Inc.
Sohrabi, Mehdi; Parsi, Masoumeh; Mianji, Fereidoun
2018-05-01
National diagnostic reference levels (NDRLs) of Iran were determined for the four most common CT examinations including head, sinus, chest and abdomen/pelvis. A new 'quality control (QC)-based dose survey method', as developed by us, was applied to 157 CT scanners in Iran (2014-15) with different slice classes, models and geographic spread across the country. The NDRLs for head, sinus, chest and abdomen/pelvis examinations are 58, 29, 12 and 14 mGy for CTDIVol and 750, 300, 300 and 650 mGy.cm for DLP, respectively. The 'QC-based dose survey method' was further proven that it is a simple, accurate and practical method for a time and cost-effective NDRLs determination. One effective approach for optimization of the CT examination protocols at the national level is the provision of an adequate standardized training of the radiologists, technicians and medical physicists on the patient radiation protection principles and implementation of the DRL concept in clinical practices.
Department of Defense In-House RDT&E Activities
1984-10-30
PRODUCTION. QC & NOT EQUIPMENT, ULTRASONICS, XRAY & NEUTRON RADIOGRAPHY , SPECTROSCOPY, HOLOGRAPHY, CHEMICAL ANALYSIS, METALLOGRAPTY & OPTICS. OTHER:U & BE...IMPORTANT PROGRAMS OTNl1O9A XM40 MASK OTNI033 RADAR WARNING RECEIVER AN/APR-39A 0TN966 AIRCREW SURVIVAL VEST 0TN876 SELF- PROPELLED ELEVATED MAINTENANCE...FACILITY FOR PROPELLANT FLAME ANALYSIS; COMPUTED TOMOGRAPHY FOR BALLISTIC EVENTS; PROPELLANT FRACTURE MECHANICS ANALYSIS FACILITY; INSTRUMENTED INDOOR
Stanhewicz, Anna E.; Proctor, David N.; Alexander, Lacy M.; Kenney, W. Larry
2015-01-01
During supine passive heating, increases in skin blood flow (SkBF) and cardiac output (Qc) are both blunted in older adults. The aim here was to determine the effect of acutely correcting the peripheral vasodilatory capacity of aged skin on the integrated cardiovascular responses to passive heating. A secondary aim was to examine the SkBF-Qc relation during hyperthermia in the presence (upright posture) and absence (dynamic exercise) of challenges to central venous pressure. We hypothesized that greater increases in SkBF would be accompanied by greater increases in Qc. Eleven healthy older adults (69 ± 3 yr) underwent supine passive heating (0.8°C rise in core temperature; water-perfused suit) after ingesting sapropterin (BH4, a nitric oxide synthase cofactor; 10 mg/kg) or placebo (randomized double-blind crossover design). Twelve young (24 ± 1 yr) subjects served as a comparison group. SkBF (laser-Doppler flowmetry) and Qc (open-circuit acetylene wash-in) were measured during supine heating, heating + upright posture, and heating + dynamic exercise. Throughout supine and upright heating, sapropterin fully restored the SkBF response of older adults to that of young adults but Qc remained blunted. During heat + upright posture, SkBF failed to decrease in untreated older subjects. There were no age- or treatment-related differences in SkBF-Qc during dynamic exercise. The principal finding of this study was that the blunted Qc response to passive heat stress is directly related to age as opposed to the blunted peripheral vasodilatory capacity of aged skin. Furthermore, peripheral impairments to SkBF in the aged may contribute to inapposite responses during challenges to central venous pressure during hyperthermia. PMID:26494450
NASA Astrophysics Data System (ADS)
Jiao, Xin; Liu, Yiqun; Yang, Wan; Zhou, Dingwu; Wang, Shuangshuang; Jin, Mengqi; Sun, Bin; Fan, Tingting
2018-01-01
The cycling of various isomorphs of authigenic silica minerals is a complex and long-term process. A special type of composite quartz (Qc) grains in tuffaceous shale of Permian Lucaogou Formation in the sediment-starved volcanically and hydrothermally active intracontinental lacustrine Santanghu rift basin (NW China) is studied in detail to demonstrate such processes. Samples from one well in the central basin were subject to petrographic, elemental chemical, and fluid inclusion analyses. About 200 Qc-bearing laminae are 0.1-2 mm and mainly 1 mm thick and intercalated within tuffaceous shale laminae. The Qc grains occur as framework grains and are dispersed in igneous feldspar-dominated matrix, suggesting episodic accumulation. The Qc grains are bedding-parallel, uniform in size (100 s µm), elongate, and radial in crystal pattern, suggesting a biogenic origin. Qc grains are composed of a core of anhedral microcrystalline quartz and an outer part of subhedral mega-quartz grains, whose edges are composed of small euhedral quartz crystals, indicating multiple episodic processes of recrystallization and overgrowth. Abundance of Al and Ti in quartz crystals and estimated temperature from fluid inclusions in Qc grains indicate that processes are related to hydrothermal fluids. Finally, the Qc grains are interpreted as original silica precipitation in microorganism (algae?) cysts, which were reworked by bottom currents and altered by hydrothermal fluids to recrystalize and overgrow during penecontemporaneous shallow burial. It is postulated that episodic volcanic and hydrothermal activities had changed lake water chemistry, temperature, and nutrient supply, resulting in variations in microorganic productivities and silica cycling. The transformation of authigenic silica from amorphous to well crystallized had occurred in a short time span during shallow burial.
Formulation of Subgrid Variability and Boundary-Layer Cloud Cover in Large-Scale Models
1999-02-28
related to burned and unburned landscapes, saline and non-saline soils, and irrigated and nonirrigated crops. Escuela de Agrono’mia Universidad de Talca...Piso 2 Departamento de Ciencias de la Atmosfera 1428 Capital Federal ARGENTINA Juan Carlos TORRES, torres@cima.uba.ar Coupled land-surface...evaporation fraction, and qc,sat is the canopy saturation specific humidity, a function of Tc. Using (21) - (22) we then de - termine qc qc = qca
NASA Astrophysics Data System (ADS)
Zhou, Fenfen; Wang, Hongqing; Liu, Pengying; Hu, Qinghua; Wang, Yuyuan; Liu, Can; Hu, Jiangke
2018-02-01
A reversible Schiff's base fluorescence probe for Al3+, (3,5-dichloro-2- hydroxybenzylidene) quinoline-2-carbohydrazide (QC), based on quinoline derivative has been designed, synthesized and evaluated. The QC exhibited a high sensitivity and selectivity toward Al3+ in EtOH-H2O (v/v = 1:9, pH = 6) by forming a 1:1 complex with Al3+ and the detection limit of QC for Al3+ was as low as 0.012 μM. Furthermore, these results displayed that the binding of QCsbnd Al3+ was broken by F-, so this system could be used to monitor F- in the future. The enhancement fluorescence of the QC could be attributed to the inhibition of PET and ESIPT and the emergency of CHEF process induced by Al3+. More importantly, QC was not only successfully used for the determination of trace Al3+ in the tap water and the human blood serum, but was valid for fluorescence imaging of Al3+ in the Hela cells.
A novel QC-LDPC code based on the finite field multiplicative group for optical communications
NASA Astrophysics Data System (ADS)
Yuan, Jian-guo; Xu, Liang; Tong, Qing-zhen
2013-09-01
A novel construction method of quasi-cyclic low-density parity-check (QC-LDPC) code is proposed based on the finite field multiplicative group, which has easier construction, more flexible code-length code-rate adjustment and lower encoding/decoding complexity. Moreover, a regular QC-LDPC(5334,4962) code is constructed. The simulation results show that the constructed QC-LDPC(5334,4962) code can gain better error correction performance under the condition of the additive white Gaussian noise (AWGN) channel with iterative decoding sum-product algorithm (SPA). At the bit error rate (BER) of 10-6, the net coding gain (NCG) of the constructed QC-LDPC(5334,4962) code is 1.8 dB, 0.9 dB and 0.2 dB more than that of the classic RS(255,239) code in ITU-T G.975, the LDPC(32640,30592) code in ITU-T G.975.1 and the SCG-LDPC(3969,3720) code constructed by the random method, respectively. So it is more suitable for optical communication systems.
Bao, Yong-Mei; Sun, Shu-Jing; Li, Meng; Li, Li; Cao, Wen-Lei; Luo, Jia; Tang, Hai-Juan; Huang, Ji; Wang, Zhou-Fei; Wang, Jian-Fei; Zhang, Hong-Sheng
2012-08-10
OsSYP71 is an oxidative stress and rice blast response gene that encodes a Qc-SNARE protein in rice. Qc-SNARE proteins belong to the superfamily of SNAREs (soluble N-ethylmaleimide-sensitive factor attachment protein receptors), which function as important components of the vesicle trafficking machinery in eukaryotic cells. In this paper, 12 Qc-SNARE genes were isolated from rice, and expression patterns of 9 genes were detected in various tissues and in seedlings challenged with oxidative stresses and inoculated with rice blast. The expression of OsSYP71 was clearly up-regulated under these stresses. Overexpression of OsSYP71 in rice showed more tolerance to oxidative stress and resistance to rice blast than wild-type plants. These results indicate that Qc-SNAREs play an important role in rice response to environmental stresses, and OsSYP71 is useful in engineering crop plants with enhanced tolerance to oxidative stress and resistance to rice blast. Copyright © 2012 Elsevier B.V. All rights reserved.
Tamiru, Afework; Boulanger, Lucy; Chang, Michelle A; Malone, Joseph L; Aidoo, Michael
2015-01-21
Rapid diagnostic tests (RDTs) are now widely used for laboratory confirmation of suspected malaria cases to comply with the World Health Organization recommendation for universal testing before treatment. However, many malaria programmes lack quality control (QC) processes to assess RDT use under field conditions. Prior research showed the feasibility of using the dried tube specimen (DTS) method for preserving Plasmodium falciparum parasites for use as QC samples for RDTs. This study focused on the use of DTS for RDT QC and proficiency testing under field conditions. DTS were prepared using cultured P. falciparum at densities of 500 and 1,000 parasites/μL; 50 μL aliquots of these along with parasite negative human blood controls (0 parasites/μL) were air-dried in specimen tubes and reactivity verified after rehydration. The DTS were used in a field study in the Oromia Region of Ethiopia. Replicate DTS samples containing 0, 500 and 1,000 parasites/μL were stored at 4°C at a reference laboratory and at ambient temperatures at two nearby health facilities. At weeks 0, 4, 8, 12, 16, 20, and 24, the DTS were rehydrated and tested on RDTs stored under manufacturer-recommended temperatures at the RL and on RDTs stored under site-specific conditions at the two health facilities. Reactivity of DTS stored at 4°C at the reference laboratory on RDTs stored at the reference laboratory was considered the gold standard for assessing DTS stability. A proficiency-testing panel consisting of one negative and three positive samples, monitored with a checklist was administered at weeks 12 and 24. At all the seven time points, DTS stored at both the reference laboratory and health facility were reactive on RDTs stored under the recommended temperature and under field conditions, and the DTS without malaria parasites were negative. At the reference laboratory and one health facility, a 500 parasites/μL DTS from the proficiency panel was falsely reported as negative at week 24 due to errors in interpreting faint test lines. The DTS method can be used under field conditions to supplement other RDT QC methods and health worker proficiency in Ethiopia and possibly other malaria-endemic countries.
Keller, Sune H; Sibomana, Merence; Olesen, Oline V; Svarer, Claus; Holm, Søren; Andersen, Flemming L; Højgaard, Liselotte
2012-03-01
Many authors have reported the importance of motion correction (MC) for PET. Patient motion during scanning disturbs kinetic analysis and degrades resolution. In addition, using misaligned transmission for attenuation and scatter correction may produce regional quantification bias in the reconstructed emission images. The purpose of this work was the development of quality control (QC) methods for MC procedures based on external motion tracking (EMT) for human scanning using an optical motion tracking system. Two scans with minor motion and 5 with major motion (as reported by the optical motion tracking system) were selected from (18)F-FDG scans acquired on a PET scanner. The motion was measured as the maximum displacement of the markers attached to the subject's head and was considered to be major if larger than 4 mm and minor if less than 2 mm. After allowing a 40- to 60-min uptake time after tracer injection, we acquired a 6-min transmission scan, followed by a 40-min emission list-mode scan. Each emission list-mode dataset was divided into 8 frames of 5 min. The reconstructed time-framed images were aligned to a selected reference frame using either EMT or the AIR (automated image registration) software. The following 3 QC methods were used to evaluate the EMT and AIR MC: a method using the ratio between 2 regions of interest with gray matter voxels (GM) and white matter voxels (WM), called GM/WM; mutual information; and cross correlation. The results of the 3 QC methods were in agreement with one another and with a visual subjective inspection of the image data. Before MC, the QC method measures varied significantly in scans with major motion and displayed limited variations on scans with minor motion. The variation was significantly reduced and measures improved after MC with AIR, whereas EMT MC performed less well. The 3 presented QC methods produced similar results and are useful for evaluating tracer-independent external-tracking motion-correction methods for human brain scans.
Explicit Lower and Upper Bounds on the Entangled Value of Multiplayer XOR Games
NASA Astrophysics Data System (ADS)
Briët, Jop; Vidick, Thomas
2013-07-01
The study of quantum-mechanical violations of Bell inequalities is motivated by the investigation, and the eventual demonstration, of the nonlocal properties of entanglement. In recent years, Bell inequalities have found a fruitful re-formulation using the language of multiplayer games originating from Computer Science. This paper studies the nonlocal properties of entanglement in the context of the simplest such games, called XOR games. When there are two players, it is well known that the maximum bias—the advantage over random play—of players using entanglement can be at most a constant times greater than that of classical players. Recently, Pérez-García et al. (Commun. Mathe. Phys. 279:455, 2008) showed that no such bound holds when there are three or more players: the use of entanglement can provide an unbounded advantage, and scale with the number of questions in the game. Their proof relies on non-trivial results from operator space theory, and gives a non-explicit existence proof, leading to a game with a very large number of questions and only a loose control over the local dimension of the players' shared entanglement. We give a new, simple and explicit (though still probabilistic) construction of a family of three-player XOR games which achieve a large quantum-classical gap (QC-gap). This QC-gap is exponentially larger than the one given by Pérez-García et. al. in terms of the size of the game, achieving a QC-gap of order {√{N}} with N 2 questions per player. In terms of the dimension of the entangled state required, we achieve the same (optimal) QC-gap of {√{N}} for a state of local dimension N per player. Moreover, the optimal entangled strategy is very simple, involving observables defined by tensor products of the Pauli matrices. Additionally, we give the first upper bound on the maximal QC-gap in terms of the number of questions per player, showing that our construction is only quadratically off in that respect. Our results rely on probabilistic estimates on the norm of random matrices and higher-order tensors which may be of independent interest.
Effect of different solutions on color stability of acrylic resin-based dentures.
Goiato, Marcelo Coelho; Nóbrega, Adhara Smith; dos Santos, Daniela Micheline; Andreotti, Agda Marobo; Moreno, Amália
2014-01-01
The aim of this study was to evaluate the effect of thermocycling and immersion in mouthwash or beverage solutions on the color stability of four different acrylic resin-based dentures (Onda Cryl, OC; QC20, QC; Classico, CL; and Lucitone, LU). The factors evaluated were type of acrylic resin, immersion time, and solution (mouthwash or beverage). A total of 224 denture samples were fabricated. For each type of resin, eight samples were immersed in mouthwashes (Plax-Colgate, PC; Listerine, LI; and Oral-B, OB), beverages (coffee, CP; cola, C; and wine, W), and artificial saliva (AS; control). The color change (DE) was evaluated before (baseline) and after thermocycling (T1), and after immersion in solution for 1 h (T2), 3 h (T3), 24 h (T4), 48 h (T5), and 96 h (T6). The CIE Lab system was used to determine the color changes. The thermocycling test was performed for 5000 cycles. Data were submitted to three-way repeated-measures analysis of variance and Tukey's test (p<0.05). When the samples were immersed in each mouthwash, all assessed factors, associated or not, significantly influenced the color change values, except there was no association between the mouthwash and acrylic resin. Similarly, when the samples were immersed in each beverage, all studied factors influenced the color change values. In general, regardless of the solution, LU exhibited the greatest DE values in the period from T1 to T5; and QC presented the greatest DE values at T6. Thus, thermocycling and immersion in the various solutions influenced the color stability of acrylic resins and QC showed the greatest color alteration.
Izumida, Fernanda Emiko; Ribeiro, Roberta Chuqui; Giampaolo, Eunice Teresinha; Machado, Ana Lucia; Pavarina, Ana Cláudia; Vergani, Carlos Eduardo
2011-12-01
This study investigated the effect of microwave disinfection on the roughness of three heat-polymerised acrylic resins after tooth brushing. Microwave disinfection has been recommended to reduce cross-contamination. However, this procedure may also influence the physical and mechanical properties of acrylic resins. Specimens (40 × 20 × 2 mm) of resins: Lucitone 550 (L), QC 20(QC) and Acron MC (A) were prepared and divided into four groups (n = 10): Control groups 1 (C1) and 2 (C2) - stored in water for 48 h or 7 days; Test groups 1 (MW2) and 2 (MW7) - stored in water for 48 h and disinfected (650 W for 6 min) daily for 2 or 7 days, respectively. After treatments, the specimens were placed in a tooth brushing machine at a rate of 60 reciprocal strokes per minute. The specimens were brushed with 20 000 strokes, which represent approximately 2 years of denture cleansing. The surface roughness (Ra) was evaluated before and after the tooth brushing. Data were analysed by two-way anova and Tukey Honestly Significant Difference (HSD) post hoc tests (α = 0.05). The data revealed significant changes between test groups for A and L resins. Comparison among resins revealed that for MW7, the roughness of A was significantly lower than that of L. After the seven microwave cycles, it could be seen that the roughness values of QC were significantly lower than those of L. The roughness of QC after brushing was not significantly affected by microwave disinfection. For A and L, seven microwave cycles resulted in increased roughness. © 2011 The Gerodontology Society and John Wiley & Sons A/S.
FPGA implementation of high-performance QC-LDPC decoder for optical communications
NASA Astrophysics Data System (ADS)
Zou, Ding; Djordjevic, Ivan B.
2015-01-01
Forward error correction is as one of the key technologies enabling the next-generation high-speed fiber optical communications. Quasi-cyclic (QC) low-density parity-check (LDPC) codes have been considered as one of the promising candidates due to their large coding gain performance and low implementation complexity. In this paper, we present our designed QC-LDPC code with girth 10 and 25% overhead based on pairwise balanced design. By FPGAbased emulation, we demonstrate that the 5-bit soft-decision LDPC decoder can achieve 11.8dB net coding gain with no error floor at BER of 10-15 avoiding using any outer code or post-processing method. We believe that the proposed single QC-LDPC code is a promising solution for 400Gb/s optical communication systems and beyond.
7 CFR 275.2 - State agency responsibilities.
Code of Federal Regulations, 2014 CFR
2014-01-01
...: (i) Data collection through management evaluation (ME) reviews and quality control (QC) reviews; (ii... knowledge of either the household or the decision under review. Where there is prior knowledge, the reviewer must disqualify her/himself. Prior knowledge is defined as having: (1) Taken any part in the decision...
7 CFR 275.2 - State agency responsibilities.
Code of Federal Regulations, 2011 CFR
2011-01-01
...: (i) Data collection through management evaluation (ME) reviews and quality control (QC) reviews; (ii... knowledge of either the household or the decision under review. Where there is prior knowledge, the reviewer must disqualify her/himself. Prior knowledge is defined as having: (1) Taken any part in the decision...
7 CFR 275.2 - State agency responsibilities.
Code of Federal Regulations, 2013 CFR
2013-01-01
...: (i) Data collection through management evaluation (ME) reviews and quality control (QC) reviews; (ii... knowledge of either the household or the decision under review. Where there is prior knowledge, the reviewer must disqualify her/himself. Prior knowledge is defined as having: (1) Taken any part in the decision...
7 CFR 275.2 - State agency responsibilities.
Code of Federal Regulations, 2012 CFR
2012-01-01
...: (i) Data collection through management evaluation (ME) reviews and quality control (QC) reviews; (ii... knowledge of either the household or the decision under review. Where there is prior knowledge, the reviewer must disqualify her/himself. Prior knowledge is defined as having: (1) Taken any part in the decision...
7 CFR 275.2 - State agency responsibilities.
Code of Federal Regulations, 2010 CFR
2010-01-01
...: (i) Data collection through management evaluation (ME) reviews and quality control (QC) reviews; (ii... knowledge of either the household or the decision under review. Where there is prior knowledge, the reviewer must disqualify her/himself. Prior knowledge is defined as having: (1) Taken any part in the decision...
DOT National Transportation Integrated Search
2015-01-01
Acceptance of earthwork construction by the Florida Department of Transportation (FDOT) : requires in-place testing conducted with a nuclear density gauge (NDG) to determine : dry density, which must obtain a required percent compaction based upon a ...
Kahn, Maria; LaRue, Nicole; Zhu, Changcheng; Pal, Sampa; Mo, Jack S; Barrett, Lynn K; Hewitt, Steve N; Dumais, Mitchell; Hemmington, Sandra; Walker, Adrian; Joynson, Jeff; Leader, Brandon T; Van Voorhis, Wesley C; Domingo, Gonzalo J
2017-01-01
A large gap for the support of point-of-care testing is the availability of reagents to support quality control (QC) of diagnostic assays along the supply chain from the manufacturer to the end user. While reagents and systems exist to support QC of laboratory screening tests for glucose-6-phosphate dehydrogenase (G6PD) deficiency, they are not configured appropriately to support point-of-care testing. The feasibility of using lyophilized recombinant human G6PD as a QC reagent in novel point-of-care tests for G6PD deficiency is demonstrated. Human recombinant G6PD (r-G6PD) was expressed in Escherichia coli and purified. Aliquots were stored at -80°C. Prior to lyophilization, aliquots were thawed, and three concentrations of r-G6PD (representing normal, intermediate, and deficient clinical G6PD levels) were prepared and mixed with a protective formulation, which protects the enzyme activity against degradation from denaturation during the lyophilization process. Following lyophilization, individual single-use tubes of lyophilized r-G6PD were placed in individual packs with desiccants and stored at five temperatures for one year. An enzyme assay for G6PD activity was used to ascertain the stability of r-G6PD activity while stored at different temperatures. Lyophilized r-G6PD is stable and can be used as a control indicator. Results presented here show that G6PD activity is stable for at least 365 days when stored at -80°C, 4°C, 30°C, and 45°C. When stored at 55°C, enzyme activity was found to be stable only through day 28. Lyophilized r-G6PD enzyme is stable and can be used as a control for point-of-care tests for G6PD deficiency.
40 CFR 98.252 - GHGs to report.
Code of Federal Regulations, 2011 CFR
2011-07-01
... follow the calculation methodologies from § 98.253(f) and the monitoring and QA/QC methods, missing data..., monitoring and QA/QC methods, missing data procedures, reporting requirements, and recordkeeping requirements...
40 CFR 98.252 - GHGs to report.
Code of Federal Regulations, 2013 CFR
2013-07-01
... follow the calculation methodologies from § 98.253(f) and the monitoring and QA/QC methods, missing data..., monitoring and QA/QC methods, missing data procedures, reporting requirements, and recordkeeping requirements...
40 CFR 98.252 - GHGs to report.
Code of Federal Regulations, 2012 CFR
2012-07-01
... follow the calculation methodologies from § 98.253(f) and the monitoring and QA/QC methods, missing data..., monitoring and QA/QC methods, missing data procedures, reporting requirements, and recordkeeping requirements...
A Novel, “Double-Clamp” Binding Mode for Human Heme Oxygenase-1 Inhibition
Rahman, Mona N.; Vlahakis, Jason Z.; Vukomanovic, Dragic; Lee, Wallace; Szarek, Walter A.; Nakatsu, Kanji; Jia, Zongchao
2012-01-01
The development of heme oxygenase (HO) inhibitors is critical in dissecting and understanding the HO system and for potential therapeutic applications. We have established a program to design and optimize HO inhibitors using structure-activity relationships in conjunction with X-ray crystallographic analyses. One of our previous complex crystal structures revealed a putative secondary hydrophobic binding pocket which could be exploited for a new design strategy by introducing a functional group that would fit into this potential site. To test this hypothesis and gain further insights into the structural basis of inhibitor binding, we have synthesized and characterized 1-(1H-imidazol-1-yl)-4,4-diphenyl-2-butanone (QC-308). Using a carbon monoxide (CO) formation assay on rat spleen microsomes, the compound was found to be ∼15 times more potent (IC50 = 0.27±0.07 µM) than its monophenyl analogue, which is already a potent compound in its own right (QC-65; IC50 = 4.0±1.8 µM). The crystal structure of hHO-1 with QC-308 revealed that the second phenyl group in the western region of the compound is indeed accommodated by a definitive secondary proximal hydrophobic pocket. Thus, the two phenyl moieties are each stabilized by distinct hydrophobic pockets. This “double-clamp” binding offers additional inhibitor stabilization and provides a new route for improvement of human heme oxygenase inhibitors. PMID:22276118
A novel, "double-clamp" binding mode for human heme oxygenase-1 inhibition.
Rahman, Mona N; Vlahakis, Jason Z; Vukomanovic, Dragic; Lee, Wallace; Szarek, Walter A; Nakatsu, Kanji; Jia, Zongchao
2012-01-01
The development of heme oxygenase (HO) inhibitors is critical in dissecting and understanding the HO system and for potential therapeutic applications. We have established a program to design and optimize HO inhibitors using structure-activity relationships in conjunction with X-ray crystallographic analyses. One of our previous complex crystal structures revealed a putative secondary hydrophobic binding pocket which could be exploited for a new design strategy by introducing a functional group that would fit into this potential site. To test this hypothesis and gain further insights into the structural basis of inhibitor binding, we have synthesized and characterized 1-(1H-imidazol-1-yl)-4,4-diphenyl-2-butanone (QC-308). Using a carbon monoxide (CO) formation assay on rat spleen microsomes, the compound was found to be ∼15 times more potent (IC(50) = 0.27±0.07 µM) than its monophenyl analogue, which is already a potent compound in its own right (QC-65; IC(50) = 4.0±1.8 µM). The crystal structure of hHO-1 with QC-308 revealed that the second phenyl group in the western region of the compound is indeed accommodated by a definitive secondary proximal hydrophobic pocket. Thus, the two phenyl moieties are each stabilized by distinct hydrophobic pockets. This "double-clamp" binding offers additional inhibitor stabilization and provides a new route for improvement of human heme oxygenase inhibitors.
NASA Technical Reports Server (NTRS)
Barbre, Robert, Jr.
2015-01-01
Assessment of space vehicle loads and trajectories during design requires a large sample of wind profiles at the altitudes where winds affect the vehicle. Traditionally, this altitude region extends from near 8-14 km to address maximum dynamic pressure upon ascent into space, but some applications require knowledge of measured wind profiles at lower altitudes. Such applications include crew capsule pad abort and plume damage analyses. Two Doppler Radar Wind Profiler (DRWP) systems exist at the United States Air Force (USAF) Eastern Range and at the National Aeronautics and Space Administration's Kennedy Space Center. The 50-MHz DRWP provides wind profiles every 3-5 minutes from roughly 2.5-18.5 km, and five 915-MHz DRWPs provide wind profiles every 15 minutes from approximately 0.2-3.0 km. Archived wind profiles from all systems underwent rigorous quality control (QC) processes, and concurrent measurements from the QC'ed 50- and 915-MHz DRWP archives were spliced into individual profiles that extend from about 0.2-18.5 km. The archive contains combined profiles from April 2000 to December 2009, and thousands of profiles during each month are available for use by the launch vehicle community. This paper presents the details of the QC and splice methodology, as well as some attributes of the archive.
Filtered Push: Annotating Distributed Data for Quality Control and Fitness for Use Analysis
NASA Astrophysics Data System (ADS)
Morris, P. J.; Kelly, M. A.; Lowery, D. B.; Macklin, J. A.; Morris, R. A.; Tremonte, D.; Wang, Z.
2009-12-01
The single greatest problem with the federation of scientific data is the assessment of the quality and validity of the aggregated data in the context of particular research problems, that is, its fitness for use. There are three critical data quality issues in networks of distributed natural science collections data, as in all scientific data: identifying and correcting errors, maintaining currency, and assessing fitness for use. To this end, we have designed and implemented a prototype network in the domain of natural science collections. This prototype is built over the open source Map-Reduce platform Hadoop with a network client in the open source collections management system Specify 6. We call this network “Filtered Push” as, at its core, annotations are pushed from the network edges to relevant authoritative repositories, where humans and software filter the annotations before accepting them as changes to the authoritative data. The Filtered Push software is a domain-neutral framework for originating, distributing, and analyzing record-level annotations. Network participants can subscribe to notifications arising from ontology-based analyses of new annotations or of purpose-built queries against the network's global history of annotations. Quality and fitness for use of distributed natural science collections data can be addressed with Filtered Push software by implementing a network that allows data providers and consumers to define potential errors in data, develop metrics for those errors, specify workflows to analyze distributed data to detect potential errors, and to close the quality management cycle by providing a network architecture to pushing assertions about data quality such as corrections back to the curators of the participating data sets. Quality issues in distributed scientific data have several things in common: (1) Statements about data quality should be regarded as hypotheses about inconsistencies between perhaps several records, data sets, or practices of science. (2) Data quality problems often cannot be detected only from internal statistical correlations or logical analysis, but may need the application of defined workflows that signal illogical output. (3) Changes in scientific theory or practice over time can result in changes of what QC tests should be applied to legacy data. (4) The frequency of some classes of error in a data set may be identifiable without the ability to assert that a particular record is in error. To address these issues requires, as does science itself, framing QC hypotheses against data that may be anywhere and may arise at any time in the future. In short, QC for science data is a never ending process. It must provide for notice to an agent (human or software) that a given dataset supports a hypothesis of inconsistency with a current scientific resource or model, or with potential generalizations of the concepts in a metadata ontology. Like quality control in general, quality control of distributed data is a repeated cyclical process. In implementing a Filtered Push network for quality control, we have a model in which the cost of QC forever is not substantially greater than QC once.
Validation Tests of a Non-Nuclear Combined Asphalt and Soil Density Gauge
2014-04-01
limit if applicable. This approach was considered as if this device was to be used on a construction project for quality control where the material...military contingency construction activities, because they are not sufficiently accurate compared to the NDG for quality control use in permanent...binder. Nominal asphalt content with water included was 5.2. m Average results from producer’s Quality Control (QC) testing. The list of instruments
Wu, Chunnuan; Liu, Yan; He, Zhonggui; Sun, Jin
2016-01-01
To assess in vivo behavior through in vitro method, the dissolution test is mostly used, both for quality control (QC) and for development purpose. In view of the fact that a dissolution test can hardly achieve two goals at the same time, the design of dissolution testing generally varies along with the development stage of drug products and therefore the selection of dissolution media may change with the goals of the dissolution test. To serve the QC purpose, a dissolution medium is designed to provide a sink condition; for development purpose, the dissolution medium is required to simulate the physiological conditions in the gastrointestinal tract as far as possible. In this review, we intended to provide an initial introduction to the various dissolution media applied for QC and formulation development purposes for poorly water soluble drugs. We focused on these methods like addition of cosolvents, surfactants and utilization of biphasic media, applied to provide sink conditions which are difficult to be achieved by simple aqueous buffers for lipophilic drugs, and introduced the development of physiologically relevant media for human and animals like dog and rat with respect to the choice of buffers, bile salts, lipids and so on. In addition, we further discussed the influence of biorelevant dissolution media on the modification of drug Biopharmaceutical Classification System (BCS) classification, especially for BCS class II drugs with low solubility and high permeability, the solubility of which is relatively sensitive to the presence of bile salts and lipids.
Quality Control in Clinical Laboratory Samples
2015-01-01
is able to find and correct flaws in the analytical processes of a lab before potentially incorrect patient resu lts are released. According to...verifi es that the results produced are accurate and precise . Clinical labs use management of documentation as well as inco rporation of a continuous...improvement process to streamline the overall quality control process . QC samples are expected to be identical and tested identically to patient
Smith, D.B.; Woodruff, L.G.; O'Leary, R. M.; Cannon, W.F.; Garrett, R.G.; Kilburn, J.E.; Goldhaber, M.B.
2009-01-01
In 2004, the US Geological Survey (USGS) and the Geological Survey of Canada sampled and chemically analyzed soils along two transects across Canada and the USA in preparation for a planned soil geochemical survey of North America. This effort was a pilot study to test and refine sampling protocols, analytical methods, quality control protocols, and field logistics for the continental survey. A total of 220 sample sites were selected at approximately 40-km intervals along the two transects. The ideal sampling protocol at each site called for a sample from a depth of 0-5 cm and a composite of each of the O, A, and C horizons. The <2-mm fraction of each sample was analyzed for Al, Ca, Fe, K, Mg, Na, S, Ti, Ag, As, Ba, Be, Bi, Cd, Ce, Co, Cr, Cs, Cu, Ga, In, La, Li, Mn, Mo, Nb, Ni, P, Pb, Rb, Sb, Sc, Sn, Sr, Te, Th, Tl, U, V, W, Y, and Zn by inductively coupled plasma-mass spectrometry and inductively coupled plasma-atomic emission spectrometry following a near-total digestion in a mixture of HCl, HNO3, HClO4, and HF. Separate methods were used for Hg, Se, total C, and carbonate-C on this same size fraction. Only Ag, In, and Te had a large percentage of concentrations below the detection limit. Quality control (QC) of the analyses was monitored at three levels: the laboratory performing the analysis, the USGS QC officer, and the principal investigator for the study. This level of review resulted in an average of one QC sample for every 20 field samples, which proved to be minimally adequate for such a large-scale survey. Additional QC samples should be added to monitor within-batch quality to the extent that no more than 10 samples are analyzed between a QC sample. Only Cr (77%), Y (82%), and Sb (80%) fell outside the acceptable limits of accuracy (% recovery between 85 and 115%) because of likely residence in mineral phases resistant to the acid digestion. A separate sample of 0-5-cm material was collected at each site for determination of organic compounds. A subset of 73 of these samples was analyzed for a suite of 19 organochlorine pesticides by gas chromatography. Only three of these samples had detectable pesticide concentrations. A separate sample of A-horizon soil was collected for microbial characterization by phospholipid fatty acid analysis (PLFA), soil enzyme assays, and determination of selected human and agricultural pathogens. Collection, preservation and analysis of samples for both organic compounds and microbial characterization add a great degree of complication to the sampling and preservation protocols and a significant increase to the cost for a continental-scale survey. Both these issues must be considered carefully prior to adopting these parameters as part of the soil geochemical survey of North America.
Patti Labbe | Frederick National Laboratory for Cancer Research
Looking back, Patti Labbe realizes that she’s always felt at home working in quality control. She got her start in the field soon after graduating college, and it’s been a part of her life ever since. “QC work was a great fit for me,” she sa
This Multi-Site QAPP presents the organization, data quality objectives (DQOs), a set of anticipated activities, sample analysis, data handling and specific Quality Assurance/Quality Control (QA/QC) procedures associated with Studies done in EPA Region 5
40 CFR 98.284 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... accounting purposes including direct measurement weighing the petroleum coke fed into your process (by belt... used to ensure the accuracy of monthly petroleum coke consumption measurements. (c) For CO2 process... quality assurance and quality control of the supplier data, you must conduct an annual measurement of the...
National Functional Guidelines for Inorganic Superfund Methods Data Review (ISM02.4)
This document is designed to assist the reviewer in evaluating (a) whether the analytical data meet the technical and Quality Control (QC) criteria specified in the SOW, and (b) the usability and extent of bias of any data that do not meet these criteria.
DOT National Transportation Integrated Search
2015-01-01
One of the objectives of this study was to evaluate soil testing equipment based on its capability of measuring in-place stiffness or modulus values. : As design criteria transition from empirical to mechanistic-empirical, soil test methods and equip...
Oral Solid Dosage Form Disintegration Testing - The Forgotten Test.
Al-Gousous, Jozef; Langguth, Peter
2015-09-01
Since its inception in the 1930s, disintegration testing has become an important quality control (QC) test in pharmaceutical industry, and disintegration test procedures for various dosage forms have been described by the different pharmacopoeias, with harmonization among them still not quite complete. However, because of the fact that complete disintegration does not necessarily imply complete dissolution, much more research has been focused on dissolution rather than on disintegration testing. Nevertheless, owing to its simplicity, disintegration testing seems to be an attractive replacement to dissolution testing as recognized by the International Conference on Harmonization guidelines, in some cases. Therefore, with proper research being carried out to overcome the associated challenges, the full potential of disintegration testing could be tapped saving considerable efforts allocated to QC testing and quality assurance. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gillespie, B.M.; Stromatt, R.W.; Ross, G.A.
This data package contains the results obtained by Pacific Northwest Laboratory (PNL) staff in the characterization of samples for the 101-SY Hydrogen Safety Project. The samples were submitted for analysis by Westinghouse Hanford Company (WHC) under the Technical Project Plan (TPP) 17667 and the Quality Assurance Plan MCS-027. They came from a core taken during Window C'' after the May 1991 gas release event. The analytical procedures required for analysis were defined in the Test Instructions (TI) prepared by the PNL 101-SY Analytical Chemistry Laboratory (ACL) Project Management Office in accordance with the TPP and the QA Plan. The requestedmore » analysis for these samples was volatile organic analysis. The quality control (QC) requirements for each sample are defined in the Test Instructions for each sample. The QC requirements outlined in the procedures and requested in the WHC statement of work were followed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gillespie, B.M.; Stromatt, R.W.; Ross, G.A.
This data package contains the results obtained by Pacific Northwest Laboratory (PNL) staff in the characterization of samples for the 101-SY Hydrogen Safety Project. The samples were submitted for analysis by Westinghouse Hanford Company (WHC) under the Technical Project Plan (TPP) 17667 and the Quality Assurance Plan MCS-027. They came from a core taken during Window ``C`` after the May 1991 gas release event. The analytical procedures required for analysis were defined in the Test Instructions (TI) prepared by the PNL 101-SY Analytical Chemistry Laboratory (ACL) Project Management Office in accordance with the TPP and the QA Plan. The requestedmore » analysis for these samples was volatile organic analysis. The quality control (QC) requirements for each sample are defined in the Test Instructions for each sample. The QC requirements outlined in the procedures and requested in the WHC statement of work were followed.« less
Characterisation of imperial college reactor centre legacy waste using gamma-ray spectrometry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shuhaimi, Alif Imran Mohd
Waste characterisation is a principal component in waste management strategy. The characterisation includes identification of chemical, physical and radiochemical parameters of radioactive waste. Failure to determine specific waste properties may result in sentencing waste packages which are not compliant with the regulation of long term storage or disposal. This project involved measurement of intensity and energy of gamma photons which may be emitted by radioactive waste generated during decommissioning of Imperial College Reactor Centre (ICRC). The measurement will use High Purity Germanium (HPGe) as Gamma-ray detector and ISOTOPIC-32 V4.1 as analyser. In order to ensure the measurements provide reliable results,more » two quality control (QC) measurements using difference matrices have been conducted. The results from QC measurements were used to determine the accuracy of the ISOTOPIC software.« less
NASA Astrophysics Data System (ADS)
Mitsuoka, Shigenori; Tamura, Akira
2012-04-01
Assuming that an electron confined by double δ-function barriers is in a quasi-stationary state, we derived eigenfunctions and eigenenergies of the electron. Applying this point of view to the electron confined in a rectangular quantum corral (QC), we obtained scanning tunneling microscopic (STM) images and scanning tunneling spectrum (STS). Our results are consistent with experimental ones, which confirms validity of the present model. Comparing with the treatment in which the corral potential is chosen to be of square-barrier type, the present treatment has an advantage that the eigenvalue equations are simple and the number of parameters that specify the potential barrier is only one except the bottom of the potential well. On the basis of a Dyson equation for the Green function we calculated STM images and STS of the QC having an adsorbed atom inside. Our results are consistent with experimental STM images and STS. In contrast to a previous viewpoint that the STS profile is reversed with that of the empty QC, we concluded the STS peaks of the adsorbed QC are shifted downward from those of the empty QC.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vile, D; Zhang, L; Cuttino, L
2016-06-15
Purpose: To create a quality assurance program based upon a risk-based assessment of a newly implemented SirSpheres Y-90 procedure. Methods: A process map was created for a newly implemented SirSpheres procedure at a community hospital. The process map documented each step of this collaborative procedure, as well as the roles and responsibilities of each member. From the process map, different potential failure modes were determined as well as any current controls in place. From this list, a full failure mode and effects analysis (FMEA) was performed by grading each failure mode’s likelihood of occurrence, likelihood of detection, and potential severity.more » These numbers were then multiplied to compute the risk priority number (RPN) for each potential failure mode. Failure modes were then ranked based on their RPN. Additional controls were then added, with failure modes corresponding to the highest RPNs taking priority. Results: A process map was created that succinctly outlined each step in the SirSpheres procedure in its current implementation. From this, 72 potential failure modes were identified and ranked according to their associated RPN. Quality assurance controls and safety barriers were then added for failure modes associated with the highest risk being addressed first. Conclusion: A quality assurance program was created from a risk-based assessment of the SirSpheres process. Process mapping and FMEA were effective in identifying potential high-risk failure modes for this new procedure, which were prioritized for new quality assurance controls. TG 100 recommends the fault tree analysis methodology to design a comprehensive and effective QC/QM program, yet we found that by simply introducing additional safety barriers to address high RPN failure modes makes the whole process simpler and safer.« less
40 CFR 98.74 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Ammonia Manufacturing § 98.74 Monitoring and QA/QC... (c)(8) of this section. (f) [Reserved] (g) If CO2 from ammonia production is used to produce urea at...
40 CFR 98.74 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Ammonia Manufacturing § 98.74 Monitoring and QA/QC... (c)(8) of this section. (f)[Reserved] (g) If CO2 from ammonia production is used to produce urea at...
40 CFR 98.74 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Ammonia Manufacturing § 98.74 Monitoring and QA/QC... (c)(8) of this section. (f)[Reserved] (g) If CO2 from ammonia production is used to produce urea at...
40 CFR 98.74 - Monitoring and QA/QC requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Ammonia Manufacturing § 98.74 Monitoring and QA/QC... (c)(8) of this section. (f) [Reserved] (g) If CO2 from ammonia production is used to produce urea at...
7 CFR 283.15 - Procedure for hearing.
Code of Federal Regulations, 2010 CFR
2010-01-01
... evidence, the QC claim against the State agency for a QC error rate in excess of the tolerance level. The... admissible in evidence subject to such objections as to relevancy, materiality or competency of the testimony...