Sample records for quality control tools

  1. Quality control troubleshooting tools for the mill floor

    Treesearch

    John Dramm

    2000-01-01

    Statistical Process Control (SPC) provides effective tools for improving process quality in the forest products industry resulting in reduced costs and improved productivity. Implementing SPC helps identify and locate problems that occur in wood products manufacturing. SPC tools achieve their real value when applied on the mill floor for monitoring and troubleshooting...

  2. GenePRIMP: A software quality control tool

    ScienceCinema

    Amrita Pati

    2017-12-09

    Amrita Pati of the DOE Joint Genome Institute's Genome Biology group describes the software tool GenePRIMP and how it fits into the quality control pipeline for microbial genomics. Further details regarding GenePRIMP appear in a paper published online May 2, 2010 in Nature Methods.

  3. Comparison of quality control software tools for diffusion tensor imaging.

    PubMed

    Liu, Bilan; Zhu, Tong; Zhong, Jianhui

    2015-04-01

    Image quality of diffusion tensor imaging (DTI) is critical for image interpretation, diagnostic accuracy and efficiency. However, DTI is susceptible to numerous detrimental artifacts that may impair the reliability and validity of the obtained data. Although many quality control (QC) software tools are being developed and are widely used and each has its different tradeoffs, there is still no general agreement on an image quality control routine for DTIs, and the practical impact of these tradeoffs is not well studied. An objective comparison that identifies the pros and cons of each of the QC tools will be helpful for the users to make the best choice among tools for specific DTI applications. This study aims to quantitatively compare the effectiveness of three popular QC tools including DTI studio (Johns Hopkins University), DTIprep (University of North Carolina at Chapel Hill, University of Iowa and University of Utah) and TORTOISE (National Institute of Health). Both synthetic and in vivo human brain data were used to quantify adverse effects of major DTI artifacts to tensor calculation as well as the effectiveness of different QC tools in identifying and correcting these artifacts. The technical basis of each tool was discussed, and the ways in which particular techniques affect the output of each of the tools were analyzed. The different functions and I/O formats that three QC tools provide for building a general DTI processing pipeline and integration with other popular image processing tools were also discussed. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. Tools for quality control of fingerprint databases

    NASA Astrophysics Data System (ADS)

    Swann, B. Scott; Libert, John M.; Lepley, Margaret A.

    2010-04-01

    Integrity of fingerprint data is essential to biometric and forensic applications. Accordingly, the FBI's Criminal Justice Information Services (CJIS) Division has sponsored development of software tools to facilitate quality control functions relative to maintaining its fingerprint data assets inherent to the Integrated Automated Fingerprint Identification System (IAFIS) and Next Generation Identification (NGI). This paper provides an introduction of two such tools. The first FBI-sponsored tool was developed by the National Institute of Standards and Technology (NIST) and examines and detects the spectral signature of the ridge-flow structure characteristic of friction ridge skin. The Spectral Image Validation/Verification (SIVV) utility differentiates fingerprints from non-fingerprints, including blank frames or segmentation failures erroneously included in data; provides a "first look" at image quality; and can identify anomalies in sample rates of scanned images. The SIVV utility might detect errors in individual 10-print fingerprints inaccurately segmented from the flat, multi-finger image acquired by one of the automated collection systems increasing in availability and usage. In such cases, the lost fingerprint can be recovered by re-segmentation from the now compressed multi-finger image record. The second FBI-sponsored tool, CropCoeff was developed by MITRE and thoroughly tested via NIST. CropCoeff enables cropping of the replacement single print directly from the compressed data file, thus avoiding decompression and recompression of images that might degrade fingerprint features necessary for matching.

  5. Quality Control System using Simple Implementation of Seven Tools for Batik Textile Manufacturing

    NASA Astrophysics Data System (ADS)

    Ragil Suryoputro, Muhammad; Sugarindra, Muchamad; Erfaisalsyah, Hendy

    2017-06-01

    In order to produce better products and mitigate defect in products, every company must implement a quality control system. Company will find means to implement a quality control system that is capable and reliable. One of the methods is using the simple implementation of the seven tools in quality control defects. The case studied in this research was the level of disability xyz grey fabric on a shuttle loom 2 on the Batik manufacturing company. The seven tools that include: flowchart, check sheet, histogram, scatter diagram combined with control charts, Pareto diagrams and fishbone diagrams (causal diagram). Check sheet results obtained types of defects in the grey fabric was woven xyz is warp, double warp, the warp break, double warp, empty warp, warp tenuous, ugly edges, thick warp, and rust. Based on the analysis of control chart indicates that the process is out of control. This can be seen in the graph control where there is still a lot of outlier data. Based on a scatter diagram shows a positive correlation between the percentage of disability and the number of production. Based on Pareto diagram, repair needs priority is for the dominant type of defect is warp (44%) and based on double warp value histogram is also the highest with a value of 23635.11 m. In addition, based on the analysis of the factors causing defect by fishbone diagram double warp or other types of defects originating from the materials, methods, machines, measurements, man and environment. Thus the company can take to minimize the prevention and repair of defects and improve product quality.

  6. QCScreen: a software tool for data quality control in LC-HRMS based metabolomics.

    PubMed

    Simader, Alexandra Maria; Kluger, Bernhard; Neumann, Nora Katharina Nicole; Bueschl, Christoph; Lemmens, Marc; Lirk, Gerald; Krska, Rudolf; Schuhmacher, Rainer

    2015-10-24

    Metabolomics experiments often comprise large numbers of biological samples resulting in huge amounts of data. This data needs to be inspected for plausibility before data evaluation to detect putative sources of error e.g. retention time or mass accuracy shifts. Especially in liquid chromatography-high resolution mass spectrometry (LC-HRMS) based metabolomics research, proper quality control checks (e.g. for precision, signal drifts or offsets) are crucial prerequisites to achieve reliable and comparable results within and across experimental measurement sequences. Software tools can support this process. The software tool QCScreen was developed to offer a quick and easy data quality check of LC-HRMS derived data. It allows a flexible investigation and comparison of basic quality-related parameters within user-defined target features and the possibility to automatically evaluate multiple sample types within or across different measurement sequences in a short time. It offers a user-friendly interface that allows an easy selection of processing steps and parameter settings. The generated results include a coloured overview plot of data quality across all analysed samples and targets and, in addition, detailed illustrations of the stability and precision of the chromatographic separation, the mass accuracy and the detector sensitivity. The use of QCScreen is demonstrated with experimental data from metabolomics experiments using selected standard compounds in pure solvent. The application of the software identified problematic features, samples and analytical parameters and suggested which data files or compounds required closer manual inspection. QCScreen is an open source software tool which provides a useful basis for assessing the suitability of LC-HRMS data prior to time consuming, detailed data processing and subsequent statistical analysis. It accepts the generic mzXML format and thus can be used with many different LC-HRMS platforms to process both multiple

  7. Colorimetry as Quality Control Tool for Individual Inkjet-Printed Pediatric Formulations.

    PubMed

    Wickström, Henrika; Nyman, Johan O; Indola, Mathias; Sundelin, Heidi; Kronberg, Leif; Preis, Maren; Rantanen, Jukka; Sandler, Niklas

    2017-02-01

    Printing technologies were recently introduced to the pharmaceutical field for manufacturing of drug delivery systems. Printing allows on demand manufacturing of flexible pharmaceutical doses in a personalized manner, which is critical for a successful and safe treatment of patient populations with specific needs, such as children and the elderly, and patients facing multimorbidity. Printing of pharmaceuticals as technique generates new demands on the quality control procedures. For example, rapid quality control is needed as the printing can be done on demand and at the point of care. This study evaluated the potential use of a handheld colorimetry device for quality control of printed doses of vitamin Bs on edible rice and sugar substrates. The structural features of the substrates with and without ink were also compared. A multicomponent ink formulation with vitamin B 1 , B 2 , B 3 , and B 6 was developed. Doses (4 cm 2 ) were prepared by applying 1-10 layers of yellow ink onto the white substrates using thermal inkjet technology. The colorimetric method was seen to be viable in detecting doses up to the 5th and 6th printed layers until color saturation of the yellow color parameter (b*) was observed on the substrates. Liquid chromatography mass spectrometry was used as a reference method for the colorimetry measurements plotted against the number of printed layers. It was concluded that colorimetry could be used as a quality control tool for detection of different doses. However, optimization of the color addition needs to be done to avoid color saturation within the planned dose interval.

  8. An Automated Treatment Plan Quality Control Tool for Intensity-Modulated Radiation Therapy Using a Voxel-Weighting Factor-Based Re-Optimization Algorithm.

    PubMed

    Song, Ting; Li, Nan; Zarepisheh, Masoud; Li, Yongbao; Gautier, Quentin; Zhou, Linghong; Mell, Loren; Jiang, Steve; Cerviño, Laura

    2016-01-01

    Intensity-modulated radiation therapy (IMRT) currently plays an important role in radiotherapy, but its treatment plan quality can vary significantly among institutions and planners. Treatment plan quality control (QC) is a necessary component for individual clinics to ensure that patients receive treatments with high therapeutic gain ratios. The voxel-weighting factor-based plan re-optimization mechanism has been proved able to explore a larger Pareto surface (solution domain) and therefore increase the possibility of finding an optimal treatment plan. In this study, we incorporated additional modules into an in-house developed voxel weighting factor-based re-optimization algorithm, which was enhanced as a highly automated and accurate IMRT plan QC tool (TPS-QC tool). After importing an under-assessment plan, the TPS-QC tool was able to generate a QC report within 2 minutes. This QC report contains the plan quality determination as well as information supporting the determination. Finally, the IMRT plan quality can be controlled by approving quality-passed plans and replacing quality-failed plans using the TPS-QC tool. The feasibility and accuracy of the proposed TPS-QC tool were evaluated using 25 clinically approved cervical cancer patient IMRT plans and 5 manually created poor-quality IMRT plans. The results showed high consistency between the QC report quality determinations and the actual plan quality. In the 25 clinically approved cases that the TPS-QC tool identified as passed, a greater difference could be observed for dosimetric endpoints for organs at risk (OAR) than for planning target volume (PTV), implying that better dose sparing could be achieved in OAR than in PTV. In addition, the dose-volume histogram (DVH) curves of the TPS-QC tool re-optimized plans satisfied the dosimetric criteria more frequently than did the under-assessment plans. In addition, the criteria for unsatisfied dosimetric endpoints in the 5 poor-quality plans could typically be

  9. An Automated Treatment Plan Quality Control Tool for Intensity-Modulated Radiation Therapy Using a Voxel-Weighting Factor-Based Re-Optimization Algorithm

    PubMed Central

    Song, Ting; Li, Nan; Zarepisheh, Masoud; Li, Yongbao; Gautier, Quentin; Zhou, Linghong; Mell, Loren; Jiang, Steve; Cerviño, Laura

    2016-01-01

    Intensity-modulated radiation therapy (IMRT) currently plays an important role in radiotherapy, but its treatment plan quality can vary significantly among institutions and planners. Treatment plan quality control (QC) is a necessary component for individual clinics to ensure that patients receive treatments with high therapeutic gain ratios. The voxel-weighting factor-based plan re-optimization mechanism has been proved able to explore a larger Pareto surface (solution domain) and therefore increase the possibility of finding an optimal treatment plan. In this study, we incorporated additional modules into an in-house developed voxel weighting factor-based re-optimization algorithm, which was enhanced as a highly automated and accurate IMRT plan QC tool (TPS-QC tool). After importing an under-assessment plan, the TPS-QC tool was able to generate a QC report within 2 minutes. This QC report contains the plan quality determination as well as information supporting the determination. Finally, the IMRT plan quality can be controlled by approving quality-passed plans and replacing quality-failed plans using the TPS-QC tool. The feasibility and accuracy of the proposed TPS-QC tool were evaluated using 25 clinically approved cervical cancer patient IMRT plans and 5 manually created poor-quality IMRT plans. The results showed high consistency between the QC report quality determinations and the actual plan quality. In the 25 clinically approved cases that the TPS-QC tool identified as passed, a greater difference could be observed for dosimetric endpoints for organs at risk (OAR) than for planning target volume (PTV), implying that better dose sparing could be achieved in OAR than in PTV. In addition, the dose-volume histogram (DVH) curves of the TPS-QC tool re-optimized plans satisfied the dosimetric criteria more frequently than did the under-assessment plans. In addition, the criteria for unsatisfied dosimetric endpoints in the 5 poor-quality plans could typically be

  10. Development and implementation of an audit tool for quality control of parenteral nutrition.

    PubMed

    García-Rodicio, Sonsoles; Abajo, Celia; Godoy, Mercedes; Catalá, Miguel Angel

    2009-01-01

    The aim of this article is to describe the development of a quality control methodology applied to patients receiving parenteral nutrition (PN) and to present the results obtained over the past 10 years. Development of the audit tool: In 1995, a total of 13 PN quality criteria and their standards were defined based on literature and past experiences. They were applied during 5 different 6-month audits carried out in subsequent years. According to the results of each audit, the criteria with lower validity were eliminated, while others were optimized and new criteria were introduced to complete the monitoring of other areas not previously examined. Currently, the quality control process includes 22 quality criteria and their standards that examine the following 4 different areas: (1) indication and duration of PN; (2) nutrition assessment, adequacy of the nutrition support, and monitoring; (3) metabolic and infectious complications; and (4) global efficacy of the nutrition support regimen. The authors describe the current definition of each criterion and present the results obtained in the 5 audits performed. In the past year, 9 of the 22 criteria reached the predefined standards. The areas detected for further improvements were: indication for PN, nutrition assessment, and management of catheter infections. The definition of quality criteria and their standards is an efficient method of providing a qualitative and quantitative analysis of the clinical care of patients receiving PN. It detects areas for improvement and assists in developing a methodology to work efficiently.

  11. The IEO Data Center Management System: Tools for quality control, analysis and access marine data

    NASA Astrophysics Data System (ADS)

    Casas, Antonia; Garcia, Maria Jesus; Nikouline, Andrei

    2010-05-01

    Since 1994 the Data Centre of the Spanish Oceanographic Institute develops system for archiving and quality control of oceanographic data. The work started in the frame of the European Marine Science & Technology Programme (MAST) when a consortium of several Mediterranean Data Centres began to work on the MEDATLAS project. Along the years, old software modules for MS DOS were rewritten, improved and migrated to Windows environment. Oceanographic data quality control includes now not only vertical profiles (mainly CTD and bottles observations) but also time series of currents and sea level observations. New powerful routines for analysis and for graphic visualization were added. Data presented originally in ASCII format were organized recently in an open source MySQL database. Nowadays, the IEO, as part of SeaDataNet Infrastructure, has designed and developed a new information system, consistent with the ISO 19115 and SeaDataNet standards, in order to manage the large and diverse marine data and information originated in Spain by different sources, and to interoperate with SeaDataNet. The system works with data stored in ASCII files (MEDATLAS, ODV) as well as data stored within the relational database. The components of the system are: 1.MEDATLAS Format and Quality Control - QCDAMAR: Quality Control of Marine Data. Main set of tools for working with data presented as text files. Includes extended quality control (searching for duplicated cruises and profiles, checking date, position, ship velocity, constant profiles, spikes, density inversion, sounding, acceptable data, impossible regional values,...) and input/output filters. - QCMareas: A set of procedures for the quality control of tide gauge data according to standard international Sea Level Observing System. These procedures include checking for unexpected anomalies in the time series, interpolation, filtering, computation of basic statistics and residuals. 2. DAMAR: A relational data base (MySql) designed to

  12. Quality and Efficiency Improvement Tools for Every Radiologist.

    PubMed

    Kudla, Alexei U; Brook, Olga R

    2018-06-01

    In an era of value-based medicine, data-driven quality improvement is more important than ever to ensure safe and efficient imaging services. Familiarity with high-value tools enables all radiologists to successfully engage in quality and efficiency improvement. In this article, we review the model for improvement, strategies for measurement, and common practical tools with real-life examples that include Run chart, Control chart (Shewhart chart), Fishbone (Cause-and-Effect or Ishikawa) diagram, Pareto chart, 5 Whys, and Root Cause Analysis. Copyright © 2018 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  13. Control by quality: proposition of a typology.

    PubMed

    Pujo, P; Pillet, M

    The application of Quality tools and methods in industrial management has always had a fundamental impact on the control of production. It influences the behavior of the actors concerned, while introducing the necessary notions and formalizations, especially for production systems with little or no automation, which constitute a large part of the industrial activity. Several quality approaches are applied in the workshop and are implemented at the level of the control. In this paper, the authors present a typology of the various approaches that have successively influenced control, such as statistical process control, quality assurance, and continuous improvement. First the authors present a parallel between production control and quality organizational structure. They note the duality between control, which is aimed at increasing productivity, and quality, which aims to satisfy the needs of the customer. They also note the hierarchical organizational structure of these two systems of management with, at each level, the notion of a feedback loop. This notion is fundamental to any kind of decision making. The paper is organized around the operational, tactical, and strategic levels, by describing for each level the main methods and tools for control by quality. The overview of these tools and methods starts at the operational level, with the Statistical Process Control, the Taguchi technique, and the "six sigma" approach. On the tactical level, we find a quality system approach, with a documented description of the procedures introduced in the firm. The management system can refer here to Quality Assurance, Total Productive Maintenance, or Management by Total Quality. The formalization through procedures of the rules of decision governing the process control enhances the validity of these rules. This leads to the enhancement of their reliability and to their consolidation. All this counterbalances the human, intrinsically fluctuating, behavior of the control

  14. Making Quality Sense: A Guide to Quality, Tools and Techniques, Awards and the Thinking Behind Them.

    ERIC Educational Resources Information Center

    Owen, Jane

    This document is intended to guide further education colleges and work-based learning providers through some of the commonly used tools, techniques, and theories of quality management. The following are among the topics discussed: (1) various ways of defining quality; methods used by organizations to achieve quality (quality control, quality…

  15. Assessing Educational Processes Using Total-Quality-Management Measurement Tools.

    ERIC Educational Resources Information Center

    Macchia, Peter, Jr.

    1993-01-01

    Discussion of the use of Total Quality Management (TQM) assessment tools in educational settings highlights and gives examples of fishbone diagrams, or cause and effect charts; Pareto diagrams; control charts; histograms and check sheets; scatter diagrams; and flowcharts. Variation and quality are discussed in terms of continuous process…

  16. Perceptual tools for quality-aware video networks

    NASA Astrophysics Data System (ADS)

    Bovik, A. C.

    2014-01-01

    Monitoring and controlling the quality of the viewing experience of videos transmitted over increasingly congested networks (especially wireless networks) is a pressing problem owing to rapid advances in video-centric mobile communication and display devices that are straining the capacity of the network infrastructure. New developments in automatic perceptual video quality models offer tools that have the potential to be used to perceptually optimize wireless video, leading to more efficient video data delivery and better received quality. In this talk I will review key perceptual principles that are, or could be used to create effective video quality prediction models, and leading quality prediction models that utilize these principles. The goal is to be able to monitor and perceptually optimize video networks by making them "quality-aware."

  17. Quality Dashboards: Technical and Architectural Considerations of an Actionable Reporting Tool for Population Management

    PubMed Central

    Olsha-Yehiav, Maya; Einbinder, Jonathan S.; Jung, Eunice; Linder, Jeffrey A.; Greim, Julie; Li, Qi; Schnipper, Jeffrey L.; Middleton, Blackford

    2006-01-01

    Quality Dashboards (QD) is a condition-specific, actionable web-based application for quality reporting and population management that is integrated into the Electronic Health Record (EHR). Using server-based graphic web controls in a .Net environment to construct Quality Dashboards allows customization of the reporting tool without the need to rely on commercial business intelligence tool. Quality Dashboards will improve patient care and quality outcomes as clinicians utilize the reporting tool for population management. PMID:17238671

  18. Single-case synthesis tools I: Comparing tools to evaluate SCD quality and rigor.

    PubMed

    Zimmerman, Kathleen N; Ledford, Jennifer R; Severini, Katherine E; Pustejovsky, James E; Barton, Erin E; Lloyd, Blair P

    2018-03-03

    Tools for evaluating the quality and rigor of single case research designs (SCD) are often used when conducting SCD syntheses. Preferred components include evaluations of design features related to the internal validity of SCD to obtain quality and/or rigor ratings. Three tools for evaluating the quality and rigor of SCD (Council for Exceptional Children, What Works Clearinghouse, and Single-Case Analysis and Design Framework) were compared to determine if conclusions regarding the effectiveness of antecedent sensory-based interventions for young children changed based on choice of quality evaluation tool. Evaluation of SCD quality differed across tools, suggesting selection of quality evaluation tools impacts evaluation findings. Suggestions for selecting an appropriate quality and rigor assessment tool are provided and across-tool conclusions are drawn regarding the quality and rigor of studies. Finally, authors provide guidance for using quality evaluations in conjunction with outcome analyses when conducting syntheses of interventions evaluated in the context of SCD. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. Average of delta: a new quality control tool for clinical laboratories.

    PubMed

    Jones, Graham R D

    2016-01-01

    Average of normals is a tool used to control assay performance using the average of a series of results from patients' samples. Delta checking is a process of identifying errors in individual patient results by reviewing the difference from previous results of the same patient. This paper introduces a novel alternate approach, average of delta, which combines these concepts to use the average of a number of sequential delta values to identify changes in assay performance. Models for average of delta and average of normals were developed in a spreadsheet application. The model assessed the expected scatter of average of delta and average of normals functions and the effect of assay bias for different values of analytical imprecision and within- and between-subject biological variation and the number of samples included in the calculations. The final assessment was the number of patients' samples required to identify an added bias with 90% certainty. The model demonstrated that with larger numbers of delta values, the average of delta function was tighter (lower coefficient of variation). The optimal number of samples for bias detection with average of delta was likely to be between 5 and 20 for most settings and that average of delta outperformed average of normals when the within-subject biological variation was small relative to the between-subject variation. Average of delta provides a possible additional assay quality control tool which theoretical modelling predicts may be more valuable than average of normals for analytes where the group biological variation is wide compared with within-subject variation and where there is a high rate of repeat testing in the laboratory patient population. © The Author(s) 2015.

  20. A mask quality control tool for the OSIRIS multi-object spectrograph

    NASA Astrophysics Data System (ADS)

    López-Ruiz, J. C.; Vaz Cedillo, Jacinto Javier; Ederoclite, Alessandro; Bongiovanni, Ángel; González Escalera, Víctor

    2012-09-01

    OSIRIS multi object spectrograph uses a set of user-customised-masks, which are manufactured on-demand. The manufacturing process consists of drilling the specified slits on the mask with the required accuracy. Ensuring that slits are on the right place when observing is of vital importance. We present a tool for checking the quality of the process of manufacturing the masks which is based on analyzing the instrument images obtained with the manufactured masks on place. The tool extracts the slit information from these images, relates specifications with the extracted slit information, and finally communicates to the operator if the manufactured mask fulfills the expectations of the mask designer. The proposed tool has been built using scripting languages and using standard libraries such as opencv, pyraf and scipy. The software architecture, advantages and limits of this tool in the lifecycle of a multiobject acquisition are presented.

  1. Statistical Process Control: A Quality Tool for a Venous Thromboembolic Disease Registry.

    PubMed

    Posadas-Martinez, Maria Lourdes; Rojas, Liliana Paloma; Vazquez, Fernando Javier; De Quiros, Fernan Bernaldo; Waisman, Gabriel Dario; Giunta, Diego Hernan

    2016-01-01

    We aim to describe Statistical Control Process as a quality tool for the Institutional Registry of Venous Thromboembolic Disease (IRTD), a registry developed in a community-care tertiary hospital in Buenos Aires, Argentina. The IRTD is a prospective cohort. The process of data acquisition began with the creation of a computerized alert generated whenever physicians requested imaging or laboratory study to diagnose venous thromboembolism, which defined eligible patients. The process then followed a structured methodology for patient's inclusion, evaluation, and posterior data entry. To control this process, process performance indicators were designed to be measured monthly. These included the number of eligible patients, the number of included patients, median time to patient's evaluation, and percentage of patients lost to evaluation. Control charts were graphed for each indicator. The registry was evaluated in 93 months, where 25,757 patients were reported and 6,798 patients met inclusion criteria. The median time to evaluation was 20 hours (SD, 12) and 7.7% of the total was lost to evaluation. Each indicator presented trends over time, caused by structural changes and improvement cycles, and therefore the central limit suffered inflexions. Statistical process control through process performance indicators allowed us to control the performance of the registry over time to detect systematic problems. We postulate that this approach could be reproduced for other clinical registries.

  2. Handling Qualities Evaluation of Pilot Tools for Spacecraft Docking in Earth Orbit

    NASA Technical Reports Server (NTRS)

    Bilimoria, Karl D.; Mueller, Eric; Frost, Chad

    2009-01-01

    A new generation of spacecraft is now under development by NASA to replace the Space Shuttle and return astronauts to the Moon. These spacecraft will have a manual control capability for several mission tasks, and the ease and precision with which pilots can execute these tasks will have an important effect on mission risk and training costs. This paper focuses on the handling qualities of a spacecraft based on dynamics similar to that of the Crew Exploration Vehicle, during the last segment of the docking task with a space station in low Earth orbit. A previous study established that handling qualities for this task degrade significantly as the level of translation-into-rotation coupling increases. The goal of this study is to evaluate the efficacy of various pilot aids designed to mitigate the handling qualities degradation caused by this coupling. Four pilot tools were ev adluaetead:d-band box/indicator, flight-path marker, translation guidance cues, and feed-forward control. Each of these pilot tools improved handling qualities, generally with greater improvements resulting from using these tools in combination. A key result of this study is that feedforward control effectively counteracts coupling effects, providing solid Level 1 handling qualities for the spacecraft configuration evaluated.

  3. Visualization and Quality Control Web Tools for CERES Products

    NASA Astrophysics Data System (ADS)

    Mitrescu, C.; Doelling, D. R.

    2017-12-01

    The NASA CERES project continues to provide the scientific communities a wide variety of satellite-derived data products such as observed TOA broadband shortwave and longwave observed fluxes, computed TOA and Surface fluxes, as well as cloud, aerosol, and other atmospheric parameters. They encompass a wide range of temporal and spatial resolutions, suited to specific applications. CERES data is used mostly by climate modeling communities but also by a wide variety of educational institutions. To better serve our users, a web-based Ordering and Visualization Tool (OVT) was developed by using Opens Source Software such as Eclipse, java, javascript, OpenLayer, Flot, Google Maps, python, and others. Due to increased demand by our own scientists, we also implemented a series of specialized functions to be used in the process of CERES Data Quality Control (QC) such as 1- and 2-D histograms, anomalies and differences, temporal and spatial averaging, side-by-side parameter comparison, and others that made the process of QC far easier and faster, but more importantly far more portable. With the integration of ground site observed surface fluxes we further facilitate the CERES project to QC the CERES computed surface fluxes. An overview of the CERES OVT basic functions using Open Source Software, as well as future steps in expanding its capabilities will be presented at the meeting.

  4. Patient-specific dosimetric endpoints based treatment plan quality control in radiotherapy.

    PubMed

    Song, Ting; Staub, David; Chen, Mingli; Lu, Weiguo; Tian, Zhen; Jia, Xun; Li, Yongbao; Zhou, Linghong; Jiang, Steve B; Gu, Xuejun

    2015-11-07

    In intensity modulated radiotherapy (IMRT), the optimal plan for each patient is specific due to unique patient anatomy. To achieve such a plan, patient-specific dosimetric goals reflecting each patient's unique anatomy should be defined and adopted in the treatment planning procedure for plan quality control. This study is to develop such a personalized treatment plan quality control tool by predicting patient-specific dosimetric endpoints (DEs). The incorporation of patient specific DEs is realized by a multi-OAR geometry-dosimetry model, capable of predicting optimal DEs based on the individual patient's geometry. The overall quality of a treatment plan is then judged with a numerical treatment plan quality indicator and characterized as optimal or suboptimal. Taking advantage of clinically available prostate volumetric modulated arc therapy (VMAT) treatment plans, we built and evaluated our proposed plan quality control tool. Using our developed tool, six of twenty evaluated plans were identified as sub-optimal plans. After plan re-optimization, these suboptimal plans achieved better OAR dose sparing without sacrificing the PTV coverage, and the dosimetric endpoints of the re-optimized plans agreed well with the model predicted values, which validate the predictability of the proposed tool. In conclusion, the developed tool is able to accurately predict optimally achievable DEs of multiple OARs, identify suboptimal plans, and guide plan optimization. It is a useful tool for achieving patient-specific treatment plan quality control.

  5. Innovative tools for quality assessment: integrated quality criteria for review of multiple study designs (ICROMS).

    PubMed

    Zingg, W; Castro-Sanchez, E; Secci, F V; Edwards, R; Drumright, L N; Sevdalis, N; Holmes, A H

    2016-04-01

    With the aim to facilitate a more comprehensive review process in public health including patient safety, we established a tool that we have termed ICROMS (Integrated quality Criteria for the Review Of Multiple Study designs), which unifies, integrates and refines current quality criteria for a large range of study designs including qualitative research. Review, pilot testing and expert consensus. The tool is the result of an iterative four phase process over two years: 1) gathering of established criteria for assessing controlled, non-controlled and qualitative study designs; 2) pilot testing of a first version in two systematic reviews on behavioural change in infection prevention and control and in antibiotic prescribing; 3) further refinement and adding of additional study designs in the context of the European Centre for Disease Prevention and Control funded project 'Systematic review and evidence-based guidance on organisation of hospital infection control programmes' (SIGHT); 4) scrutiny by the pan-European expert panel of the SIGHT project, which had the objective of ensuring robustness of the systematic review. ICROMS includes established quality criteria for randomised studies, controlled before-and-after studies and interrupted time series, and incorporates criteria for non-controlled before-and-after studies, cohort studies and qualitative studies. The tool consists of two parts: 1) a list of quality criteria specific for each study design, as well as criteria applicable across all study designs by using a scoring system; 2) a 'decision matrix', which specifies the robustness of the study by identifying minimum requirements according to the study type and the relevance of the study to the review question. The decision matrix directly determines inclusion or exclusion of a study in the review. ICROMS was applied to a series of systematic reviews to test its feasibility and usefulness in the appraisal of multiple study designs. The tool was applicable

  6. Statistical quality control through overall vibration analysis

    NASA Astrophysics Data System (ADS)

    Carnero, M. a. Carmen; González-Palma, Rafael; Almorza, David; Mayorga, Pedro; López-Escobar, Carlos

    2010-05-01

    The present study introduces the concept of statistical quality control in automotive wheel bearings manufacturing processes. Defects on products under analysis can have a direct influence on passengers' safety and comfort. At present, the use of vibration analysis on machine tools for quality control purposes is not very extensive in manufacturing facilities. Noise and vibration are common quality problems in bearings. These failure modes likely occur under certain operating conditions and do not require high vibration amplitudes but relate to certain vibration frequencies. The vibration frequencies are affected by the type of surface problems (chattering) of ball races that are generated through grinding processes. The purpose of this paper is to identify grinding process variables that affect the quality of bearings by using statistical principles in the field of machine tools. In addition, an evaluation of the quality results of the finished parts under different combinations of process variables is assessed. This paper intends to establish the foundations to predict the quality of the products through the analysis of self-induced vibrations during the contact between the grinding wheel and the parts. To achieve this goal, the overall self-induced vibration readings under different combinations of process variables are analysed using statistical tools. The analysis of data and design of experiments follows a classical approach, considering all potential interactions between variables. The analysis of data is conducted through analysis of variance (ANOVA) for data sets that meet normality and homoscedasticity criteria. This paper utilizes different statistical tools to support the conclusions such as chi squared, Shapiro-Wilks, symmetry, Kurtosis, Cochran, Hartlett, and Hartley and Krushal-Wallis. The analysis presented is the starting point to extend the use of predictive techniques (vibration analysis) for quality control. This paper demonstrates the existence

  7. QC-ART: A tool for real-time quality control assessment of mass spectrometry-based proteomics data.

    PubMed

    Stanfill, Bryan A; Nakayasu, Ernesto S; Bramer, Lisa M; Thompson, Allison M; Ansong, Charles K; Clauss, Therese; Gritsenko, Marina A; Monroe, Matthew E; Moore, Ronald J; Orton, Daniel J; Piehowski, Paul D; Schepmoes, Athena A; Smith, Richard D; Webb-Robertson, Bobbie-Jo; Metz, Thomas O

    2018-04-17

    Liquid chromatography-mass spectrometry (LC-MS)-based proteomics studies of large sample cohorts can easily require from months to years to complete. Acquiring consistent, high-quality data in such large-scale studies is challenging because of normal variations in instrumentation performance over time, as well as artifacts introduced by the samples themselves, such as those due to collection, storage and processing. Existing quality control methods for proteomics data primarily focus on post-hoc analysis to remove low-quality data that would degrade downstream statistics; they are not designed to evaluate the data in near real-time, which would allow for interventions as soon as deviations in data quality are detected.  In addition to flagging analyses that demonstrate outlier behavior, evaluating how the data structure changes over time can aide in understanding typical instrument performance or identify issues such as a degradation in data quality due to the need for instrument cleaning and/or re-calibration.  To address this gap for proteomics, we developed Quality Control Analysis in Real-Time (QC-ART), a tool for evaluating data as they are acquired in order to dynamically flag potential issues with instrument performance or sample quality.  QC-ART has similar accuracy as standard post-hoc analysis methods with the additional benefit of real-time analysis.  We demonstrate the utility and performance of QC-ART in identifying deviations in data quality due to both instrument and sample issues in near real-time for LC-MS-based plasma proteomics analyses of a sample subset of The Environmental Determinants of Diabetes in the Young cohort. We also present a case where QC-ART facilitated the identification of oxidative modifications, which are often underappreciated in proteomic experiments. Published under license by The American Society for Biochemistry and Molecular Biology, Inc.

  8. A Statistical Project Control Tool for Engineering Managers

    NASA Technical Reports Server (NTRS)

    Bauch, Garland T.

    2001-01-01

    This slide presentation reviews the use of a Statistical Project Control Tool (SPCT) for managing engineering projects. A literature review pointed to a definition of project success, (i.e., A project is successful when the cost, schedule, technical performance, and quality satisfy the customer.) The literature review also pointed to project success factors, and traditional project control tools, and performance measures that are detailed in the report. The essential problem is that with resources becoming more limited, and an increasing number or projects, project failure is increasing, there is a limitation of existing methods and systematic methods are required. The objective of the work is to provide a new statistical project control tool for project managers. Graphs using the SPCT method plotting results of 3 successful projects and 3 failed projects are reviewed, with success and failure being defined by the owner.

  9. A short term quality control tool for biodegradable microspheres.

    PubMed

    D'Souza, Susan; Faraj, Jabar A; Dorati, Rossella; DeLuca, Patrick P

    2014-06-01

    Accelerated in vitro release testing methodology has been developed as an indicator of product performance to be used as a discriminatory quality control (QC) technique for the release of clinical and commercial batches of biodegradable microspheres. While product performance of biodegradable microspheres can be verified by in vivo and/or in vitro experiments, such evaluation can be particularly challenging because of slow polymer degradation, resulting in extended study times, labor, and expense. Three batches of Leuprolide poly(lactic-co-glycolic acid) (PLGA) microspheres having varying morphology (process variants having different particle size and specific surface area) were manufactured by the solvent extraction/evaporation technique. Tests involving in vitro release, polymer degradation and hydration of the microspheres were performed on the three batches at 55°C. In vitro peptide release at 55°C was analyzed using a previously derived modification of the Weibull function termed the modified Weibull equation (MWE). Experimental observations and data analysis confirm excellent reproducibility studies within and between batches of the microsphere formulations demonstrating the predictability of the accelerated experiments at 55°C. The accelerated test method was also successfully able to distinguish the in vitro product performance between the three batches having varying morphology (process variants), indicating that it is a suitable QC tool to discriminate product or process variants in clinical or commercial batches of microspheres. Additionally, data analysis utilized the MWE to further quantify the differences obtained from the accelerated in vitro product performance test between process variants, thereby enhancing the discriminatory power of the accelerated methodology at 55°C.

  10. Visualization and Quality Control Web Tools for CERES Products

    NASA Astrophysics Data System (ADS)

    Mitrescu, C.; Doelling, D. R.; Rutan, D. A.

    2016-12-01

    The CERES project continues to provide the scientific community a wide variety of satellite-derived data products such as observed TOA broadband shortwave and longwave observed fluxes, computed TOA and Surface fluxes, as well as cloud, aerosol, and other atmospheric parameters. They encompass a wide range of temporal and spatial resolutions, suited to specific applications. Now in its 16-year, CERES products are mostly used by climate modeling communities that focus on global mean energetics, meridianal heat transport, and climate trend studies. In order to serve all our users, we developed a web-based Ordering and Visualization Tool (OVT). Using Opens Source Software such as Eclipse, java, javascript, OpenLayer, Flot, Google Maps, python, and others, the OVT Team developed a series of specialized functions to be used in the process of CERES Data Quality Control (QC). We mention 1- and 2-D histogram, anomaly, deseasonalization, temporal and spatial averaging, side-by-side parameter comparison, and others that made the process of QC far easier and faster, but more importantly far more portable. We are now in the process of integrating ground site observed surface fluxes to further facilitate the CERES project to QC the CERES computed surface fluxes. These features will give users the opportunity to perform their own comparisons of the CERES computed surface fluxes and observed ground site fluxes. An overview of the CERES OVT basic functions using Open Source Software, as well as future steps in expanding its capabilities will be presented at the meeting.

  11. HPTLC Fingerprint Analysis: A Quality Control for Authentication of Herbal Phytochemicals

    NASA Astrophysics Data System (ADS)

    Ram, Mauji; Abdin, M. Z.; Khan, M. A.; Jha, Prabhakar

    Authentication and consistent quality are the basic requirement for Indian traditional medicine (TIM), Chinese traditional herbal medicine (TCHM), and their commercial products, regardless of the kind of research conducted to modernize the TIM and TCHM. The complexities of TIM and TCHM challenge the current official quality control mode, for which only a few biochemical markers were selected for identification and quantitative assay. Referring too many unknown factors existed in TIM and TCHM, it is impossible and unnecessary to pinpoint qualitatively and quantitatively every single component contained in the herbal drug. Chromatographic fingerprint is a rational option to meet the need for more effective and powerful quality assessment to TIM and TCHM. The optimized chromatographic fingerprint is not only an alternative analytical tool for authentication, but also an approach to express the various pattern of chemical ingredients distribution in the herbal drugs and preserve such "database" for further multifaced sustainable studies. Analytical separation techniques, for example, high-performance liquid chromatography (HPLC), gas chromatography (GC) and mass spectrometry (MS) were among the most popular methods of choice used for quality control of raw material and finished herbal product. Fingerprint analysis approach using high-performance thin-layer chromatography (HPTLC) has become the most potent tool for quality control of herbal medicines because of its simplicity and reliability. It can serve as a tool for identification, authentication, and quality control of herbal drugs. In this chapter, attempts are being made to expand the use of HPTLC and at the same time create interest among prospective researcher in herbal analysis. The developed method can be used as a quality control tool for rapid authentication from a wide variety of herbal samples. Some examples demonstrated the role of fingerprinting in quality control and assessment.

  12. HPLC for quality control of polyimides

    NASA Technical Reports Server (NTRS)

    Young, P. R.; Sykes, G. F.

    1979-01-01

    High Pressure Liquid Chromatography (HPLC) as a quality control tool for polyimide resins and prepregs are presented. A data base to help establish accept/reject criteria for these materials was developed. This work is intended to supplement, not replace, standard quality control tests normally conducted on incoming resins and prepregs. To help achieve these objectives, the HPLC separation of LARC-160 polyimide precursor resin was characterized. Room temperature resin aging effects were studied. Graphite reinforced composites made from fresh and aged resin were fabricated and tested to determine if changes observed by HPLC were significant.

  13. Artificial Intelligence Approach to Support Statistical Quality Control Teaching

    ERIC Educational Resources Information Center

    Reis, Marcelo Menezes; Paladini, Edson Pacheco; Khator, Suresh; Sommer, Willy Arno

    2006-01-01

    Statistical quality control--SQC (consisting of Statistical Process Control, Process Capability Studies, Acceptance Sampling and Design of Experiments) is a very important tool to obtain, maintain and improve the Quality level of goods and services produced by an organization. Despite its importance, and the fact that it is taught in technical and…

  14. Inconsistency in the items included in tools used in general health research and physical therapy to evaluate the methodological quality of randomized controlled trials: a descriptive analysis

    PubMed Central

    2013-01-01

    Background Assessing the risk of bias of randomized controlled trials (RCTs) is crucial to understand how biases affect treatment effect estimates. A number of tools have been developed to evaluate risk of bias of RCTs; however, it is unknown how these tools compare to each other in the items included. The main objective of this study was to describe which individual items are included in RCT quality tools used in general health and physical therapy (PT) research, and how these items compare to those of the Cochrane Risk of Bias (RoB) tool. Methods We used comprehensive literature searches and a systematic approach to identify tools that evaluated the methodological quality or risk of bias of RCTs in general health and PT research. We extracted individual items from all quality tools. We calculated the frequency of quality items used across tools and compared them to those in the RoB tool. Comparisons were made between general health and PT quality tools using Chi-squared tests. Results In addition to the RoB tool, 26 quality tools were identified, with 19 being used in general health and seven in PT research. The total number of quality items included in general health research tools was 130, compared with 48 items across PT tools and seven items in the RoB tool. The most frequently included items in general health research tools (14/19, 74%) were inclusion and exclusion criteria, and appropriate statistical analysis. In contrast, the most frequent items included in PT tools (86%, 6/7) were: baseline comparability, blinding of investigator/assessor, and use of intention-to-treat analysis. Key items of the RoB tool (sequence generation and allocation concealment) were included in 71% (5/7) of PT tools, and 63% (12/19) and 37% (7/19) of general health research tools, respectively. Conclusions There is extensive item variation across tools that evaluate the risk of bias of RCTs in health research. Results call for an in-depth analysis of items that should be used to

  15. Inconsistency in the items included in tools used in general health research and physical therapy to evaluate the methodological quality of randomized controlled trials: a descriptive analysis.

    PubMed

    Armijo-Olivo, Susan; Fuentes, Jorge; Ospina, Maria; Saltaji, Humam; Hartling, Lisa

    2013-09-17

    Assessing the risk of bias of randomized controlled trials (RCTs) is crucial to understand how biases affect treatment effect estimates. A number of tools have been developed to evaluate risk of bias of RCTs; however, it is unknown how these tools compare to each other in the items included. The main objective of this study was to describe which individual items are included in RCT quality tools used in general health and physical therapy (PT) research, and how these items compare to those of the Cochrane Risk of Bias (RoB) tool. We used comprehensive literature searches and a systematic approach to identify tools that evaluated the methodological quality or risk of bias of RCTs in general health and PT research. We extracted individual items from all quality tools. We calculated the frequency of quality items used across tools and compared them to those in the RoB tool. Comparisons were made between general health and PT quality tools using Chi-squared tests. In addition to the RoB tool, 26 quality tools were identified, with 19 being used in general health and seven in PT research. The total number of quality items included in general health research tools was 130, compared with 48 items across PT tools and seven items in the RoB tool. The most frequently included items in general health research tools (14/19, 74%) were inclusion and exclusion criteria, and appropriate statistical analysis. In contrast, the most frequent items included in PT tools (86%, 6/7) were: baseline comparability, blinding of investigator/assessor, and use of intention-to-treat analysis. Key items of the RoB tool (sequence generation and allocation concealment) were included in 71% (5/7) of PT tools, and 63% (12/19) and 37% (7/19) of general health research tools, respectively. There is extensive item variation across tools that evaluate the risk of bias of RCTs in health research. Results call for an in-depth analysis of items that should be used to assess risk of bias of RCTs. Further

  16. Enhancing Leadership Quality. TQ Source Tips & Tools: Emerging Strategies to Enhance Educator Quality

    ERIC Educational Resources Information Center

    National Comprehensive Center for Teacher Quality, 2008

    2008-01-01

    Teaching Quality (TQ) Source Tips & Tools: Emerging Strategies to Enhance Educator Quality is an online resource developed by the TQ Center. It is designed to help education practitioners tap into strategies and resources they can use to enhance educator quality. This publication is based on the TQ Source Tips & Tools topic area "Enhancing…

  17. Visualization and Quality Control Web Tools for CERES Products

    NASA Astrophysics Data System (ADS)

    Mitrescu, C.; Doelling, D.; Chu, C.; Mlynczak, P.

    2014-12-01

    The CERES project continues to provide the scientific community a wide variety of satellite-derived data products. The flagship products TOA broadband shortwave and longwave observed fluxes, computed TOA and Surface fluxes, as well as cloud, aerosol, and other atmospheric parameters. These datasets encompass a wide range of temporal and spatial resolutions, suited to specific applications. We thus offer time resolutions that range from instantaneous to monthly means, with spatial resolutions that range from 20-km footprint to global scales. The 14-year record is mostly used by climate modeling communities that focus on global mean energetics, meridianal heat transport, and climate trend studies. CERES products are also used by the remote sensing community for their climatological studies. In the last years however, our CERES products had been used by an even broader audience, like the green energy, health and environmental research communities, and others. Because of that, the CERES project has implemented a now well-established web-oriented Ordering and Visualization Tool (OVT), which is well into its fifth year of development. In order to help facilitate a comprehensive quality control of CERES products, the OVT Team began introducing a series of specialized functions. These include the 1- and 2-D histogram, anomaly, deseasonalization, temporal and spatial averaging, side-by-side parameter comparison, and other specialized scientific application capabilities. Over time increasingly higher order temporal and spatial resolution products are being made available to the public through the CERES OVT. These high-resolution products require accessing the existing long-term archive - thus the reading of many very large netCDF or HDF files that pose a real challenge to the task of near instantaneous visualization. An overview of the CERES OVT basic functions and QC capabilities as well as future steps in expanding its capabilities will be presented at the meeting.

  18. Quality control in diagnostic immunohistochemistry: integrated on-slide positive controls.

    PubMed

    Bragoni, A; Gambella, A; Pigozzi, S; Grigolini, M; Fiocca, R; Mastracci, L; Grillo, F

    2017-11-01

    Standardization in immunohistochemistry is a priority in modern pathology and requires strict quality control. Cost containment has also become fundamental and auditing of all procedures must take into account both these principles. Positive controls must be routinely performed so that their positivity guarantees the appropriateness of the immunohistochemical procedure. The aim of this study is to develop a low cost (utilizing a punch biopsy-PB-tool) procedure to construct positive controls which can be integrated in the patient's tissue slide. Sixteen frequently used control blocks were selected and multiple cylindrical samples were obtained using a 5-mm diameter punch biopsy tool, separately re-embedding them in single blocks. For each diagnostic immunoreaction requiring a positive control, an integrated PB-control section (cut from the appropriate PB-control block) was added to the top right corner of the diagnostic slide before immunostaining. This integrated control technique permitted a saving of 4.75% in total direct lab costs and proved to be technically feasible and reliable. Our proposal is easy to perform and within the reach of all pathology labs, requires easily available tools, its application costs is less than using external paired controls and ensures that a specific control for each slide is always available.

  19. Innovative Stormwater Quality Tools by SARA for Holistic Watershed Master Planning

    NASA Astrophysics Data System (ADS)

    Thomas, S. M.; Su, Y. C.; Hummel, P. R.

    2016-12-01

    Stormwater management strategies such as Best Management Practices (BMP) and Low-Impact Development (LID) have increasingly gained attention in urban runoff control, becoming vital to holistic watershed master plans. These strategies can help address existing water quality impairments and support regulatory compliance, as well as guide planning and management of future development when substantial population growth and urbanization is projected to occur. However, past efforts have been limited to qualitative planning due to the lack of suitable tools to conduct quantitative assessment. The San Antonio River Authority (SARA), with the assistance of Lockwood, Andrews & Newnam, Inc. (LAN) and AQUA TERRA Consultants (a division of RESPEC), developed comprehensive hydrodynamic and water quality models using the Hydrological Simulation Program-FORTRAN (HSPF) for several urban watersheds in the San Antonio River Basin. These models enabled watershed management to look at water quality issues on a more refined temporal and spatial scale than the limited monitoring data. They also provided a means to locate and quantify potential water quality impairments and evaluate the effects of mitigation measures. To support the models, a suite of software tools were developed. including: 1) SARA Timeseries Utility Tool for managing and processing of large model timeseries files, 2) SARA Load Reduction Tool to determine load reductions needed to achieve screening levels for each modeled constituent on a sub-basin basis, and 3) SARA Enhanced BMP Tool to determine the optimal combination of BMP types and units needed to achieve the required load reductions. Using these SARA models and tools, water quality agencies and stormwater professionals can determine the optimal combinations of BMP/LID to accomplish their goals and save substantial stormwater infrastructure and management costs. The tools can also help regulators and permittees evaluate the feasibility of achieving compliance

  20. Assessing Lymphatic Filariasis Data Quality in Endemic Communities in Ghana, Using the Neglected Tropical Diseases Data Quality Assessment Tool for Preventive Chemotherapy.

    PubMed

    de Souza, Dziedzom K; Yirenkyi, Eric; Otchere, Joseph; Biritwum, Nana-Kwadwo; Ameme, Donne K; Sackey, Samuel; Ahorlu, Collins; Wilson, Michael D

    2016-03-01

    The activities of the Global Programme for the Elimination of Lymphatic Filariasis have been in operation since the year 2000, with Mass Drug Administration (MDA) undertaken yearly in disease endemic communities. Information collected during MDA-such as population demographics, age, sex, drugs used and remaining, and therapeutic and geographic coverage-can be used to assess the quality of the data reported. To assist country programmes in evaluating the information reported, the WHO, in collaboration with NTD partners, including ENVISION/RTI, developed an NTD Data Quality Assessment (DQA) tool, for use by programmes. This study was undertaken to evaluate the tool and assess the quality of data reported in some endemic communities in Ghana. A cross sectional study, involving review of data registers and interview of drug distributors, disease control officers, and health information officers using the NTD DQA tool, was carried out in selected communities in three LF endemic Districts in Ghana. Data registers for service delivery points were obtained from District health office for assessment. The assessment verified reported results in comparison with recounted values for five indicators: number of tablets received, number of tablets used, number of tablets remaining, MDA coverage, and population treated. Furthermore, drug distributors, disease control officers, and health information officers (at the first data aggregation level), were interviewed, using the DQA tool, to determine the performance of the functional areas of the data management system. The results showed that over 60% of the data reported were inaccurate, and exposed the challenges and limitations of the data management system. The DQA tool is a very useful monitoring and evaluation (M&E) tool that can be used to elucidate and address data quality issues in various NTD control programmes.

  1. Control Strategy Tool (CoST)

    EPA Pesticide Factsheets

    The EPA Control Strategy Tool (CoST) is a software tool for projecting potential future control scenarios, their effects on emissions and estimated costs. This tool uses the NEI and the Control Measures Dataset as key inputs. CoST outputs are projections of future control scenarios.

  2. Sigma metrics as a tool for evaluating the performance of internal quality control in a clinical chemistry laboratory

    PubMed Central

    Kumar, B. Vinodh; Mohan, Thuthi

    2018-01-01

    OBJECTIVE: Six Sigma is one of the most popular quality management system tools employed for process improvement. The Six Sigma methods are usually applied when the outcome of the process can be measured. This study was done to assess the performance of individual biochemical parameters on a Sigma Scale by calculating the sigma metrics for individual parameters and to follow the Westgard guidelines for appropriate Westgard rules and levels of internal quality control (IQC) that needs to be processed to improve target analyte performance based on the sigma metrics. MATERIALS AND METHODS: This is a retrospective study, and data required for the study were extracted between July 2015 and June 2016 from a Secondary Care Government Hospital, Chennai. The data obtained for the study are IQC - coefficient of variation percentage and External Quality Assurance Scheme (EQAS) - Bias% for 16 biochemical parameters. RESULTS: For the level 1 IQC, four analytes (alkaline phosphatase, magnesium, triglyceride, and high-density lipoprotein-cholesterol) showed an ideal performance of ≥6 sigma level, five analytes (urea, total bilirubin, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level and for level 2 IQCs, same four analytes of level 1 showed a performance of ≥6 sigma level, and four analytes (urea, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level. For all analytes <6 sigma level, the quality goal index (QGI) was <0.8 indicating the area requiring improvement to be imprecision except cholesterol whose QGI >1.2 indicated inaccuracy. CONCLUSION: This study shows that sigma metrics is a good quality tool to assess the analytical performance of a clinical chemistry laboratory. Thus, sigma metric analysis provides a benchmark for the laboratory to design a protocol for IQC, address poor assay performance, and assess the efficiency of existing laboratory processes. PMID:29692587

  3. Sigma metrics as a tool for evaluating the performance of internal quality control in a clinical chemistry laboratory.

    PubMed

    Kumar, B Vinodh; Mohan, Thuthi

    2018-01-01

    Six Sigma is one of the most popular quality management system tools employed for process improvement. The Six Sigma methods are usually applied when the outcome of the process can be measured. This study was done to assess the performance of individual biochemical parameters on a Sigma Scale by calculating the sigma metrics for individual parameters and to follow the Westgard guidelines for appropriate Westgard rules and levels of internal quality control (IQC) that needs to be processed to improve target analyte performance based on the sigma metrics. This is a retrospective study, and data required for the study were extracted between July 2015 and June 2016 from a Secondary Care Government Hospital, Chennai. The data obtained for the study are IQC - coefficient of variation percentage and External Quality Assurance Scheme (EQAS) - Bias% for 16 biochemical parameters. For the level 1 IQC, four analytes (alkaline phosphatase, magnesium, triglyceride, and high-density lipoprotein-cholesterol) showed an ideal performance of ≥6 sigma level, five analytes (urea, total bilirubin, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level and for level 2 IQCs, same four analytes of level 1 showed a performance of ≥6 sigma level, and four analytes (urea, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level. For all analytes <6 sigma level, the quality goal index (QGI) was <0.8 indicating the area requiring improvement to be imprecision except cholesterol whose QGI >1.2 indicated inaccuracy. This study shows that sigma metrics is a good quality tool to assess the analytical performance of a clinical chemistry laboratory. Thus, sigma metric analysis provides a benchmark for the laboratory to design a protocol for IQC, address poor assay performance, and assess the efficiency of existing laboratory processes.

  4. Use Cases for Combining Web Services with ArcPython Tools for Enabling Quality Control of Land Remote Sensing Data Products.

    NASA Astrophysics Data System (ADS)

    Krehbiel, C.; Maiersperger, T.; Friesz, A.; Harriman, L.; Quenzer, R.; Impecoven, K.

    2016-12-01

    Three major obstacles facing big Earth data users include data storage, management, and analysis. As the amount of satellite remote sensing data increases, so does the need for better data storage and management strategies to exploit the plethora of data now available. Standard GIS tools can help big Earth data users whom interact with and analyze increasingly large and diverse datasets. In this presentation we highlight how NASA's Land Processes Distributed Active Archive Center (LP DAAC) is tackling these big Earth data challenges. We provide a real life use case example to describe three tools and services provided by the LP DAAC to more efficiently exploit big Earth data in a GIS environment. First, we describe the Open-source Project for a Network Data Access Protocol (OPeNDAP), which calls to specific data, minimizing the amount of data that a user downloads and improves the efficiency of data downloading and processing. Next, we cover the LP DAAC's Application for Extracting and Exploring Analysis Ready Samples (AppEEARS), a web application interface for extracting and analyzing land remote sensing data. From there, we review an ArcPython toolbox that was developed to provide quality control services to land remote sensing data products. Locating and extracting specific subsets of larger big Earth datasets improves data storage and management efficiency for the end user, and quality control services provides a straightforward interpretation of big Earth data. These tools and services are beneficial to the GIS user community in terms of standardizing workflows and improving data storage, management, and analysis tactics.

  5. Developing a consumer evaluation tool of weight control strategy advertisements on the Internet.

    PubMed

    Luevorasirikul, Kanokrat; Gray, Nicola J; Anderson, Claire W

    2008-06-01

    To develop two evaluation tools for weight loss and weight gain advertisements on the Internet in order to help consumers to evaluate the quality of information within these advertisements. One hundred websites identified by Internet search engines for weight loss and weight gain strategies (50 websites each) were evaluated using two specific scoring instruments, developed by adapting questions from the 'DISCERN' tool and reviewing all related weight control guidelines and advertising regulations. The validity and reliability of the adapted tools were tested. Our evaluation tools rated the information from most websites as poor quality (70%). In the case of weight loss strategies, statements about rapid (18%) and permanent (28%) weight loss caused concern as well as lack of sensible advice about dieting and a lack of product warnings (84%). Safety concerns relating to weight gain products were the lack of warnings about side effects in products containing steroids and creatine (92%). The adapted tools exhibited acceptable validity and reliability. Quality of information within weight control advertisements on the Internet was generally poor. Problems of false claims, little advice on healthy ways to modify weight and few warnings on side effects have been highlighted in this study.

  6. A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.

    PubMed

    Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco

    2018-01-01

    One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality

  7. Information systems as a quality management tool in clinical laboratories

    NASA Astrophysics Data System (ADS)

    Schmitz, Vanessa; Rosecler Bez el Boukhari, Marta

    2007-11-01

    This article describes information systems as a quality management tool in clinical laboratories. The quality of laboratory analyses is of fundamental importance for health professionals in aiding appropriate diagnosis and treatment. Information systems allow the automation of internal quality management processes, using standard sample tests, Levey-Jennings charts and Westgard multirule analysis. This simplifies evaluation and interpretation of quality tests and reduces the possibility of human error. This study proposes the development of an information system with appropriate functions and costs for the automation of internal quality control in small and medium-sized clinical laboratories. To this end, it evaluates the functions and usability of two commercial software products designed for this purpose, identifying the positive features of each, so that these can be taken into account during the development of the proposed system.

  8. A case study: application of statistical process control tool for determining process capability and sigma level.

    PubMed

    Chopra, Vikram; Bairagi, Mukesh; Trivedi, P; Nagar, Mona

    2012-01-01

    Statistical process control is the application of statistical methods to the measurement and analysis of variation process. Various regulatory authorities such as Validation Guidance for Industry (2011), International Conference on Harmonisation ICH Q10 (2009), the Health Canada guidelines (2009), Health Science Authority, Singapore: Guidance for Product Quality Review (2008), and International Organization for Standardization ISO-9000:2005 provide regulatory support for the application of statistical process control for better process control and understanding. In this study risk assessments, normal probability distributions, control charts, and capability charts are employed for selection of critical quality attributes, determination of normal probability distribution, statistical stability, and capability of production processes, respectively. The objective of this study is to determine tablet production process quality in the form of sigma process capability. By interpreting data and graph trends, forecasting of critical quality attributes, sigma process capability, and stability of process were studied. The overall study contributes to an assessment of process at the sigma level with respect to out-of-specification attributes produced. Finally, the study will point to an area where the application of quality improvement and quality risk assessment principles for achievement of six sigma-capable processes is possible. Statistical process control is the most advantageous tool for determination of the quality of any production process. This tool is new for the pharmaceutical tablet production process. In the case of pharmaceutical tablet production processes, the quality control parameters act as quality assessment parameters. Application of risk assessment provides selection of critical quality attributes among quality control parameters. Sequential application of normality distributions, control charts, and capability analyses provides a valid statistical

  9. A Quality Classroom: Quality Teaching Tools That Facilitate Student Success.

    ERIC Educational Resources Information Center

    Cooke, Brian

    This presentation described practical applications and quality tools for educators that are based on original classroom research and the theories of motivation, learning, profound knowledge, systems thinking, and service quality advanced by Karl Albrecht, William Glasser, and W. Edwards Deming. The presentation was conducted in a way that…

  10. Ten tools of continuous quality improvement: a review and case example of hospital discharge.

    PubMed

    Ziegenfuss, J T; McKenna, C K

    1995-01-01

    Concepts and methods of continuous quality improvement have been endorsed by quality specialists in American Health care, and their use has convinced CEOs that industrial methods can make a contribution to health and medical care. For all the quality improvement publications, there are still few that offer a clear, concise definition and an explanation of the primary tools for teaching purposes. This report reviews ten continuous quality improvement methods including: problem solving cycle, affinity diagrams, cause and effect diagrams, Pareto diagrams, histograms, bar charts, control charts, scatter diagrams, checklists, and a process decision program chart. These do not represent an exhaustive list, but a set of commonly used tools. They are applied to a case study of bed utilization in a university hospital.

  11. Standardisation of DNA quantitation by image analysis: quality control of instrumentation.

    PubMed

    Puech, M; Giroud, F

    1999-05-01

    DNA image analysis is frequently performed in clinical practice as a prognostic tool and to improve diagnosis. The precision of prognosis and diagnosis depends on the accuracy of analysis and particularly on the quality of image analysis systems. It has been reported that image analysis systems used for DNA quantification differ widely in their characteristics (Thunissen et al.: Cytometry 27: 21-25, 1997). This induces inter-laboratory variations when the same sample is analysed in different laboratories. In microscopic image analysis, the principal instrumentation errors arise from the optical and electronic parts of systems. They bring about problems of instability, non-linearity, and shading and glare phenomena. The aim of this study is to establish tools and standardised quality control procedures for microscopic image analysis systems. Specific reference standard slides have been developed to control instability, non-linearity, shading and glare phenomena and segmentation efficiency. Some systems have been controlled with these tools and these quality control procedures. Interpretation criteria and accuracy limits of these quality control procedures are proposed according to the conclusions of a European project called PRESS project (Prototype Reference Standard Slide). Beyond these limits, tested image analysis systems are not qualified to realise precise DNA analysis. The different procedures presented in this work determine if an image analysis system is qualified to deliver sufficiently precise DNA measurements for cancer case analysis. If the controlled systems are beyond the defined limits, some recommendations are given to find a solution to the problem.

  12. Report Central: quality reporting tool in an electronic health record.

    PubMed

    Jung, Eunice; Li, Qi; Mangalampalli, Anil; Greim, Julie; Eskin, Michael S; Housman, Dan; Isikoff, Jeremy; Abend, Aaron H; Middleton, Blackford; Einbinder, Jonathan S

    2006-01-01

    Quality reporting tools, integrated with ambulatory electronic health records, can help clinicians and administrators understand performance, manage populations, and improve quality. Report Central is a secure web report delivery tool built on Crystal Reports XItrade mark and ASP.NET technologies. Pilot evaluation of Report Central indicates that clinicians prefer a quality reporting tool that is integrated with our home-grown EHR to support clinical workflow.

  13. Report Central: Quality Reporting Tool in an Electronic Health Record

    PubMed Central

    Jung, Eunice; Li, Qi; Mangalampalli, Anil; Greim, Julie; Eskin, Michael S.; Housman, Dan; Isikoff, Jeremy; Abend, Aaron H.; Middleton, Blackford; Einbinder, Jonathan S.

    2006-01-01

    Quality reporting tools, integrated with ambulatory electronic health records, can help clinicians and administrators understand performance, manage populations, and improve quality. Report Central is a secure web report delivery tool built on Crystal Reports XI™ and ASP.NET technologies. Pilot evaluation of Report Central indicates that clinicians prefer a quality reporting tool that is integrated with our home-grown EHR to support clinical workflow. PMID:17238590

  14. THE ATMOSPHERIC MODEL EVALUATION TOOL (AMET); AIR QUALITY MODULE

    EPA Science Inventory

    This presentation reviews the development of the Atmospheric Model Evaluation Tool (AMET) air quality module. The AMET tool is being developed to aid in the model evaluation. This presentation focuses on the air quality evaluation portion of AMET. Presented are examples of the...

  15. Quality assessment tools add value.

    PubMed

    Paul, L

    1996-10-01

    The rapid evolution of the health care marketplace can be expected to continue as we move closer to the 21st Century. Externally-imposed pressures for cost reduction will increasingly be accompanied by pressure within health care organizations as risk-sharing reimbursement arrangements become more commonplace. Competitive advantage will be available to those organizations that can demonstrate objective value as defined by the cost-quality equation. The tools an organization chooses to perform quality assessment will be an important factor in its ability to demonstrate such value. Traditional quality assurance will in all likelihood continue, but the extent to which quality improvement activities are adopted by the culture of an organization may determine its ability to provide objective evidence of better health status outcomes.

  16. Six Sigma Quality Management System and Design of Risk-based Statistical Quality Control.

    PubMed

    Westgard, James O; Westgard, Sten A

    2017-03-01

    Six sigma concepts provide a quality management system (QMS) with many useful tools for managing quality in medical laboratories. This Six Sigma QMS is driven by the quality required for the intended use of a test. The most useful form for this quality requirement is the allowable total error. Calculation of a sigma-metric provides the best predictor of risk for an analytical examination process, as well as a design parameter for selecting the statistical quality control (SQC) procedure necessary to detect medically important errors. Simple point estimates of sigma at medical decision concentrations are sufficient for laboratory applications. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Software Quality Control at Belle II

    NASA Astrophysics Data System (ADS)

    Ritter, M.; Kuhr, T.; Hauth, T.; Gebard, T.; Kristof, M.; Pulvermacher, C.; Belle Software Group, II

    2017-10-01

    Over the last seven years the software stack of the next generation B factory experiment Belle II has grown to over one million lines of C++ and Python code, counting only the part included in offline software releases. There are several thousand commits to the central repository by about 100 individual developers per year. To keep a coherent software stack of high quality that it can be sustained and used efficiently for data acquisition, simulation, reconstruction, and analysis over the lifetime of the Belle II experiment is a challenge. A set of tools is employed to monitor the quality of the software and provide fast feedback to the developers. They are integrated in a machinery that is controlled by a buildbot master and automates the quality checks. The tools include different compilers, cppcheck, the clang static analyzer, valgrind memcheck, doxygen, a geometry overlap checker, a check for missing or extra library links, unit tests, steering file level tests, a sophisticated high-level validation suite, and an issue tracker. The technological development infrastructure is complemented by organizational means to coordinate the development.

  18. Human Connectome Project Informatics: quality control, database services, and data visualization

    PubMed Central

    Marcus, Daniel S.; Harms, Michael P.; Snyder, Abraham Z.; Jenkinson, Mark; Wilson, J Anthony; Glasser, Matthew F.; Barch, Deanna M.; Archie, Kevin A.; Burgess, Gregory C.; Ramaratnam, Mohana; Hodge, Michael; Horton, William; Herrick, Rick; Olsen, Timothy; McKay, Michael; House, Matthew; Hileman, Michael; Reid, Erin; Harwell, John; Coalson, Timothy; Schindler, Jon; Elam, Jennifer S.; Curtiss, Sandra W.; Van Essen, David C.

    2013-01-01

    The Human Connectome Project (HCP) has developed protocols, standard operating and quality control procedures, and a suite of informatics tools to enable high throughput data collection, data sharing, automated data processing and analysis, and data mining and visualization. Quality control procedures include methods to maintain data collection consistency over time, to measure head motion, and to establish quantitative modality-specific overall quality assessments. Database services developed as customizations of the XNAT imaging informatics platform support both internal daily operations and open access data sharing. The Connectome Workbench visualization environment enables user interaction with HCP data and is increasingly integrated with the HCP's database services. Here we describe the current state of these procedures and tools and their application in the ongoing HCP study. PMID:23707591

  19. The development and validation of a meta-tool for quality appraisal of public health evidence: Meta Quality Appraisal Tool (MetaQAT).

    PubMed

    Rosella, L; Bowman, C; Pach, B; Morgan, S; Fitzpatrick, T; Goel, V

    2016-07-01

    Most quality appraisal tools were developed for clinical medicine and tend to be study-specific with a strong emphasis on risk of bias. In order to be more relevant to public health, an appropriate quality appraisal tool needs to be less reliant on the evidence hierarchy and consider practice applicability. Given the broad range of study designs used in public health, the objective of this study was to develop and validate a meta-tool that combines public health-focused principles of appraisal coupled with a set of design-specific companion tools. Several design methods were used to develop and validate the tool including literature review, synthesis, and validation with a reference standard. A search of critical appraisal tools relevant to public health was conducted; core concepts were collated. The resulting framework was piloted during three feedback sessions with public health practitioners. Following subsequent revisions, the final meta-tool, the Meta Quality Appraisal Tool (MetaQAT), was then validated through a content analysis of appraisals conducted by two groups of experienced public health researchers (MetaQAT vs generic appraisal form). The MetaQAT framework consists of four domains: relevancy, reliability, validity, and applicability. In addition, a companion tool was assembled from existing critical appraisal tools to provide study design-specific guidance on validity appraisal. Content analysis showed similar methodological and generalizability concerns were raised by both groups; however, the MetaQAT appraisers commented more extensively on applicability to public health practice. Critical appraisal tools designed for clinical medicine have limitations for use in the context of public health. The meta-tool structure of the MetaQAT allows for rigorous appraisal, while allowing users to simultaneously appraise the multitude of study designs relevant to public health research and assess non-standard domains, such as applicability. Copyright © 2015

  20. Internal quality control: planning and implementation strategies.

    PubMed

    Westgard, James O

    2003-11-01

    The first essential in setting up internal quality control (IQC) of a test procedure in the clinical laboratory is to select the proper IQC procedure to implement, i.e. choosing the statistical criteria or control rules, and the number of control measurements, according to the quality required for the test and the observed performance of the method. Then the right IQC procedure must be properly implemented. This review focuses on strategies for planning and implementing IQC procedures in order to improve the quality of the IQC. A quantitative planning process is described that can be implemented with graphical tools such as power function or critical-error graphs and charts of operating specifications. Finally, a total QC strategy is formulated to minimize cost and maximize quality. A general strategy for IQC implementation is recommended that employs a three-stage design in which the first stage provides high error detection, the second stage low false rejection and the third stage prescribes the length of the analytical run, making use of an algorithm involving the average of normal patients' data.

  1. Data Quality Control Tools Applied to Seismo-Acoustic Arrays in Korea

    NASA Astrophysics Data System (ADS)

    Park, J.; Hayward, C.; Stump, B. W.

    2017-12-01

    We assess data quality (data gap, seismometer orientation, timing error, noise level and coherence between co-located sensors) for seismic and infrasound data in South Korea using six seismo-acoustic arrays, BRDAR, CHNAR, KSGAR, KMPAR, TJIAR, and YPDAR, cooperatively operated by Southern Methodist University and Korea Institute for Geosciences and Mineral Resources. Timing errors associated with seismometers can be found based on estimated changes in instrument orientation calculated from RMS errors between the reference array and each array seismometer using waveforms filtered from 0.1 to 0.35 Hz. Noise levels of seismic and infrasound data are analyzed to investigate local environmental effects and seasonal noise variation. In order to examine the spectral properties of the noise, the waveform are analyzed using Welch's method (Welch, 1967) that produces a single power spectral estimate from an average of spectra taken at regular intervals over a specific time period. This analysis quantifies the range of noise conditions found at each of the arrays over the given time period. We take an advantage of the fact that infrasound sensors are co-located or closely located to one another, which allows for a direct comparison of sensors, following the method by Ringler et al. (2010). The power level differences between two sensors at the same array in the frequency band of interest are used to monitor temporal changes in data quality and instrument conditions. A data quality factor is assigned to stations based on the average values of temporal changes estimated in the frequency and time domains. These monitoring tools enable us to automatically assess technical issue related to the instruments and data quality at each seismo-acoustic array as well as to investigate local environmental effects and seasonal variations in both seismic and infrasound data.

  2. Water Quality Analysis Tool (WQAT)

    EPA Science Inventory

    The purpose of the Water Quality Analysis Tool (WQAT) software is to provide a means for analyzing and producing useful remotely sensed data products for an entire estuary, a particular point or area of interest (AOI or POI) in estuaries, or water bodies of interest where pre-pro...

  3. Using Quality Tools and Methodologies to Improve a Hospital's Quality Position.

    PubMed

    Branco, Daniel; Wicks, Angela M; Visich, John K

    2017-01-01

    The authors identify the quality tools and methodologies most frequently used by quality-positioned hospitals versus nonquality hospitals. Northeastern U.S. hospitals in both groups received a brief, 12-question survey. The authors found that 93.75% of the quality hospitals and 81.25% of the nonquality hospitals used some form of process improvement methodologies. However, there were significant differences between the groups regarding the impact of quality improvement initiatives on patients. The findings indicate that in quality hospitals the use of quality improvement initiatives had a significantly greater positive impact on patient satisfaction and patient outcomes when compared to nonquality hospitals.

  4. Continuous Quality Improvement Tools for Effective Teaching.

    ERIC Educational Resources Information Center

    Cornesky, Robert A.

    This manual presents 15 Continuous Quality Improvement (CQI) tools and techniques necessary for effective teaching. By using the tools and techniques of CQI, teachers will be able to help themselves and their students to focus on the classroom processes. This will permit the teacher and students to plan, organize, implement, and make decisions…

  5. Tools used for evaluation of Brazilian children's quality of life

    PubMed Central

    Souza, João Gabriel S.; Pamponet, Marcela Antunes; Souza, Tamirys Caroline S.; Pereira, Alessandra Ribeiro; Souza, Andrey George S.; Martins, Andréa Maria E. de B. L.

    2014-01-01

    OBJECTIVE: To review the available tools to evaluate children's quality of life validated for Brazilian language and culture. DATA SOURCES: Search of scientific articles in Medline, Lilacs and SciELO databases using the combination of descriptors "quality of life", "child" and "questionnaires" in Portuguese and English. DATA SYNTHESIS: Among the tools designed to assess children's quality of life validated for the Brazilian language and culture, the Auto questionnaire Qualité de Vie Enfant Imagé (AUQEI), the Child Health Questionnaire - Parent Form 50 (CHQ-PF50), the Pediatric Quality of Life Inventory (PedsQL(tm)) version 4.0 and the Kidscreen-52 are highlighted. Some tools do not include all range of ages and some lack domains that are currently considered relevant in the context of childhood, such as bullying. Moreover, due to the cultural diversity of Brazil, it may be necessary to adapt some instruments or to validate other tools. CONCLUSIONS: There are validated instruments to evaluate children's quality of life in Brazil. However, the validation or the adaptation of other international tools have to be considered in order to overcome current deficiencies. PMID:25119761

  6. Data quality control and tools in passive seismic experiments exemplified on the Czech broadband seismic pool MOBNET in the AlpArray collaborative project

    NASA Astrophysics Data System (ADS)

    Vecsey, Luděk; Plomerová, Jaroslava; Jedlička, Petr; Munzarová, Helena; Babuška, Vladislav; AlpArray Working Group

    2017-12-01

    This paper focuses on major issues related to the data reliability and network performance of 20 broadband (BB) stations of the Czech (CZ) MOBNET (MOBile NETwork) seismic pool within the AlpArray seismic experiments. Currently used high-resolution seismological applications require high-quality data recorded for a sufficiently long time interval at seismological observatories and during the entire time of operation of the temporary stations. In this paper we present new hardware and software tools we have been developing during the last two decades while analysing data from several international passive experiments. The new tools help to assure the high-quality standard of broadband seismic data and eliminate potential errors before supplying data to seismological centres. Special attention is paid to crucial issues like the detection of sensor misorientation, timing problems, interchange of record components and/or their polarity reversal, sensor mass centring, or anomalous channel amplitudes due to, for example, imperfect gain. Thorough data quality control should represent an integral constituent of seismic data recording, preprocessing, and archiving, especially for data from temporary stations in passive seismic experiments. Large international seismic experiments require enormous efforts from scientists from different countries and institutions to gather hundreds of stations to be deployed in the field during a limited time period. In this paper, we demonstrate the beneficial effects of the procedures we have developed for acquiring a reliable large set of high-quality data from each group participating in field experiments. The presented tools can be applied manually or automatically on data from any seismic network.

  7. Quality Assurance Project Plan Development Tool

    EPA Pesticide Factsheets

    This tool contains information designed to assist in developing a Quality Assurance (QA) Project Plan that meets EPA requirements for projects that involve surface or groundwater monitoring and/or the collection and analysis of water samples.

  8. Quality Controlling CMIP datasets at GFDL

    NASA Astrophysics Data System (ADS)

    Horowitz, L. W.; Radhakrishnan, A.; Balaji, V.; Adcroft, A.; Krasting, J. P.; Nikonov, S.; Mason, E. E.; Schweitzer, R.; Nadeau, D.

    2017-12-01

    As GFDL makes the switch from model development to production in light of the Climate Model Intercomparison Project (CMIP), GFDL's efforts are shifted to testing and more importantly establishing guidelines and protocols for Quality Controlling and semi-automated data publishing. Every CMIP cycle introduces key challenges and the upcoming CMIP6 is no exception. The new CMIP experimental design comprises of multiple MIPs facilitating research in different focus areas. This paradigm has implications not only for the groups that develop the models and conduct the runs, but also for the groups that monitor, analyze and quality control the datasets before data publishing, before their knowledge makes its way into reports like the IPCC (Intergovernmental Panel on Climate Change) Assessment Reports. In this talk, we discuss some of the paths taken at GFDL to quality control the CMIP-ready datasets including: Jupyter notebooks, PrePARE, LAMP (Linux, Apache, MySQL, PHP/Python/Perl): technology-driven tracker system to monitor the status of experiments qualitatively and quantitatively, provide additional metadata and analysis services along with some in-built controlled-vocabulary validations in the workflow. In addition to this, we also discuss the integration of community-based model evaluation software (ESMValTool, PCMDI Metrics Package, and ILAMB) as part of our CMIP6 workflow.

  9. Advancement in modern approaches to mineral production quality control

    NASA Astrophysics Data System (ADS)

    Freidina, EV; Botvinnik, AA; Dvornikova, AN

    2017-02-01

    The natural resource potential of mineral deposits is represented by three categories: upside, attainable and investment. A modern methodology is proposed in this paper for production quality control, and its tools aimed at ensuring agreement between the product quality and the market requirements are described. The definitions of the costs of the product quality compliance and incompliance with the consumer requirements are introduced; the latter is suggested to use in evaluating resource potential of mineral deposits at a certain degree of probability.

  10. [Development of quality assurance/quality control web system in radiotherapy].

    PubMed

    Okamoto, Hiroyuki; Mochizuki, Toshihiko; Yokoyama, Kazutoshi; Wakita, Akihisa; Nakamura, Satoshi; Ueki, Heihachi; Shiozawa, Keiko; Sasaki, Koji; Fuse, Masashi; Abe, Yoshihisa; Itami, Jun

    2013-12-01

    Our purpose is to develop a QA/QC (quality assurance/quality control) web system using a server-side script language such as HTML (HyperText Markup Language) and PHP (Hypertext Preprocessor), which can be useful as a tool to share information about QA/QC in radiotherapy. The system proposed in this study can be easily built in one's own institute, because HTML can be easily handled. There are two desired functions in a QA/QC web system: (i) To review the results of QA/QC for a radiotherapy machine, manuals, and reports necessary for routinely performing radiotherapy through this system. By disclosing the results, transparency can be maintained, (ii) To reveal a protocol for QA/QC in one's own institute using pictures and movies relating to QA/QC for simplicity's sake, which can also be used as an educational tool for junior radiation technologists and medical physicists. By using this system, not only administrators, but also all staff involved in radiotherapy, can obtain information about the conditions and accuracy of treatment machines through the QA/QC web system.

  11. C++ software quality in the ATLAS experiment: tools and experience

    NASA Astrophysics Data System (ADS)

    Martin-Haugh, S.; Kluth, S.; Seuster, R.; Snyder, S.; Obreshkov, E.; Roe, S.; Sherwood, P.; Stewart, G. A.

    2017-10-01

    In this paper we explain how the C++ code quality is managed in ATLAS using a range of tools from compile-time through to run time testing and reflect on the substantial progress made in the last two years largely through the use of static analysis tools such as Coverity®, an industry-standard tool which enables quality comparison with general open source C++ code. Other available code analysis tools are also discussed, as is the role of unit testing with an example of how the GoogleTest framework can be applied to our codebase.

  12. Controlled dehydration improves the diffraction quality of two RNA crystals.

    PubMed

    Park, HaJeung; Tran, Tuan; Lee, Jun Hyuck; Park, Hyun; Disney, Matthew D

    2016-11-03

    Post-crystallization dehydration methods, applying either vapor diffusion or humidity control devices, have been widely used to improve the diffraction quality of protein crystals. Despite the fact that RNA crystals tend to diffract poorly, there is a dearth of reports on the application of dehydration methods to improve the diffraction quality of RNA crystals. We use dehydration techniques with a Free Mounting System (FMS, a humidity control device) to recover the poor diffraction quality of RNA crystals. These approaches were applied to RNA constructs that model various RNA-mediated repeat expansion disorders. The method we describe herein could serve as a general tool to improve diffraction quality of RNA crystals to facilitate structure determinations.

  13. Information quality-control model

    NASA Technical Reports Server (NTRS)

    Vincent, D. A.

    1971-01-01

    Model serves as graphic tool for estimating complete product objectives from limited input information, and is applied to cost estimations, product-quality evaluations, and effectiveness measurements for manpower resources allocation. Six product quality levels are defined.

  14. Quality Risk Management: Putting GMP Controls First.

    PubMed

    O'Donnell, Kevin; Greene, Anne; Zwitkovits, Michael; Calnan, Nuala

    2012-01-01

    This paper presents a practical way in which current approaches to quality risk management (QRM) may be improved, such that they better support qualification, validation programs, and change control proposals at manufacturing sites. The paper is focused on the treatment of good manufacturing practice (GMP) controls during QRM exercises. It specifically addresses why it is important to evaluate and classify such controls in terms of how they affect the severity, probability of occurrence, and detection ratings that may be assigned to potential failure modes or negative events. It also presents a QRM process that is designed to directly link the outputs of risk assessments and risk control activities with qualification and validation protocols in the GMP environment. This paper concerns the need for improvement in the use of risk-based principles and tools when working to ensure that the manufacturing processes used to produce medicines, and their related equipment, are appropriate. Manufacturing processes need to be validated (or proven) to demonstrate that they can produce a medicine of the required quality. The items of equipment used in such processes need to be qualified, in order to prove that they are fit for their intended use. Quality risk management (QRM) tools can be used to support such qualification and validation activities, but their use should be science-based and subject to as little subjectivity and uncertainty as possible. When changes are proposed to manufacturing processes, equipment, or related activities, they also need careful evaluation to ensure that any risks present are managed effectively. This paper presents a practical approach to how QRM may be improved so that it better supports qualification, validation programs, and change control proposals in a more scientific way. This improved approach is based on the treatment of what are called good manufacturing process (GMP) controls during those QRM exercises. A GMP control can be considered

  15. Improvement of Computer Software Quality through Software Automated Tools.

    DTIC Science & Technology

    1986-08-31

    requirement for increased emphasis on software quality assurance has lead to the creation of various methods of verification and validation. Experience...result was a vast array of methods , systems, languages and automated tools to assist in the process. Given that the primary role of quality assurance is...Unfortunately, there is no single method , tool or technique that can insure accurate, reliable and cost effective software. Therefore, government and industry

  16. Implementing self sustained quality control procedures in a clinical laboratory.

    PubMed

    Khatri, Roshan; K C, Sanjay; Shrestha, Prabodh; Sinha, J N

    2013-01-01

    Quality control is an essential component in every clinical laboratory which maintains the excellence of laboratory standards, supplementing to proper disease diagnosis, patient care and resulting in overall strengthening of health care system. Numerous quality control schemes are available, with combinations of procedures, most of which are tedious, time consuming and can be "too technical" whereas commercially available quality control materials can be expensive especially for laboratories in developing nations like Nepal. Here, we present a procedure performed at our centre with self prepared control serum and use of simple statistical tools for quality assurance. The pooled serum was prepared as per guidelines for preparation of stabilized liquid quality control serum from human sera. Internal Quality Assessment was performed on this sample, on a daily basis which included measurement of 12 routine biochemical parameters. The results were plotted on Levey-Jennings charts and analysed with quality control rules, for a period of one month. The mean levels of biochemical analytes in self prepared control serum were within normal physiological range. This serum was evaluated every day along with patients' samples. The results obtained were plotted on control charts and analysed using common quality control rules to identify possible systematic and random errors. Immediate mitigation measures were taken and the dispatch of erroneous reports was avoided. In this study we try to highlight on a simple internal quality control procedure which can be performed by laboratories, with minimum technology, expenditure, and expertise and improve reliability and validity of the test reports.

  17. PACS quality control and automatic problem notifier

    NASA Astrophysics Data System (ADS)

    Honeyman-Buck, Janice C.; Jones, Douglas; Frost, Meryll M.; Staab, Edward V.

    1997-05-01

    One side effect of installing a clinical PACS Is that users become dependent upon the technology and in some cases it can be very difficult to revert back to a film based system if components fail. The nature of system failures range from slow deterioration of function as seen in the loss of monitor luminance through sudden catastrophic loss of the entire PACS networks. This paper describes the quality control procedures in place at the University of Florida and the automatic notification system that alerts PACS personnel when a failure has happened or is anticipated. The goal is to recover from a failure with a minimum of downtime and no data loss. Routine quality control is practiced on all aspects of PACS, from acquisition, through network routing, through display, and including archiving. Whenever possible, the system components perform self and between platform checks for active processes, file system status, errors in log files, and system uptime. When an error is detected or a exception occurs, an automatic page is sent to a pager with a diagnostic code. Documentation on each code, trouble shooting procedures, and repairs are kept on an intranet server accessible only to people involved in maintaining the PACS. In addition to the automatic paging system for error conditions, acquisition is assured by an automatic fax report sent on a daily basis to all technologists acquiring PACS images to be used as a cross check that all studies are archived prior to being removed from the acquisition systems. Daily quality control is preformed to assure that studies can be moved from each acquisition and contrast adjustment. The results of selected quality control reports will be presented. The intranet documentation server will be described with the automatic pager system. Monitor quality control reports will be described and the cost of quality control will be quantified. As PACS is accepted as a clinical tool, the same standards of quality control must be established

  18. Enhanced methodology of focus control and monitoring on scanner tool

    NASA Astrophysics Data System (ADS)

    Chen, Yen-Jen; Kim, Young Ki; Hao, Xueli; Gomez, Juan-Manuel; Tian, Ye; Kamalizadeh, Ferhad; Hanson, Justin K.

    2017-03-01

    As the demand of the technology node shrinks from 14nm to 7nm, the reliability of tool monitoring techniques in advanced semiconductor fabs to achieve high yield and quality becomes more critical. Tool health monitoring methods involve periodic sampling of moderately processed test wafers to detect for particles, defects, and tool stability in order to ensure proper tool health. For lithography TWINSCAN scanner tools, the requirements for overlay stability and focus control are very strict. Current scanner tool health monitoring methods include running BaseLiner to ensure proper tool stability on a periodic basis. The focus measurement on YIELDSTAR by real-time or library-based reconstruction of critical dimensions (CD) and side wall angle (SWA) has been demonstrated as an accurate metrology input to the control loop. The high accuracy and repeatability of the YIELDSTAR focus measurement provides a common reference of scanner setup and user process. In order to further improve the metrology and matching performance, Diffraction Based Focus (DBF) metrology enabling accurate, fast, and non-destructive focus acquisition, has been successfully utilized for focus monitoring/control of TWINSCAN NXT immersion scanners. The optimal DBF target was determined to have minimized dose crosstalk, dynamic precision, set-get residual, and lens aberration sensitivity. By exploiting this new measurement target design, 80% improvement in tool-to-tool matching, >16% improvement in run-to-run mean focus stability, and >32% improvement in focus uniformity have been demonstrated compared to the previous BaseLiner methodology. Matching <2.4 nm across multiple NXT immersion scanners has been achieved with the new methodology of set baseline reference. This baseline technique, with either conventional BaseLiner low numerical aperture (NA=1.20) mode or advanced illumination high NA mode (NA=1.35), has also been evaluated to have consistent performance. This enhanced methodology of focus

  19. Improving the quality of EHR recording in primary care: a data quality feedback tool.

    PubMed

    van der Bij, Sjoukje; Khan, Nasra; Ten Veen, Petra; de Bakker, Dinny H; Verheij, Robert A

    2017-01-01

    Electronic health record (EHR) data are used to exchange information among health care providers. For this purpose, the quality of the data is essential. We developed a data quality feedback tool that evaluates differences in EHR data quality among practices and software packages as part of a larger intervention. The tool was applied in 92 practices in the Netherlands using different software packages. Practices received data quality feedback in 2010 and 2012. We observed large differences in the quality of recording. For example, the percentage of episodes of care that had a meaningful diagnostic code ranged from 30% to 100%. Differences were highly related to the software package. A year after the first measurement, the quality of recording had improved significantly and differences decreased, with 67% of the physicians indicating that they had actively changed their recording habits based on the results of the first measurement. About 80% found the feedback helpful in pinpointing recording problems. One of the software vendors made changes in functionality as a result of the feedback. Our EHR data quality feedback tool is capable of highlighting differences among practices and software packages. As such, it also stimulates improvements. As substantial variability in recording is related to the software package, our study strengthens the evidence that data quality can be improved substantially by standardizing the functionalities of EHR software packages. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  20. Selection and application of microbial source tracking tools for water-quality investigations

    USGS Publications Warehouse

    Stoeckel, Donald M.

    2005-01-01

    Microbial source tracking (MST) is a complex process that includes many decision-making steps. Once a contamination problem has been defined, the potential user of MST tools must thoroughly consider study objectives before deciding upon a source identifier, a detection method, and an analytical approach to apply to the problem. Regardless of which MST protocol is chosen, underlying assumptions can affect the results and interpretation. It is crucial to incorporate tests of those assumptions in the study quality-control plan to help validate results and facilitate interpretation. Detailed descriptions of MST objectives, protocols, and assumptions are provided in this report to assist in selection and application of MST tools for water-quality investigations. Several case studies illustrate real-world applications of MST protocols over a range of settings, spatial scales, and types of contamination. Technical details of many available source identifiers and detection methods are included as appendixes. By use of this information, researchers should be able to formulate realistic expectations for the information that MST tools can provide and, where possible, successfully execute investigations to characterize sources of fecal contamination to resource waters.

  1. Developing Cyberinfrastructure Tools and Services for Metadata Quality Evaluation

    NASA Astrophysics Data System (ADS)

    Mecum, B.; Gordon, S.; Habermann, T.; Jones, M. B.; Leinfelder, B.; Powers, L. A.; Slaughter, P.

    2016-12-01

    Metadata and data quality are at the core of reusable and reproducible science. While great progress has been made over the years, much of the metadata collected only addresses data discovery, covering concepts such as titles and keywords. Improving metadata beyond the discoverability plateau means documenting detailed concepts within the data such as sampling protocols, instrumentation used, and variables measured. Given that metadata commonly do not describe their data at this level, how might we improve the state of things? Giving scientists and data managers easy to use tools to evaluate metadata quality that utilize community-driven recommendations is the key to producing high-quality metadata. To achieve this goal, we created a set of cyberinfrastructure tools and services that integrate with existing metadata and data curation workflows which can be used to improve metadata and data quality across the sciences. These tools work across metadata dialects (e.g., ISO19115, FGDC, EML, etc.) and can be used to assess aspects of quality beyond what is internal to the metadata such as the congruence between the metadata and the data it describes. The system makes use of a user-friendly mechanism for expressing a suite of checks as code in popular data science programming languages such as Python and R. This reduces the burden on scientists and data managers to learn yet another language. We demonstrated these services and tools in three ways. First, we evaluated a large corpus of datasets in the DataONE federation of data repositories against a metadata recommendation modeled after existing recommendations such as the LTER best practices and the Attribute Convention for Dataset Discovery (ACDD). Second, we showed how this service can be used to display metadata and data quality information to data producers during the data submission and metadata creation process, and to data consumers through data catalog search and access tools. Third, we showed how the centrally

  2. Review of quality assessment tools for the evaluation of pharmacoepidemiological safety studies

    PubMed Central

    Neyarapally, George A; Hammad, Tarek A; Pinheiro, Simone P; Iyasu, Solomon

    2012-01-01

    Objectives Pharmacoepidemiological studies are an important hypothesis-testing tool in the evaluation of postmarketing drug safety. Despite the potential to produce robust value-added data, interpretation of findings can be hindered due to well-recognised methodological limitations of these studies. Therefore, assessment of their quality is essential to evaluating their credibility. The objective of this review was to evaluate the suitability and relevance of available tools for the assessment of pharmacoepidemiological safety studies. Design We created an a priori assessment framework consisting of reporting elements (REs) and quality assessment attributes (QAAs). A comprehensive literature search identified distinct assessment tools and the prespecified elements and attributes were evaluated. Primary and secondary outcome measures The primary outcome measure was the percentage representation of each domain, RE and QAA for the quality assessment tools. Results A total of 61 tools were reviewed. Most tools were not designed to evaluate pharmacoepidemiological safety studies. More than 50% of the reviewed tools considered REs under the research aims, analytical approach, outcome definition and ascertainment, study population and exposure definition and ascertainment domains. REs under the discussion and interpretation, results and study team domains were considered in less than 40% of the tools. Except for the data source domain, quality attributes were considered in less than 50% of the tools. Conclusions Many tools failed to include critical assessment elements relevant to observational pharmacoepidemiological safety studies and did not distinguish between REs and QAAs. Further, there is a lack of considerations on the relative weights of different domains and elements. The development of a quality assessment tool would facilitate consistent, objective and evidence-based assessments of pharmacoepidemiological safety studies. PMID:23015600

  3. qcML: An Exchange Format for Quality Control Metrics from Mass Spectrometry Experiments*

    PubMed Central

    Walzer, Mathias; Pernas, Lucia Espona; Nasso, Sara; Bittremieux, Wout; Nahnsen, Sven; Kelchtermans, Pieter; Pichler, Peter; van den Toorn, Henk W. P.; Staes, An; Vandenbussche, Jonathan; Mazanek, Michael; Taus, Thomas; Scheltema, Richard A.; Kelstrup, Christian D.; Gatto, Laurent; van Breukelen, Bas; Aiche, Stephan; Valkenborg, Dirk; Laukens, Kris; Lilley, Kathryn S.; Olsen, Jesper V.; Heck, Albert J. R.; Mechtler, Karl; Aebersold, Ruedi; Gevaert, Kris; Vizcaíno, Juan Antonio; Hermjakob, Henning; Kohlbacher, Oliver; Martens, Lennart

    2014-01-01

    Quality control is increasingly recognized as a crucial aspect of mass spectrometry based proteomics. Several recent papers discuss relevant parameters for quality control and present applications to extract these from the instrumental raw data. What has been missing, however, is a standard data exchange format for reporting these performance metrics. We therefore developed the qcML format, an XML-based standard that follows the design principles of the related mzML, mzIdentML, mzQuantML, and TraML standards from the HUPO-PSI (Proteomics Standards Initiative). In addition to the XML format, we also provide tools for the calculation of a wide range of quality metrics as well as a database format and interconversion tools, so that existing LIMS systems can easily add relational storage of the quality control data to their existing schema. We here describe the qcML specification, along with possible use cases and an illustrative example of the subsequent analysis possibilities. All information about qcML is available at http://code.google.com/p/qcml. PMID:24760958

  4. qcML: an exchange format for quality control metrics from mass spectrometry experiments.

    PubMed

    Walzer, Mathias; Pernas, Lucia Espona; Nasso, Sara; Bittremieux, Wout; Nahnsen, Sven; Kelchtermans, Pieter; Pichler, Peter; van den Toorn, Henk W P; Staes, An; Vandenbussche, Jonathan; Mazanek, Michael; Taus, Thomas; Scheltema, Richard A; Kelstrup, Christian D; Gatto, Laurent; van Breukelen, Bas; Aiche, Stephan; Valkenborg, Dirk; Laukens, Kris; Lilley, Kathryn S; Olsen, Jesper V; Heck, Albert J R; Mechtler, Karl; Aebersold, Ruedi; Gevaert, Kris; Vizcaíno, Juan Antonio; Hermjakob, Henning; Kohlbacher, Oliver; Martens, Lennart

    2014-08-01

    Quality control is increasingly recognized as a crucial aspect of mass spectrometry based proteomics. Several recent papers discuss relevant parameters for quality control and present applications to extract these from the instrumental raw data. What has been missing, however, is a standard data exchange format for reporting these performance metrics. We therefore developed the qcML format, an XML-based standard that follows the design principles of the related mzML, mzIdentML, mzQuantML, and TraML standards from the HUPO-PSI (Proteomics Standards Initiative). In addition to the XML format, we also provide tools for the calculation of a wide range of quality metrics as well as a database format and interconversion tools, so that existing LIMS systems can easily add relational storage of the quality control data to their existing schema. We here describe the qcML specification, along with possible use cases and an illustrative example of the subsequent analysis possibilities. All information about qcML is available at http://code.google.com/p/qcml. © 2014 by The American Society for Biochemistry and Molecular Biology, Inc.

  5. Satisfaction monitoring for quality control in campground management

    Treesearch

    Wilbur F. LaPage; Malcolm I. Bevins

    1981-01-01

    A 4-year study of camper satisfaction indicates that satisfaction monitoring is a useful tool for campground managers to assess their performance and achieve a high level of quality control in their service to the public. An indication of camper satisfaction with campground management is gained from a report card on which a small sample of visitors rates 14 elements of...

  6. Practical implementation science: developing and piloting the quality implementation tool.

    PubMed

    Meyers, Duncan C; Katz, Jason; Chien, Victoria; Wandersman, Abraham; Scaccia, Jonathan P; Wright, Annie

    2012-12-01

    According to the Interactive Systems Framework for Dissemination and Implementation, implementation is a major mechanism and concern in bridging research and practice. The growing number of implementation frameworks need to be synthesized and translated so that the science and practice of quality implementation can be furthered. In this article, we: (1) use the synthesis of frameworks developed by Meyers et al. (Am J Commun Psychol, 2012) and translate the results into a practical implementation science tool to use for improving quality of implementation (i.e., the Quality Implementation Tool; QIT), and (2) present some of the benefits and limitations of the tool by describing how the QIT was implemented in two different pilot projects. We discuss how the QIT can be used to guide collaborative planning, monitoring, and evaluation of how an innovation is implemented.

  7. A quality assessment tool for markup-based clinical guidelines.

    PubMed

    Shalom, Erez; Shahar, Yuval; Taieb-Maimon, Meirav; Lunenfeld, Eitan

    2008-11-06

    We introduce a tool for quality assessment of procedural and declarative knowledge. We developed this tool for evaluating the specification of mark-up-based clinical GLs. Using this graphical tool, the expert physician and knowledge engineer collaborate to perform scoring, using pre-defined scoring scale, each of the knowledge roles of the mark-ups, comparing it to a gold standard. The tool enables scoring the mark-ups simultaneously at different sites by different users at different locations.

  8. Consequent use of IT tools as a driver for cost reduction and quality improvements

    NASA Astrophysics Data System (ADS)

    Hein, Stefan; Rapp, Roberto; Feustel, Andreas

    2013-10-01

    The semiconductor industry drives a lot of efforts in the field of cost reductions and quality improvements. The consequent use of IT tools is one possibility to support these goals. With the extensions of its 150mm Fab to 200mm Robert Bosch increased the systematic use of data analysis and Advanced Process Control (APC).

  9. Water Quality Analysis Tool (WQAT) | Science Inventory | US ...

    EPA Pesticide Factsheets

    The purpose of the Water Quality Analysis Tool (WQAT) software is to provide a means for analyzing and producing useful remotely sensed data products for an entire estuary, a particular point or area of interest (AOI or POI) in estuaries, or water bodies of interest where pre-processed and geographically gridded remotely sensed images are available. A graphical user interface (GUI), was created to enable the user to select and display imagery from a variety of remote sensing data sources. The user can select a date (or date range) and location to extract pixels from the remotely sensed imagery. The GUI is used to obtain all available pixel values (i.e. pixel from all available bands of all available satellites) for a given location on a given date and time. The resultant data set can be analyzed or saved to a file for future use. The WQAT software provides users with a way to establish algorithms between remote sensing reflectance (Rrs) and any available in situ parameters, as well as statistical and regression analysis. The combined data sets can be used to improve water quality research and studies. Satellites provide spatially synoptic data at high frequency (daily to weekly). These characteristics are desirable for supplementing existing water quality observations and for providing information for large aquatic ecosystems that are historically under-sampled by field programs. Thus, the Water Quality Assessment Tool (WQAT) software tool was developed to suppo

  10. Development of a Multi-Domain Assessment Tool for Quality Improvement Projects.

    PubMed

    Rosenbluth, Glenn; Burman, Natalie J; Ranji, Sumant R; Boscardin, Christy K

    2017-08-01

    Improving the quality of health care and education has become a mandate at all levels within the medical profession. While several published quality improvement (QI) assessment tools exist, all have limitations in addressing the range of QI projects undertaken by learners in undergraduate medical education, graduate medical education, and continuing medical education. We developed and validated a tool to assess QI projects with learner engagement across the educational continuum. After reviewing existing tools, we interviewed local faculty who taught QI to understand how learners were engaged and what these faculty wanted in an ideal assessment tool. We then developed a list of competencies associated with QI, established items linked to these competencies, revised the items using an iterative process, and collected validity evidence for the tool. The resulting Multi-Domain Assessment of Quality Improvement Projects (MAQIP) rating tool contains 9 items, with criteria that may be completely fulfilled, partially fulfilled, or not fulfilled. Interrater reliability was 0.77. Untrained local faculty were able to use the tool with minimal guidance. The MAQIP is a 9-item, user-friendly tool that can be used to assess QI projects at various stages and to provide formative and summative feedback to learners at all levels.

  11. HVAC SYSTEMS AS A TOOL IN CONTROLLING INDOOR AIR QUALITY: A LITERATURE REVIEW

    EPA Science Inventory

    The report gives results of a review of literature on the use of heating, ventilating, and air-conditioning (HVAC) systems to control indoor air quality (IAQ). Although significant progress has been made in reducing the energy consumption of HVAC systems, their effect on indoor a...

  12. Atmospheric Model Evaluation Tool for meteorological and air quality simulations

    EPA Pesticide Factsheets

    The Atmospheric Model Evaluation Tool compares model predictions to observed data from various meteorological and air quality observation networks to help evaluate meteorological and air quality simulations.

  13. Direct-To Tool for En Route Controllers

    NASA Technical Reports Server (NTRS)

    Erzberger, Heinz; McNally, David; Foster, Michelle

    2002-01-01

    This paper describes a new automation tool for en route air traffic controllers, called the Direct-To Tool. The Tool is designed to reduce the time of flight and fuel consumption for aircraft flying in en route airspace. It provides each controller with the identities of aircraft in his/her sector, which can reduce their time en route by bypassing dog-legged route segments and flying "direct to" a waypoint closer to the destination airport. The Tool uses its build-in conflict probing capability to determine if the improved route is free of conflicts with other aircraft. The Tool's graphical computer interface enables the controller to enter a direct-to clearance by a simple point and click action. Because of its low workload and convenience, this method is strongly favored by controllers The Tool has been running since January with live radar data received at NASA from the Fort Worth Air Route Traffic Control Center. For aircraft operating in the Fort Worth Center, the Tool has the potential to save in excess of 500,000 in-flight minutes per year. A provisional patent application for this Tool has been filed. A field task in planned for the last quarter of this year.

  14. INDOOR AIR QUALITY AND INHALATION EXPOSURE - SIMULATION TOOL KIT

    EPA Science Inventory

    A Microsoft Windows-based indoor air quality (IAQ) simulation software package is presented. Named Simulation Tool Kit for Indoor Air Quality and Inhalation Exposure, or IAQX for short, this package complements and supplements existing IAQ simulation programs and is desi...

  15. The quality of websites addressing fibromyalgia: an assessment of quality and readability using standardised tools

    PubMed Central

    MacDermid, Joy C; Wilkins, Seanne; Gibson, Jane; Shaw, Lynn

    2011-01-01

    Background Patients living with fibromyalgia strongly prefer to access health information on the web. However, the majority of subjects in previous studies strongly expressed their concerns about the quality of online information resources. Objectives The purpose of this study was to evaluate existing online fibromyalgia information resources for content, quality and readability by using standardised quality and readability tools. Methods The first 25 websites were identified using Google and the search keyword ‘fibromyalgia’. Pairs of raters independently evaluated website quality using two structured tools (DISCERN and a quality checklist). Readability was assessed using the Flesch Reading Ease score maps. Results Ranking of the websites' quality varied by the tool used, although there was general agreement about the top three websites (Fibromyalgia Information, Fibromyalgia Information Foundation and National Institute of Arthritis and Musculoskeletal and Skin Diseases). Content analysis indicated that 72% of websites provided information on treatment options, 68% on symptoms, 60% on diagnosis and 40% on coping and resources. DISCERN ratings classified 32% websites as ‘very good’, 32% as ‘good and 36% as ‘marginal’. The mean overall DISCERN score was 36.88 (good). Only 16% of websites met the recommended literacy level grade of 6–8 (range 7–15). Conclusion Higher quality websites tended to be less readable. Online fibromyalgia information resources do not provide comprehensive information about fibromyalgia, and have low quality and poor readability. While information is very important for those living with fibromyalgia, current resources are unlikely to provide necessary or accurate information, and may not be usable for most people. PMID:22021777

  16. The quality of websites addressing fibromyalgia: an assessment of quality and readability using standardised tools.

    PubMed

    Daraz, Lubna; Macdermid, Joy C; Wilkins, Seanne; Gibson, Jane; Shaw, Lynn

    2011-07-31

    Background Patients living with fibromyalgia strongly prefer to access health information on the web. However, the majority of subjects in previous studies strongly expressed their concerns about the quality of online information resources. Objectives The purpose of this study was to evaluate existing online fibromyalgia information resources for content, quality and readability by using standardised quality and readability tools. Methods The first 25 websites were identified using Google and the search keyword 'fibromyalgia'. Pairs of raters independently evaluated website quality using two structured tools (DISCERN and a quality checklist). Readability was assessed using the Flesch Reading Ease score maps. Results Ranking of the websites' quality varied by the tool used, although there was general agreement about the top three websites (Fibromyalgia Information, Fibromyalgia Information Foundation and National Institute of Arthritis and Musculoskeletal and Skin Diseases). Content analysis indicated that 72% of websites provided information on treatment options, 68% on symptoms, 60% on diagnosis and 40% on coping and resources. DISCERN ratings classified 32% websites as 'very good', 32% as 'good and 36% as 'marginal'. The mean overall DISCERN score was 36.88 (good). Only 16% of websites met the recommended literacy level grade of 6-8 (range 7-15). Conclusion Higher quality websites tended to be less readable. Online fibromyalgia information resources do not provide comprehensive information about fibromyalgia, and have low quality and poor readability. While information is very important for those living with fibromyalgia, current resources are unlikely to provide necessary or accurate information, and may not be usable for most people.

  17. Tool Time: Gender and Students' Use of Tools, Control, and Authority.

    ERIC Educational Resources Information Center

    Jones, M. Gail; Brader-Araje, Laura; Carboni, Lisa Wilson; Carter, Glenda; Rua, Melissa J.; Banilower, Eric; Hatch, Holly

    2000-01-01

    Observes 16 students from five elementary science classes to examine how students use tools when constructing new knowledge during science instruction, how control of tools is actualized from pedagogical perspectives, how language and tool accessibility intersect, how gender intersects with tool use, and how competition for resources impacts…

  18. The development of a quality appraisal tool for studies of diagnostic reliability (QAREL).

    PubMed

    Lucas, Nicholas P; Macaskill, Petra; Irwig, Les; Bogduk, Nikolai

    2010-08-01

    In systematic reviews of the reliability of diagnostic tests, no quality assessment tool has been used consistently. The aim of this study was to develop a specific quality appraisal tool for studies of diagnostic reliability. Key principles for the quality of studies of diagnostic reliability were identified with reference to epidemiologic principles, existing quality appraisal checklists, and the Standards for Reporting of Diagnostic Accuracy (STARD) and Quality Assessment of Diagnostic Accuracy Studies (QUADAS) resources. Specific items that encompassed each of the principles were developed. Experts in diagnostic research provided feedback on the items that were to form the appraisal tool. This process was iterative and continued until consensus among experts was reached. The Quality Appraisal of Reliability Studies (QAREL) checklist includes 11 items that explore seven principles. Items cover the spectrum of subjects, spectrum of examiners, examiner blinding, order effects of examination, suitability of the time interval among repeated measurements, appropriate test application and interpretation, and appropriate statistical analysis. QAREL has been developed as a specific quality appraisal tool for studies of diagnostic reliability. The reliability of this tool in different contexts needs to be evaluated. Copyright (c) 2010 Elsevier Inc. All rights reserved.

  19. Quality of life tools in head and neck oncology.

    PubMed

    Heutte, N; Plisson, L; Lange, M; Prevost, V; Babin, E

    2014-02-01

    Quality of life (QoL) is now as much an assessment criterion in clinical trials in head and neck oncology as are survival and response rate. It is therefore important to be able to choose an adapted tool from the wide range of QoL instruments available. The present study presents an inventory of QoL scales validated in their French-language version, to facilitate the selection of appropriate tools showing good psychometric properties. QoL scales cited in all 492 French and English language articles published between March 1st, 2006 and April 3rd, 2012, referenced on Medline and retrieved using the keywords "quality of life" AND "head and neck" AND "cancer", were inventoried and classified thematically in a search of the literature. Ninety QoL scales are presented by theme (ORL oncology, voice, swallowing and mastication, mucosities and xerostomia, etc.), specifying psychometric quality and citation level. The present report constitutes a guide to selecting QoL tools adapted to head and neck oncology studies. Copyright © 2013 Elsevier Masson SAS. All rights reserved.

  20. Practical quality control tools for curves and surfaces

    NASA Technical Reports Server (NTRS)

    Small, Scott G.

    1992-01-01

    Curves (geometry) and surfaces created by Computer Aided Geometric Design systems in the engineering environment must satisfy two basic quality criteria: the geometric shape must have the desired engineering properties; and the objects must be parameterized in a way which does not cause computational difficulty for geometric processing and engineering analysis. Interactive techniques are described which are in use at Boeing to evaluate the quality of aircraft geometry prior to Computational Fluid Dynamic analysis, including newly developed methods for examining surface parameterization and its effects.

  1. Control/structure interaction conceptual design tool

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.

    1990-01-01

    The JPL Control/Structure Interaction Program is developing new analytical methods for designing micro-precision spacecraft with controlled structures. One of these, the Conceptual Design Tool, will illustrate innovative new approaches to the integration of multi-disciplinary analysis and design methods. The tool will be used to demonstrate homogeneity of presentation, uniform data representation across analytical methods, and integrated systems modeling. The tool differs from current 'integrated systems' that support design teams most notably in its support for the new CSI multi-disciplinary engineer. The design tool will utilize a three dimensional solid model of the spacecraft under design as the central data organization metaphor. Various analytical methods, such as finite element structural analysis, control system analysis, and mechanical configuration layout, will store and retrieve data from a hierarchical, object oriented data structure that supports assemblies of components with associated data and algorithms. In addition to managing numerical model data, the tool will assist the designer in organizing, stating, and tracking system requirements.

  2. Assessments of the quality of randomized controlled trials published in International Journal of Urology from 1994 to 2011.

    PubMed

    Cho, Hee Ju; Chung, Jae Hoon; Jo, Jung Ki; Kang, Dong Hyuk; Cho, Jeong Man; Yoo, Tag Keun; Lee, Seung Wook

    2013-12-01

    Randomized controlled trials are one of the most reliable resources for assessing the effectiveness and safety of medical treatments. Low quality randomized controlled trials carry a large bias that can ultimately impair the reliability of their conclusions. The present study aimed to evaluate the quality of randomized controlled trials published in International Journal of Urology by using multiple quality assessment tools. Randomized controlled trials articles published in International Journal of Urology were found using the PubMed MEDLINE database, and qualitative analysis was carried out with three distinct assessment tools: the Jadad scale, the van Tulder scale and the Cochrane Collaboration Risk of Bias Tool. The quality of randomized controlled trials was analyzed by publication year, type of subjects, intervention, presence of funding and whether an institutional review board reviewed the study. A total of 68 randomized controlled trial articles were published among a total of 1399 original articles in International Journal of Urology. Among these randomized controlled trials, 10 (2.70%) were from 1994 to 1999, 23 (4.10%) were from 2000 to 2005 and 35 (4.00%) were from 2006 to 2011 (P = 0.494). On the assessment with the Jadad and van Tulder scale, the numbers and percentage of high quality randomized controlled trials increased over time. The studies that had institutional review board reviews, funding resources or that were carried out in multiple institutions had an increased percentage of high quality articles. The numbers and percentage of high-quality randomized controlled trials published in International Journal of Urology have increased over time. Furthermore, randomized controlled trials with funding resources, institutional review board reviews or carried out in multiple institutions have been found to be of higher quality compared with others not presenting these features. © 2013 The Japanese Urological Association.

  3. A New Tool for Quality: The Internal Audit.

    PubMed

    Haycock, Camille; Schandl, Annette

    As health care systems aspire to improve the quality and value for the consumers they serve, quality outcomes must be at the forefront of this value equation. As organizations implement evidence-based practices, electronic records to standardize processes, and quality improvement initiatives, many tactics are deployed to accelerate improvement and care outcomes. This article describes how one organization utilized a formal clinical audit process to identify gaps and/or barriers that may be contributing to underperforming measures and outcomes. This partnership between quality and audit can be a powerful tool and produce insights that can be scaled across a large health care system.

  4. Effectiveness of the Assessment of Burden of COPD (ABC) tool on health-related quality of life in patients with COPD: a cluster randomised controlled trial in primary and hospital care

    PubMed Central

    Slok, Annerika H M; Kotz, Daniel; van Breukelen, Gerard; Chavannes, Niels H; Rutten-van Mölken, Maureen P M H; Kerstjens, Huib A M; van der Molen, Thys; Asijee, Guus M; Dekhuijzen, P N Richard; Holverda, Sebastiaan; Salomé, Philippe L; Goossens, Lucas M A; Twellaar, Mascha; in ‘t Veen, Johannes C C M; van Schayck, Onno C P

    2016-01-01

    Objective Assessing the effectiveness of the Assessment of Burden of COPD (ABC) tool on disease-specific quality of life in patients with chronic obstructive pulmonary disease (COPD) measured with the St. George's Respiratory Questionnaire (SGRQ), compared with usual care. Methods A pragmatic cluster randomised controlled trial, in 39 Dutch primary care practices and 17 hospitals, with 357 patients with COPD (postbronchodilator FEV1/FVC ratio <0.7) aged ≥40 years, who could understand and read the Dutch language. Healthcare providers were randomly assigned to the intervention or control group. The intervention group applied the ABC tool, which consists of a short validated questionnaire assessing the experienced burden of COPD, objective COPD parameter (eg, lung function) and a treatment algorithm including a visual display and treatment advice. The control group provided usual care. Researchers were blinded to group allocation during analyses. Primary outcome was the number of patients with a clinically relevant improvement in SGRQ score between baseline and 18-month follow-up. Secondary outcomes were the COPD Assessment Test (CAT) and the Patient Assessment of Chronic Illness Care (PACIC; a measurement of perceived quality of care). Results At 18-month follow-up, 34% of the 146 patients from 27 healthcare providers in the intervention group showed a clinically relevant improvement in the SGRQ, compared with 22% of the 148 patients from 29 healthcare providers in the control group (OR 1.85, 95% CI 1.08 to 3.16). No difference was found on the CAT (−0.26 points (scores ranging from 0 to 40); 95% CI −1.52 to 0.99). The PACIC showed a higher improvement in the intervention group (0.32 points (scores ranging from 1 to 5); 95% CI 0.14 to 0.50). Conclusions This study showed that use of the ABC tool may increase quality of life and perceived quality of care. Trial registration number NTR3788; Results. PMID:27401361

  5. Proteomics as a Quality Control Tool of Pharmaceutical Probiotic Bacterial Lysate Products

    PubMed Central

    Klein, Günter; Schanstra, Joost P.; Hoffmann, Janosch; Mischak, Harald; Siwy, Justyna; Zimmermann, Kurt

    2013-01-01

    Probiotic bacteria have a wide range of applications in veterinary and human therapeutics. Inactivated probiotics are complex samples and quality control (QC) should measure as many molecular features as possible. Capillary electrophoresis coupled to mass spectrometry (CE/MS) has been used as a multidimensional and high throughput method for the identification and validation of biomarkers of disease in complex biological samples such as biofluids. In this study we evaluate the suitability of CE/MS to measure the consistency of different lots of the probiotic formulation Pro-Symbioflor which is a bacterial lysate of heat-inactivated Escherichia coli and Enterococcus faecalis. Over 5000 peptides were detected by CE/MS in 5 different lots of the bacterial lysate and in a sample of culture medium. 71 to 75% of the total peptide content was identical in all lots. This percentage increased to 87–89% when allowing the absence of a peptide in one of the 5 samples. These results, based on over 2000 peptides, suggest high similarity of the 5 different lots. Sequence analysis identified peptides of both E. coli and E. faecalis and peptides originating from the culture medium, thus confirming the presence of the strains in the formulation. Ontology analysis suggested that the majority of the peptides identified for E. coli originated from the cell membrane or the fimbrium, while peptides identified for E. faecalis were enriched for peptides originating from the cytoplasm. The bacterial lysate peptides as a whole are recognised as highly conserved molecular patterns by the innate immune system as microbe associated molecular pattern (MAMP). Sequence analysis also identified the presence of soybean, yeast and casein protein fragments that are part of the formulation of the culture medium. In conclusion CE/MS seems an appropriate QC tool to analyze complex biological products such as inactivated probiotic formulations and allows determining the similarity between lots. PMID

  6. Modelling raw water quality: development of a drinking water management tool.

    PubMed

    Kübeck, Ch; van Berk, W; Bergmann, A

    2009-01-01

    Ensuring future drinking water supply requires a tough management of groundwater resources. However, recent practices of economic resource control often does not involve aspects of the hydrogeochemical and geohydraulical groundwater system. In respect of analysing the available quantity and quality of future raw water, an effective resource management requires a full understanding of the hydrogeochemical and geohydraulical processes within the aquifer. For example, the knowledge of raw water quality development within the time helps to work out strategies of water treatment as well as planning finance resources. On the other hand, the effectiveness of planed measurements reducing the infiltration of harmful substances such as nitrate can be checked and optimized by using hydrogeochemical modelling. Thus, within the framework of the InnoNet program funded by Federal Ministry of Economics and Technology, a network of research institutes and water suppliers work in close cooperation developing a planning and management tool particularly oriented on water management problems. The tool involves an innovative material flux model that calculates the hydrogeochemical processes under consideration of the dynamics in agricultural land use. The program integrated graphical data evaluation is aligned on the needs of water suppliers.

  7. The adenosine triphosphate method as a quality control tool to assess 'cleanliness' of frequently touched hospital surfaces.

    PubMed

    Knape, L; Hambraeus, A; Lytsy, B

    2015-10-01

    The adenosine triphosphate (ATP) method is widely accepted as a quality control method to complement visual assessment, in the specifications of requirements, when purchasing cleaning contractors in Swedish hospitals. To examine whether the amount of biological load, as measured by ATP on frequently touched near-patient surfaces, had been reduced after an intervention; to evaluate the correlation between visual assessment and ATP levels on the same surfaces; to identify aspects of the performance of the ATP method as a tool in evaluating hospital cleanliness. A prospective intervention study in three phases was carried out in a medical ward and an intensive care unit (ICU) at a regional hospital in mid-Sweden between 2012 and 2013. Existing cleaning procedures were defined and baseline tests were sampled by visual inspection and ATP measurements of ten frequently touched surfaces in patients' rooms before and after intervention. The intervention consisted of educating nursing staff about the importance of hospital cleaning and direct feedback of ATP levels before and after cleaning. The mixed model showed a significant decrease in ATP levels after the intervention (P < 0.001). Relative light unit values were lower in the ICU. Cleanliness as judged by visual assessments improved. In the logistic regression analysis, there was a significant association between visual assessments and ATP levels. Direct feedback of ATP levels, together with education and introduction of written cleaning protocols, were effective tools to improve cleanliness. Visual assessment correlated with the level of ATP but the correlation was not absolute. The ATP method could serve as an educational tool for staff, but is not enough to assess hospital cleanliness in general as only a limited part of a large area is covered. Copyright © 2015 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.

  8. Quality Leadership and Quality Control

    PubMed Central

    Badrick, Tony

    2003-01-01

    Different quality control rules detect different analytical errors with varying levels of efficiency depending on the type of error present, its prevalence and the number of observations. The efficiency of a rule can be gauged by inspection of a power function graph. Control rules are only part of a process and not an end in itself; just as important are the trouble-shooting systems employed when a failure occurs. 'Average of patient normals' may develop as a usual adjunct to conventional quality control serum based programmes. Acceptable error can be based on various criteria; biological variation is probably the most sensible. Once determined, acceptable error can be used as limits in quality control rule systems. A key aspect of an organisation is leadership, which links the various components of the quality system. Leadership is difficult to characterise but its key aspects include trust, setting an example, developing staff and critically setting the vision for the organisation. Organisations also have internal characteristics such as the degree of formalisation, centralisation, and complexity. Medical organisations can have internal tensions because of the dichotomy between the bureaucratic and the shadow medical structures. PMID:18568046

  9. Principles and Practices for Quality Assurance and Quality Control

    USGS Publications Warehouse

    Jones, Berwyn E.

    1999-01-01

    Quality assurance and quality control are vital parts of highway runoff water-quality monitoring projects. To be effective, project quality assurance must address all aspects of the project, including project management responsibilities and resources, data quality objectives, sampling and analysis plans, data-collection protocols, data quality-control plans, data-assessment procedures and requirements, and project outputs. Quality control ensures that the data quality objectives are achieved as planned. The historical development and current state of the art of quality assurance and quality control concepts described in this report can be applied to evaluation of data from prior projects.

  10. Quality control management and communication between radiologists and technologists.

    PubMed

    Nagy, Paul G; Pierce, Benjamin; Otto, Misty; Safdar, Nabile M

    2008-06-01

    The greatest barrier to quality control (QC) in the digital imaging environment is the lack of communication and documentation between those who interpret images and those who acquire them. Paper-based QC methods are insufficient in a digital image management system. Problem work flow must be incorporated into reengineering efforts when migrating to a digital practice. The authors implemented a Web-based QC feedback tool to document and facilitate the communication of issues identified by radiologists. The goal was to promote a responsive and constructive tool that contributes to a culture of quality. The hypothesis was that by making it easier for radiologists to submit quality issues, the number of QC issues submitted would increase. The authors integrated their Web-based quality tracking system with a clinical picture archiving and communication system so that radiologists could report quality issues without disrupting clinical work flow. Graphical dashboarding techniques aid supervisors in using this database to identify the root causes of different types of issues. Over the initial 12-month rollout period, starting in the general section, the authors recorded 20 times more QC issues submitted by radiologists, accompanied by a rise in technologists' responsiveness to QC issues. For technologists with high numbers of QC issues, the incorporation of data from this tracking system proved useful in performance appraisals and in driving individual improvement. This tool is an example of the types of information technology innovations that can be leveraged to support QC in the digital imaging environment. Initial data suggest that the result is not only an improvement in quality but higher levels of satisfaction for both radiologists and technologists.

  11. Quality control and in-service inspection technology for hybrid-composite girder bridges.

    DOT National Transportation Integrated Search

    2014-08-01

    This report describes efforts to develop quality control tools and in-service inspection technologies for the fabrication and construction of Hybrid Composite Beams (HCBs). HCBs are a new bridge technology currently being evaluated by the Missouri De...

  12. Current Quality-of-Life Tools Available for Use in Contact Dermatitis.

    PubMed

    Swietlik, Jacquelyn; Reeder, Margo

    2016-01-01

    Contact dermatitis is a common dermatologic condition that can cause significant impairment in patients' overall quality of life (QoL). This impact is separate and potentially more clinically relevant than one's disease "severity" in contact dermatitis and should be consistently addressed by dermatologists. Despite this, QoL tools specific to contact dermatitis are lacking, and there is little consistency in the literature regarding the tool used to evaluate clinical response to therapies. Measurements currently available to evaluate disease-related QoL in contact dermatitis fit into 1 of the following 3 general types: generic health-related QoL measures, dermatology-related QoL measures, or specific dermatologic disease-related QoL measures. This article reviews the strengths and weaknesses of existing QoL tools used in contact dermatitis including: Short Form Survey 36, Dermatology Life Quality Index, Skindex-29, Skindex-16, Dermatology-Specific Quality of Life, and Fragrance Quality of Life Index.

  13. Variables Control Charts: A Measurement Tool to Detect Process Problems within Housing

    ERIC Educational Resources Information Center

    Luna, Andrew

    1999-01-01

    The purpose of this study was to use quality improvement tools to determine if the current process of supplying hot water to a high-rise residence hall for women at a southeastern Doctoral I granting institution was in control. After a series of focus groups among the residents in the hall, it was determined that they were mostly concerned about…

  14. Tools for surveying and improving the quality of life: people with special needs in focus.

    PubMed

    Hoyningen-Süess, Ursula; Oberholzer, David; Stalder, René; Brügger, Urs

    2012-01-01

    This article seeks to describe online tools for surveying and improving quality of life for people with disabilities living in assisted living centers and special education service organizations. Ensuring a decent quality of life for disabled people is an important welfare state goal. Using well-accepted quality of life conceptions, online diagnostic and planning tools were developed during an Institute for Education, University of Zurich, research project. The diagnostic tools measure, evaluate and analyze disabled people's quality of life. The planning tools identify factors that can affect their quality of life and suggest improvements. Instrument validity and reliability are not tested according to the standard statistical procedures. This will be done at a more advanced stage of the project. Instead, the tool is developed, refined and adjusted in cooperation with practitioners who are constantly judging it according to best practice standards. The tools support staff in assisted living centers and special education service organizations. These tools offer comprehensive resources for surveying, quantifying, evaluating, describing and simulating quality of life elements.

  15. Evidence-based decision-making in infectious diseases epidemiology, prevention and control: matching research questions to study designs and quality appraisal tools.

    PubMed

    Harder, Thomas; Takla, Anja; Rehfuess, Eva; Sánchez-Vivar, Alex; Matysiak-Klose, Dorothea; Eckmanns, Tim; Krause, Gérard; de Carvalho Gomes, Helena; Jansen, Andreas; Ellis, Simon; Forland, Frode; James, Roberta; Meerpohl, Joerg J; Morgan, Antony; Schünemann, Holger; Zuiderent-Jerak, Teun; Wichmann, Ole

    2014-05-21

    The Project on a Framework for Rating Evidence in Public Health (PRECEPT) was initiated and is being funded by the European Centre for Disease Prevention and Control (ECDC) to define a methodology for evaluating and grading evidence and strength of recommendations in the field of public health, with emphasis on infectious disease epidemiology, prevention and control. One of the first steps was to review existing quality appraisal tools (QATs) for individual research studies of various designs relevant to this area, using a question-based approach. Through team discussions and expert consultations, we identified 20 relevant types of public health questions, which were grouped into six domains, i.e. characteristics of the pathogen, burden of disease, diagnosis, risk factors, intervention, and implementation of intervention. Previously published systematic reviews were used and supplemented by expert consultation to identify suitable QATs. Finally, a matrix was constructed for matching questions to study designs suitable to address them and respective QATs. Key features of each of the included QATs were then analyzed, in particular in respect to its intended use, types of questions and answers, presence/absence of a quality score, and if a validation was performed. In total we identified 21 QATs and 26 study designs, and matched them. Four QATs were suitable for experimental quantitative study designs, eleven for observational quantitative studies, two for qualitative studies, three for economic studies, one for diagnostic test accuracy studies, and one for animal studies. Included QATs consisted of six to 28 items. Six of the QATs had a summary quality score. Fourteen QATs had undergone at least one validation procedure. The results of this methodological study can be used as an inventory of potentially relevant questions, appropriate study designs and QATs for researchers and authorities engaged with evidence-based decision-making in infectious disease epidemiology

  16. Evidence-based decision-making in infectious diseases epidemiology, prevention and control: matching research questions to study designs and quality appraisal tools

    PubMed Central

    2014-01-01

    Background The Project on a Framework for Rating Evidence in Public Health (PRECEPT) was initiated and is being funded by the European Centre for Disease Prevention and Control (ECDC) to define a methodology for evaluating and grading evidence and strength of recommendations in the field of public health, with emphasis on infectious disease epidemiology, prevention and control. One of the first steps was to review existing quality appraisal tools (QATs) for individual research studies of various designs relevant to this area, using a question-based approach. Methods Through team discussions and expert consultations, we identified 20 relevant types of public health questions, which were grouped into six domains, i.e. characteristics of the pathogen, burden of disease, diagnosis, risk factors, intervention, and implementation of intervention. Previously published systematic reviews were used and supplemented by expert consultation to identify suitable QATs. Finally, a matrix was constructed for matching questions to study designs suitable to address them and respective QATs. Key features of each of the included QATs were then analyzed, in particular in respect to its intended use, types of questions and answers, presence/absence of a quality score, and if a validation was performed. Results In total we identified 21 QATs and 26 study designs, and matched them. Four QATs were suitable for experimental quantitative study designs, eleven for observational quantitative studies, two for qualitative studies, three for economic studies, one for diagnostic test accuracy studies, and one for animal studies. Included QATs consisted of six to 28 items. Six of the QATs had a summary quality score. Fourteen QATs had undergone at least one validation procedure. Conclusions The results of this methodological study can be used as an inventory of potentially relevant questions, appropriate study designs and QATs for researchers and authorities engaged with evidence-based decision

  17. Framing quality improvement tools and techniques in healthcare the case of improvement leaders' guides.

    PubMed

    Millar, Ross

    2013-01-01

    The purpose of this paper is to present a study of how quality improvement tools and techniques are framed within healthcare settings. The paper employs an interpretive approach to understand how quality improvement tools and techniques are mobilised and legitimated. It does so using a case study of the NHS Modernisation Agency Improvement Leaders' Guides in England. Improvement Leaders' Guides were framed within a service improvement approach encouraging the use of quality improvement tools and techniques within healthcare settings. Their use formed part of enacting tools and techniques across different contexts. Whilst this enactment was believed to support the mobilisation of tools and techniques, the experience also illustrated the challenges in distributing such approaches. The paper provides an important contribution in furthering our understanding of framing the "social act" of quality improvement. Given the ongoing emphasis on quality improvement in health systems and the persistent challenges involved, it also provides important information for healthcare leaders globally in seeking to develop, implement or modify similar tools and distribute leadership within health and social care settings.

  18. Gironacel[R]: A Virtual Tool for Learning Quality Management

    ERIC Educational Resources Information Center

    Mendez, Empar; Casadesus, Marti; De Ciurana, Quim

    2006-01-01

    This article describes the Gironacel[R] project--a virtual learning environment produced by the University of Girona. The purpose of this tool is to make it easier for students studying quality management courses within engineering schools to understand what the "quality culture" is and how to implement the ISO 9001:2000 standard in a…

  19. Optical Method For Monitoring Tool Control For Green Burnishing With Using Of Algorithms With Adaptive Settings

    NASA Astrophysics Data System (ADS)

    Lukyanov, A. A.; Grigoriev, S. N.; Bobrovskij, I. N.; Melnikov, P. A.; Bobrovskij, N. M.

    2017-05-01

    With regard to the complexity of the new technology and increase its reliability requirements laboriousness of control operations in industrial quality control systems increases significantly. The importance of quality management control due to the fact that its promotes the correct use of production conditions, the relevant requirements are required. Digital image processing allows to reach a new technological level of production (new technological way). The most complicated automated interpretation of information is the basis for decision-making in the management of production processes. In the case of surface analysis of tools used for processing with the using of metalworking fluids (MWF) it is more complicated. The authors suggest new algorithm for optical inspection of the wear of the cylinder tool for burnishing, which used in surface plastic deformation without using of MWF. The main advantage of proposed algorithm is the possibility of automatic recognition of images of burnisher tool with the subsequent allocation of its boundaries, finding a working surface and automatically allocating the defects and wear area. Software that implements the algorithm was developed by the authors in Matlab programming environment, but can be implemented using other programming languages.

  20. mtDNAmanager: a Web-based tool for the management and quality analysis of mitochondrial DNA control-region sequences

    PubMed Central

    Lee, Hwan Young; Song, Injee; Ha, Eunho; Cho, Sung-Bae; Yang, Woo Ick; Shin, Kyoung-Jin

    2008-01-01

    Background For the past few years, scientific controversy has surrounded the large number of errors in forensic and literature mitochondrial DNA (mtDNA) data. However, recent research has shown that using mtDNA phylogeny and referring to known mtDNA haplotypes can be useful for checking the quality of sequence data. Results We developed a Web-based bioinformatics resource "mtDNAmanager" that offers a convenient interface supporting the management and quality analysis of mtDNA sequence data. The mtDNAmanager performs computations on mtDNA control-region sequences to estimate the most-probable mtDNA haplogroups and retrieves similar sequences from a selected database. By the phased designation of the most-probable haplogroups (both expected and estimated haplogroups), mtDNAmanager enables users to systematically detect errors whilst allowing for confirmation of the presence of clear key diagnostic mutations and accompanying mutations. The query tools of mtDNAmanager also facilitate database screening with two options of "match" and "include the queried nucleotide polymorphism". In addition, mtDNAmanager provides Web interfaces for users to manage and analyse their own data in batch mode. Conclusion The mtDNAmanager will provide systematic routines for mtDNA sequence data management and analysis via easily accessible Web interfaces, and thus should be very useful for population, medical and forensic studies that employ mtDNA analysis. mtDNAmanager can be accessed at . PMID:19014619

  1. A CIS (Clinical Information System) Quality Evaluation Tool for Nursing Care Services

    ERIC Educational Resources Information Center

    Lee, Seon Ah

    2010-01-01

    The purpose of this study was to develop a tool to evaluate the quality of a clinical information system (CIS) conceived by nurses and conduct a pilot test with the developed tool as an initial assessment. CIS quality is required for successful implementation in information technology (IT) environments. The study started with the realization that…

  2. ChiLin: a comprehensive ChIP-seq and DNase-seq quality control and analysis pipeline.

    PubMed

    Qin, Qian; Mei, Shenglin; Wu, Qiu; Sun, Hanfei; Li, Lewyn; Taing, Len; Chen, Sujun; Li, Fugen; Liu, Tao; Zang, Chongzhi; Xu, Han; Chen, Yiwen; Meyer, Clifford A; Zhang, Yong; Brown, Myles; Long, Henry W; Liu, X Shirley

    2016-10-03

    Transcription factor binding, histone modification, and chromatin accessibility studies are important approaches to understanding the biology of gene regulation. ChIP-seq and DNase-seq have become the standard techniques for studying protein-DNA interactions and chromatin accessibility respectively, and comprehensive quality control (QC) and analysis tools are critical to extracting the most value from these assay types. Although many analysis and QC tools have been reported, few combine ChIP-seq and DNase-seq data analysis and quality control in a unified framework with a comprehensive and unbiased reference of data quality metrics. ChiLin is a computational pipeline that automates the quality control and data analyses of ChIP-seq and DNase-seq data. It is developed using a flexible and modular software framework that can be easily extended and modified. ChiLin is ideal for batch processing of many datasets and is well suited for large collaborative projects involving ChIP-seq and DNase-seq from different designs. ChiLin generates comprehensive quality control reports that include comparisons with historical data derived from over 23,677 public ChIP-seq and DNase-seq samples (11,265 datasets) from eight literature-based classified categories. To the best of our knowledge, this atlas represents the most comprehensive ChIP-seq and DNase-seq related quality metric resource currently available. These historical metrics provide useful heuristic quality references for experiment across all commonly used assay types. Using representative datasets, we demonstrate the versatility of the pipeline by applying it to different assay types of ChIP-seq data. The pipeline software is available open source at https://github.com/cfce/chilin . ChiLin is a scalable and powerful tool to process large batches of ChIP-seq and DNase-seq datasets. The analysis output and quality metrics have been structured into user-friendly directories and reports. We have successfully compiled 23

  3. Quality Tools for Professional Higher Education Review and Improvement. PHExcel Report

    ERIC Educational Resources Information Center

    Jørgensen, Malene Dahl; Sparre Kristensen, Regitze; Wimpf, Alexandre; Delplace, Stefan

    2014-01-01

    The report is the project's first outcome, and provides an overview of quality tools, quality models and quality labels, currently in use in (professional) higher education. It is followed by a gap analysis as regards the Standards and Guidelines for quality assurance in the European Higher Education Area (ESG), and the identified characteristics…

  4. QUAST: quality assessment tool for genome assemblies.

    PubMed

    Gurevich, Alexey; Saveliev, Vladislav; Vyahhi, Nikolay; Tesler, Glenn

    2013-04-15

    Limitations of genome sequencing techniques have led to dozens of assembly algorithms, none of which is perfect. A number of methods for comparing assemblers have been developed, but none is yet a recognized benchmark. Further, most existing methods for comparing assemblies are only applicable to new assemblies of finished genomes; the problem of evaluating assemblies of previously unsequenced species has not been adequately considered. Here, we present QUAST-a quality assessment tool for evaluating and comparing genome assemblies. This tool improves on leading assembly comparison software with new ideas and quality metrics. QUAST can evaluate assemblies both with a reference genome, as well as without a reference. QUAST produces many reports, summary tables and plots to help scientists in their research and in their publications. In this study, we used QUAST to compare several genome assemblers on three datasets. QUAST tables and plots for all of them are available in the Supplementary Material, and interactive versions of these reports are on the QUAST website. http://bioinf.spbau.ru/quast . Supplementary data are available at Bioinformatics online.

  5. Internal quality control: best practice.

    PubMed

    Kinns, Helen; Pitkin, Sarah; Housley, David; Freedman, Danielle B

    2013-12-01

    There is a wide variation in laboratory practice with regard to implementation and review of internal quality control (IQC). A poor approach can lead to a spectrum of scenarios from validation of incorrect patient results to over investigation of falsely rejected analytical runs. This article will provide a practical approach for the routine clinical biochemistry laboratory to introduce an efficient quality control system that will optimise error detection and reduce the rate of false rejection. Each stage of the IQC system is considered, from selection of IQC material to selection of IQC rules, and finally the appropriate action to follow when a rejection signal has been obtained. The main objective of IQC is to ensure day-to-day consistency of an analytical process and thus help to determine whether patient results are reliable enough to be released. The required quality and assay performance varies between analytes as does the definition of a clinically significant error. Unfortunately many laboratories currently decide what is clinically significant at the troubleshooting stage. Assay-specific IQC systems will reduce the number of inappropriate sample-run rejections compared with the blanket use of one IQC rule. In practice, only three or four different IQC rules are required for the whole of the routine biochemistry repertoire as assays are assigned into groups based on performance. The tools to categorise performance and assign IQC rules based on that performance are presented. Although significant investment of time and education is required prior to implementation, laboratories have shown that such systems achieve considerable reductions in cost and labour.

  6. QUALITY CONTROLS FOR PCR

    EPA Science Inventory

    The purpose of this presentation is to present an overview of the quality control (QC) sections of a draft EPA document entitled, "Quality Assurance/Quality Control Guidance for Laboratories Performing PCR Analyses on Environmental Samples." This document has been prepared by th...

  7. Distributed Engine Control Empirical/Analytical Verification Tools

    NASA Technical Reports Server (NTRS)

    DeCastro, Jonathan; Hettler, Eric; Yedavalli, Rama; Mitra, Sayan

    2013-01-01

    NASA's vision for an intelligent engine will be realized with the development of a truly distributed control system featuring highly reliable, modular, and dependable components capable of both surviving the harsh engine operating environment and decentralized functionality. A set of control system verification tools was developed and applied to a C-MAPSS40K engine model, and metrics were established to assess the stability and performance of these control systems on the same platform. A software tool was developed that allows designers to assemble easily a distributed control system in software and immediately assess the overall impacts of the system on the target (simulated) platform, allowing control system designers to converge rapidly on acceptable architectures with consideration to all required hardware elements. The software developed in this program will be installed on a distributed hardware-in-the-loop (DHIL) simulation tool to assist NASA and the Distributed Engine Control Working Group (DECWG) in integrating DCS (distributed engine control systems) components onto existing and next-generation engines.The distributed engine control simulator blockset for MATLAB/Simulink and hardware simulator provides the capability to simulate virtual subcomponents, as well as swap actual subcomponents for hardware-in-the-loop (HIL) analysis. Subcomponents can be the communication network, smart sensor or actuator nodes, or a centralized control system. The distributed engine control blockset for MATLAB/Simulink is a software development tool. The software includes an engine simulation, a communication network simulation, control algorithms, and analysis algorithms set up in a modular environment for rapid simulation of different network architectures; the hardware consists of an embedded device running parts of the CMAPSS engine simulator and controlled through Simulink. The distributed engine control simulation, evaluation, and analysis technology provides unique

  8. Application of reiteration of Hankel singular value decomposition in quality control

    NASA Astrophysics Data System (ADS)

    Staniszewski, Michał; Skorupa, Agnieszka; Boguszewicz, Łukasz; Michalczuk, Agnieszka; Wereszczyński, Kamil; Wicher, Magdalena; Konopka, Marek; Sokół, Maria; Polański, Andrzej

    2017-07-01

    Medical centres are obliged to store past medical records, including the results of quality assurance (QA) tests of the medical equipment, which is especially useful in checking reproducibility of medical devices and procedures. Analysis of multivariate time series is an important part of quality control of NMR data. In this work we proposean anomaly detection tool based on Reiteration of Hankel Singular Value Decomposition method. The presented method was compared with external software and authors obtained comparable results.

  9. Sensory and rapid instrumental methods as a combined tool for quality control of cooked ham.

    PubMed

    Barbieri, Sara; Soglia, Francesca; Palagano, Rosa; Tesini, Federica; Bendini, Alessandra; Petracci, Massimiliano; Cavani, Claudio; Gallina Toschi, Tullia

    2016-11-01

    In this preliminary investigation, different commercial categories of Italian cooked pork hams have been characterized using an integrated approach based on both sensory and fast instrumental measurements. For these purposes, Italian products belonging to different categories (cooked ham, "selected" cooked ham and "high quality" cooked ham) were evaluated by sensory descriptive analysis and by the application of rapid tools such as image analysis by an "electronic eye" and texture analyzer. The panel of trained assessors identified and evaluated 10 sensory descriptors able to define the quality of the products. Statistical analysis highlighted that sensory characteristics related to appearance and texture were the most significant in discriminating samples belonged to the highest (high quality cooked hams) and the lowest (cooked hams) quality of the product whereas the selected cooked hams, showed intermediate characteristics. In particular, high quality samples were characterized, above all, by the highest intensity of pink intensity, typical appearance and cohesiveness, and, at the same time, by the lowest intensity of juiciness; standard cooked ham samples showed the lowest intensity of all visual attributes and the highest value of juiciness, whereas the intermediate category (selected cooked ham) was not discriminated from the other. Also physical-rheological parameters measured by electronic eye and texture analyzer were effective in classifying samples. In particular, the PLS model built with data obtained from the electronic eye showed a satisfactory performance in terms of prediction of the pink intensity and presence of fat attributes evaluated during the sensory visual phase. This study can be considered a first application of this combined approach that could represent a suitable and fast method to verify if the meat product purchased by consumer match its description in terms of compliance with the claimed quality.

  10. [Methodological quality and reporting quality evaluation of randomized controlled trials published in China Journal of Chinese Materia Medica].

    PubMed

    Yu, Dan-Dan; Xie, Yan-Ming; Liao, Xing; Zhi, Ying-Jie; Jiang, Jun-Jie; Chen, Wei

    2018-02-01

    To evaluate the methodological quality and reporting quality of randomized controlled trials(RCTs) published in China Journal of Chinese Materia Medica, we searched CNKI and China Journal of Chinese Materia webpage to collect RCTs since the establishment of the magazine. The Cochrane risk of bias assessment tool was used to evaluate the methodological quality of RCTs. The CONSORT 2010 list was adopted as reporting quality evaluating tool. Finally, 184 RCTs were included and evaluated methodologically, of which 97 RCTs were evaluated with reporting quality. For the methodological evaluating, 62 trials(33.70%) reported the random sequence generation; 9(4.89%) trials reported the allocation concealment; 25(13.59%) trials adopted the method of blinding; 30(16.30%) trials reported the number of patients withdrawing, dropping out and those lost to follow-up;2 trials (1.09%) reported trial registration and none of the trial reported the trial protocol; only 8(4.35%) trials reported the sample size estimation in details. For reporting quality appraising, 3 reporting items of 25 items were evaluated with high-quality,including: abstract, participants qualified criteria, and statistical methods; 4 reporting items with medium-quality, including purpose, intervention, random sequence method, and data collection of sites and locations; 9 items with low-quality reporting items including title, backgrounds, random sequence types, allocation concealment, blindness, recruitment of subjects, baseline data, harms, and funding;the rest of items were of extremely low quality(the compliance rate of reporting item<10%). On the whole, the methodological and reporting quality of RCTs published in the magazine are generally low. Further improvement in both methodological and reporting quality for RCTs of traditional Chinese medicine are warranted. It is recommended that the international standards and procedures for RCT design should be strictly followed to conduct high-quality trials

  11. Console Log Keeping Made Easier - Tools and Techniques for Improving Quality of Flight Controller Activity Logs

    NASA Technical Reports Server (NTRS)

    Scott, David W.; Underwood, Debrah (Technical Monitor)

    2002-01-01

    At the Marshall Space Flight Center's (MSFC) Payload Operations Integration Center (POIC) for International Space Station (ISS), each flight controller maintains detailed logs of activities and communications at their console position. These logs are critical for accurately controlling flight in real-time as well as providing a historical record and troubleshooting tool. This paper describes logging methods and electronic formats used at the POIC and provides food for thought on their strengths and limitations, plus proposes some innovative extensions. It also describes an inexpensive PC-based scheme for capturing and/or transcribing audio clips from communications consoles. Flight control activity (e.g. interpreting computer displays, entering data/issuing electronic commands, and communicating with others) can become extremely intense. It's essential to document it well, but the effort to do so may conflict with actual activity. This can be more than just annoying, as what's in the logs (or just as importantly not in them) often feeds back directly into the quality of future operations, whether short-term or long-term. In earlier programs, such as Spacelab, log keeping was done on paper, often using position-specific shorthand, and the other reader was at the mercy of the writer's penmanship. Today, user-friendly software solves the legibility problem and can automate date/time entry, but some content may take longer to finish due to individual typing speed and less use of symbols. File layout can be used to great advantage in making types of information easy to find, and creating searchable master logs for a given position is very easy and a real lifesaver in reconstructing events or researching a given topic. We'll examine log formats from several console position, and the types of information that are included and (just as importantly) excluded. We'll also look at when a summary or synopsis is effective, and when extensive detail is needed.

  12. [A quality evaluation tableau for health institutions: an educational tool].

    PubMed

    Moll, Marie Christine; Decavel, Frédérique; Merlet, Christine

    2009-09-01

    For a few years, health institutions have had to comply with the certification and the need to establish the new governance. Thanks to the accreditation version 2 (obtained in 2005), the elaboration of the hospital project (adopted in October, 2006) and the organization in poles since 2006, the quality oriented management became a priority axis at the University Hospital of Angers. The strategic adaptation to quality requirements leads to develop the hospital management, more especially at the level of the clinical, medico technical and administrative poles. The elements of the hospital project including the part about the quality, risk and evaluation aim at being adapted by every pole according to the level of its project. This adaptation which is imposed to each pole manager requires a practical and educational accompaniment allowing at the same time to realize a diagnosis of the progress of the quality approach, a measure of the impact of the global impregnation within the institution and a comparison between pole. A eight axis dashboard with criteria and a user guide were developed from certification ISO 9001, the EFQM manual and the certification manual version 2 of the Healthcare High Authorities. The criteria are transcribed in an EXCEL grid ready to use. Succeeding in estimating your own quality system means that you demonstrate the maturity of the quality approach. The results of this evaluation confirmed those of the certification. The dashboard is a management structuring tool at the service of the multidisciplinary team. Two considerations emerge from these results: First of all, for the hospital top management, the axes to be improved emerge as a priority to determine and target the next annual action plans. The results also allow to support the auto evaluation for the certification version 2010 planned in January of the same year. It is a pragmatic tool which allows auto evaluation and comparison to estimate the pole performances. It is a strategic

  13. [On-site quality control of acupuncture randomized controlled trial: design of content and checklist of quality control based on PICOST].

    PubMed

    Li, Hong-Jiao; He, Li-Yun; Liu, Zhi-Shun; Sun, Ya-Nan; Yan, Shi-Yan; Liu, Jia; Zhao, Ye; Liu, Bao-Yan

    2014-02-01

    To effectively guarantee quality of randomized controlld trial (RCT) of acupuncture and develop reasonable content and checklist of on-site quality control, influencing factors on quality of acupuncture RCT are analyzed and scientificity of quality control content and feasibility of on-site manipulation are put into overall consideration. Based on content and checklist of on-site quality control in National 11th Five-Year Plan Project Optimization of Comprehensive Treatment Plan for TCM in Prevention and Treatment of Serious Disease and Clinical Assessment on Generic Technology and Quality Control Research, it is proposed that on-site quality control of acupuncture RCT should be conducted with PICOST (patient, intervention, comparison, out come, site and time) as core, especially on quality control of interveners' skills and outcome assessment of blinding, and checklist of on-site quality control is developed to provide references for undertaking groups of the project.

  14. Quality evaluation and control of end cap welds in PHWR fuel elements by ultrasonic examination

    NASA Astrophysics Data System (ADS)

    Choi, M. S.; Yang, M. S.

    1991-02-01

    The current quality control procedure of nuclear fuel end cap weld is mainly dependent on the destructive metallographic examination. A nondestructive examination technique, i.e., ultrasonic examination, has been developed to identify and evaluate weld discontinuities. A few interesting results of the weld quality evaluation by applying the developed ultrasonic examination technique to PHWR fuel welds are presented. In addition, the feasibility of the weld quality control by the ultrasonic examination is discussed. This study shows that the ultrasonic examination is effective and reliable method for detecting abnormal weld contours and weld discontinuities such as micro-fissure, crack, upset split and expulsion, and can be used as a quality control tool for the end cap welding process.

  15. Importance of implementing an analytical quality control system in a core laboratory.

    PubMed

    Marques-Garcia, F; Garcia-Codesal, M F; Caro-Narros, M R; Contreras-SanFeliciano, T

    2015-01-01

    The aim of the clinical laboratory is to provide useful information for screening, diagnosis and monitoring of disease. The laboratory should ensure the quality of extra-analytical and analytical process, based on set criteria. To do this, it develops and implements a system of internal quality control, designed to detect errors, and compare its data with other laboratories, through external quality control. In this way it has a tool to detect the fulfillment of the objectives set, and in case of errors, allowing corrective actions to be made, and ensure the reliability of the results. This article sets out to describe the design and implementation of an internal quality control protocol, as well as its periodical assessment intervals (6 months) to determine compliance with pre-determined specifications (Stockholm Consensus(1)). A total of 40 biochemical and 15 immunochemical methods were evaluated using three different control materials. Next, a standard operation procedure was planned to develop a system of internal quality control that included calculating the error of the analytical process, setting quality specifications, and verifying compliance. The quality control data were then statistically depicted as means, standard deviations, and coefficients of variation, as well as systematic, random, and total errors. The quality specifications were then fixed and the operational rules to apply in the analytical process were calculated. Finally, our data were compared with those of other laboratories through an external quality assurance program. The development of an analytical quality control system is a highly structured process. This should be designed to detect errors that compromise the stability of the analytical process. The laboratory should review its quality indicators, systematic, random and total error at regular intervals, in order to ensure that they are meeting pre-determined specifications, and if not, apply the appropriate corrective actions

  16. QUAST: quality assessment tool for genome assemblies

    PubMed Central

    Gurevich, Alexey; Saveliev, Vladislav; Vyahhi, Nikolay; Tesler, Glenn

    2013-01-01

    Summary: Limitations of genome sequencing techniques have led to dozens of assembly algorithms, none of which is perfect. A number of methods for comparing assemblers have been developed, but none is yet a recognized benchmark. Further, most existing methods for comparing assemblies are only applicable to new assemblies of finished genomes; the problem of evaluating assemblies of previously unsequenced species has not been adequately considered. Here, we present QUAST—a quality assessment tool for evaluating and comparing genome assemblies. This tool improves on leading assembly comparison software with new ideas and quality metrics. QUAST can evaluate assemblies both with a reference genome, as well as without a reference. QUAST produces many reports, summary tables and plots to help scientists in their research and in their publications. In this study, we used QUAST to compare several genome assemblers on three datasets. QUAST tables and plots for all of them are available in the Supplementary Material, and interactive versions of these reports are on the QUAST website. Availability: http://bioinf.spbau.ru/quast Contact: gurevich@bioinf.spbau.ru Supplementary information: Supplementary data are available at Bioinformatics online. PMID:23422339

  17. Some Inspection Methods for Quality Control and In-service Inspection of GLARE

    NASA Astrophysics Data System (ADS)

    Sinke, J.

    2003-07-01

    Quality control of materials and structures is an important issue, also for GLARE. During the manufacturing stage the processes and materials should be monitored and checked frequently in order to obtain a qualified product. During the operation of the aircraft, frequent monitoring and inspections are performed to maintain the quality at a prescribed level. Therefore, in-service inspection methods are applied, and when necessary repair activities are conducted. For the quality control of the GLARE panels and components during manufacturing, the C-scan method proves to be an effective tool. For in-service inspection the Eddy Current Method is one of the suitable options. In this paper a brief overview is presented of both methods and their application on GLARE products.

  18. [Quality control in anesthesiology].

    PubMed

    Muñoz-Ramón, J M

    1995-03-01

    The process of quality control and auditing of anesthesiology allows us to evaluate care given by a service and solve problems that are detected. Quality control is a basic element of care giving and is only secondarily an area of academic research; it is therefore a meaningless effort if the information does not serve to improve departmental procedures. Quality assurance procedures assume certain infrastructural requirements and an initial period of implementation and adjustment. The main objectives of quality control are the reduction of morbidity and mortality due to anesthesia, assurance of the availability and proper management of resources and, finally, the well-being and safety of the patient.

  19. Quality Evaluation Tool for Computer-and Web-Delivered Instruction

    DTIC Science & Technology

    2005-06-01

    Bryman , A ., Mars, R., & Tapangco, L. (1996). When less is more: Meaningful learning from visual and verbal summaries of science textbook lessons...is unlimited. " A " 13. ABSTRACT (Maximum 200 words) The objective of this effort was to develop an Instructional Quality Evaluation Tool to help...developed for each rating point on all scales. This report includes these anchored Likert scales, which can serve as a "stand-alone" Tool. The

  20. The methodological quality assessment tools for preclinical and clinical studies, systematic review and meta-analysis, and clinical practice guideline: a systematic review.

    PubMed

    Zeng, Xiantao; Zhang, Yonggang; Kwong, Joey S W; Zhang, Chao; Li, Sheng; Sun, Feng; Niu, Yuming; Du, Liang

    2015-02-01

    To systematically review the methodological assessment tools for pre-clinical and clinical studies, systematic review and meta-analysis, and clinical practice guideline. We searched PubMed, the Cochrane Handbook for Systematic Reviews of Interventions, Joanna Briggs Institute (JBI) Reviewers Manual, Centre for Reviews and Dissemination, Critical Appraisal Skills Programme (CASP), Scottish Intercollegiate Guidelines Network (SIGN), and the National Institute for Clinical Excellence (NICE) up to May 20th, 2014. Two authors selected studies and extracted data; quantitative analysis was performed to summarize the characteristics of included tools. We included a total of 21 assessment tools for analysis. A number of tools were developed by academic organizations, and some were developed by only a small group of researchers. The JBI developed the highest number of methodological assessment tools, with CASP coming second. Tools for assessing the methodological quality of randomized controlled studies were most abundant. The Cochrane Collaboration's tool for assessing risk of bias is the best available tool for assessing RCTs. For cohort and case-control studies, we recommend the use of the Newcastle-Ottawa Scale. The Methodological Index for Non-Randomized Studies (MINORS) is an excellent tool for assessing non-randomized interventional studies, and the Agency for Healthcare Research and Quality (ARHQ) methodology checklist is applicable for cross-sectional studies. For diagnostic accuracy test studies, the Quality Assessment of Diagnostic Accuracy Studies-2 (QUADAS-2) tool is recommended; the SYstematic Review Centre for Laboratory animal Experimentation (SYRCLE) risk of bias tool is available for assessing animal studies; Assessment of Multiple Systematic Reviews (AMSTAR) is a measurement tool for systematic reviews/meta-analyses; an 18-item tool has been developed for appraising case series studies, and the Appraisal of Guidelines, Research and Evaluation (AGREE

  1. Commercial jet fuel quality control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strauss, K.H.

    1995-05-01

    The paper discusses the purpose of jet fuel quality control between the refinery and the aircraft. It describes fixed equipment, including various types of filters, and the usefulness and limitations of this equipment. Test equipment is reviewed as are various surveillance procedures. These include the Air Transport Association specification ATA 103, the FAA Advisory Circular 150/5230-4, the International Air Transport Association Guidance Material for Fuel Quality Control and Fuelling Service and the Guidelines for Quality Control at Jointly Operated Fuel Systems. Some past and current quality control problems are briefly mentioned.

  2. FQC Dashboard: integrates FastQC results into a web-based, interactive, and extensible FASTQ quality control tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Joseph; Pirrung, Meg; McCue, Lee Ann

    FQC is software that facilitates large-scale quality control of FASTQ files by carrying out a QC protocol, parsing results, and aggregating quality metrics within and across experiments into an interactive dashboard. The dashboard utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data.

  3. TU-FG-201-05: Varian MPC as a Statistical Process Control Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carver, A; Rowbottom, C

    Purpose: Quality assurance in radiotherapy requires the measurement of various machine parameters to ensure they remain within permitted values over time. In Truebeam release 2.0 the Machine Performance Check (MPC) was released allowing beam output and machine axis movements to be assessed in a single test. We aim to evaluate the Varian Machine Performance Check (MPC) as a tool for Statistical Process Control (SPC). Methods: Varian’s MPC tool was used on three Truebeam and one EDGE linac for a period of approximately one year. MPC was commissioned against independent systems. After this period the data were reviewed to determine whethermore » or not the MPC was useful as a process control tool. Analyses on individual tests were analysed using Shewhart control plots, using Matlab for analysis. Principal component analysis was used to determine if a multivariate model was of any benefit in analysing the data. Results: Control charts were found to be useful to detect beam output changes, worn T-nuts and jaw calibration issues. Upper and lower control limits were defined at the 95% level. Multivariate SPC was performed using Principal Component Analysis. We found little evidence of clustering beyond that which might be naively expected such as beam uniformity and beam output. Whilst this makes multivariate analysis of little use it suggests that each test is giving independent information. Conclusion: The variety of independent parameters tested in MPC makes it a sensitive tool for routine machine QA. We have determined that using control charts in our QA programme would rapidly detect changes in machine performance. The use of control charts allows large quantities of tests to be performed on all linacs without visual inspection of all results. The use of control limits alerts users when data are inconsistent with previous measurements before they become out of specification. A. Carver has received a speaker’s honorarium from Varian.« less

  4. Application of ESE Data and Tools to Air Quality Management: Services for Helping the Air Quality Community use ESE Data (SHAirED)

    NASA Technical Reports Server (NTRS)

    Falke, Stefan; Husar, Rudolf

    2011-01-01

    The goal of this REASoN applications and technology project is to deliver and use Earth Science Enterprise (ESE) data and tools in support of air quality management. Its scope falls within the domain of air quality management and aims to develop a federated air quality information sharing network that includes data from NASA, EPA, US States and others. Project goals were achieved through a access of satellite and ground observation data, web services information technology, interoperability standards, and air quality community collaboration. In contributing to a network of NASA ESE data in support of particulate air quality management, the project will develop access to distributed data, build Web infrastructure, and create tools for data processing and analysis. The key technologies used in the project include emerging web services for developing self describing and modular data access and processing tools, and service oriented architecture for chaining web services together to assemble customized air quality management applications. The technology and tools required for this project were developed within DataFed.net, a shared infrastructure that supports collaborative atmospheric data sharing and processing web services. Much of the collaboration was facilitated through community interactions through the Federation of Earth Science Information Partners (ESIP) Air Quality Workgroup. The main activities during the project that successfully advanced DataFed, enabled air quality applications and established community-oriented infrastructures were: develop access to distributed data (surface and satellite), build Web infrastructure to support data access, processing and analysis create tools for data processing and analysis foster air quality community collaboration and interoperability.

  5. Quality improvement in neonatal digital radiography: implementing the basic quality improvement tools.

    PubMed

    Eslamy, Hedieh K; Newman, Beverley; Weinberger, Ed

    2014-12-01

    A quality improvement (QI) program may be implemented using the plan-do-study-act cycle (as a model for making improvements) and the basic QI tools (used to visually display and analyze variation in data). Managing radiation dose has come to the forefront as a safety goal for radiology departments. This is especially true in the pediatric population, which is more radiosensitive than the adult population. In this article, we use neonatal digital radiography to discuss developing a QI program with the principle goals of decreasing the radiation dose, decreasing variation in radiation dose, and optimizing image quality. Copyright © 2014 Elsevier Inc. All rights reserved.

  6. Spatially-controlled illumination with rescan confocal microscopy enhances image quality, resolution and reduces photodamage

    NASA Astrophysics Data System (ADS)

    Krishnaswami, Venkataraman; De Luca, Giulia M. R.; Breedijk, Ronald M. P.; Van Noorden, Cornelis J. F.; Manders, Erik M. M.; Hoebe, Ron A.

    2017-02-01

    Fluorescence microscopy is an important tool in biomedical imaging. An inherent trade-off lies between image quality and photodamage. Recently, we have introduced rescan confocal microscopy (RCM) that improves the lateral resolution of a confocal microscope down to 170 nm. Previously, we have demonstrated that with controlled-light exposure microscopy, spatial control of illumination reduces photodamage without compromising image quality. Here, we show that the combination of these two techniques leads to high resolution imaging with reduced photodamage without compromising image quality. Implementation of spatially-controlled illumination was carried out in RCM using a line scanning-based approach. Illumination is spatially-controlled for every line during imaging with the help of a prediction algorithm that estimates the spatial profile of the fluorescent specimen. The estimation is based on the information available from previously acquired line images. As a proof-of-principle, we show images of N1E-115 neuroblastoma cells, obtained by this new setup with reduced illumination dose, improved resolution and without compromising image quality.

  7. Training, Quality Assurance Factors, and Tools Investigation: a Work Report and Suggestions on Software Quality Assurance

    NASA Technical Reports Server (NTRS)

    Lee, Pen-Nan

    1991-01-01

    Previously, several research tasks have been conducted, some observations were obtained, and several possible suggestions have been contemplated involving software quality assurance engineering at NASA Johnson. These research tasks are briefly described. Also, a brief discussion is given on the role of software quality assurance in software engineering along with some observations and suggestions. A brief discussion on a training program for software quality assurance engineers is provided. A list of assurance factors as well as quality factors are also included. Finally, a process model which can be used for searching and collecting software quality assurance tools is presented.

  8. Quality control and quality assurance plan for bridge channel-stability assessments in Massachusetts

    USGS Publications Warehouse

    Parker, Gene W.; Pinson, Harlow

    1993-01-01

    A quality control and quality assurance plan has been implemented as part of the Massachusetts bridge scour and channel-stability assessment program. This program is being conducted by the U.S. Geological Survey, Massachusetts-Rhode Island District, in cooperation with the Massachusetts Highway Department. Project personnel training, data-integrity verification, and new data-management technologies are being utilized in the channel-stability assessment process to improve current data-collection and management techniques. An automated data-collection procedure has been implemented to standardize channel-stability assessments on a regular basis within the State. An object-oriented data structure and new image management tools are used to produce a data base enabling management of multiple data object classes. Data will be reviewed by assessors and data base managers before being merged into a master bridge-scour data base, which includes automated data-verification routines.

  9. Multisite Evaluation of a Data Quality Tool for Patient-Level Clinical Data Sets

    PubMed Central

    Huser, Vojtech; DeFalco, Frank J.; Schuemie, Martijn; Ryan, Patrick B.; Shang, Ning; Velez, Mark; Park, Rae Woong; Boyce, Richard D.; Duke, Jon; Khare, Ritu; Utidjian, Levon; Bailey, Charles

    2016-01-01

    Introduction: Data quality and fitness for analysis are crucial if outputs of analyses of electronic health record data or administrative claims data should be trusted by the public and the research community. Methods: We describe a data quality analysis tool (called Achilles Heel) developed by the Observational Health Data Sciences and Informatics Collaborative (OHDSI) and compare outputs from this tool as it was applied to 24 large healthcare datasets across seven different organizations. Results: We highlight 12 data quality rules that identified issues in at least 10 of the 24 datasets and provide a full set of 71 rules identified in at least one dataset. Achilles Heel is a freely available software that provides a useful starter set of data quality rules with the ability to add additional rules. We also present results of a structured email-based interview of all participating sites that collected qualitative comments about the value of Achilles Heel for data quality evaluation. Discussion: Our analysis represents the first comparison of outputs from a data quality tool that implements a fixed (but extensible) set of data quality rules. Thanks to a common data model, we were able to compare quickly multiple datasets originating from several countries in America, Europe and Asia. PMID:28154833

  10. Design and Testing of a Tool for Evaluating the Quality of Diabetes Consumer-Information Web Sites

    PubMed Central

    Steinwachs, Donald; Rubin, Haya R

    2003-01-01

    Background Most existing tools for measuring the quality of Internet health information focus almost exclusively on structural criteria or other proxies for quality information rather than evaluating actual accuracy and comprehensiveness. Objective This research sought to develop a new performance-measurement tool for evaluating the quality of Internet health information, test the validity and reliability of the tool, and assess the variability in diabetes Web site quality. Methods An objective, systematic tool was developed to evaluate Internet diabetes information based on a quality-of-care measurement framework. The principal investigator developed an abstraction tool and trained an external reviewer on its use. The tool included 7 structural measures and 34 performance measures created by using evidence-based practice guidelines and experts' judgments of accuracy and comprehensiveness. Results Substantial variation existed in all categories, with overall scores following a normal distribution and ranging from 15% to 95% (mean was 50% and median was 51%). Lin's concordance correlation coefficient to assess agreement between raters produced a rho of 0.761 (Pearson's r of 0.769), suggesting moderate to high agreement. The average agreement between raters for the performance measures was 0.80. Conclusions Diabetes Web site quality varies widely. Alpha testing of this new tool suggests that it could become a reliable and valid method for evaluating the quality of Internet health sites. Such an instrument could help lay people distinguish between beneficial and misleading information. PMID:14713658

  11. SU-E-T-103: Development and Implementation of Web Based Quality Control Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Studinski, R; Taylor, R; Angers, C

    Purpose: Historically many radiation medicine programs have maintained their Quality Control (QC) test results in paper records or Microsoft Excel worksheets. Both these approaches represent significant logistical challenges, and are not predisposed to data review and approval. It has been our group's aim to develop and implement web based software designed not just to record and store QC data in a centralized database, but to provide scheduling and data review tools to help manage a radiation therapy clinics Equipment Quality control program. Methods: The software was written in the Python programming language using the Django web framework. In order tomore » promote collaboration and validation from other centres the code was made open source and is freely available to the public via an online source code repository. The code was written to provide a common user interface for data entry, formalize the review and approval process, and offer automated data trending and process control analysis of test results. Results: As of February 2014, our installation of QAtrack+ has 180 tests defined in its database and has collected ∼22 000 test results, all of which have been reviewed and approved by a physicist via QATrack+'s review tools. These results include records for quality control of Elekta accelerators, CT simulators, our brachytherapy programme, TomoTherapy and Cyberknife units. Currently at least 5 other centres are known to be running QAtrack+ clinically, forming the start of an international user community. Conclusion: QAtrack+ has proven to be an effective tool for collecting radiation therapy QC data, allowing for rapid review and trending of data for a wide variety of treatment units. As free and open source software, all source code, documentation and a bug tracker are available to the public at https://bitbucket.org/tohccmedphys/qatrackplus/.« less

  12. Made-in-USA Quality Circles Become People-Building Tool.

    ERIC Educational Resources Information Center

    Cohen, Larry

    1983-01-01

    Discusses the use of quality circles as a human resources development tool in Middlesex Community College's Career-Oriented Peer Services tutoring center. Delineates rules for circle participants and follows the activities of two circles comprised of business-oriented and engineering-oriented students. (DMM)

  13. Using quality assessment tools to critically appraise ageing research: a guide for clinicians.

    PubMed

    Harrison, Jennifer Kirsty; Reid, James; Quinn, Terry J; Shenkin, Susan Deborah

    2017-05-01

    Evidence based medicine tells us that we should not accept published research at face value. Even research from established teams published in the highest impact journals can have methodological flaws, biases and limited generalisability. The critical appraisal of research studies can seem daunting, but tools are available to make the process easier for the non-specialist. Understanding the language and process of quality assessment is essential when considering or conducting research, and is also valuable for all clinicians who use published research to inform their clinical practice.We present a review written specifically for the practising geriatrician. This considers how quality is defined in relation to the methodological conduct and reporting of research. Having established why quality assessment is important, we present and critique tools which are available to standardise quality assessment. We consider five study designs: RCTs, non-randomised studies, observational studies, systematic reviews and diagnostic test accuracy studies. Quality assessment for each of these study designs is illustrated with an example of published cognitive research. The practical applications of the tools are highlighted, with guidance on their strengths and limitations. We signpost educational resources and offer specific advice for use of these tools.We hope that all geriatricians become comfortable with critical appraisal of published research and that use of the tools described in this review - along with awareness of their strengths and limitations - become a part of teaching, journal clubs and practice. © The Author 2016. Published by Oxford University Press on behalf of the British Geriatrics Society.

  14. 40 CFR 75.21 - Quality assurance and quality control requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false Quality assurance and quality control... PROGRAMS (CONTINUED) CONTINUOUS EMISSION MONITORING Operation and Maintenance Requirements § 75.21 Quality assurance and quality control requirements. (a) Continuous emission monitoring systems. The owner or...

  15. Tool time: Gender and students' use of tools, control, and authority

    NASA Astrophysics Data System (ADS)

    Jones, M. Gail; Brader-Araje, Laura; Carboni, Lisa Wilson; Carter, Glenda; Rua, Melissa J.; Banilower, Eric; Hatch, Holly

    2000-10-01

    In this study, we examined how students used science equipment and tools in constructing knowledge during science instruction. Within a geographical metaphor, we focused on how students use tools when constructing new knowledge, how control of tools is actualized from pedagogical perspectives, how language and tool accessibility intersect, how gender intersects with tool use, and how competition for resources impacts access to tools. Sixteen targeted students from five elementary science classes were observed for 3 days of instruction. Results showed gender differences in students' use of exclusive language and commands, as well as in the ways students played and tinkered with tools. Girls tended to carefully follow the teacher's directions during the laboratory and did little playing or tinkering with science tools. Male students tended to use tools in inventive and exploratory ways. Results also showed that whether or not a student had access to his or her own materials became indicative of the type of verbal interactions that took place during the science investigation. Gender-related patterns in how tools are shared, how dyads relate to the materials and each other, and how materials are used to build knowledge are described.

  16. Active controls: A look at analytical methods and associated tools

    NASA Technical Reports Server (NTRS)

    Newsom, J. R.; Adams, W. M., Jr.; Mukhopadhyay, V.; Tiffany, S. H.; Abel, I.

    1984-01-01

    A review of analytical methods and associated tools for active controls analysis and design problems is presented. Approaches employed to develop mathematical models suitable for control system analysis and/or design are discussed. Significant efforts have been expended to develop tools to generate the models from the standpoint of control system designers' needs and develop the tools necessary to analyze and design active control systems. Representative examples of these tools are discussed. Examples where results from the methods and tools have been compared with experimental data are also presented. Finally, a perspective on future trends in analysis and design methods is presented.

  17. A Quality Assessment Tool for Non-Specialist Users of Regression Analysis

    ERIC Educational Resources Information Center

    Argyrous, George

    2015-01-01

    This paper illustrates the use of a quality assessment tool for regression analysis. It is designed for non-specialist "consumers" of evidence, such as policy makers. The tool provides a series of questions such consumers of evidence can ask to interrogate regression analysis, and is illustrated with reference to a recent study published…

  18. 40 CFR 51.359 - Quality control.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 2 2013-07-01 2013-07-01 false Quality control. 51.359 Section 51.359....359 Quality control. Quality control measures shall insure that emission testing equipment is calibrated and maintained properly, and that inspection, calibration records, and control charts are...

  19. 40 CFR 51.359 - Quality control.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 2 2010-07-01 2010-07-01 false Quality control. 51.359 Section 51.359....359 Quality control. Quality control measures shall insure that emission testing equipment is calibrated and maintained properly, and that inspection, calibration records, and control charts are...

  20. 40 CFR 51.359 - Quality control.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 2 2011-07-01 2011-07-01 false Quality control. 51.359 Section 51.359....359 Quality control. Quality control measures shall insure that emission testing equipment is calibrated and maintained properly, and that inspection, calibration records, and control charts are...

  1. 40 CFR 51.359 - Quality control.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 2 2012-07-01 2012-07-01 false Quality control. 51.359 Section 51.359....359 Quality control. Quality control measures shall insure that emission testing equipment is calibrated and maintained properly, and that inspection, calibration records, and control charts are...

  2. HEPS Inventory Tool: An Inventory Tool Including Quality Assessment of School Interventions on Healthy Eating and Physical Activity

    ERIC Educational Resources Information Center

    Dadaczynski, Kevin; Paulus, Peter; de Vries, Nanne; de Ruiter, Silvia; Buijs, Goof

    2010-01-01

    The HEPS Inventory Tool aims to support stakeholders working in school health promotion to promote high quality interventions on healthy eating and physical activity. As a tool it provides a step-by-step approach on how to develop a national or regional inventory of existing school based interventions on healthy eating and physical activity. It…

  3. Quality control in gastrointestinal surgery.

    PubMed

    Ramírez-Barba, Ector Jaime; Arenas-Moya, Diego; Vázquez-Guerrero, Arturo

    2011-01-01

    We analyzed the Mexican legal framework, identifying the vectors that characterize quality and control in gastrointestinal surgery. Quality is contemplated in the health protection rights determined according to the Mexican Constitution, established in the general health law and included as a specific goal in the actual National Development Plan and Health Sector Plan. Quality control implies planning, verification and application of corrective measures. Mexico has implemented several quality strategies such as certification of hospitals and regulatory agreements by the General Salubrity Council, creation of the National Health Quality Committee, generation of Clinical Practice Guidelines and the Certification of Medical Specialties, among others. Quality control in gastrointestinal surgery must begin at the time of medical education and continue during professional activities of surgeons, encouraging multidisciplinary teamwork, knowledge, abilities, attitudes, values and skills that promote homogeneous, safe and quality health services for the Mexican population.

  4. Evolution of Quality Assurance for Clinical Immunohistochemistry in the Era of Precision Medicine: Part 4: Tissue Tools for Quality Assurance in Immunohistochemistry.

    PubMed

    Cheung, Carol C; D'Arrigo, Corrado; Dietel, Manfred; Francis, Glenn D; Fulton, Regan; Gilks, C Blake; Hall, Jacqueline A; Hornick, Jason L; Ibrahim, Merdol; Marchetti, Antonio; Miller, Keith; van Krieken, J Han; Nielsen, Soren; Swanson, Paul E; Taylor, Clive R; Vyberg, Mogens; Zhou, Xiaoge; Torlakovic, Emina E

    2017-04-01

    The numbers of diagnostic, prognostic, and predictive immunohistochemistry (IHC) tests are increasing; the implementation and validation of new IHC tests, revalidation of existing tests, as well as the on-going need for daily quality assurance monitoring present significant challenges to clinical laboratories. There is a need for proper quality tools, specifically tissue tools that will enable laboratories to successfully carry out these processes. This paper clarifies, through the lens of laboratory tissue tools, how validation, verification, and revalidation of IHC tests can be performed in order to develop and maintain high quality "fit-for-purpose" IHC testing in the era of precision medicine. This is the final part of the 4-part series "Evolution of Quality Assurance for Clinical Immunohistochemistry in the Era of Precision Medicine."

  5. General aviation fuel quality control

    NASA Technical Reports Server (NTRS)

    Poitz, H.

    1983-01-01

    Quality control measures for aviation gasoline, and some of the differences between quality control on avgas and mogas are discussed. One thing to keep in mind is that with motor gasoline you can always pull off to the side of the road. It's not so easy to do in an airplane. Consequently, there are reasons for having the tight specifications and the tight quality control measures on avgas as compared to motor gasoline.

  6. Quality of clinical brain tumor MR spectra judged by humans and machine learning tools.

    PubMed

    Kyathanahally, Sreenath P; Mocioiu, Victor; Pedrosa de Barros, Nuno; Slotboom, Johannes; Wright, Alan J; Julià-Sapé, Margarida; Arús, Carles; Kreis, Roland

    2018-05-01

    To investigate and compare human judgment and machine learning tools for quality assessment of clinical MR spectra of brain tumors. A very large set of 2574 single voxel spectra with short and long echo time from the eTUMOUR and INTERPRET databases were used for this analysis. Original human quality ratings from these studies as well as new human guidelines were used to train different machine learning algorithms for automatic quality control (AQC) based on various feature extraction methods and classification tools. The performance was compared with variance in human judgment. AQC built using the RUSBoost classifier that combats imbalanced training data performed best. When furnished with a large range of spectral and derived features where the most crucial ones had been selected by the TreeBagger algorithm it showed better specificity (98%) in judging spectra from an independent test-set than previously published methods. Optimal performance was reached with a virtual three-class ranking system. Our results suggest that feature space should be relatively large for the case of MR tumor spectra and that three-class labels may be beneficial for AQC. The best AQC algorithm showed a performance in rejecting spectra that was comparable to that of a panel of human expert spectroscopists. Magn Reson Med 79:2500-2510, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  7. Network-based production quality control

    NASA Astrophysics Data System (ADS)

    Kwon, Yongjin; Tseng, Bill; Chiou, Richard

    2007-09-01

    This study investigates the feasibility of remote quality control using a host of advanced automation equipment with Internet accessibility. Recent emphasis on product quality and reduction of waste stems from the dynamic, globalized and customer-driven market, which brings opportunities and threats to companies, depending on the response speed and production strategies. The current trends in industry also include a wide spread of distributed manufacturing systems, where design, production, and management facilities are geographically dispersed. This situation mandates not only the accessibility to remotely located production equipment for monitoring and control, but efficient means of responding to changing environment to counter process variations and diverse customer demands. To compete under such an environment, companies are striving to achieve 100%, sensor-based, automated inspection for zero-defect manufacturing. In this study, the Internet-based quality control scheme is referred to as "E-Quality for Manufacturing" or "EQM" for short. By its definition, EQM refers to a holistic approach to design and to embed efficient quality control functions in the context of network integrated manufacturing systems. Such system let designers located far away from the production facility to monitor, control and adjust the quality inspection processes as production design evolves.

  8. AfterQC: automatic filtering, trimming, error removing and quality control for fastq data.

    PubMed

    Chen, Shifu; Huang, Tanxiao; Zhou, Yanqing; Han, Yue; Xu, Mingyan; Gu, Jia

    2017-03-14

    Some applications, especially those clinical applications requiring high accuracy of sequencing data, usually have to face the troubles caused by unavoidable sequencing errors. Several tools have been proposed to profile the sequencing quality, but few of them can quantify or correct the sequencing errors. This unmet requirement motivated us to develop AfterQC, a tool with functions to profile sequencing errors and correct most of them, plus highly automated quality control and data filtering features. Different from most tools, AfterQC analyses the overlapping of paired sequences for pair-end sequencing data. Based on overlapping analysis, AfterQC can detect and cut adapters, and furthermore it gives a novel function to correct wrong bases in the overlapping regions. Another new feature is to detect and visualise sequencing bubbles, which can be commonly found on the flowcell lanes and may raise sequencing errors. Besides normal per cycle quality and base content plotting, AfterQC also provides features like polyX (a long sub-sequence of a same base X) filtering, automatic trimming and K-MER based strand bias profiling. For each single or pair of FastQ files, AfterQC filters out bad reads, detects and eliminates sequencer's bubble effects, trims reads at front and tail, detects the sequencing errors and corrects part of them, and finally outputs clean data and generates HTML reports with interactive figures. AfterQC can run in batch mode with multiprocess support, it can run with a single FastQ file, a single pair of FastQ files (for pair-end sequencing), or a folder for all included FastQ files to be processed automatically. Based on overlapping analysis, AfterQC can estimate the sequencing error rate and profile the error transform distribution. The results of our error profiling tests show that the error distribution is highly platform dependent. Much more than just another new quality control (QC) tool, AfterQC is able to perform quality control, data

  9. FQC Dashboard: integrates FastQC results into a web-based, interactive, and extensible FASTQ quality control tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Joseph; Pirrung, Meg; McCue, Lee Ann

    FQC is software that facilitates quality control of FASTQ files by carrying out a QC protocol using FastQC, parsing results, and aggregating quality metrics into an interactive dashboard designed to richly summarize individual sequencing runs. The dashboard groups samples in dropdowns for navigation among the data sets, utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data.

  10. FQC Dashboard: integrates FastQC results into a web-based, interactive, and extensible FASTQ quality control tool

    DOE PAGES

    Brown, Joseph; Pirrung, Meg; McCue, Lee Ann

    2017-06-09

    FQC is software that facilitates quality control of FASTQ files by carrying out a QC protocol using FastQC, parsing results, and aggregating quality metrics into an interactive dashboard designed to richly summarize individual sequencing runs. The dashboard groups samples in dropdowns for navigation among the data sets, utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data.

  11. Intelligent control system based on ARM for lithography tool

    NASA Astrophysics Data System (ADS)

    Chen, Changlong; Tang, Xiaoping; Hu, Song; Wang, Nan

    2014-08-01

    The control system of traditional lithography tool is based on PC and MCU. The PC handles the complex algorithm, human-computer interaction, and communicates with MCU via serial port; The MCU controls motors and electromagnetic valves, etc. This mode has shortcomings like big volume, high power consumption, and wasting of PC resource. In this paper, an embedded intelligent control system of lithography tool, based on ARM, is provided. The control system used S5PV210 as processor, completing the functions of PC in traditional lithography tool, and provided a good human-computer interaction by using LCD and capacitive touch screen. Using Android4.0.3 as operating system, the equipment provided a cool and easy UI which made the control more user-friendly, and implemented remote control and debug, pushing video information of product by network programming. As a result, it's convenient for equipment vendor to provide technical support for users. Finally, compared with traditional lithography tool, this design reduced the PC part, making the hardware resources efficiently used and reducing the cost and volume. Introducing embedded OS and the concepts in "The Internet of things" into the design of lithography tool can be a development trend.

  12. Quality controls for gamma cameras and PET cameras: development of a free open-source ImageJ program

    NASA Astrophysics Data System (ADS)

    Carlier, Thomas; Ferrer, Ludovic; Berruchon, Jean B.; Cuissard, Regis; Martineau, Adeline; Loonis, Pierre; Couturier, Olivier

    2005-04-01

    Acquisition data and treatments for quality controls of gamma cameras and Positron Emission Tomography (PET) cameras are commonly performed with dedicated program packages, which are running only on manufactured computers and differ from each other, depending on camera company and program versions. The aim of this work was to develop a free open-source program (written in JAVA language) to analyze data for quality control of gamma cameras and PET cameras. The program is based on the free application software ImageJ and can be easily loaded on any computer operating system (OS) and thus on any type of computer in every nuclear medicine department. Based on standard parameters of quality control, this program includes 1) for gamma camera: a rotation center control (extracted from the American Association of Physics in Medicine, AAPM, norms) and two uniformity controls (extracted from the Institute of Physics and Engineering in Medicine, IPEM, and National Electronic Manufacturers Association, NEMA, norms). 2) For PET systems, three quality controls recently defined by the French Medical Physicist Society (SFPM), i.e. spatial resolution and uniformity in a reconstructed slice and scatter fraction, are included. The determination of spatial resolution (thanks to the Point Spread Function, PSF, acquisition) allows to compute the Modulation Transfer Function (MTF) in both modalities of cameras. All the control functions are included in a tool box which is a free ImageJ plugin and could be soon downloaded from Internet. Besides, this program offers the possibility to save on HTML format the uniformity quality control results and a warning can be set to automatically inform users in case of abnormal results. The architecture of the program allows users to easily add any other specific quality control program. Finally, this toolkit is an easy and robust tool to perform quality control on gamma cameras and PET cameras based on standard computation parameters, is free, run on

  13. Evaluation of the Missoula-VITAS Quality of Life Index--revised: research tool or clinical tool?

    PubMed

    Schwartz, Carolyn E; Merriman, Melanie P; Reed, George; Byock, Ira

    2005-02-01

    Quality of life (QOL) is a central outcome measure in caring for seriously ill patients. The Missoula-VITAS Quality of Life Index (MVQOLI) is a 25-item patient-centered index that weights each of five QOL dimensions (symptoms, function, interpersonal, wellbeing, transcendence) by its importance to the respondent. The measure has been used to assess QOL for hospice patients, and has been found to be somewhat complex to use and analyze. This study aimed to simplify the measure, and evaluate the reliability and validity of a revised version as either a research or clinical tool (i.e., "psychometric" versus "clinimetric"). Two data collection efforts are described. The psychometric study collected QOL data from 175 patients at baseline, 3-5 days, and 21 days later. The implementation study evaluated the feasibility and utility of the MVQOLI-R during over six weeks of use. End-stage renal patients on dialysis, hospice, or long-term care patients participated in the psychometric study. The implementation study was done in hospice, home health, and palliative care settings. The MVQOLI-R and the Memorial Symptom Assessment Scale. The psychometric and implementation studies suggest that the MVQOLI-R performs well as a clinical tool but is not powerful as an outcome research instrument. The MVQOLI-R has the heterogeneous structure of clinimetric tools, and demonstrated both relevance and responsiveness. Additionally, in a clinical setting the MVQOLI-R was useful therapeutically for stimulating communication about the psychosocial and spiritual issues important to the tasks of life completion and life closure. The MVQOLI-R has clinical utility as a patient QOL assessment tool and may have therapeutic utility as a tool for fostering discussion among patients and their clinicians, as well as for helping patients identify sources of suffering and opportunities during this time in their lives.

  14. [Compatibility of different quality control systems].

    PubMed

    Invernizzi, Enrico

    2002-01-01

    Management of the good laboratory practice (GLP) quality system presupposes its linking to a basic recognized and approved quality system, from which it can draw on management procedures common to all quality systems, such as the ISO 9000 set of norms. A quality system organized in this way can also be integrated with other dedicated quality systems, or parts of them, to obtain principles or management procedures for specific topics. The aim of this organization is to set up a reliable, recognized quality system compatible with the principles of GLP and other quality management systems, which provides users with a simplified set of easily accessible management tools and answers. The organization of this quality system is set out in the quality assurance programme, which is actually the document in which the test facility incorporates the GLP principles into its own quality organization.

  15. Development of AMSTAR: a measurement tool to assess the methodological quality of systematic reviews.

    PubMed

    Shea, Beverley J; Grimshaw, Jeremy M; Wells, George A; Boers, Maarten; Andersson, Neil; Hamel, Candyce; Porter, Ashley C; Tugwell, Peter; Moher, David; Bouter, Lex M

    2007-02-15

    Our objective was to develop an instrument to assess the methodological quality of systematic reviews, building upon previous tools, empirical evidence and expert consensus. A 37-item assessment tool was formed by combining 1) the enhanced Overview Quality Assessment Questionnaire (OQAQ), 2) a checklist created by Sacks, and 3) three additional items recently judged to be of methodological importance. This tool was applied to 99 paper-based and 52 electronic systematic reviews. Exploratory factor analysis was used to identify underlying components. The results were considered by methodological experts using a nominal group technique aimed at item reduction and design of an assessment tool with face and content validity. The factor analysis identified 11 components. From each component, one item was selected by the nominal group. The resulting instrument was judged to have face and content validity. A measurement tool for the 'assessment of multiple systematic reviews' (AMSTAR) was developed. The tool consists of 11 items and has good face and content validity for measuring the methodological quality of systematic reviews. Additional studies are needed with a focus on the reproducibility and construct validity of AMSTAR, before strong recommendations can be made on its use.

  16. 40 CFR 51.359 - Quality control.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 2 2014-07-01 2014-07-01 false Quality control. 51.359 Section 51.359 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS REQUIREMENTS FOR... to assure test accuracy. Computer control of quality assurance checks and quality control charts...

  17. Turning Continuous Quality Improvement into Institutional Practice: The Tools and Techniques.

    ERIC Educational Resources Information Center

    Cornesky, Robert A.

    This manual is intended to assist managers of support units at institutions of higher education in the implementation of Continuous Quality Improvement (CQI). The purpose is to describe a cooperative model for CQI which will permit managers to evaluate the quality of their units and institution, and by using the described tools and techniques, to…

  18. Lot quality assurance sampling to monitor supplemental immunization activity quality: an essential tool for improving performance in polio endemic countries.

    PubMed

    Brown, Alexandra E; Okayasu, Hiromasa; Nzioki, Michael M; Wadood, Mufti Z; Chabot-Couture, Guillaume; Quddus, Arshad; Walker, George; Sutter, Roland W

    2014-11-01

    Monitoring the quality of supplementary immunization activities (SIAs) is a key tool for polio eradication. Regular monitoring data, however, are often unreliable, showing high coverage levels in virtually all areas, including those with ongoing virus circulation. To address this challenge, lot quality assurance sampling (LQAS) was introduced in 2009 as an additional tool to monitor SIA quality. Now used in 8 countries, LQAS provides a number of programmatic benefits: identifying areas of weak coverage quality with statistical reliability, differentiating areas of varying coverage with greater precision, and allowing for trend analysis of campaign quality. LQAS also accommodates changes to survey format, interpretation thresholds, evaluations of sample size, and data collection through mobile phones to improve timeliness of reporting and allow for visualization of campaign quality. LQAS becomes increasingly important to address remaining gaps in SIA quality and help focus resources on high-risk areas to prevent the continued transmission of wild poliovirus. © Crown copyright 2014.

  19. RNA-SeQC: RNA-seq metrics for quality control and process optimization.

    PubMed

    DeLuca, David S; Levin, Joshua Z; Sivachenko, Andrey; Fennell, Timothy; Nazaire, Marc-Danie; Williams, Chris; Reich, Michael; Winckler, Wendy; Getz, Gad

    2012-06-01

    RNA-seq, the application of next-generation sequencing to RNA, provides transcriptome-wide characterization of cellular activity. Assessment of sequencing performance and library quality is critical to the interpretation of RNA-seq data, yet few tools exist to address this issue. We introduce RNA-SeQC, a program which provides key measures of data quality. These metrics include yield, alignment and duplication rates; GC bias, rRNA content, regions of alignment (exon, intron and intragenic), continuity of coverage, 3'/5' bias and count of detectable transcripts, among others. The software provides multi-sample evaluation of library construction protocols, input materials and other experimental parameters. The modularity of the software enables pipeline integration and the routine monitoring of key measures of data quality such as the number of alignable reads, duplication rates and rRNA contamination. RNA-SeQC allows investigators to make informed decisions about sample inclusion in downstream analysis. In summary, RNA-SeQC provides quality control measures critical to experiment design, process optimization and downstream computational analysis. See www.genepattern.org to run online, or www.broadinstitute.org/rna-seqc/ for a command line tool.

  20. A dynamic quality assessment tool for laparoscopic hysterectomy to measure surgical outcomes.

    PubMed

    Driessen, Sara R C; Van Zwet, Erik W; Haazebroek, Pascal; Sandberg, Evelien M; Blikkendaal, Mathijs D; Twijnstra, Andries R H; Jansen, Frank Willem

    2016-12-01

    The current health care system has an urgent need for tools to measure quality. A wide range of quality indicators have been developed in an attempt to differentiate between high-quality and low-quality health care processes. However, one of the main issues of currently used indicators is the lack of case-mix correction and improvement possibilities. Case-mix is defined as specific (patient) characteristics that are known to potentially affect (surgical) outcome. If these characteristics are not taken into consideration, comparisons of outcome among health care providers may not be valid. The objective of the study was to develop and test a quality assessment tool for laparoscopic hysterectomy, which can serve as a new outcome quality indicator. This is a prospective, international, multicenter implementation study. A web-based application was developed with 3 main goals: (1) to measure the surgeon's performance using 3 primary outcomes (blood loss, operative time, and complications); (2) to provide immediate individual feedback using cumulative observed-minus-expected graphs; and (3) to detect consistently suboptimal performance after correcting for case-mix characteristics. All gynecologists who perform laparoscopic hysterectomies were requested to register their procedures in the application. A patient safety risk factor checklist was used by the surgeon for reflection. Thereafter a prospective implementation study was performed, and the application was tested using a survey that included the System Usability Scale. A total of 2066 laparoscopic hysterectomies were registered by 81 gynecologists. Mean operative time was 100 ± 39 minutes, blood loss 127 ± 163 mL, and the complication rate 6.1%. The overall survey response rate was 75%, and the mean System Usability Scale was 76.5 ± 13.6, which indicates that the application was good to excellent. The majority of surgeons reported that the application made them more aware of their performance, the outcomes, and

  1. 7 CFR 981.42 - Quality control.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 8 2012-01-01 2012-01-01 false Quality control. 981.42 Section 981.42 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Marketing Agreements... Regulating Handling Quality Control § 981.42 Quality control. (a) Incoming. Except as provided in this...

  2. 7 CFR 981.42 - Quality control.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 8 2013-01-01 2013-01-01 false Quality control. 981.42 Section 981.42 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (MARKETING AGREEMENTS... Regulating Handling Quality Control § 981.42 Quality control. (a) Incoming. Except as provided in this...

  3. 7 CFR 981.42 - Quality control.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 8 2011-01-01 2011-01-01 false Quality control. 981.42 Section 981.42 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Marketing Agreements... Regulating Handling Quality Control § 981.42 Quality control. (a) Incoming. Except as provided in this...

  4. 7 CFR 981.42 - Quality control.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 8 2010-01-01 2010-01-01 false Quality control. 981.42 Section 981.42 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Marketing Agreements... Regulating Handling Quality Control § 981.42 Quality control. (a) Incoming. Except as provided in this...

  5. Computer implemented method, and apparatus for controlling a hand-held tool

    NASA Technical Reports Server (NTRS)

    Wagner, Kenneth William (Inventor); Taylor, James Clayton (Inventor)

    1999-01-01

    The invention described here in is a computer-implemented method and apparatus for controlling a hand-held tool. In particular, the control of a hand held tool is for the purpose of controlling the speed of a fastener interface mechanism and the torque applied to fasteners by the fastener interface mechanism of the hand-held tool and monitoring the operating parameters of the tool. The control is embodied in intool software embedded on a processor within the tool which also communicates with remote software. An operator can run the tool, or through the interaction of both software, operate the tool from a remote location, analyze data from a performance history recorded by the tool, and select various torque and speed parameters for each fastener.

  6. Improvement of Selected Logistics Processes Using Quality Engineering Tools

    NASA Astrophysics Data System (ADS)

    Zasadzień, Michał; Žarnovský, Jozef

    2018-03-01

    Increase in the number of orders, the increasing quality requirements and the speed of order preparation require implementation of new solutions and improvement of logistics processes. Any disruption that occurs during execution of an order often leads to customer dissatisfaction, as well as loss of his/her confidence. The article presents a case study of the use of quality engineering methods and tools to improve the e-commerce logistic process. This made it possible to identify and prioritize key issues, identify their causes, and formulate improvement and prevention measures.

  7. 7 CFR 930.44 - Quality control.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 8 2011-01-01 2011-01-01 false Quality control. 930.44 Section 930.44 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Marketing Agreements... Control § 930.44 Quality control. (a) Quality standards. The Board may establish, with the approval of the...

  8. 7 CFR 930.44 - Quality control.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 8 2010-01-01 2010-01-01 false Quality control. 930.44 Section 930.44 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Marketing Agreements... Control § 930.44 Quality control. (a) Quality standards. The Board may establish, with the approval of the...

  9. 7 CFR 930.44 - Quality control.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 8 2012-01-01 2012-01-01 false Quality control. 930.44 Section 930.44 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Marketing Agreements... Control § 930.44 Quality control. (a) Quality standards. The Board may establish, with the approval of the...

  10. 7 CFR 930.44 - Quality control.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 8 2013-01-01 2013-01-01 false Quality control. 930.44 Section 930.44 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (MARKETING AGREEMENTS... Control § 930.44 Quality control. (a) Quality standards. The Board may establish, with the approval of the...

  11. Controlling Pollutants and Sources: Indoor Air Quality Design Tools for Schools

    EPA Pesticide Factsheets

    To protect indoor environmental quality the designer should understand indoor air quality problems and seek to eliminate potential sources of contamination that originate from outdoors as well as indoors.

  12. Soft qualities in healthcare. Method and tools for soft qualities design in hospitals' built environments.

    PubMed

    Capolongo, S; Bellini, E; Nachiero, D; Rebecchi, A; Buffoli, M

    2014-01-01

    The design of hospital environments is determined by functional requirements and technical regulations, as well as numerous protocols, which define the structure and system characteristics that such environments need to achieve. In order to improve people's well-being and the quality of their experience within public hospitals, design elements (soft qualities) are added to those 'necessary' features. The aim of this research has been to experiment a new design process and also to create health care spaces with high environmental quality and capable to meet users' emotional and perceptual needs. Such needs were investigated with the help of qualitative research tools and the design criteria for one of these soft qualities - colour - were subsequently defined on the basis of the findings. The colour scheme design for the new San Paolo Hospital Emergency Department in Milan was used as case study. Focus groups were fundamental in defining the project's goals and criteria. The issues raised have led to believe that the proper procedure is not the mere consultation of the users in order to define the goals: users should rather be involved in the whole design process and become co-agents of the choices that determine the environment characteristics, so as to meet the quality requirements identified by the users themselves. The case study has shown the possibility of developing a designing methodology made by three steps (or operational tools) in which users' groups are involved in the choices, loading to plan the environments where compliance with expectations is already implied and verified by means of the process itself. Thus, the method leads to the creation of soft qualities in Healthcare.

  13. New genetic tools to improve citrus fruit quality and drive consumer demand

    USDA-ARS?s Scientific Manuscript database

    Chemical and genomic dissection of important components underlying fruit quality has led toward the development of new tools to make the creation and selection of citrus cultivars improved in quality attributes more targeted and efficient. The use of SNP platforms and other technologies have resulte...

  14. Quality tools and resources to support organisational improvement integral to high-quality primary care: a systematic review of published and grey literature.

    PubMed

    Janamian, Tina; Upham, Susan J; Crossland, Lisa; Jackson, Claire L

    2016-04-18

    To conduct a systematic review of the literature to identify existing online primary care quality improvement tools and resources to support organisational improvement related to the seven elements in the Primary Care Practice Improvement Tool (PC-PIT), with the identified tools and resources to progress to a Delphi study for further assessment of relevance and utility. Systematic review of the international published and grey literature. CINAHL, Embase and PubMed databases were searched in March 2014 for articles published between January 2004 and December 2013. GreyNet International and other relevant websites and repositories were also searched in March-April 2014 for documents dated between 1992 and 2012. All citations were imported into a bibliographic database. Published and unpublished tools and resources were included in the review if they were in English, related to primary care quality improvement and addressed any of the seven PC-PIT elements of a high-performing practice. Tools and resources that met the eligibility criteria were then evaluated for their accessibility, relevance, utility and comprehensiveness using a four-criteria appraisal framework. We used a data extraction template to systematically extract information from eligible tools and resources. A content analysis approach was used to explore the tools and resources and collate relevant information: name of the tool or resource, year and country of development, author, name of the organisation that provided access and its URL, accessibility information or problems, overview of each tool or resource and the quality improvement element(s) it addresses. If available, a copy of the tool or resource was downloaded into the bibliographic database, along with supporting evidence (published or unpublished) on its use in primary care. This systematic review identified 53 tools and resources that can potentially be provided as part of a suite of tools and resources to support primary care practices in

  15. Coproducing Aboriginal patient journey mapping tools for improved quality and coordination of care.

    PubMed

    Kelly, Janet; Dwyer, Judith; Mackean, Tamara; O'Donnell, Kim; Willis, Eileen

    2016-12-08

    This paper describes the rationale and process for developing a set of Aboriginal patient journey mapping tools with Aboriginal patients, health professionals, support workers, educators and researchers in the Managing Two Worlds Together project between 2008 and 2015. Aboriginal patients and their families from rural and remote areas, and healthcare providers in urban, rural and remote settings, shared their perceptions of the barriers and enablers to quality care in interviews and focus groups, and individual patient journey case studies were documented. Data were thematically analysed. In the absence of suitable existing tools, a new analytical framework and mapping approach was developed. The utility of the tools in other settings was then tested with health professionals, and the tools were further modified for use in quality improvement in health and education settings in South Australia and the Northern Territory. A central set of patient journey mapping tools with flexible adaptations, a workbook, and five sets of case studies describing how staff adapted and used the tools at different sites are available for wider use.

  16. 14 CFR 21.139 - Quality control.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Quality control. 21.139 Section 21.139... PROCEDURES FOR PRODUCTS AND PARTS Production Certificates § 21.139 Quality control. The applicant must show that he has established and can maintain a quality control system for any product, for which he...

  17. 33 CFR 385.21 - Quality control.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 33 Navigation and Navigable Waters 3 2011-07-01 2011-07-01 false Quality control. 385.21 Section... Processes § 385.21 Quality control. (a) The Corps of Engineers and the non-Federal sponsor shall prepare a quality control plan, in accordance with applicable Corps of Engineers regulations, for each product that...

  18. 14 CFR 21.139 - Quality control.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 1 2011-01-01 2011-01-01 false Quality control. 21.139 Section 21.139... PROCEDURES FOR PRODUCTS AND PARTS Production Certificates § 21.139 Quality control. The applicant must show that he has established and can maintain a quality control system for any product, for which he...

  19. 33 CFR 385.21 - Quality control.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 33 Navigation and Navigable Waters 3 2013-07-01 2013-07-01 false Quality control. 385.21 Section... Processes § 385.21 Quality control. (a) The Corps of Engineers and the non-Federal sponsor shall prepare a quality control plan, in accordance with applicable Corps of Engineers regulations, for each product that...

  20. 33 CFR 385.21 - Quality control.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 3 2010-07-01 2010-07-01 false Quality control. 385.21 Section... Processes § 385.21 Quality control. (a) The Corps of Engineers and the non-Federal sponsor shall prepare a quality control plan, in accordance with applicable Corps of Engineers regulations, for each product that...

  1. 40 CFR 136.7 - Quality assurance and quality control.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... quality control elements, where applicable, into the laboratory's documented standard operating procedure... quality control elements must be clearly documented in the written standard operating procedure for each... Methods contains QA/QC procedures in the Part 1000 section of the Standard Methods Compendium. The...

  2. 40 CFR 136.7 - Quality assurance and quality control.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... quality control elements, where applicable, into the laboratory's documented standard operating procedure... quality control elements must be clearly documented in the written standard operating procedure for each... Methods contains QA/QC procedures in the Part 1000 section of the Standard Methods Compendium. The...

  3. 40 CFR 136.7 - Quality assurance and quality control.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... quality control elements, where applicable, into the laboratory's documented standard operating procedure... quality control elements must be clearly documented in the written standard operating procedure for each... Methods contains QA/QC procedures in the Part 1000 section of the Standard Methods Compendium. The...

  4. A comprehensive quality control workflow for paired tumor-normal NGS experiments.

    PubMed

    Schroeder, Christopher M; Hilke, Franz J; Löffler, Markus W; Bitzer, Michael; Lenz, Florian; Sturm, Marc

    2017-06-01

    Quality control (QC) is an important part of all NGS data analysis stages. Many available tools calculate QC metrics from different analysis steps of single sample experiments (raw reads, mapped reads and variant lists). Multi-sample experiments, as sequencing of tumor-normal pairs, require additional QC metrics to ensure validity of results. These multi-sample QC metrics still lack standardization. We therefore suggest a new workflow for QC of DNA sequencing of tumor-normal pairs. With this workflow well-known single-sample QC metrics and additional metrics specific for tumor-normal pairs can be calculated. The segmentation into different tools offers a high flexibility and allows reuse for other purposes. All tools produce qcML, a generic XML format for QC of -omics experiments. qcML uses quality metrics defined in an ontology, which was adapted for NGS. All QC tools are implemented in C ++ and run both under Linux and Windows. Plotting requires python 2.7 and matplotlib. The software is available under the 'GNU General Public License version 2' as part of the ngs-bits project: https://github.com/imgag/ngs-bits. christopher.schroeder@med.uni-tuebingen.de. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  5. Interpolity exchange of basalt tools facilitated via elite control in Hawaiian archaic states

    PubMed Central

    Kirch, Patrick V.; Mills, Peter R.; Lundblad, Steven P.; Sinton, John; Kahn, Jennifer G.

    2012-01-01

    Ethnohistoric accounts of late precontact Hawaiian archaic states emphasize the independence of chiefly controlled territories (ahupua‘a) based on an agricultural, staple economy. However, elite control of unevenly distributed resources, such as high-quality volcanic rock for adze production, may have provided an alternative source of economic power. To test this hypothesis we used nondestructive energy-dispersive X-ray fluorescence (ED-XRF) analysis of 328 lithic artifacts from 36 archaeological features in the Kahikinui district, Maui Island, to geochemically characterize the source groups. This process was followed by a limited sampling using destructive wavelength-dispersive X-ray fluorescence (WD-XRF) analysis to more precisely characterize certain nonlocal source groups. Seventeen geochemical groups were defined, eight of which represent extra-Maui Island sources. Although the majority of stone tools were derived from Maui Island sources (71%), a significant quantity (27%) of tools derived from extraisland sources, including the large Mauna Kea quarry on Hawai‘i Island as well as quarries on O‘ahu, Moloka‘i, and Lāna‘i islands. Importantly, tools quarried from extralocal sources are found in the highest frequency in elite residential features and in ritual contexts. These results suggest a significant role for a wealth economy based on the control and distribution of nonagricultural goods and resources during the rise of the Hawaiian archaic states. PMID:22203984

  6. Application of miniaturized near-infrared spectroscopy for quality control of extemporaneous orodispersible films.

    PubMed

    Foo, Wen Chin; Widjaja, Effendi; Khong, Yuet Mei; Gokhale, Rajeev; Chan, Sui Yung

    2018-02-20

    Extemporaneous oral preparations are routinely compounded in the pharmacy due to a lack of suitable formulations for special populations. Such small-scale pharmacy preparations also present an avenue for individualized pharmacotherapy. Orodispersible films (ODF) have increasingly been evaluated as a suitable dosage form for extemporaneous oral preparations. Nevertheless, as with all other extemporaneous preparations, safety and quality remain a concern. Although the United States Pharmacopeia (USP) recommends analytical testing of compounded preparations for quality assurance, pharmaceutical assays are typically not routinely performed for such non-sterile pharmacy preparations, due to the complexity and high cost of conventional assay methods such as high performance liquid chromatography (HPLC). Spectroscopic methods including Raman, infrared and near-infrared spectroscopy have been successfully applied as quality control tools in the industry. The state-of-art benchtop spectrometers used in those studies have the advantage of superior resolution and performance, but are not suitable for use in a small-scale pharmacy setting. In this study, we investigated the application of a miniaturized near infrared (NIR) spectrometer as a quality control tool for identification and quantification of drug content in extemporaneous ODFs. Miniaturized near infrared (NIR) spectroscopy is suitable for small-scale pharmacy applications in view of its small size, portability, simple user interface, rapid measurement and real-time prediction results. Nevertheless, the challenge with miniaturized NIR spectroscopy is its lower resolution compared to state-of-art benchtop equipment. We have successfully developed NIR spectroscopy calibration models for identification of ODFs containing five different drugs, and quantification of drug content in ODFs containing 2-10mg ondansetron (OND). The qualitative model for drug identification produced 100% prediction accuracy. The quantitative

  7. A web tool for STORET/WQX water quality data retrieval and Best Management Practice scenario suggestion.

    PubMed

    Park, Youn Shik; Engel, Bernie A; Kim, Jonggun; Theller, Larry; Chaubey, Indrajeet; Merwade, Venkatesh; Lim, Kyoung Jae

    2015-03-01

    Total Maximum Daily Load is a water quality standard to regulate water quality of streams, rivers and lakes. A wide range of approaches are used currently to develop TMDLs for impaired streams and rivers. Flow and load duration curves (FDC and LDC) have been used in many states to evaluate the relationship between flow and pollutant loading along with other models and approaches. A web-based LDC Tool was developed to facilitate development of FDC and LDC as well as to support other hydrologic analyses. In this study, the FDC and LDC tool was enhanced to allow collection of water quality data via the web and to assist in establishing cost-effective Best Management Practice (BMP) implementations. The enhanced web-based tool provides use of water quality data not only from the US Geological Survey but also from the Water Quality Portal for the U.S. via web access. Moreover, the web-based tool identifies required pollutant reductions to meet standard loads and suggests a BMP scenario based on ability of BMPs to reduce pollutant loads, BMP establishment and maintenance costs. In the study, flow and water quality data were collected via web access to develop LDC and to identify the required reduction. The suggested BMP scenario from the web-based tool was evaluated using the EPA Spreadsheet Tool for the Estimation of Pollutant Load model to attain the required pollutant reduction at least cost. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Integrated Tools for Future Distributed Engine Control Technologies

    NASA Technical Reports Server (NTRS)

    Culley, Dennis; Thomas, Randy; Saus, Joseph

    2013-01-01

    Turbine engines are highly complex mechanical systems that are becoming increasingly dependent on control technologies to achieve system performance and safety metrics. However, the contribution of controls to these measurable system objectives is difficult to quantify due to a lack of tools capable of informing the decision makers. This shortcoming hinders technology insertion in the engine design process. NASA Glenn Research Center is developing a Hardware-inthe- Loop (HIL) platform and analysis tool set that will serve as a focal point for new control technologies, especially those related to the hardware development and integration of distributed engine control. The HIL platform is intended to enable rapid and detailed evaluation of new engine control applications, from conceptual design through hardware development, in order to quantify their impact on engine systems. This paper discusses the complex interactions of the control system, within the context of the larger engine system, and how new control technologies are changing that paradigm. The conceptual design of the new HIL platform is then described as a primary tool to address those interactions and how it will help feed the insertion of new technologies into future engine systems.

  9. Assessment of 100% Rapid Review as an Effective Tool for Internal Quality Control in Cytopathological Services.

    PubMed

    Queiroz Filho, José; de Oliveira Crispim Freitas, Janaina Cristiana; Caldas Pessoa, Daliana; Eleutério Júnior, José; Giraldo, Paulo César; Gonçalves, Ana Katherine

    2017-01-01

    The aim of this study was to evaluate the 100% rapid review (100%-RR) as an effective tool for internal quality control (IQC) in gynecological cytopathology services. A total of 8,677 swabs were analyzed; the negative results were submitted to 100%-RR. Divergent cases were discussed in a consensus meeting to reach a conclusion on the final diagnosis. The data were entered into SAS statistical software, and the agreement of the 100%-RR results with the final diagnosis was tested with the weighted kappa statistic. Of the 8,155 smears characterized as negative, 255 (3.13%) were abnormal smears, and 552 (6.77%) unsatisfactory smears were deemed negative. Regarding the results on the 8,155 smears subjected to 100%-RR when compared with the final diagnosis, there was agreement in 7,063 (86.60%) of them, and there were 1,092 (13.40%) discordant results (65.6%, unsatisfactory; 5.47%, atypical squamous cells of undetermined significance [ASC-US]). The κ index had an agreement of 0.867, with κ = 0.734 (p < 0.0001). Compared with the final diagnosis, the sensitivity of 100%-RR was 99.91% and its specificity was 99.4% for severe abnormalities. The sensitivity for high-grade squamous intraepithelial lesions was 88.2%, with a specificity of 100.00%. For abnormalities considered borderline, such as ASC-US, the sensitivity was 94.50% and the specificity was 99.5%. The 100%-RR was considered efficient when used as an IQC method. © 2017 S. Karger AG, Basel.

  10. QUALITY CONTROL OF PHARMACEUTICALS.

    PubMed

    LEVI, L; WALKER, G C; PUGSLEY, L I

    1964-10-10

    Quality control is an essential operation of the pharmaceutical industry. Drugs must be marketed as safe and therapeutically active formulations whose performance is consistent and predictable. New and better medicinal agents are being produced at an accelerated rate. At the same time more exacting and sophisticated analytical methods are being developed for their evaluation. Requirements governing the quality control of pharmaceuticals in accordance with the Canadian Food and Drugs Act are cited and discussed.

  11. Online Error Reporting for Managing Quality Control Within Radiology.

    PubMed

    Golnari, Pedram; Forsberg, Daniel; Rosipko, Beverly; Sunshine, Jeffrey L

    2016-06-01

    Information technology systems within health care, such as picture archiving and communication system (PACS) in radiology, can have a positive impact on production but can also risk compromising quality. The widespread use of PACS has removed the previous feedback loop between radiologists and technologists. Instead of direct communication of quality discrepancies found for an examination, the radiologist submitted a paper-based quality-control report. A web-based issue-reporting tool can help restore some of the feedback loop and also provide possibilities for more detailed analysis of submitted errors. The purpose of this study was to evaluate the hypothesis that data from use of an online error reporting software for quality control can focus our efforts within our department. For the 372,258 radiologic examinations conducted during the 6-month period study, 930 errors (390 exam protocol, 390 exam validation, and 150 exam technique) were submitted, corresponding to an error rate of 0.25 %. Within the category exam protocol, technologist documentation had the highest number of submitted errors in ultrasonography (77 errors [44 %]), while imaging protocol errors were the highest subtype error for computed tomography modality (35 errors [18 %]). Positioning and incorrect accession had the highest errors in the exam technique and exam validation error category, respectively, for nearly all of the modalities. An error rate less than 1 % could signify a system with a very high quality; however, a more likely explanation is that not all errors were detected or reported. Furthermore, staff reception of the error reporting system could also affect the reporting rate.

  12. 7 CFR 58.928 - Quality control tests.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Quality control tests. 58.928 Section 58.928... Procedures § 58.928 Quality control tests. All dairy products and other ingredients shall be subject to inspection for quality and condition throughout each processing operation. Quality control tests shall be...

  13. Expert database system for quality control

    NASA Astrophysics Data System (ADS)

    Wang, Anne J.; Li, Zhi-Cheng

    1993-09-01

    There are more competitors today. Markets are not homogeneous they are fragmented into increasingly focused niches requiring greater flexibility in the product mix shorter manufacturing production runs and above allhigher quality. In this paper the author identified a real-time expert system as a way to improve plantwide quality management. The quality control expert database system (QCEDS) by integrating knowledge of experts in operations quality management and computer systems use all information relevant to quality managementfacts as well as rulesto determine if a product meets quality standards. Keywords: expert system quality control data base

  14. Quality evaluation of JAMA Patient Pages on diabetes using the Ensuring Quality Information for Patient (EQIP) tool.

    PubMed

    Vaona, Alberto; Marcon, Alessandro; Rava, Marta; Buzzetti, Roberto; Sartori, Marco; Abbinante, Crescenza; Moser, Andrea; Seddaiu, Antonia; Prontera, Manuela; Quaglio, Alessandro; Pallazzoni, Piera; Sartori, Valentina; Rigon, Giulio

    2011-12-01

    Many medical journals provide patient information leaflets on the correct use of medicines and/or appropriate lifestyles. Only a few studies have assessed the quality of this patient-specific literature. The purpose of this study was to evaluate the quality of JAMA Patient Pages on diabetes using the Ensuring Quality Information for Patient (EQIP) tool. A multidisciplinary group of 10 medical doctors analyzed all diabetes-related Patient Pages published by JAMA from 1998 to 2010 using the EQIP tool. Inter-rater reliability was assessed using the percentage of observed total agreement (p(o)). A quality score between 0 and 1 (the higher score indicating higher quality) was calculated for each item on every page as a function of raters' answers to the EQIP checklist. A mean score per item and a mean score per page were then calculated. We found 8 Patient Pages on diabetes on the JAMA web site. The overall quality score of the documents ranged between 0.55 (Managing Diabetes and Diabetes) and 0.67 (weight and diabetes). p(o) was at least moderate (>50%) for 15 of the 20 EQIP items. Despite generally favorable quality scores, some items received low scores. The worst scores were for the item assessing provision of an empty space to customize information for individual patients (score=0.01, p(o)=95%) and patients involvement in document drafting (score=0.11, p(o)=79%). The Patient Pages on diabetes published by JAMA were found to present weak points that limit their overall quality and may jeopardize their efficacy. We therefore recommend that authors and publishers of written patient information comply with published quality criteria. Further research is needed to evaluate the quality and efficacy of existing written health care information. Copyright © 2011 Primary Care Diabetes Europe. Published by Elsevier Ltd. All rights reserved.

  15. Quality of Reporting Randomized Controlled Trials in Five Leading Neurology Journals in 2008 and 2013 Using the Modified "Risk of Bias" Tool.

    PubMed

    Zhai, Xiao; Cui, Jin; Wang, Yiran; Qu, Zhiquan; Mu, Qingchun; Li, Peiwen; Zhang, Chaochao; Yang, Mingyuan; Chen, Xiao; Chen, Ziqiang; Li, Ming

    2017-03-01

    To examine the risk of bias of methodological quality of reporting randomized clinical trials (RCTs) in major neurology journals before and after the update (2011) of Cochrane risk of bias tool. RCTs in 5 leading neurology journals in 2008 and 2013 were searched systematically. Characteristics were extracted based on the list of the modified Cochrane Collaboration's tool. Country, number of patients, type of intervention, and funding source also were examined for further analysis. A total of 138 RCTs were enrolled in this study. The rates of following a trial plan were 61.6% for the allocation generation, 52.9% for the allocation concealment, 84.8% for the blinding of the participants or the personnel, 34.8% for the blinding of outcome assessment, 78.3% for the incomplete outcome data, and 67.4% for the selective reporting. A significant setback was found in "the selective reporting" in 2013 than that in 2008. Trials performed by multi-centers and on a large scale had significantly more "low risk of bias" trials. Not only the number of surgical trials (5.8%) was much less than that of trials using drugs (73.9%), but also the reporting quality of surgical trials were worse (P = 0.008). Finally, only 17.4% trials met the criterion of "low risk of bias." The modified "risk of bias" tool is an improved version for assessment. Methodological quality of reporting RCTs in the 5neurology journals is unsatisfactory, especially that for surgical RCTs, and it could be further improved. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. 7 CFR 58.335 - Quality control tests.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Quality control tests. 58.335 Section 58.335... Procedures § 58.335 Quality control tests. All milk, cream and related products are subject to inspection for quality and condition throughout each processing operation. Quality control tests shall be made on flow...

  17. Using Quality Management Tools to Enhance Feedback from Student Evaluations

    ERIC Educational Resources Information Center

    Jensen, John B.; Artz, Nancy

    2005-01-01

    Statistical tools found in the service quality assessment literature--the "T"[superscript 2] statistic combined with factor analysis--can enhance the feedback instructors receive from student ratings. "T"[superscript 2] examines variability across multiple sets of ratings to isolate individual respondents with aberrant response…

  18. Educational Tool for Optimal Controller Tuning Using Evolutionary Strategies

    ERIC Educational Resources Information Center

    Carmona Morales, D.; Jimenez-Hornero, J. E.; Vazquez, F.; Morilla, F.

    2012-01-01

    In this paper, an optimal tuning tool is presented for control structures based on multivariable proportional-integral-derivative (PID) control, using genetic algorithms as an alternative to traditional optimization algorithms. From an educational point of view, this tool provides students with the necessary means to consolidate their knowledge on…

  19. Scater: pre-processing, quality control, normalization and visualization of single-cell RNA-seq data in R.

    PubMed

    McCarthy, Davis J; Campbell, Kieran R; Lun, Aaron T L; Wills, Quin F

    2017-04-15

    Single-cell RNA sequencing (scRNA-seq) is increasingly used to study gene expression at the level of individual cells. However, preparing raw sequence data for further analysis is not a straightforward process. Biases, artifacts and other sources of unwanted variation are present in the data, requiring substantial time and effort to be spent on pre-processing, quality control (QC) and normalization. We have developed the R/Bioconductor package scater to facilitate rigorous pre-processing, quality control, normalization and visualization of scRNA-seq data. The package provides a convenient, flexible workflow to process raw sequencing reads into a high-quality expression dataset ready for downstream analysis. scater provides a rich suite of plotting tools for single-cell data and a flexible data structure that is compatible with existing tools and can be used as infrastructure for future software development. The open-source code, along with installation instructions, vignettes and case studies, is available through Bioconductor at http://bioconductor.org/packages/scater . davis@ebi.ac.uk. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press.

  20. Quality Control of Pharmaceuticals

    PubMed Central

    Levi, Leo; Walker, George C.; Pugsley, L. I.

    1964-01-01

    Quality control is an essential operation of the pharmaceutical industry. Drugs must be marketed as safe and therapeutically active formulations whose performance is consistent and predictable. New and better medicinal agents are being produced at an accelerated rate. At the same time more exacting and sophisticated analytical methods are being developed for their evaluation. Requirements governing the quality control of pharmaceuticals in accordance with the Canadian Food and Drugs Act are cited and discussed. PMID:14199105

  1. Severe postpartum haemorrhage after vaginal delivery: a statistical process control chart to report seven years of continuous quality improvement.

    PubMed

    Dupont, Corinne; Occelli, Pauline; Deneux-Tharaux, Catherine; Touzet, Sandrine; Duclos, Antoine; Bouvier-Colle, Marie-Hélène; Rudigoz, René-Charles; Huissoud, Cyril

    2014-07-01

    Severe postpartum haemorrhage after vaginal delivery: a statistical process control chart to report seven years of continuous quality improvement To use statistical process control charts to describe trends in the prevalence of severe postpartum haemorrhage after vaginal delivery. This assessment was performed 7 years after we initiated a continuous quality improvement programme that began with regular criteria-based audits Observational descriptive study, in a French maternity unit in the Rhône-Alpes region. Quarterly clinical audit meetings to analyse all cases of severe postpartum haemorrhage after vaginal delivery and provide feedback on quality of care with statistical process control tools. The primary outcomes were the prevalence of severe PPH after vaginal delivery and its quarterly monitoring with a control chart. The secondary outcomes included the global quality of care for women with severe postpartum haemorrhage, including the performance rate of each recommended procedure. Differences in these variables between 2005 and 2012 were tested. From 2005 to 2012, the prevalence of severe postpartum haemorrhage declined significantly, from 1.2% to 0.6% of vaginal deliveries (p<0.001). Since 2010, the quarterly rate of severe PPH has not exceeded the upper control limits, that is, been out of statistical control. The proportion of cases that were managed consistently with the guidelines increased for all of their main components. Implementation of continuous quality improvement efforts began seven years ago and used, among other tools, statistical process control charts. During this period, the prevalence of severe postpartum haemorrhage after vaginal delivery has been reduced by 50%. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  2. Dynamic principle for ensemble control tools.

    PubMed

    Samoletov, A; Vasiev, B

    2017-11-28

    Dynamical equations describing physical systems in contact with a thermal bath are commonly extended by mathematical tools called "thermostats." These tools are designed for sampling ensembles in statistical mechanics. Here we propose a dynamic principle underlying a range of thermostats which is derived using fundamental laws of statistical physics and ensures invariance of the canonical measure. The principle covers both stochastic and deterministic thermostat schemes. Our method has a clear advantage over a range of proposed and widely used thermostat schemes that are based on formal mathematical reasoning. Following the derivation of the proposed principle, we show its generality and illustrate its applications including design of temperature control tools that differ from the Nosé-Hoover-Langevin scheme.

  3. Computer applications in scientific balloon quality control

    NASA Astrophysics Data System (ADS)

    Seely, Loren G.; Smith, Michael S.

    Seal defects and seal tensile strength are primary determinants of product quality in scientific balloon manufacturing; they therefore require a unit of quality measure. The availability of inexpensive and powerful data-processing tools can serve as the basis of a quality-trends-discerning analysis of products. The results of one such analysis are presently given in graphic form for use on the production floor. Software descriptions and their sample outputs are presented, together with a summary of overall and long-term effects of these methods on product quality.

  4. 30 CFR 28.31 - Quality control plans; contents.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Quality control plans; contents. 28.31 Section... PROTECTION FOR TRAILING CABLES IN COAL MINES Quality Control § 28.31 Quality control plans; contents. (a) Each quality control plan shall contain provisions for the management of quality, including: (1...

  5. [Highly quality-controlled radiation therapy].

    PubMed

    Shirato, Hiroki

    2005-04-01

    Advanced radiation therapy for intracranial disease has focused on set-up accuracy for the past 15 years. However, quality control in the prescribed dose is actually as important as the tumor set-up in radiation therapy. Because of the complexity of the three-dimensional radiation treatment planning system in recent years, the highly quality-controlled prescription of the dose has now been reappraised as the mainstream to improve the treatment outcome of radiation therapy for intracranial disease. The Japanese Committee for Quality Control of Radiation Therapy has developed fundamental requirements such as a QC committee in each hospital, a medical physicist, dosimetrists (QC members), and an external audit.

  6. rnaQUAST: a quality assessment tool for de novo transcriptome assemblies.

    PubMed

    Bushmanova, Elena; Antipov, Dmitry; Lapidus, Alla; Suvorov, Vladimir; Prjibelski, Andrey D

    2016-07-15

    Ability to generate large RNA-Seq datasets created a demand for both de novo and reference-based transcriptome assemblers. However, while many transcriptome assemblers are now available, there is still no unified quality assessment tool for RNA-Seq assemblies. We present rnaQUAST-a tool for evaluating RNA-Seq assembly quality and benchmarking transcriptome assemblers using reference genome and gene database. rnaQUAST calculates various metrics that demonstrate completeness and correctness levels of the assembled transcripts, and outputs them in a user-friendly report. rnaQUAST is implemented in Python and is freely available at http://bioinf.spbau.ru/en/rnaquast ap@bioinf.spbau.ru Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  7. The utilization of six sigma and statistical process control techniques in surgical quality improvement.

    PubMed

    Sedlack, Jeffrey D

    2010-01-01

    Surgeons have been slow to incorporate industrial reliability techniques. Process control methods were applied to surgeon waiting time between cases, and to length of stay (LOS) after colon surgery. Waiting times between surgeries were evaluated by auditing the operating room records of a single hospital over a 1-month period. The medical records of 628 patients undergoing colon surgery over a 5-year period were reviewed. The average surgeon wait time between cases was 53 min, and the busiest surgeon spent 291/2 hr in 1 month waiting between surgeries. Process control charting demonstrated poor overall control of the room turnover process. Average LOS after colon resection also demonstrated very poor control. Mean LOS was 10 days. Weibull's conditional analysis revealed a conditional LOS of 9.83 days. Serious process management problems were identified in both analyses. These process issues are both expensive and adversely affect the quality of service offered by the institution. Process control mechanisms were suggested or implemented to improve these surgical processes. Industrial reliability and quality management tools can easily and effectively identify process control problems that occur on surgical services. © 2010 National Association for Healthcare Quality.

  8. A new generation of tools for search, recovery and quality evaluation of World Wide Web medical resources.

    PubMed

    Aguillo, I

    2000-01-01

    Although the Internet is already a valuable information resource in medicine, there are important challenges to be faced before physicians and general users will have extensive access to this information. As a result of a research effort to compile a health-related Internet directory, new tools and strategies have been developed to solve key problems derived from the explosive growth of medical information on the Net and the great concern over the quality of such critical information. The current Internet search engines lack some important capabilities. We suggest using second generation tools (client-side based) able to deal with large quantities of data and to increase the usability of the records recovered. We tested the capabilities of these programs to solve health-related information problems, recognising six groups according to the kind of topics addressed: Z39.50 clients, downloaders, multisearchers, tracing agents, indexers and mappers. The evaluation of the quality of health information available on the Internet could require a large amount of human effort. A possible solution may be to use quantitative indicators based on the hypertext visibility of the Web sites. The cybermetric measures are valid for quality evaluation if they are derived from indirect peer review by experts with Web pages citing the site. The hypertext links acting as citations need to be extracted from a controlled sample of quality super-sites.

  9. Innovative Tools for Water Quality/Quantity Management: New York City's Operations Support Tool

    NASA Astrophysics Data System (ADS)

    Wang, L.; Schaake, J. C.; Day, G. N.; Porter, J.; Sheer, D. P.; Pyke, G.

    2011-12-01

    The New York City Department of Environmental Protection (DEP) manages New York City's water supply, which is comprised of over 20 reservoirs and supplies more than 1 billion gallons of water per day to over 9 million customers. Recently, DEP has initiated design of an Operations Support Tool (OST), a state-of-the-art decision support system to provide computational and predictive support for water supply operations and planning. This presentation describes the technical structure of OST, including the underlying water supply and water quality models, data sources and database management, reservoir inflow forecasts, and the functionalities required to meet the needs of a diverse group of end users. OST is a major upgrade of DEP's current water supply - water quality model, developed to evaluate alternatives for controlling turbidity in NYC's Catskill reservoirs. While the current model relies on historical hydrologic and meteorological data, OST can be driven by forecasted future conditions. It will receive a variety of near-real-time data from a number of sources. OST will support two major types of simulations: long-term, for evaluating policy or infrastructure changes over an extended period of time; and short-term "position analysis" (PA) simulations, consisting of multiple short simulations, all starting from the same initial conditions. Typically, the starting conditions for a PA run will represent those for the current day and traces of forecasted hydrology will drive the model for the duration of the simulation period. The result of these simulations will be a distribution of future system states based on system operating rules and the range of input ensemble streamflow predictions. DEP managers will analyze the output distributions and make operation decisions using risk-based metrics such as probability of refill. Currently, in the developmental stages of OST, forecasts are based on antecedent hydrologic conditions and are statistical in nature. The

  10. 30 CFR 74.6 - Quality control.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Quality control. 74.6 Section 74.6 Mineral... of the CMDPSU will be maintained in production through adequate quality control procedures, MSHA and... DUST SAMPLING DEVICES Approval Requirements for Coal Mine Dust Personal Sampler Unit § 74.6 Quality...

  11. Total Quality Management (TQM), an Overview

    DTIC Science & Technology

    1991-09-01

    Quality Management (TQM). It discusses the reasons TQM is a current growth industry, what it is, and how one implements it. It describes the basic analytical tools, statistical process control, some advanced analytical tools, tools used by process improvement teams to enhance their own operations, and action plans for making improvements. The final sections discuss assessing quality efforts and measuring the quality to knowledge

  12. Tools for proactive collection and use of quality metadata in GEOSS

    NASA Astrophysics Data System (ADS)

    Bastin, L.; Thum, S.; Maso, J.; Yang, K. X.; Nüst, D.; Van den Broek, M.; Lush, V.; Papeschi, F.; Riverola, A.

    2012-12-01

    The GEOSS Common Infrastructure allows interactive evaluation and selection of Earth Observation datasets by the scientific community and decision makers, but the data quality information needed to assess fitness for use is often patchy and hard to visualise when comparing candidate datasets. In a number of studies over the past decade, users repeatedly identified the same types of gaps in quality metadata, specifying the need for enhancements such as peer and expert review, better traceability and provenance information, information on citations and usage of a dataset, warning about problems identified with a dataset and potential workarounds, and 'soft knowledge' from data producers (e.g. recommendations for use which are not easily encoded using the existing standards). Despite clear identification of these issues in a number of recommendations, the gaps persist in practice and are highlighted once more in our own, more recent, surveys. This continuing deficit may well be the result of a historic paucity of tools to support the easy documentation and continual review of dataset quality. However, more recent developments in tools and standards, as well as more general technological advances, present the opportunity for a community of scientific users to adopt a more proactive attitude by commenting on their uses of data, and for that feedback to be federated with more traditional and static forms of metadata, allowing a user to more accurately assess the suitability of a dataset for their own specific context and reliability thresholds. The EU FP7 GeoViQua project aims to develop this opportunity by adding data quality representations to the existing search and visualisation functionalities of the Geo Portal. Subsequently we will help to close the gap by providing tools to easily create quality information, and to permit user-friendly exploration of that information as the ultimate incentive for improved data quality documentation. Quality information is derived

  13. Online tools for uncovering data quality issues in satellite-based global precipitation products

    NASA Astrophysics Data System (ADS)

    Liu, Z.; Heo, G.

    2015-12-01

    Accurate and timely available global precipitation products are important to many applications such as flood forecasting, hydrological modeling, vector-borne disease research, crop yield estimates, etc. However, data quality issues such as biases and uncertainties are common in satellite-based precipitation products and it is important to understand these issues in applications. In recent years, algorithms using multi-satellites and multi-sensors for satellite-based precipitation estimates have become popular, such as the TRMM (Tropical Rainfall Measuring Mission) Multi-satellite Precipitation Analysis (TMPA) and the latest Integrated Multi-satellitE Retrievals for GPM (IMERG). Studies show that data quality issues for multi-satellite and multi-sensor products can vary with space and time and can be difficult to summarize. Online tools can provide customized results for a given area of interest, allowing customized investigation or comparison on several precipitation products. Because downloading data and software is not required, online tools can facilitate precipitation product evaluation and comparison. In this presentation, we will present online tools to uncover data quality issues in satellite-based global precipitation products. Examples will be presented as well.

  14. Measuring and improving the quality of postoperative epidural analgesia for major abdominal surgery using statistical process control charts.

    PubMed

    Duncan, Fiona; Haigh, Carol

    2013-10-01

    To explore and improve the quality of continuous epidural analgesia for pain relief using Statistical Process Control tools. Measuring the quality of pain management interventions is complex. Intermittent audits do not accurately capture the results of quality improvement initiatives. The failure rate for one intervention, epidural analgesia, is approximately 30% in everyday practice, so it is an important area for improvement. Continuous measurement and analysis are required to understand the multiple factors involved in providing effective pain relief. Process control and quality improvement Routine prospectively acquired data collection started in 2006. Patients were asked about their pain and side effects of treatment. Statistical Process Control methods were applied for continuous data analysis. A multidisciplinary group worked together to identify reasons for variation in the data and instigated ideas for improvement. The key measure for improvement was a reduction in the percentage of patients with an epidural in severe pain. The baseline control charts illustrated the recorded variation in the rate of several processes and outcomes for 293 surgical patients. The mean visual analogue pain score (VNRS) was four. There was no special cause variation when data were stratified by surgeons, clinical area or patients who had experienced pain before surgery. Fifty-seven per cent of patients were hypotensive on the first day after surgery. We were able to demonstrate a significant improvement in the failure rate of epidurals as the project continued with quality improvement interventions. Statistical Process Control is a useful tool for measuring and improving the quality of pain management. The applications of Statistical Process Control methods offer the potential to learn more about the process of change and outcomes in an Acute Pain Service both locally and nationally. We have been able to develop measures for improvement and benchmarking in routine care that

  15. Quality control and primo-diagnosis of transurethral bladder resections with full-field OCT

    NASA Astrophysics Data System (ADS)

    Montagne, P.; Ducesne, I.; Anract, J.; Yang, C.; Sibony, M.; Beuvon, F.; Delongchamps, N. B.; Dalimier, E.

    2017-02-01

    Transurethral resections are commonly used for bladder cancer diagnosis, treatment and follow-up. Cancer staging relies largely on the analysis of muscle in the resections; however, muscle presence is uncertain at the time of the resection. An extemporaneous quality control tool would be of great use to certify the presence of muscle in the resection, and potentially formulate a primo-diagnosis, in order to ensure optimum patient care. Full-field optical coherence tomography (FFOCT) offers a fast and non-destructive method of obtaining images of biological tissues at ultrahigh resolution (1μm in all 3 directions), approaching traditional histological sections. This study aimed to evaluate the potential of FFOCT for the quality control and the primo-diagnosis of transurethral bladder resections. Over 70 transurethral bladder resections were imaged with FFOCT within minutes, shortly after excision, and before histological preparation. Side-by-side comparison with histology allowed to establish reading criteria for the presence of muscle and cancer in particular. Images of 24 specimens were read blindly by three non-pathologists readers: two resident urologists and a junior bio-medical engineer, who were asked to notify the presence of muscle and tumor. Results showed that after appropriate training, 96% accuracy could be obtained on both tumour and muscle detection. FFOCT is a fast and nondestructive imaging technique that provides analysis results concordant with histology. Its implementation as a quality control and primo-diagnosis tool for transurethral bladder resections in the urology suite is feasible and lets envision high value for the patient.

  16. A Review and Analysis of Remote Sensing Capability for Air Quality Measurements as a Potential Decision Support Tool Conducted by the NASA DEVELOP Program

    NASA Technical Reports Server (NTRS)

    Ross, A.; Richards, A.; Keith, K.; Frew, C.; Boseck, J.; Sutton, S.; Watts, C.; Rickman, D.

    2007-01-01

    This project focused on a comprehensive utilization of air quality model products as decision support tools (DST) needed for public health applications. A review of past and future air quality measurement methods and their uncertainty, along with the relationship of air quality to national and global public health, is vital. This project described current and future NASA satellite remote sensing and ground sensing capabilities and the potential for using these sensors to enhance the prediction, prevention, and control of public health effects that result from poor air quality. The qualitative uncertainty of current satellite remotely sensed air quality, the ground-based remotely sensed air quality, the air quality/public health model, and the decision making process is evaluated in this study. Current peer-reviewed literature suggests that remotely sensed air quality parameters correlate well with ground-based sensor data. A satellite remote-sensed and ground-sensed data complement is needed to enhance the models/tools used by policy makers for the protection of national and global public health communities

  17. [Quality control in herbal supplements].

    PubMed

    Oelker, Luisa

    2005-01-01

    Quality and safety of food and herbal supplements are the result of a whole of different elements as good manufacturing practice and process control. The process control must be active and able to individuate and correct all possible hazards. The main and most utilized instrument is the hazard analysis critical control point (HACCP) system the correct application of which can guarantee the safety of the product. Herbal supplements need, in addition to standard quality control, a set of checks to assure the harmlessness and safety of the plants used.

  18. Operational and Strategic Controlling Tools in Microenterprises - Case Study

    NASA Astrophysics Data System (ADS)

    Konsek-Ciechońska, Justyna

    2017-12-01

    Globalisation and increasing requirements of the environment cause the executives and supervisors to search for more and more perfect solutions, allowing them to streamline and improve the effectiveness of company operations. One of such tools, used more and more often, is controlling, the role of which has substantially increased in the recent years. It is already implemented not only in large companies with foreign capital, but also in increasingly smaller entities, which begin to notice the positive effects of the implications of the principles and tools of controlling - both operational and strategic. The purpose of the article is to demonstrate the practical side of controlling tools that can be used for the purposes of operations conducted by microenterprises.

  19. Continuous integration and quality control for scientific software

    NASA Astrophysics Data System (ADS)

    Neidhardt, A.; Ettl, M.; Brisken, W.; Dassing, R.

    2013-08-01

    Modern software has to be stable, portable, fast and reliable. This is going to be also more and more important for scientific software. But this requires a sophisticated way to inspect, check and evaluate the quality of source code with a suitable, automated infrastructure. A centralized server with a software repository and a version control system is one essential part, to manage the code basis and to control the different development versions. While each project can be compiled separately, the whole code basis can also be compiled with one central “Makefile”. This is used to create automated, nightly builds. Additionally all sources are inspected automatically with static code analysis and inspection tools, which check well-none error situations, memory and resource leaks, performance issues, or style issues. In combination with an automatic documentation generator it is possible to create the developer documentation directly from the code and the inline comments. All reports and generated information are presented as HTML page on a Web server. Because this environment increased the stability and quality of the software of the Geodetic Observatory Wettzell tremendously, it is now also available for scientific communities. One regular customer is already the developer group of the DiFX software correlator project.

  20. Reporting Quality Assessment of Randomized Controlled Trials Published in Nephrology Urology Monthly Journal.

    PubMed

    Mehrazmay, Alireza; Karambakhsh, Alireza; Salesi, Mahmood

    2015-07-01

    Randomized controlled trials (RCTs) are important tools for evidence-based health care decisions. It is, therefore, important that they be conducted and reported with the highest possible standards. The aim of this study was to evaluate the reporting quality of the RCTs published in nephrology urology monthly journal and to examine whether there was a change over time in the reporting quality. The quality of each report was assessed using the Consolidated Standards of Reporting Trials (CONSORT) 2010 Statement checklist and a 5-point quality assessment instrument, i.e. the Jadad scale. Eighteen (14 Iranian and 4 non-Iranian) RCTs were published from 2012 to 2014 on topics including renal stone (16.6%), hemodialysis and transplantation (38.8%), and prostate conditions (11.1%). Interventions comprised surgery, drugs, and teaching method in 7 (38 %), 10 (55%), and 1 (5%) of them, respectively. According to the CONSORT checklist, the weakest reported items were registration number, identification as a randomized trial in the title, and settings and locations where the data were collected. The mean Jadad score of the reports was 2.72 ± 1.36 (54% of their maximum possible total score). According to the Jadad and CONSORT scales, there was an increase in the quality of reporting from 2012 to 2014. This assessment shows low reporting quality scores in reports. Training courses for researchers, using standard reporting tools (e.g. CONSORT 2010 Statement checklist), and consultation with methodologists can improve the quality of published RCTs.

  1. Distributed sensor architecture for intelligent control that supports quality of control and quality of service.

    PubMed

    Poza-Lujan, Jose-Luis; Posadas-Yagüe, Juan-Luis; Simó-Ten, José-Enrique; Simarro, Raúl; Benet, Ginés

    2015-02-25

    This paper is part of a study of intelligent architectures for distributed control and communications systems. The study focuses on optimizing control systems by evaluating the performance of middleware through quality of service (QoS) parameters and the optimization of control using Quality of Control (QoC) parameters. The main aim of this work is to study, design, develop, and evaluate a distributed control architecture based on the Data-Distribution Service for Real-Time Systems (DDS) communication standard as proposed by the Object Management Group (OMG). As a result of the study, an architecture called Frame-Sensor-Adapter to Control (FSACtrl) has been developed. FSACtrl provides a model to implement an intelligent distributed Event-Based Control (EBC) system with support to measure QoS and QoC parameters. The novelty consists of using, simultaneously, the measured QoS and QoC parameters to make decisions about the control action with a new method called Event Based Quality Integral Cycle. To validate the architecture, the first five Braitenberg vehicles have been implemented using the FSACtrl architecture. The experimental outcomes, demonstrate the convenience of using jointly QoS and QoC parameters in distributed control systems.

  2. Distributed Sensor Architecture for Intelligent Control that Supports Quality of Control and Quality of Service

    PubMed Central

    Poza-Lujan, Jose-Luis; Posadas-Yagüe, Juan-Luis; Simó-Ten, José-Enrique; Simarro, Raúl; Benet, Ginés

    2015-01-01

    This paper is part of a study of intelligent architectures for distributed control and communications systems. The study focuses on optimizing control systems by evaluating the performance of middleware through quality of service (QoS) parameters and the optimization of control using Quality of Control (QoC) parameters. The main aim of this work is to study, design, develop, and evaluate a distributed control architecture based on the Data-Distribution Service for Real-Time Systems (DDS) communication standard as proposed by the Object Management Group (OMG). As a result of the study, an architecture called Frame-Sensor-Adapter to Control (FSACtrl) has been developed. FSACtrl provides a model to implement an intelligent distributed Event-Based Control (EBC) system with support to measure QoS and QoC parameters. The novelty consists of using, simultaneously, the measured QoS and QoC parameters to make decisions about the control action with a new method called Event Based Quality Integral Cycle. To validate the architecture, the first five Braitenberg vehicles have been implemented using the FSACtrl architecture. The experimental outcomes, demonstrate the convenience of using jointly QoS and QoC parameters in distributed control systems. PMID:25723145

  3. Quality control education in the community college

    NASA Technical Reports Server (NTRS)

    Greene, J. Griffen; Wilson, Steve

    1966-01-01

    This paper describes the Quality Control Program at Daytona Beach Junior College, including course descriptions. The program in quality control required communication between the college and the American Society for Quality Control (ASQC). The college has machinery established for certification of the learning process, and the society has the source of teachers who are competent in the technical field and who are the employers of the educational products. The associate degree for quality control does not have a fixed program, which can serve all needs, any more than all engineering degrees have identical programs. The main ideas which would be common to all quality control programs are the concept of economic control of a repetitive process and the concept of developing individual potentialities into individuals who are needed and productive.

  4. Web-based decision support and visualization tools for water quality management in the Chesapeake Bay watershed

    USGS Publications Warehouse

    Mullinix, C.; Hearn, P.; Zhang, H.; Aguinaldo, J.

    2009-01-01

    Federal, State, and local water quality managers charged with restoring the Chesapeake Bay ecosystem require tools to maximize the impact of their limited resources. To address this need, the U.S. Geological Survey (USGS) and the Environmental Protection Agency's Chesapeake Bay Program (CBP) are developing a suite of Web-based tools called the Chesapeake Online Assessment Support Toolkit (COAST). The goal of COAST is to help CBP partners identify geographic areas where restoration activities would have the greatest effect, select the appropriate management strategies, and improve coordination and prioritization among partners. As part of the COAST suite of tools focused on environmental restoration, a water quality management visualization component called the Nutrient Yields Mapper (NYM) tool is being developed by USGS. The NYM tool is a web application that uses watershed yield estimates from USGS SPAtially Referenced Regressions On Watershed (SPARROW) attributes model (Schwarz et al., 2006) [6] to allow water quality managers to identify important sources of nitrogen and phosphorous within the Chesapeake Bay watershed. The NYM tool utilizes new open source technologies that have become popular in geospatial web development, including components such as OpenLayers and GeoServer. This paper presents examples of water quality data analysis based on nutrient type, source, yield, and area of interest using the NYM tool for the Chesapeake Bay watershed. In addition, we describe examples of map-based techniques for identifying high and low nutrient yield areas; web map engines; and data visualization and data management techniques.

  5. The concentration-discharge slope as a tool for water quality management.

    PubMed

    Bieroza, M Z; Heathwaite, A L; Bechmann, M; Kyllmar, K; Jordan, P

    2018-07-15

    Recent technological breakthroughs of optical sensors and analysers have enabled matching the water quality measurement interval to the time scales of stream flow changes and led to an improved understanding of spatially and temporally heterogeneous sources and delivery pathways for many solutes and particulates. This new ability to match the chemograph with the hydrograph has promoted renewed interest in the concentration-discharge (c-q) relationship and its value in characterizing catchment storage, time lags and legacy effects for both weathering products and anthropogenic pollutants. In this paper we evaluated the stream c-q relationships for a number of water quality determinands (phosphorus, suspended sediments, nitrogen) in intensively managed agricultural catchments based on both high-frequency (sub-hourly) and long-term low-frequency (fortnightly-monthly) routine monitoring data. We used resampled high-frequency data to test the uncertainty in water quality parameters (e.g. mean, 95th percentile and load) derived from low-frequency sub-datasets. We showed that the uncertainty in water quality parameters increases with reduced sampling frequency as a function of the c-q slope. We also showed that different sources and delivery pathways control c-q relationship for different solutes and particulates. Secondly, we evaluated the variation in c-q slopes derived from the long-term low-frequency data for different determinands and catchments and showed strong chemostatic behaviour for phosphorus and nitrogen due to saturation and agricultural legacy effects. The c-q slope analysis can provide an effective tool to evaluate the current monitoring networks and the effectiveness of water management interventions. This research highlights how improved understanding of solute and particulate dynamics obtained with optical sensors and analysers can be used to understand patterns in long-term water quality time series, reduce the uncertainty in the monitoring data and to

  6. Content, Quality, and Assessment Tools of Physician-Rating Websites in 12 Countries: Quantitative Analysis.

    PubMed

    Rothenfluh, Fabia; Schulz, Peter J

    2018-06-14

    Websites on which users can rate their physician are becoming increasingly popular, but little is known about the website quality, the information content, and the tools they offer users to assess physicians. This study assesses these aspects on physician-rating websites in German- and English-speaking countries. The objective of this study was to collect information on websites with a physician rating or review tool in 12 countries in terms of metadata, website quality (transparency, privacy and freedom of speech of physicians and patients, check mechanisms for appropriateness and accuracy of reviews, and ease of page navigation), professional information about the physician, rating scales and tools, as well as traffic rank. A systematic Web search based on a set of predefined keywords was conducted on Google, Bing, and Yahoo in August 2016. A final sample of 143 physician-rating websites was analyzed and coded for metadata, quality, information content, and the physician-rating tools. The majority of websites were registered in the United States (40/143) or Germany (25/143). The vast majority were commercially owned (120/143, 83.9%), and 69.9% (100/143) displayed some form of physician advertisement. Overall, information content (mean 9.95/25) as well as quality were low (mean 18.67/47). Websites registered in the United Kingdom obtained the highest quality scores (mean 26.50/47), followed by Australian websites (mean 21.50/47). In terms of rating tools, physician-rating websites were most frequently asking users to score overall performance, punctuality, or wait time in practice. This study evidences that websites that provide physician rating should improve and communicate their quality standards, especially in terms of physician and user protection, as well as transparency. In addition, given that quality standards on physician-rating websites are low overall, the development of transparent guidelines is required. Furthermore, attention should be paid to the

  7. 23 CFR 1340.8 - Quality control.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... OBSERVATIONAL SURVEYS OF SEAT BELT USE Survey Design Requirements § 1340.8 Quality control. (a) Quality control... control monitors involved in seat belt use surveys shall have received training in data collection...) Statistical review. Survey results shall be reviewed and approved by a survey statistician, i.e., a person...

  8. 23 CFR 1340.8 - Quality control.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... OBSERVATIONAL SURVEYS OF SEAT BELT USE Survey Design Requirements § 1340.8 Quality control. (a) Quality control... control monitors involved in seat belt use surveys shall have received training in data collection...) Statistical review. Survey results shall be reviewed and approved by a survey statistician, i.e., a person...

  9. 23 CFR 1340.8 - Quality control.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... OBSERVATIONAL SURVEYS OF SEAT BELT USE Survey Design Requirements § 1340.8 Quality control. (a) Quality control... control monitors involved in seat belt use surveys shall have received training in data collection...) Statistical review. Survey results shall be reviewed and approved by a survey statistician, i.e., a person...

  10. The swiss neonatal quality cycle, a monitor for clinical performance and tool for quality improvement

    PubMed Central

    2013-01-01

    Background We describe the setup of a neonatal quality improvement tool and list which peer-reviewed requirements it fulfils and which it does not. We report on the so-far observed effects, how the units can identify quality improvement potential, and how they can measure the effect of changes made to improve quality. Methods Application of a prospective longitudinal national cohort data collection that uses algorithms to ensure high data quality (i.e. checks for completeness, plausibility and reliability), and to perform data imaging (Plsek’s p-charts and standardized mortality or morbidity ratio SMR charts). The collected data allows monitoring a study collective of very low birth-weight infants born from 2009 to 2011 by applying a quality cycle following the steps ′guideline – perform - falsify – reform′. Results 2025 VLBW live-births from 2009 to 2011 representing 96.1% of all VLBW live-births in Switzerland display a similar mortality rate but better morbidity rates when compared to other networks. Data quality in general is high but subject to improvement in some units. Seven measurements display quality improvement potential in individual units. The methods used fulfil several international recommendations. Conclusions The Quality Cycle of the Swiss Neonatal Network is a helpful instrument to monitor and gradually help improve the quality of care in a region with high quality standards and low statistical discrimination capacity. PMID:24074151

  11. 42 CFR 84.41 - Quality control plans; contents.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 1 2014-10-01 2014-10-01 false Quality control plans; contents. 84.41 Section 84... AND HEALTH RESEARCH AND RELATED ACTIVITIES APPROVAL OF RESPIRATORY PROTECTIVE DEVICES Quality Control § 84.41 Quality control plans; contents. (a) Each quality control plan shall contain provisions for the...

  12. 42 CFR 84.41 - Quality control plans; contents.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 1 2012-10-01 2012-10-01 false Quality control plans; contents. 84.41 Section 84... AND HEALTH RESEARCH AND RELATED ACTIVITIES APPROVAL OF RESPIRATORY PROTECTIVE DEVICES Quality Control § 84.41 Quality control plans; contents. (a) Each quality control plan shall contain provisions for the...

  13. 42 CFR 84.41 - Quality control plans; contents.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 1 2013-10-01 2013-10-01 false Quality control plans; contents. 84.41 Section 84... AND HEALTH RESEARCH AND RELATED ACTIVITIES APPROVAL OF RESPIRATORY PROTECTIVE DEVICES Quality Control § 84.41 Quality control plans; contents. (a) Each quality control plan shall contain provisions for the...

  14. 42 CFR 84.41 - Quality control plans; contents.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Quality control plans; contents. 84.41 Section 84... AND HEALTH RESEARCH AND RELATED ACTIVITIES APPROVAL OF RESPIRATORY PROTECTIVE DEVICES Quality Control § 84.41 Quality control plans; contents. (a) Each quality control plan shall contain provisions for the...

  15. 42 CFR 84.41 - Quality control plans; contents.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 1 2011-10-01 2011-10-01 false Quality control plans; contents. 84.41 Section 84... AND HEALTH RESEARCH AND RELATED ACTIVITIES APPROVAL OF RESPIRATORY PROTECTIVE DEVICES Quality Control § 84.41 Quality control plans; contents. (a) Each quality control plan shall contain provisions for the...

  16. Instrument Quality Control.

    PubMed

    Jayakody, Chatura; Hull-Ryde, Emily A

    2016-01-01

    Well-defined quality control (QC) processes are used to determine whether a certain procedure or action conforms to a widely accepted standard and/or set of guidelines, and are important components of any laboratory quality assurance program (Popa-Burke et al., J Biomol Screen 14: 1017-1030, 2009). In this chapter, we describe QC procedures useful for monitoring the accuracy and precision of laboratory instrumentation, most notably automated liquid dispensers. Two techniques, gravimetric QC and photometric QC, are highlighted in this chapter. When used together, these simple techniques provide a robust process for evaluating liquid handler accuracy and precision, and critically underpin high-quality research programs.

  17. 14 CFR 145.211 - Quality control system.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 3 2011-01-01 2011-01-01 false Quality control system. 145.211 Section 145...) SCHOOLS AND OTHER CERTIFICATED AGENCIES REPAIR STATIONS Operating Rules § 145.211 Quality control system. (a) A certificated repair station must establish and maintain a quality control system acceptable to...

  18. 18 CFR 12.40 - Quality control programs.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Quality control... PROJECT WORKS Other Responsibilities of Applicant or Licensee § 12.40 Quality control programs. (a... meeting any requirements or standards set by the Regional Engineer. If a quality control program is...

  19. 18 CFR 12.40 - Quality control programs.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Quality control... PROJECT WORKS Other Responsibilities of Applicant or Licensee § 12.40 Quality control programs. (a... meeting any requirements or standards set by the Regional Engineer. If a quality control program is...

  20. 14 CFR 145.211 - Quality control system.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 3 2010-01-01 2010-01-01 false Quality control system. 145.211 Section 145...) SCHOOLS AND OTHER CERTIFICATED AGENCIES REPAIR STATIONS Operating Rules § 145.211 Quality control system. (a) A certificated repair station must establish and maintain a quality control system acceptable to...

  1. GCE Data Toolbox for MATLAB - a software framework for automating environmental data processing, quality control and documentation

    NASA Astrophysics Data System (ADS)

    Sheldon, W.; Chamblee, J.; Cary, R. H.

    2013-12-01

    Environmental scientists are under increasing pressure from funding agencies and journal publishers to release quality-controlled data in a timely manner, as well as to produce comprehensive metadata for submitting data to long-term archives (e.g. DataONE, Dryad and BCO-DMO). At the same time, the volume of digital data that researchers collect and manage is increasing rapidly due to advances in high frequency electronic data collection from flux towers, instrumented moorings and sensor networks. However, few pre-built software tools are available to meet these data management needs, and those tools that do exist typically focus on part of the data management lifecycle or one class of data. The GCE Data Toolbox has proven to be both a generalized and effective software solution for environmental data management in the Long Term Ecological Research Network (LTER). This open source MATLAB software library, developed by the Georgia Coastal Ecosystems LTER program, integrates metadata capture, creation and management with data processing, quality control and analysis to support the entire data lifecycle. Raw data can be imported directly from common data logger formats (e.g. SeaBird, Campbell Scientific, YSI, Hobo), as well as delimited text files, MATLAB files and relational database queries. Basic metadata are derived from the data source itself (e.g. parsed from file headers) and by value inspection, and then augmented using editable metadata templates containing boilerplate documentation, attribute descriptors, code definitions and quality control rules. Data and metadata content, quality control rules and qualifier flags are then managed together in a robust data structure that supports database functionality and ensures data validity throughout processing. A growing suite of metadata-aware editing, quality control, analysis and synthesis tools are provided with the software to support managing data using graphical forms and command-line functions, as well as

  2. Modal control theory and application to aircraft lateral handling qualities design

    NASA Technical Reports Server (NTRS)

    Srinathkumar, S.

    1978-01-01

    A multivariable synthesis procedure based on eigenvalue/eigenvector assignment is reviewed and is employed to develop a systematic design procedure to meet the lateral handling qualities design objectives of a fighter aircraft over a wide range of flight conditions. The closed loop modal characterization developed provides significant insight into the design process and plays a pivotal role in the synthesis of robust feedback systems. The simplicity of the synthesis algorithm yields an efficient computer aided interactive design tool for flight control system synthesis.

  3. Experience-based quality control of clinical intensity-modulated radiotherapy planning.

    PubMed

    Moore, Kevin L; Brame, R Scott; Low, Daniel A; Mutic, Sasa

    2011-10-01

    To incorporate a quality control tool, according to previous planning experience and patient-specific anatomic information, into the intensity-modulated radiotherapy (IMRT) plan generation process and to determine whether the tool improved treatment plan quality. A retrospective study of 42 IMRT plans demonstrated a correlation between the fraction of organs at risk (OARs) overlapping the planning target volume and the mean dose. This yielded a model, predicted dose = prescription dose (0.2 + 0.8 [1 - exp(-3 overlapping planning target volume/volume of OAR)]), that predicted the achievable mean doses according to the planning target volume overlap/volume of OAR and the prescription dose. The model was incorporated into the planning process by way of a user-executable script that reported the predicted dose for any OAR. The script was introduced to clinicians engaged in IMRT planning and deployed thereafter. The script's effect was evaluated by tracking δ = (mean dose-predicted dose)/predicted dose, the fraction by which the mean dose exceeded the model. All OARs under investigation (rectum and bladder in prostate cancer; parotid glands, esophagus, and larynx in head-and-neck cancer) exhibited both smaller δ and reduced variability after script implementation. These effects were substantial for the parotid glands, for which the previous δ = 0.28 ± 0.24 was reduced to δ = 0.13 ± 0.10. The clinical relevance was most evident in the subset of cases in which the parotid glands were potentially salvageable (predicted dose <30 Gy). Before script implementation, an average of 30.1 Gy was delivered to the salvageable cases, with an average predicted dose of 20.3 Gy. After implementation, an average of 18.7 Gy was delivered to salvageable cases, with an average predicted dose of 17.2 Gy. In the prostate cases, the rectum model excess was reduced from δ = 0.28 ± 0.20 to δ = 0.07 ± 0.15. On surveying dosimetrists at the end of the study, most reported that the script

  4. A flexible tool for hydraulic and water quality performance analysis of green infrastructure

    NASA Astrophysics Data System (ADS)

    Massoudieh, A.; Alikhani, J.

    2017-12-01

    Models that allow for design considerations of green infrastructure (GI) practices to control stormwater runoff and associated contaminants have received considerable attention in recent years. To be used to evaluate the effect design configurations on the long-term performance of GIs, models should be able to consider processes within GIs with good fidelity. In this presentation, a sophisticated, yet flexible tool for hydraulic and water quality assessment of GIs will be introduced. The tool can be used by design engineers and researchers to capture and explore the effect of design factors and properties of the media employed in the performance of GI systems at a relatively small scale. We deemed it essential to have a flexible GI modeling tool that is capable of simulating GI system components and specific biogeochemical processes affecting contaminants such as evapotranspiration, plant uptake, reactions, and particle-associated transport accurately while maintaining a high degree of flexibility to account for the myriad of GI alternatives. The mathematical framework for a stand-alone GI performance assessment tool has been developed and will be demonstrated. The process-based model framework developed here can be used to model a diverse range of GI practices such as stormwater ponds, green roofs, retention ponds, bioretention systems, infiltration trench, permeable pavement and other custom-designed combinatory systems. An example of the application of the system to evaluate the performance of a rain-garden system will be demonstrated.

  5. Intelligent Monitoring? Assessing the ability of the Care Quality Commission's statistical surveillance tool to predict quality and prioritise NHS hospital inspections.

    PubMed

    Griffiths, Alex; Beaussier, Anne-Laure; Demeritt, David; Rothstein, Henry

    2017-02-01

    The Care Quality Commission (CQC) is responsible for ensuring the quality of the health and social care delivered by more than 30 000 registered providers in England. With only limited resources for conducting on-site inspections, the CQC has used statistical surveillance tools to help it identify which providers it should prioritise for inspection. In the face of planned funding cuts, the CQC plans to put more reliance on statistical surveillance tools to assess risks to quality and prioritise inspections accordingly. To evaluate the ability of the CQC's latest surveillance tool, Intelligent Monitoring (IM), to predict the quality of care provided by National Health Service (NHS) hospital trusts so that those at greatest risk of providing poor-quality care can be identified and targeted for inspection. The predictive ability of the IM tool is evaluated through regression analyses and χ 2 testing of the relationship between the quantitative risk score generated by the IM tool and the subsequent quality rating awarded following detailed on-site inspection by large expert teams of inspectors. First, the continuous risk scores generated by the CQC's IM statistical surveillance tool cannot predict inspection-based quality ratings of NHS hospital trusts (OR 0.38 (0.14 to 1.05) for Outstanding/Good, OR 0.94 (0.80 to -1.10) for Good/Requires improvement, and OR 0.90 (0.76 to 1.07) for Requires improvement/Inadequate). Second, the risk scores cannot be used more simply to distinguish the trusts performing poorly-those subsequently rated either 'Requires improvement' or 'Inadequate'-from the trusts performing well-those subsequently rated either 'Good' or 'Outstanding' (OR 1.07 (0.91 to 1.26)). Classifying CQC's risk bandings 1-3 as high risk and 4-6 as low risk, 11 of the high risk trusts were performing well and 43 of the low risk trusts were performing poorly, resulting in an overall accuracy rate of 47.6%. Third, the risk scores cannot be used even more simply to

  6. Tools for developing a quality management program: proactive tools (process mapping, value stream mapping, fault tree analysis, and failure mode and effects analysis).

    PubMed

    Rath, Frank

    2008-01-01

    This article examines the concepts of quality management (QM) and quality assurance (QA), as well as the current state of QM and QA practices in radiotherapy. A systematic approach incorporating a series of industrial engineering-based tools is proposed, which can be applied in health care organizations proactively to improve process outcomes, reduce risk and/or improve patient safety, improve through-put, and reduce cost. This tool set includes process mapping and process flowcharting, failure modes and effects analysis (FMEA), value stream mapping, and fault tree analysis (FTA). Many health care organizations do not have experience in applying these tools and therefore do not understand how and when to use them. As a result there are many misconceptions about how to use these tools, and they are often incorrectly applied. This article describes these industrial engineering-based tools and also how to use them, when they should be used (and not used), and the intended purposes for their use. In addition the strengths and weaknesses of each of these tools are described, and examples are given to demonstrate the application of these tools in health care settings.

  7. Development and validation of a tool to evaluate the quality of medical education websites in pathology.

    PubMed

    Alyusuf, Raja H; Prasad, Kameshwar; Abdel Satir, Ali M; Abalkhail, Ali A; Arora, Roopa K

    2013-01-01

    The exponential use of the internet as a learning resource coupled with varied quality of many websites, lead to a need to identify suitable websites for teaching purposes. The aim of this study is to develop and to validate a tool, which evaluates the quality of undergraduate medical educational websites; and apply it to the field of pathology. A tool was devised through several steps of item generation, reduction, weightage, pilot testing, post-pilot modification of the tool and validating the tool. Tool validation included measurement of inter-observer reliability; and generation of criterion related, construct related and content related validity. The validated tool was subsequently tested by applying it to a population of pathology websites. Reliability testing showed a high internal consistency reliability (Cronbach's alpha = 0.92), high inter-observer reliability (Pearson's correlation r = 0.88), intraclass correlation coefficient = 0.85 and κ =0.75. It showed high criterion related, construct related and content related validity. The tool showed moderately high concordance with the gold standard (κ =0.61); 92.2% sensitivity, 67.8% specificity, 75.6% positive predictive value and 88.9% negative predictive value. The validated tool was applied to 278 websites; 29.9% were rated as recommended, 41.0% as recommended with caution and 29.1% as not recommended. A systematic tool was devised to evaluate the quality of websites for medical educational purposes. The tool was shown to yield reliable and valid inferences through its application to pathology websites.

  8. Test Analysis Tools to Ensure Higher Quality of On-Board Real Time Software for Space Applications

    NASA Astrophysics Data System (ADS)

    Boudillet, O.; Mescam, J.-C.; Dalemagne, D.

    2008-08-01

    EADS Astrium Space Transportation, in its Les Mureaux premises, is responsible for the French M51 nuclear deterrent missile onboard SW. There was also developed over 1 million of line of code, mostly in ADA, for the Automated Transfer Vehicle (ATV) onboard SW and the flight control SW of the ARIANE5 launcher which has put it into orbit. As part of the ATV SW, ASTRIUM ST has developed the first Category A SW ever qualified for a European space application. To ensure that all these embedded SW have been developed with the highest quality and reliability level, specific development tools have been designed to cover the steps of source code verification, automated validation test or complete target instruction coverage verification. Three of such dedicated tools are presented here.

  9. Standard Free Droplet Digital Polymerase Chain Reaction as a New Tool for the Quality Control of High-Capacity Adenoviral Vectors in Small-Scale Preparations

    PubMed Central

    Boehme, Philip; Stellberger, Thorsten; Solanki, Manish; Zhang, Wenli; Schulz, Eric; Bergmann, Thorsten; Liu, Jing; Doerner, Johannes; Baiker, Armin E.

    2015-01-01

    Abstract High-capacity adenoviral vectors (HCAdVs) are promising tools for gene therapy as well as for genetic engineering. However, one limitation of the HCAdV vector system is the complex, time-consuming, and labor-intensive production process and the following quality control procedure. Since HCAdVs are deleted for all viral coding sequences, a helper virus (HV) is needed in the production process to provide the sequences for all viral proteins in trans. For the purification procedure of HCAdV, cesium chloride density gradient centrifugation is usually performed followed by buffer exchange using dialysis or comparable methods. However, performing these steps is technically difficult, potentially error-prone, and not scalable. Here, we establish a new protocol for small-scale production of HCAdV based on commercially available adenovirus purification systems and a standard method for the quality control of final HCAdV preparations. For titration of final vector preparations, we established a droplet digital polymerase chain reaction (ddPCR) that uses a standard free-end-point PCR in small droplets of defined volume. By using different probes, this method is capable of detecting and quantifying HCAdV and HV in one reaction independent of reference material, rendering this method attractive for accurately comparing viral titers between different laboratories. In summary, we demonstrate that it is possible to produce HCAdV in a small scale of sufficient quality and quantity to perform experiments in cell culture, and we established a reliable protocol for vector titration based on ddPCR. Our method significantly reduces time and required equipment to perform HCAdV production. In the future the ddPCR technology could be advantageous for titration of other viral vectors commonly used in gene therapy. PMID:25640117

  10. Rigidity controllable polishing tool based on magnetorheological effect

    NASA Astrophysics Data System (ADS)

    Wang, Jia; Wan, Yongjian; Shi, Chunyan

    2012-10-01

    A stable and predictable material removal function (MRF) plays a crucial role in computer controlled optical surfacing (CCOS). For physical contact polishing case, the stability of MRF depends on intimate contact between polishing interface and workpiece. Rigid laps maintain this function in polishing spherical surfaces, whose curvature has no variation with the position on the surface. Such rigid laps provide smoothing effect for mid-spatial frequency errors, but can't be used in aspherical surfaces for they will destroy the surface figure. Flexible tools such as magnetorheological fluid or air bonnet conform to the surface [1]. They lack rigidity and provide little natural smoothing effect. We present a rigidity controllable polishing tool that uses a kind of magnetorheological elastomers (MRE) medium [2]. It provides the ability of both conforming to the aspheric surface and maintaining natural smoothing effect. What's more, its rigidity can be controlled by the magnetic field. This paper will present the design, analysis, and stiffness variation mechanism model of such polishing tool [3].

  11. Quality Control in construction.

    DTIC Science & Technology

    1984-01-01

    behavioral scientists. In 1962, Dr. Kaoru Ishikawa gave shape to the form of training which featured intradepartmental groups of ten or so workers seated...and Japanese circles bears closer scrutiny. 4.3.1 Japanese Ingredients of Quality The founder of quality circles, Dr. Kaoru Ishikawa , gives six...around 51 a table; hence the name Quality Control Circle. 4 Dr. 0 Ishikawa was an engineering professor at Tokyo University, and the circles were

  12. Quality Assessment of Comparative Diagnostic Accuracy Studies: Our Experience Using a Modified Version of the QUADAS-2 Tool

    ERIC Educational Resources Information Center

    Wade, Ros; Corbett, Mark; Eastwood, Alison

    2013-01-01

    Assessing the quality of included studies is a vital step in undertaking a systematic review. The recently revised Quality Assessment of Diagnostic Accuracy Studies (QUADAS) tool (QUADAS-2), which is the only validated quality assessment tool for diagnostic accuracy studies, does not include specific criteria for assessing comparative studies. As…

  13. Discussion of the quality control and performance testing of ultrasound diagnostic equipment

    NASA Astrophysics Data System (ADS)

    Jiang, Junjie

    2018-03-01

    In recent years, with the rapid development of ultrasonography, the application and popularization of new technology used in ultrasound equipment, the level of providing diagnostic information for doctors enhances unceasingly, which has become the indispensable diagnostic tool for medical institutions. The performance of equipment is directly related to the doctor’s diagnosis and the patient’s health, therefore, it is very important to choose a good method for quality control and performance testing.

  14. A tool to determine financial impact of adverse events in health care: healthcare quality calculator.

    PubMed

    Yarbrough, Wendell G; Sewell, Andrew; Tickle, Erin; Rhinehardt, Eric; Harkleroad, Rod; Bennett, Marc; Johnson, Deborah; Wen, Li; Pfeiffer, Matthew; Benegas, Manny; Morath, Julie

    2014-12-01

    Hospital leaders lack tools to determine the financial impact of poor patient outcomes and adverse events. To provide health-care leaders with decision support for investments to improve care, we created a tool, the Healthcare Quality Calculator (HQCal), which uses institution-specific financial data to calculate impact of poor patient outcomes or quality improvement on present and future margin. Excel and Web-based versions of the HQCal were based on a cohort study framework and created with modular components including major drivers of cost and reimbursement. The Healthcare Quality Calculator (HQCal) compares payment, cost, and profit/loss for patients with and without poor outcomes or quality issues. Cost and payment information for groups with and without quality issues are used by the HQCal to calculate profit or loss. Importantly, institution-specific payment and cost data are used to calculate financial impact and attributable cost associated with poor patient outcomes, adverse events, or quality issues. Because future cost and reimbursement changes can be forecast, the HQCal incorporates a forward-looking component. The flexibility of the HQCal was demonstrated using surgical site infections after abdominal surgery and postoperative surgical airway complications. The Healthcare Quality Calculator determines financial impact of poor patient outcomes and the benefit of initiatives to improve quality. The calculator can identify quality issues that would provide the largest financial benefit if improved; however, it cannot identify specific interventions. The calculator provides a tool to improve transparency regarding both short- and long-term financial consequences of funding, or failing to fund, initiatives to close gaps in quality or improve patient outcomes.

  15. 30 CFR 74.6 - Quality control.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 30 Mineral Resources 1 2013-07-01 2013-07-01 false Quality control. 74.6 Section 74.6 Mineral... control. The applicant shall describe the way in which each lot of components will be sampled and tested... of the CMDPSU will be maintained in production through adequate quality control procedures, MSHA and...

  16. 30 CFR 74.6 - Quality control.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 30 Mineral Resources 1 2014-07-01 2014-07-01 false Quality control. 74.6 Section 74.6 Mineral... control. The applicant shall describe the way in which each lot of components will be sampled and tested... of the CMDPSU will be maintained in production through adequate quality control procedures, MSHA and...

  17. 30 CFR 74.6 - Quality control.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 30 Mineral Resources 1 2012-07-01 2012-07-01 false Quality control. 74.6 Section 74.6 Mineral... control. The applicant shall describe the way in which each lot of components will be sampled and tested... of the CMDPSU will be maintained in production through adequate quality control procedures, MSHA and...

  18. Helicopter mathematical models and control law development for handling qualities research

    NASA Technical Reports Server (NTRS)

    Chen, Robert T. N.; Lebacqz, J. Victor; Aiken, Edwin W.; Tischler, Mark B.

    1988-01-01

    Progress made in joint NASA/Army research concerning rotorcraft flight-dynamics modeling, design methodologies for rotorcraft flight-control laws, and rotorcraft parameter identification is reviewed. Research into these interactive disciplines is needed to develop the analytical tools necessary to conduct flying qualities investigations using both the ground-based and in-flight simulators, and to permit an efficient means of performing flight test evaluation of rotorcraft flying qualities for specification compliance. The need for the research is particularly acute for rotorcraft because of their mathematical complexity, high order dynamic characteristics, and demanding mission requirements. The research in rotorcraft flight-dynamics modeling is pursued along two general directions: generic nonlinear models and nonlinear models for specific rotorcraft. In addition, linear models are generated that extend their utilization from 1-g flight to high-g maneuvers and expand their frequency range of validity for the design analysis of high-gain flight control systems. A variety of methods ranging from classical frequency-domain approaches to modern time-domain control methodology that are used in the design of rotorcraft flight control laws is reviewed. Also reviewed is a study conducted to investigate the design details associated with high-gain, digital flight control systems for combat rotorcraft. Parameter identification techniques developed for rotorcraft applications are reviewed.

  19. E-nursing documentation as a tool for quality assurance.

    PubMed

    Rajkovic, Vladislav; Sustersic, Olga; Rajkovic, Uros

    2006-01-01

    The article presents the results of a project with which we describe the reengineering of nursing documentation. Documentation in nursing is an efficient tool for ensuring quality health care and consequently quality patient treatment along the whole clinical path. We have taken into account the nursing process and patient treatment based on Henderson theoretical model of nursing that consists of 14 basic living activities. The model of new documentation enables tracing, transparency, selectivity, monitoring and analyses. All these factors lead to improvements of a health system as well as to improved safety of patients and members of nursing teams. Thus the documentation was developed for three health care segments: secondary and tertiary level, dispensaries and community health care. The new quality introduced to the documentation process by information and communication technology is presented by a database model and a software prototype for managing documentation.

  20. High-volume image quality assessment systems: tuning performance with an interactive data visualization tool

    NASA Astrophysics Data System (ADS)

    Bresnahan, Patricia A.; Pukinskis, Madeleine; Wiggins, Michael

    1999-03-01

    evaluation. Lower-level pass-fail conditions and decision rules were coded into the system. Higher-level image quality states were defined by allowing the users to interactively adjust the system's sensitivity to various image attributes by manipulating graphical controls. Results were presented in easily interpreted bar graphs. These graphs were mouse- sensitive, allowing the user to more fully explore the subsets of data indicated by various color blocks. In order to simplify the performance evaluation and tuning process, users could choose to view the results of (1) the existing system parameter state, (2) the results of any arbitrary parameter values they chose, or (3) the results of a quasi-optimum parameter state, derived by applying a decision rule to a large set of possible parameter states. Giving managers easy- to-use tools for defining the more subjective aspects of quality resulted in a system that responded to contextual cues that are difficult to hard-code. It had the additional advantage of allowing the definition of quality to evolve over time, as users became more knowledgeable as to the strengths and limitations of an automated quality inspection system.

  1. Quality Assurance and Quality Control, Part 2.

    PubMed

    Akers, Michael J

    2015-01-01

    The tragedy surrounding the New England Compounding Center and contaminated steroid syringe preparations clearly points out what can happen if quality-assurance and quality-control procedures are not strictly practiced in the compounding of sterile preparations. This article is part 2 of a two-part article on requirements to comply with United States Pharmacopeia general chapters <797> and <1163> with respect to quality assurance of compounded sterile preparations. Part 1 covered documentation requirements, inspection procedures, compounding accuracy checks, and part of a discussion on bacterial endotoxin testing. Part 2 covers sterility testing, the completion from part 1 on bacterial endotoxin testing, a brief dicussion of United States Pharmacopeia <1163>, and advances in pharmaceutical quality systems.

  2. A tool for assessment of heart failure prescribing quality: A systematic review and meta-analysis.

    PubMed

    El Hadidi, Seif; Darweesh, Ebtissam; Byrne, Stephen; Bermingham, Margaret

    2018-04-16

    Heart failure (HF) guidelines aim to standardise patient care. Internationally, prescribing practice in HF may deviate from guidelines and so a standardised tool is required to assess prescribing quality. A systematic review and meta-analysis were performed to identify a quantitative tool for measuring adherence to HF guidelines and its clinical implications. Eleven electronic databases were searched to include studies reporting a comprehensive tool for measuring adherence to prescribing guidelines in HF patients aged ≥18 years. Qualitative studies or studies measuring prescription rates alone were excluded. Study quality was assessed using the Good ReseArch for Comparative Effectiveness Checklist. In total, 2455 studies were identified. Sixteen eligible full-text articles were included (n = 14 354 patients, mean age 69 ± 8 y). The Guideline Adherence Index (GAI), and its modified versions, was the most frequently cited tool (n = 13). Other tools identified were the Individualised Reconciled Evidence Recommendations, the Composite Heart Failure Performance, and the Heart Failure Scale. The meta-analysis included the GAI studies of good to high quality. The average GAI-3 was 62%. Compared to low GAI, high GAI patients had lower mortality rate (7.6% vs 33.9%) and lower rehospitalisation rates (23.5% vs 24.5%); both P ≤ .05. High GAI was associated with reduced risk of mortality (hazard ratio = 0.29, 95% confidence interval, 0.06-0.51) and rehospitalisation (hazard ratio = 0.64, 95% confidence interval, 0.41-1.00). No tool was used to improve prescribing quality. The GAI is the most frequently used tool to assess guideline adherence in HF. High GAI is associated with improved HF outcomes. Copyright © 2018 John Wiley & Sons, Ltd.

  3. Guidance for Efficient Small Animal Imaging Quality Control.

    PubMed

    Osborne, Dustin R; Kuntner, Claudia; Berr, Stuart; Stout, David

    2017-08-01

    Routine quality control is a critical aspect of properly maintaining high-performance small animal imaging instrumentation. A robust quality control program helps produce more reliable data both for academic purposes and as proof of system performance for contract imaging work. For preclinical imaging laboratories, the combination of costs and available resources often limits their ability to produce efficient and effective quality control programs. This work presents a series of simplified quality control procedures that are accessible to a wide range of preclinical imaging laboratories. Our intent is to provide minimum guidelines for routine quality control that can assist preclinical imaging specialists in setting up an appropriate quality control program for their facility.

  4. 42 CFR 84.256 - Quality control requirements.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 1 2013-10-01 2013-10-01 false Quality control requirements. 84.256 Section 84.256... § 84.256 Quality control requirements. (a) In addition to the construction and performance requirements specified in §§ 84.251, 84.252, 84.253, 84.254, and 84.255, the quality control requirements in paragraphs...

  5. 42 CFR 84.256 - Quality control requirements.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 1 2012-10-01 2012-10-01 false Quality control requirements. 84.256 Section 84.256... § 84.256 Quality control requirements. (a) In addition to the construction and performance requirements specified in §§ 84.251, 84.252, 84.253, 84.254, and 84.255, the quality control requirements in paragraphs...

  6. 42 CFR 84.256 - Quality control requirements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Quality control requirements. 84.256 Section 84.256... § 84.256 Quality control requirements. (a) In addition to the construction and performance requirements specified in §§ 84.251, 84.252, 84.253, 84.254, and 84.255, the quality control requirements in paragraphs...

  7. 42 CFR 84.256 - Quality control requirements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 1 2011-10-01 2011-10-01 false Quality control requirements. 84.256 Section 84.256... § 84.256 Quality control requirements. (a) In addition to the construction and performance requirements specified in §§ 84.251, 84.252, 84.253, 84.254, and 84.255, the quality control requirements in paragraphs...

  8. 42 CFR 84.256 - Quality control requirements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 1 2014-10-01 2014-10-01 false Quality control requirements. 84.256 Section 84.256... § 84.256 Quality control requirements. (a) In addition to the construction and performance requirements specified in §§ 84.251, 84.252, 84.253, 84.254, and 84.255, the quality control requirements in paragraphs...

  9. Artificial intelligence-based computer modeling tools for controlling slag foaming in electric arc furnaces

    NASA Astrophysics Data System (ADS)

    Wilson, Eric Lee

    Due to increased competition in a world economy, steel companies are currently interested in developing techniques that will allow for the improvement of the steelmaking process, either by increasing output efficiency or by improving the quality of their product, or both. Slag foaming is one practice that has been shown to contribute to both these goals. However, slag foaming is highly dynamic and difficult to model or control. This dissertation describes an effort to use artificial intelligence-based tools (genetic algorithms, fuzzy logic, and neural networks) to both model and control the slag foaming process. Specifically, a neural network is trained and tested on slag foaming data provided by a steel plant. This neural network model is then controlled by a fuzzy logic controller, which in turn is optimized by a genetic algorithm. This tuned controller is then installed at a steel plant and given control be a more efficient slag foaming controller than what was previously used by the steel plant.

  10. International Society of Human and Animal Mycology (ISHAM)-ITS reference DNA barcoding database--the quality controlled standard tool for routine identification of human and animal pathogenic fungi.

    PubMed

    Irinyi, Laszlo; Serena, Carolina; Garcia-Hermoso, Dea; Arabatzis, Michael; Desnos-Ollivier, Marie; Vu, Duong; Cardinali, Gianluigi; Arthur, Ian; Normand, Anne-Cécile; Giraldo, Alejandra; da Cunha, Keith Cassia; Sandoval-Denis, Marcelo; Hendrickx, Marijke; Nishikaku, Angela Satie; de Azevedo Melo, Analy Salles; Merseguel, Karina Bellinghausen; Khan, Aziza; Parente Rocha, Juliana Alves; Sampaio, Paula; da Silva Briones, Marcelo Ribeiro; e Ferreira, Renata Carmona; de Medeiros Muniz, Mauro; Castañón-Olivares, Laura Rosio; Estrada-Barcenas, Daniel; Cassagne, Carole; Mary, Charles; Duan, Shu Yao; Kong, Fanrong; Sun, Annie Ying; Zeng, Xianyu; Zhao, Zuotao; Gantois, Nausicaa; Botterel, Françoise; Robbertse, Barbara; Schoch, Conrad; Gams, Walter; Ellis, David; Halliday, Catriona; Chen, Sharon; Sorrell, Tania C; Piarroux, Renaud; Colombo, Arnaldo L; Pais, Célia; de Hoog, Sybren; Zancopé-Oliveira, Rosely Maria; Taylor, Maria Lucia; Toriello, Conchita; de Almeida Soares, Célia Maria; Delhaes, Laurence; Stubbe, Dirk; Dromer, Françoise; Ranque, Stéphane; Guarro, Josep; Cano-Lira, Jose F; Robert, Vincent; Velegraki, Aristea; Meyer, Wieland

    2015-05-01

    Human and animal fungal pathogens are a growing threat worldwide leading to emerging infections and creating new risks for established ones. There is a growing need for a rapid and accurate identification of pathogens to enable early diagnosis and targeted antifungal therapy. Morphological and biochemical identification methods are time-consuming and require trained experts. Alternatively, molecular methods, such as DNA barcoding, a powerful and easy tool for rapid monophasic identification, offer a practical approach for species identification and less demanding in terms of taxonomical expertise. However, its wide-spread use is still limited by a lack of quality-controlled reference databases and the evolving recognition and definition of new fungal species/complexes. An international consortium of medical mycology laboratories was formed aiming to establish a quality controlled ITS database under the umbrella of the ISHAM working group on "DNA barcoding of human and animal pathogenic fungi." A new database, containing 2800 ITS sequences representing 421 fungal species, providing the medical community with a freely accessible tool at http://www.isham.org/ and http://its.mycologylab.org/ to rapidly and reliably identify most agents of mycoses, was established. The generated sequences included in the new database were used to evaluate the variation and overall utility of the ITS region for the identification of pathogenic fungi at intra-and interspecies level. The average intraspecies variation ranged from 0 to 2.25%. This highlighted selected pathogenic fungal species, such as the dermatophytes and emerging yeast, for which additional molecular methods/genetic markers are required for their reliable identification from clinical and veterinary specimens. © The Author 2015. Published by Oxford University Press on behalf of The International Society for Human and Animal Mycology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  11. [Tools to enhance the quality and transparency of health research reports: reporting guidelines].

    PubMed

    Galvão, Taís Freire; Silva, Marcus Tolentino; Garcia, Leila Posenato

    2016-01-01

    Scientific writing is the cornestone for publishing the results of research. Reporting guidelines are important tools for all those involved in the process of research production and report writing. These guidelines detail what is expected to be found in each section of a report for a given study design. The EQUATOR Network (Enhancing the QUAlity and Transparency Of health Research) is an international initiative that seeks to improve the reliability and the value of health research literature by promoting transparent and accurate reporting and wider use of robust reporting guidelines. The use of reporting guidelines has contributed to improved reports as well as increased quality of research methods. Reporting guidelines need to be publicized in order to increase knowledge about these essential tools among health researchers. Encouraging their use by journals is key to enhancing the quality of scientific publications.

  12. Data Quality Control of the French Permanent Broadband Network in the RESIF Framework

    NASA Astrophysics Data System (ADS)

    Grunberg, Marc; Lambotte, Sophie; Engels, Fabien; Dretzen, Remi; Hernandez, Alain

    2014-05-01

    discrepancies. The results of quality control is visualized through a web interface. This latter gathers data from different information systems to provide a global view on last events that could impact the data (like intervention on site or seismic events, etc.). This work is still an ongoing project. We intend to add more sophisticated procedures to enhanced our data Quality Control. Among them, we will deploy a seismic moment tensor inversion tool for amplitude, time and polarity control and a noise correlation procedure for time drift detections.

  13. Recommendations for the Use of E-Tools for Improvements around Assignment Marking Quality

    ERIC Educational Resources Information Center

    Heinrich, Eva; Milne, John; Ramsay, Annabel; Morrison, David

    2009-01-01

    This article reports on selected aspects of a larger study on the use of electronic tools in the context of the management and marking of assignments. The study comprised a literature review, interviews and a review of e-tools. The article briefly summarises the findings from the literature on what comprises quality in assignment marking. The…

  14. The coronary artery disease quality dashboard: a chronic care disease management tool in an electronic health record.

    PubMed

    Jung, Eunice; Schnipper, Jeffrey L; Li, Qi; Linder, Jeffrey A; Rose, Alan F; Li, Ruzhuo; Eskin, Michael S; Housman, Dan; Middleton, Blackford; Einbinder, Jonathan S

    2007-10-11

    Quality reporting tools, integrated with ambulatory electronic health records (EHRs), may help clinicians understand performance, manage populations, and improve quality. The Coronary Artery Disease Quality Dash board (CAD QD) is a secure web report for performance measurement of a chronic care condition delivered through a central data warehouse and custom-built reporting tool. Pilot evaluation of the CAD Quality Dash board indicates that clinicians prefer a quality report that combines not only structured data from EHRs but one that facilitates actions to be taken on individual patients or on a population, i.e., for case management.

  15. Development and validation of a tool to evaluate the quality of medical education websites in pathology

    PubMed Central

    Alyusuf, Raja H.; Prasad, Kameshwar; Abdel Satir, Ali M.; Abalkhail, Ali A.; Arora, Roopa K.

    2013-01-01

    Background: The exponential use of the internet as a learning resource coupled with varied quality of many websites, lead to a need to identify suitable websites for teaching purposes. Aim: The aim of this study is to develop and to validate a tool, which evaluates the quality of undergraduate medical educational websites; and apply it to the field of pathology. Methods: A tool was devised through several steps of item generation, reduction, weightage, pilot testing, post-pilot modification of the tool and validating the tool. Tool validation included measurement of inter-observer reliability; and generation of criterion related, construct related and content related validity. The validated tool was subsequently tested by applying it to a population of pathology websites. Results and Discussion: Reliability testing showed a high internal consistency reliability (Cronbach's alpha = 0.92), high inter-observer reliability (Pearson's correlation r = 0.88), intraclass correlation coefficient = 0.85 and κ =0.75. It showed high criterion related, construct related and content related validity. The tool showed moderately high concordance with the gold standard (κ =0.61); 92.2% sensitivity, 67.8% specificity, 75.6% positive predictive value and 88.9% negative predictive value. The validated tool was applied to 278 websites; 29.9% were rated as recommended, 41.0% as recommended with caution and 29.1% as not recommended. Conclusion: A systematic tool was devised to evaluate the quality of websites for medical educational purposes. The tool was shown to yield reliable and valid inferences through its application to pathology websites. PMID:24392243

  16. Matlab as a robust control design tool

    NASA Technical Reports Server (NTRS)

    Gregory, Irene M.

    1994-01-01

    This presentation introduces Matlab as a tool used in flight control research. The example used to illustrate some of the capabilities of this software is a robust controller designed for a single stage to orbit air breathing vehicles's ascent to orbit. The global requirements of the controller are to stabilize the vehicle and follow a trajectory in the presence of atmospheric disturbances and strong dynamic coupling between airframe and propulsion.

  17. FQC Dashboard: integrates FastQC results into a web-based, interactive, and extensible FASTQ quality control tool.

    PubMed

    Brown, Joseph; Pirrung, Meg; McCue, Lee Ann

    2017-06-09

    FQC is software that facilitates quality control of FASTQ files by carrying out a QC protocol using FastQC, parsing results, and aggregating quality metrics into an interactive dashboard designed to richly summarize individual sequencing runs. The dashboard groups samples in dropdowns for navigation among the data sets, utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data. FQC is implemented in Python 3 and Javascript, and is maintained under an MIT license. Documentation and source code is available at: https://github.com/pnnl/fqc . joseph.brown@pnnl.gov. © The Author(s) 2017. Published by Oxford University Press.

  18. Indoor Air Quality Tools for Schools Program: Benefits of Improving Air Quality in the School Environment.

    ERIC Educational Resources Information Center

    Environmental Protection Agency, Washington, DC. Office of Radiation and Indoor Air.

    The U.S. Environmental Protection Agency (EPA) developed the Indoor Air Quality Tools for Schools (IAQ TfS) Program to help schools prevent, identify, and resolve their IAQ problems. This publication describes the program and its advantages, explaining that through simple, low-cost measures, schools can: reduce IAQ-related health risks and…

  19. Dust control effectiveness of drywall sanding tools.

    PubMed

    Young-Corbett, Deborah E; Nussbaum, Maury A

    2009-07-01

    In this laboratory study, four drywall sanding tools were evaluated in terms of dust generation rates in the respirable and thoracic size classes. In a repeated measures study design, 16 participants performed simulated drywall finishing tasks with each of four tools: (1) ventilated sander, (2) pole sander, (3) block sander, and (4) wet sponge. Dependent variables of interest were thoracic and respirable breathing zone dust concentrations. Analysis by Friedman's Test revealed that the ventilated drywall sanding tool produced significantly less dust, of both size classes, than did the other three tools. The pole and wet sanders produced significantly less dust of both size classes than did the block sander. The block sander, the most commonly used tool in drywall finishing operations, produced significantly more dust of both size classes than did the other three tools. When compared with the block sander, the other tools offer substantial dust reduction. The ventilated tool reduced respirable concentrations by 88% and thoracic concentrations by 85%. The pole sander reduced respirable concentrations by 58% and thoracic by 50%. The wet sander produced reductions of 60% and 47% in the respirable and thoracic classes, respectively. Wet sponge sanders and pole sanders are effective at reducing breathing-zone dust concentrations; however, based on its superior dust control effectiveness, the ventilated sander is the recommended tool for drywall finishing operations.

  20. Simulation Evaluation of Controller-Managed Spacing Tools under Realistic Operational Conditions

    NASA Technical Reports Server (NTRS)

    Callantine, Todd J.; Hunt, Sarah M.; Prevot, Thomas

    2014-01-01

    Controller-Managed Spacing (CMS) tools have been developed to aid air traffic controllers in managing high volumes of arriving aircraft according to a schedule while enabling them to fly efficient descent profiles. The CMS tools are undergoing refinement in preparation for field demonstration as part of NASA's Air Traffic Management (ATM) Technology Demonstration-1 (ATD-1). System-level ATD-1 simulations have been conducted to quantify expected efficiency and capacity gains under realistic operational conditions. This paper presents simulation results with a focus on CMS-tool human factors. The results suggest experienced controllers new to the tools find them acceptable and can use them effectively in ATD-1 operations.

  1. 7 CFR 58.928 - Quality control tests.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... quality test shall be made to determine product stability. ... 7 Agriculture 3 2013-01-01 2013-01-01 false Quality control tests. 58.928 Section 58.928... Procedures § 58.928 Quality control tests. All dairy products and other ingredients shall be subject to...

  2. 7 CFR 58.928 - Quality control tests.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... quality test shall be made to determine product stability. ... 7 Agriculture 3 2012-01-01 2012-01-01 false Quality control tests. 58.928 Section 58.928... Procedures § 58.928 Quality control tests. All dairy products and other ingredients shall be subject to...

  3. Design and evaluation of an air traffic control Final Approach Spacing Tool

    NASA Technical Reports Server (NTRS)

    Davis, Thomas J.; Erzberger, Heinz; Green, Steven M.; Nedell, William

    1991-01-01

    This paper describes the design and simulator evaluation of an automation tool for assisting terminal radar approach controllers in sequencing and spacing traffic onto the final approach course. The automation tool, referred to as the Final Approach Spacing Tool (FAST), displays speed and heading advisories for arriving aircraft as well as sequencing information on the controller's radar display. The main functional elements of FAST are a scheduler that schedules and sequences the traffic, a four-dimensional trajectory synthesizer that generates the advisories, and a graphical interface that displays the information to the controller. FAST has been implemented on a high-performance workstation. It can be operated as a stand-alone in the terminal radar approach control facility or as an element of a system integrated with automation tools in the air route traffic control center. FAST was evaluated by experienced air traffic controllers in a real-time air traffic control simulation. simulation results summarized in the paper show that the automation tools significantly reduced controller work load and demonstrated a potential for an increase in landing rate.

  4. User-friendly solutions for microarray quality control and pre-processing on ArrayAnalysis.org

    PubMed Central

    Eijssen, Lars M. T.; Jaillard, Magali; Adriaens, Michiel E.; Gaj, Stan; de Groot, Philip J.; Müller, Michael; Evelo, Chris T.

    2013-01-01

    Quality control (QC) is crucial for any scientific method producing data. Applying adequate QC introduces new challenges in the genomics field where large amounts of data are produced with complex technologies. For DNA microarrays, specific algorithms for QC and pre-processing including normalization have been developed by the scientific community, especially for expression chips of the Affymetrix platform. Many of these have been implemented in the statistical scripting language R and are available from the Bioconductor repository. However, application is hampered by lack of integrative tools that can be used by users of any experience level. To fill this gap, we developed a freely available tool for QC and pre-processing of Affymetrix gene expression results, extending, integrating and harmonizing functionality of Bioconductor packages. The tool can be easily accessed through a wizard-like web portal at http://www.arrayanalysis.org or downloaded for local use in R. The portal provides extensive documentation, including user guides, interpretation help with real output illustrations and detailed technical documentation. It assists newcomers to the field in performing state-of-the-art QC and pre-processing while offering data analysts an integral open-source package. Providing the scientific community with this easily accessible tool will allow improving data quality and reuse and adoption of standards. PMID:23620278

  5. A web-based screening tool for near-port air quality assessments

    PubMed Central

    Isakov, Vlad; Barzyk, Timothy M.; Smith, Elizabeth R.; Arunachalam, Saravanan; Naess, Brian; Venkatram, Akula

    2018-01-01

    The Community model for near-PORT applications (C-PORT) is a screening tool with an intended purpose of calculating differences in annual averaged concentration patterns and relative contributions of various source categories over the spatial domain within about 10 km of the port. C-PORT can inform decision-makers and concerned citizens about local air quality due to mobile source emissions related to commercial port activities. It allows users to visualize and evaluate different planning scenarios, helping them identify the best alternatives for making long-term decisions that protect community health and sustainability. The web-based, easy-to-use interface currently includes data from 21 seaports primarily in the Southeastern U.S., and has a map-based interface based on Google Maps. The tool was developed to visualize and assess changes in air quality due to changes in emissions and/or meteorology in order to analyze development scenarios, and is not intended to support or replace any regulatory models or programs. PMID:29681760

  6. Quality control quantification (QCQ): a tool to measure the value of quality control checks in radiation oncology.

    PubMed

    Ford, Eric C; Terezakis, Stephanie; Souranis, Annette; Harris, Kendra; Gay, Hiram; Mutic, Sasa

    2012-11-01

    To quantify the error-detection effectiveness of commonly used quality control (QC) measures. We analyzed incidents from 2007-2010 logged into a voluntary in-house, electronic incident learning systems at 2 academic radiation oncology clinics. None of the incidents resulted in patient harm. Each incident was graded for potential severity using the French Nuclear Safety Authority scoring scale; high potential severity incidents (score >3) were considered, along with a subset of 30 randomly chosen low severity incidents. Each report was evaluated to identify which of 15 common QC checks could have detected it. The effectiveness was calculated, defined as the percentage of incidents that each QC measure could detect, both for individual QC checks and for combinations of checks. In total, 4407 incidents were reported, 292 of which had high-potential severity. High- and low-severity incidents were detectable by 4.0 ± 2.3 (mean ± SD) and 2.6 ± 1.4 QC checks, respectively (P<.001). All individual checks were less than 50% sensitive with the exception of pretreatment plan review by a physicist (63%). An effectiveness of 97% was achieved with 7 checks used in combination and was not further improved with more checks. The combination of checks with the highest effectiveness includes physics plan review, physician plan review, Electronic Portal Imaging Device-based in vivo portal dosimetry, radiation therapist timeout, weekly physics chart check, the use of checklists, port films, and source-to-skin distance checks. Some commonly used QC checks such as pretreatment intensity modulated radiation therapy QA do not substantially add to the ability to detect errors in these data. The effectiveness of QC measures in radiation oncology depends sensitively on which checks are used and in which combinations. A small percentage of errors cannot be detected by any of the standard formal QC checks currently in broad use, suggesting that further improvements are needed. These data

  7. Development and Evaluation of a Pedagogical Tool to Improve Understanding of a Quality Checklist: A Randomised Controlled Trial

    PubMed Central

    Fourcade, Lola; Boutron, Isabelle; Moher, David; Ronceray, Lucie; Baron, Gabriel; Ravaud, Philippe

    2007-01-01

    Objective: The aim of this study was to develop and evaluate a pedagogical tool to enhance the understanding of a checklist that evaluates reports of nonpharmacological trials (CLEAR NPT). Design: Paired randomised controlled trial. Participants: Clinicians and systematic reviewers. Interventions: We developed an Internet-based computer learning system (ICLS). This pedagogical tool used many examples from published randomised controlled trials to demonstrate the main coding difficulties encountered when using this checklist. Randomised participants received either a specific Web-based training with the ICLS (intervention group) or no specific training. Outcome measures: The primary outcome was the rate of correct answers compared to a criterion standard for coding a report of randomised controlled trials with the CLEAR NPT. Results: Between April and June 2006, 78 participants were randomly assigned to receive training with the ICLS (39) or no training (39). Participants trained by the ICLS did not differ from the control group in performance on the CLEAR NPT. The mean paired difference and corresponding 95% confidence interval was 0.5 (−5.1 to 6.1). The rate of correct answers did not differ between the two groups regardless of the CLEAR NPT item. Combining both groups, the rate of correct answers was high or items related to allocation sequence (79.5%), description of the intervention (82.0%), blinding of patients (79.5%), and follow-up schedule (83.3%). The rate of correct answers was low for items related to allocation concealment (46.1%), co-interventions (30.3%), blinding of outcome assessors (53.8%), specific measures to avoid ascertainment bias (28.6%), and intention-to-treat analysis (60.2%). Conclusions: Although we showed no difference in effect between the intervention and control groups, our results highlight the gap in knowledge and urgency for education on important aspects of trial conduct. PMID:17479163

  8. Toward an evidence-based system for innovation support for implementing innovations with quality: tools, training, technical assistance, and quality assurance/quality improvement.

    PubMed

    Wandersman, Abraham; Chien, Victoria H; Katz, Jason

    2012-12-01

    An individual or organization that sets out to implement an innovation (e.g., a new technology, program, or policy) generally requires support. In the Interactive Systems Framework for Dissemination and Implementation, a Support System should work with Delivery Systems (national, state and/or local entities such as health and human service organizations, community-based organizations, schools) to enhance their capacity for quality implementation of innovations. The literature on the Support System [corrected] has been underresearched and under-developedThis article begins to conceptualize theory, research, and action for an evidence-based system for innovation support (EBSIS). EBSIS describes key priorities for strengthening the science and practice of support. The major goal of EBSIS is to enhance the research and practice of support in order to build capacity in the Delivery System for implementing innovations with quality, and thereby, help the Delivery System achieve outcomes. EBSIS is guided by a logic model that includes four key support components: tools, training, technical assistance, and quality assurance/quality improvement. EBSIS uses the Getting To Outcomes approach to accountability to aid the identification and synthesis of concepts, tools, and evidence for support. We conclude with some discussion of the current status of EBSIS and possible next steps, including the development of collaborative researcher-practitioner-funder-consumer partnerships to accelerate accumulation of knowledge on the Support System.

  9. Proteomics Quality Control: Quality Control Software for MaxQuant Results.

    PubMed

    Bielow, Chris; Mastrobuoni, Guido; Kempa, Stefan

    2016-03-04

    Mass spectrometry-based proteomics coupled to liquid chromatography has matured into an automatized, high-throughput technology, producing data on the scale of multiple gigabytes per instrument per day. Consequently, an automated quality control (QC) and quality analysis (QA) capable of detecting measurement bias, verifying consistency, and avoiding propagation of error is paramount for instrument operators and scientists in charge of downstream analysis. We have developed an R-based QC pipeline called Proteomics Quality Control (PTXQC) for bottom-up LC-MS data generated by the MaxQuant software pipeline. PTXQC creates a QC report containing a comprehensive and powerful set of QC metrics, augmented with automated scoring functions. The automated scores are collated to create an overview heatmap at the beginning of the report, giving valuable guidance also to nonspecialists. Our software supports a wide range of experimental designs, including stable isotope labeling by amino acids in cell culture (SILAC), tandem mass tags (TMT), and label-free data. Furthermore, we introduce new metrics to score MaxQuant's Match-between-runs (MBR) functionality by which peptide identifications can be transferred across Raw files based on accurate retention time and m/z. Last but not least, PTXQC is easy to install and use and represents the first QC software capable of processing MaxQuant result tables. PTXQC is freely available at https://github.com/cbielow/PTXQC .

  10. 7 CFR 981.442 - Quality control.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 8 2012-01-01 2012-01-01 false Quality control. 981.442 Section 981.442 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Marketing Agreements... Administrative Rules and Regulations § 981.442 Quality control. (a) Incoming. Pursuant to § 981.42(a), the...

  11. 7 CFR 981.442 - Quality control.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 8 2013-01-01 2013-01-01 false Quality control. 981.442 Section 981.442 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (MARKETING AGREEMENTS... Administrative Rules and Regulations § 981.442 Quality control. (a) Incoming. Pursuant to § 981.42(a), the...

  12. 7 CFR 981.442 - Quality control.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 8 2011-01-01 2011-01-01 false Quality control. 981.442 Section 981.442 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Marketing Agreements... Administrative Rules and Regulations § 981.442 Quality control. (a) Incoming. Pursuant to § 981.42(a), the...

  13. GEOSPATIAL QUALITY COUNCIL

    EPA Science Inventory

    Geospatial Science is increasingly becoming an important tool in making Agency decisions. QualIty Control and Quality Assurance are required to be integrated during the planning, implementation and assessment of geospatial databases, processes and products. In order to ensure Age...

  14. Easily configured real-time CPOE Pick Off Tool supporting focused clinical research and quality improvement.

    PubMed

    Rosenbaum, Benjamin P; Silkin, Nikolay; Miller, Randolph A

    2014-01-01

    Real-time alerting systems typically warn providers about abnormal laboratory results or medication interactions. For more complex tasks, institutions create site-wide 'data warehouses' to support quality audits and longitudinal research. Sophisticated systems like i2b2 or Stanford's STRIDE utilize data warehouses to identify cohorts for research and quality monitoring. However, substantial resources are required to install and maintain such systems. For more modest goals, an organization desiring merely to identify patients with 'isolation' orders, or to determine patients' eligibility for clinical trials, may adopt a simpler, limited approach based on processing the output of one clinical system, and not a data warehouse. We describe a limited, order-entry-based, real-time 'pick off' tool, utilizing public domain software (PHP, MySQL). Through a web interface the tool assists users in constructing complex order-related queries and auto-generates corresponding database queries that can be executed at recurring intervals. We describe successful application of the tool for research and quality monitoring.

  15. Assessment of quality of prescribing in patients of hypertension at primary and secondary health care facilities using the Prescription Quality Index (PQI) tool.

    PubMed

    Suthar, Jalpa Vashishth; Patel, Varsha J

    2014-01-01

    To determine the quality of prescribing in hypertension in primary and secondary health care settings using the Prescription Quality Index (PQI) tool and to assess the reliability of this tool. An observational cross-sectional study was carried out for 6 months in order to assess quality of prescribing of antihypertensive drugs using Prescription Quality Index (PQI) at four primary (PHC) and two secondary (SHC) health care facilities. Patients attending these facilities for at least 3 months were included. Complete medical history and prescriptions received were noted. Total and criteria wise PQI scores were derived for each prescription. Prescriptions were categorized as poor (score of ≤31), medium (score 32-33) and high quality (score 34-43) based on PQI total score. Psychometric analysis using factor analysis was carried out to assess reliability and validity. Total 73 hypertensive patients were included. Mean age was 61.2 ± 11 years with 35 (48%) patients above 65 years of age. Total PQI score was 26 ± 11. There was a significant difference in PQI score between PHC and SHC (P < 0.05) Out of 73 prescriptions, 43 (59%) were of poor quality with PQI score <31. The value of Cronbach's α for the entire 22 criteria of PQI was 0.71 suggesting good reliability of PQI tool in our setting. Based on PQI scores, quality of prescribing in hypertensive patients was poor, somewhat better in primary as compared to secondary health care facility. PQI is reliable for measuring prescribing quality in hypertension in Indian set up.

  16. Contributions of CCLM to advances in quality control.

    PubMed

    Kazmierczak, Steven C

    2013-01-01

    Abstract The discipline of laboratory medicine is relatively young when considered in the context of the history of medicine itself. The history of quality control, within the context of laboratory medicine, also enjoys a relatively brief, but rich history. Laboratory quality control continues to evolve along with advances in automation, measurement techniques and information technology. Clinical Chemistry and Laboratory Medicine (CCLM) has played a key role in helping disseminate information about the proper use and utility of quality control. Publication of important advances in quality control techniques and dissemination of guidelines concerned with laboratory quality control has undoubtedly helped readers of this journal keep up to date on the most recent developments in this field.

  17. Application of uncertainty and sensitivity analysis to the air quality SHERPA modelling tool

    NASA Astrophysics Data System (ADS)

    Pisoni, E.; Albrecht, D.; Mara, T. A.; Rosati, R.; Tarantola, S.; Thunis, P.

    2018-06-01

    Air quality has significantly improved in Europe over the past few decades. Nonetheless we still find high concentrations in measurements mainly in specific regions or cities. This dimensional shift, from EU-wide to hot-spot exceedances, calls for a novel approach to regional air quality management (to complement EU-wide existing policies). The SHERPA (Screening for High Emission Reduction Potentials on Air quality) modelling tool was developed in this context. It provides an additional tool to be used in support to regional/local decision makers responsible for the design of air quality plans. It is therefore important to evaluate the quality of the SHERPA model, and its behavior in the face of various kinds of uncertainty. Uncertainty and sensitivity analysis techniques can be used for this purpose. They both reveal the links between assumptions and forecasts, help in-model simplification and may highlight unexpected relationships between inputs and outputs. Thus, a policy steered SHERPA module - predicting air quality improvement linked to emission reduction scenarios - was evaluated by means of (1) uncertainty analysis (UA) to quantify uncertainty in the model output, and (2) by sensitivity analysis (SA) to identify the most influential input sources of this uncertainty. The results of this study provide relevant information about the key variables driving the SHERPA output uncertainty, and advise policy-makers and modellers where to place their efforts for an improved decision-making process.

  18. An iterative learning control method with application for CNC machine tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, D.I.; Kim, S.

    1996-01-01

    A proportional, integral, and derivative (PID) type iterative learning controller is proposed for precise tracking control of industrial robots and computer numerical controller (CNC) machine tools performing repetitive tasks. The convergence of the output error by the proposed learning controller is guaranteed under a certain condition even when the system parameters are not known exactly and unknown external disturbances exist. As the proposed learning controller is repeatedly applied to the industrial robot or the CNC machine tool with the path-dependent repetitive task, the distance difference between the desired path and the actual tracked or machined path, which is one ofmore » the most significant factors in the evaluation of control performance, is progressively reduced. The experimental results demonstrate that the proposed learning controller can improve machining accuracy when the CNC machine tool performs repetitive machining tasks.« less

  19. 7 CFR 58.733 - Quality control tests.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Quality control tests. 58.733 Section 58.733... Procedures § 58.733 Quality control tests. (a) Chemical analyses. The following chemical analyses shall be... pasteurization by means of the phosphatase test, as well as any other tests necessary to assure good quality...

  20. 7 CFR 58.733 - Quality control tests.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Quality control tests. 58.733 Section 58.733... Procedures § 58.733 Quality control tests. (a) Chemical analyses. The following chemical analyses shall be... pasteurization by means of the phosphatase test, as well as any other tests necessary to assure good quality...

  1. Simultaneous Independent Control of Tool Axial Force and Temperature in Friction Stir Processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ross, Kenneth A.; Grant, Glenn J.; Darsell, Jens T.

    Maintaining consistent tool depth relative to the part surface is a critical requirement for many Friction stir processing (FSP) applications. Force control is often used with the goal of obtaining a constant weld depth. When force control is used, if weld temperature decreases, flow stress increases and the tool is pushed up. If weld temperature increases, flow stress decreases and the tool dives. These variations in tool depth and weld temperature cause various types of weld defects. Robust temperature control for FSP maintains a commanded temperature through control of the spindle axis only. Robust temperature control and force control aremore » completely decoupled in control logic and machine motion. This results in stable temperature, force and tool depth despite the presence of geometric and thermal disturbances. Performance of this control method is presented for various weld paths and alloy systems.« less

  2. Quality-control design for surface-water sampling in the National Water-Quality Network

    USGS Publications Warehouse

    Riskin, Melissa L.; Reutter, David C.; Martin, Jeffrey D.; Mueller, David K.

    2018-04-10

    The data-quality objectives for samples collected at surface-water sites in the National Water-Quality Network include estimating the extent to which contamination, matrix effects, and measurement variability affect interpretation of environmental conditions. Quality-control samples provide insight into how well the samples collected at surface-water sites represent the true environmental conditions. Quality-control samples used in this program include field blanks, replicates, and field matrix spikes. This report describes the design for collection of these quality-control samples and the data management needed to properly identify these samples in the U.S. Geological Survey’s national database.

  3. Advanced strategies for quality control of Chinese medicines.

    PubMed

    Zhao, Jing; Ma, Shuang-Cheng; Li, Shao-Ping

    2018-01-05

    Quality control is always the critical issue for Chinese medicines (CMs) with their worldwide increasing use. Different from western medicine, CMs are usually considered that multiple constituents are responsible for the therapeutic effects. Therefore, quality control of CMs is a challenge. In 2011, the strategies for quantification, related to the markers, reference compounds and approaches, in quality control of CMs were reviewed (Li, et al., J. Pharm. Biomed. Anal., 2011, 55, 802-809). Since then, some new strategies have been proposed in these fields. Therefore, the review on the strategies for quality control of CMs should be updated to improve the safety and efficacy of CMs. Herein, novel strategies related to quality marker discovery, reference compound development and advanced approaches (focused on glyco-analysis) for quality control, during 2011-2016, were summarized and discussed. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Results of a multicentre randomised controlled trial of statistical process control charts and structured diagnostic tools to reduce ward-acquired meticillin-resistant Staphylococcus aureus: the CHART Project.

    PubMed

    Curran, E; Harper, P; Loveday, H; Gilmour, H; Jones, S; Benneyan, J; Hood, J; Pratt, R

    2008-10-01

    Statistical process control (SPC) charts have previously been advocated for infection control quality improvement. To determine their effectiveness, a multicentre randomised controlled trial was undertaken to explore whether monthly SPC feedback from infection control nurses (ICNs) to healthcare workers of ward-acquired meticillin-resistant Staphylococcus aureus (WA-MRSA) colonisation or infection rates would produce any reductions in incidence. Seventy-five wards in 24 hospitals in the UK were randomised into three arms: (1) wards receiving SPC chart feedback; (2) wards receiving SPC chart feedback in conjunction with structured diagnostic tools; and (3) control wards receiving neither type of feedback. Twenty-five months of pre-intervention WA-MRSA data were compared with 24 months of post-intervention data. Statistically significant and sustained decreases in WA-MRSA rates were identified in all three arms (P<0.001; P=0.015; P<0.001). The mean percentage reduction was 32.3% for wards receiving SPC feedback, 19.6% for wards receiving SPC and diagnostic feedback, and 23.1% for control wards, but with no significant difference between the control and intervention arms (P=0.23). There were significantly more post-intervention 'out-of-control' episodes (P=0.021) in the control arm (averages of 0.60, 0.28, and 0.28 for Control, SPC and SPC+Tools wards, respectively). Participants identified SPC charts as an effective communication tool and valuable for disseminating WA-MRSA data.

  5. 30 CFR 28.30 - Quality control plans; filing requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Quality control plans; filing requirements. 28... PROTECTION FOR TRAILING CABLES IN COAL MINES Quality Control § 28.30 Quality control plans; filing... part, each applicant shall file with MSHA a proposed quality control plan which shall be designed to...

  6. 30 CFR 28.30 - Quality control plans; filing requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Quality control plans; filing requirements. 28... PROTECTION FOR TRAILING CABLES IN COAL MINES Quality Control § 28.30 Quality control plans; filing... part, each applicant shall file with MSHA a proposed quality control plan which shall be designed to...

  7. 30 CFR 28.30 - Quality control plans; filing requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... PROTECTION FOR TRAILING CABLES IN COAL MINES Quality Control § 28.30 Quality control plans; filing... 30 Mineral Resources 1 2013-07-01 2013-07-01 false Quality control plans; filing requirements. 28... part, each applicant shall file with MSHA a proposed quality control plan which shall be designed to...

  8. 30 CFR 28.30 - Quality control plans; filing requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... PROTECTION FOR TRAILING CABLES IN COAL MINES Quality Control § 28.30 Quality control plans; filing... 30 Mineral Resources 1 2012-07-01 2012-07-01 false Quality control plans; filing requirements. 28... part, each applicant shall file with MSHA a proposed quality control plan which shall be designed to...

  9. 30 CFR 28.30 - Quality control plans; filing requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... PROTECTION FOR TRAILING CABLES IN COAL MINES Quality Control § 28.30 Quality control plans; filing... 30 Mineral Resources 1 2014-07-01 2014-07-01 false Quality control plans; filing requirements. 28... part, each applicant shall file with MSHA a proposed quality control plan which shall be designed to...

  10. 21 CFR 864.8625 - Hematology quality control mixture.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Hematology quality control mixture. 864.8625 Section 864.8625 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES... quality control mixture. (a) Identification. A hematology quality control mixture is a device used to...

  11. Measuring the quality of Hospital Food Services: Development and reliability of a Meal Quality Audit Tool.

    PubMed

    Banks, Merrilyn; Hannan-Jones, Mary; Ross, Lynda; Buckley, Ann; Ellick, Jennifer; Young, Adrienne

    2017-04-01

    To develop and test the reliability of a Meal Quality Audit Tool (MQAT) to audit the quality of hospital meals to assist food service managers and dietitians in identifying areas for improvement. The MQAT was developed using expert opinion and was modified over time with extensive use and feedback. A phased approach was used to assess content validity and test reliability: (i) trial with 60 dietetic students, (ii) trial with 12 food service dietitians in practice and (iii) interrater reliability study. Phases 1 and 2 confirmed content validity and informed minor revision of scoring, language and formatting of the MQAT. To assess reliability of the final MQAT, eight separate meal quality audits of five identical meals were conducted over several weeks in the hospital setting. Each audit comprised an 'expert' team and four 'test' teams (dietitians, food services and ward staff). Interrater reliability was determined using intra-class correlation analysis. There was statistically significant interrater reliability for dimensions of Temperature and Accuracy (P < 0.001) but not for Appearance or Sensory. Composition of the 'test' team appeared to influence results for Appearance and Sensory, with food service-led teams scoring higher on these dimensions. 'Test' teams reported that MQAT was clear and easy to use. MQAT was found to be reliable for Temperature and Accuracy domains, with further work required to improve the reliability of the Appearance and Sensory dimensions. The systematic use of the tool, used in conjunction with patient satisfaction, could provide pertinent and useful information regarding the quality of food services and areas for improvement. © 2017 Dietitians Association of Australia.

  12. Spacecraft Guidance, Navigation, and Control Visualization Tool

    NASA Technical Reports Server (NTRS)

    Mandic, Milan; Acikmese, Behcet; Blackmore, Lars

    2011-01-01

    G-View is a 3D visualization tool for supporting spacecraft guidance, navigation, and control (GN&C) simulations relevant to small-body exploration and sampling (see figure). The tool is developed in MATLAB using Virtual Reality Toolbox and provides users with the ability to visualize the behavior of their simulations, regardless of which programming language (or machine) is used to generate simulation results. The only requirement is that multi-body simulation data is generated and placed in the proper format before applying G-View.

  13. Surgical process improvement tools: defining quality gaps and priority areas in gastrointestinal cancer surgery.

    PubMed

    Wei, A C; Devitt, K S; Wiebe, M; Bathe, O F; McLeod, R S; Urbach, D R

    2014-04-01

    Surgery is a cornerstone of cancer treatment, but significant differences in the quality of surgery have been reported. Surgical process improvement tools (spits) modify the processes of care as a means to quality improvement (qi). We were interested in developing spits in the area of gastrointestinal (gi) cancer surgery. We report the recommendations of an expert panel held to define quality gaps and establish priority areas that would benefit from spits. The present study used the knowledge-to-action cycle was as a framework. Canadian experts in qi and in gi cancer surgery were assembled in a nominal group workshop. Participants evaluated the merits of spits, described gaps in current knowledge, and identified and ranked processes of care that would benefit from qi. A qualitative analysis of the workshop deliberations using modified grounded theory methods identified major themes. The expert panel consisted of 22 participants. Experts confirmed that spits were an important strategy for qi. The top-rated spits included clinical pathways, electronic information technology, and patient safety tools. The preferred settings for use of spits included preoperative and intraoperative settings and multidisciplinary contexts. Outcomes of interest were cancer-related outcomes, process, and the technical quality of surgery measures. Surgical process improvement tools were confirmed as an important strategy. Expert panel recommendations will be used to guide future research efforts for spits in gi cancer surgery.

  14. Cellular Strategies of Protein Quality Control

    PubMed Central

    Chen, Bryan; Retzlaff, Marco; Roos, Thomas; Frydman, Judith

    2011-01-01

    Eukaryotic cells must contend with a continuous stream of misfolded proteins that compromise the cellular protein homeostasis balance and jeopardize cell viability. An elaborate network of molecular chaperones and protein degradation factors continually monitor and maintain the integrity of the proteome. Cellular protein quality control relies on three distinct yet interconnected strategies whereby misfolded proteins can either be refolded, degraded, or delivered to distinct quality control compartments that sequester potentially harmful misfolded species. Molecular chaperones play a critical role in determining the fate of misfolded proteins in the cell. Here, we discuss the spatial and temporal organization of cellular quality control strategies and their implications for human diseases linked to protein misfolding and aggregation. PMID:21746797

  15. 21 CFR 211.22 - Responsibilities of quality control unit.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 4 2011-04-01 2011-04-01 false Responsibilities of quality control unit. 211.22... Personnel § 211.22 Responsibilities of quality control unit. (a) There shall be a quality control unit that... have been fully investigated. The quality control unit shall be responsible for approving or rejecting...

  16. 21 CFR 211.22 - Responsibilities of quality control unit.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 4 2010-04-01 2010-04-01 false Responsibilities of quality control unit. 211.22... Personnel § 211.22 Responsibilities of quality control unit. (a) There shall be a quality control unit that... have been fully investigated. The quality control unit shall be responsible for approving or rejecting...

  17. 40 CFR 81.112 - Charleston Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Quality Control Regions § 81.112 Charleston Intrastate Air Quality Control Region. The Charleston Intrastate Air Quality Control Region (South Carolina) consists of the territorial area encompassed by the... Quality Control Region: Region 1. 81.107Greenwood Intrastate Air Quality Control Region: Region 2. 81...

  18. 40 CFR 81.112 - Charleston Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Quality Control Regions § 81.112 Charleston Intrastate Air Quality Control Region. The Charleston Intrastate Air Quality Control Region (South Carolina) consists of the territorial area encompassed by the... Quality Control Region: Region 1. 81.107Greenwood Intrastate Air Quality Control Region: Region 2. 81...

  19. Millimeter-Wave Absorption as a Quality Control Tool for M-Type Hexaferrite Nanopowders

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCloy, John S.; Korolev, Konstantin A.; Crum, Jarrod V.

    2013-01-01

    Millimeter wave (MMW) absorption measurements have been conducted on commercial samples of large (micrometer-sized) and small (nanometer-sized) particles of BaFe12O19 and SrFe12O19 using a quasi-optical MMW spectrometer and a series of backwards wave oscillators encompassing the 30-120 GHz range. Effective anisotropy of the particles calculated from the resonant absorption frequency indicates lower overall anisotropy in the nano-particles. Due to their high magnetocrystalline anisotropy, both BaFe12O19 and SrFe12O19 are expected to have spin resonances in the 45-55 GHz range. Several of the sampled BaFe12O19 powders did not have MMW absorptions, so they were further investigated by DC magnetization and x-ray diffractionmore » to assess magnetic behavior and structure. The samples with absent MMW absorption contained primarily iron oxides, suggesting that MMW absorption could be used for quality control in hexaferrite powder manufacture.« less

  20. 7 CFR 275.21 - Quality control review reports.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 4 2011-01-01 2011-01-01 false Quality control review reports. 275.21 Section 275.21... Reporting on Program Performance § 275.21 Quality control review reports. (a) General. Each State agency shall submit reports on the performance of quality control reviews in accordance with the requirements...

  1. Implementation of the trauma registry as a tool for quality improvement in trauma care in a brazilian hospital: the first 12 months.

    PubMed

    Parreira, José Gustavo; de Campos, Tércio; Perlingeiro, Jacqueline A Gianinni; Soldá, Silvia C; Assef, José Cesar; Gonçalves, Augusto Canton; Zuffo, Bruno Malteze; Floriano, Caio Gomes; de Oliveira, Erik Haruk; de Oliveira, Renato Vieira Rodrigues; Oliveira, Amanda Lima; de Melo, Caio Gullo; Below, Cristiano; Miranda, Dino R Pérez; Santos, Gabriella Colasuonno; de Almeida, Gabriele Madeira; Brianti, Isabela Campos; Votto, Karina Baruel de Camargo; Schues, Patrick Alexander Sauer; dos Santos, Rafael Gomes; de Figueredo, Sérgio Mazzola Poli; de Araujo, Tatiani Gonçalves; Santos, Bruna do Nascimento; Ferreira, Laura Cardoso Manduca; Tanaka, Giuliana Olivi; Matos, Thiara; da Sousa, Maria Daiana; Augusto, Samara de Souza

    2015-01-01

    to analyze the implementation of a trauma registry in a university teaching hospital delivering care under the unified health system (SUS), and its ability to identify points for improvement in the quality of care provided. the data collection group comprised students from medicine and nursing courses who were holders of FAPESP scholarships (technical training 1) or otherwise, overseen by the coordinators of the project. The itreg (ECO Sistemas-RJ/SBAIT) software was used as the database tool. Several quality "filters" were proposed to select those cases for review in the quality control process. data for 1344 trauma patients were input to the itreg database between March and November 2014. Around 87.0% of cases were blunt trauma patients, 59.6% had RTS>7.0 and 67% ISS<9. Full records were available for 292 cases, which were selected for review in the quality program. The auditing filters most frequently registered were laparotomy four hours after admission and drainage of acute subdural hematomas four hours after admission. Several points for improvement were flagged, such as control of overtriage of patients, the need to reduce the number of negative imaging exams, the development of protocols for achieving central venous access, and management of major TBI. the trauma registry provides a clear picture of the points to be improved in trauma patient care, however, there are specific peculiarities for implementing this tool in the Brazilian milieu.

  2. 40 CFR 81.88 - Billings Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Quality Control Regions § 81.88 Billings Intrastate Air Quality Control Region. The Metropolitan Billings Intrastate Air Quality Control Region (Montana) has been renamed the Billings Intrastate Air Quality Control... to by Montana authorities as follows: Sec. 481.168Great Falls Intrastate Air Quality Control Region...

  3. 40 CFR 81.88 - Billings Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Quality Control Regions § 81.88 Billings Intrastate Air Quality Control Region. The Metropolitan Billings Intrastate Air Quality Control Region (Montana) has been renamed the Billings Intrastate Air Quality Control... to by Montana authorities as follows: Sec. 481.168Great Falls Intrastate Air Quality Control Region...

  4. Electronic nose for quality control of Colombian coffee through the detection of defects in "Cup Tests".

    PubMed

    Rodríguez, Juan; Durán, Cristhian; Reyes, Adriana

    2010-01-01

    Electronic noses (ENs), are used for many applications, but we must emphasize the importance of their application to foodstuffs like coffee. This paper presents a research study about the analysis of Colombian coffee samples for the detection and classification of defects (i.e., using "Cup Tests"), which was conducted at the Almacafé quality control laboratory in Cúcuta, Colombia. The results obtained show that the application of an electronic nose called "A-NOSE", may be used in the coffee industry for the cupping tests. The results show that e-nose technology can be a useful tool for quality control to evaluate the excellence of the Colombian coffee produced by National Federation of Coffee Growers.

  5. Data warehousing as a tool for quality management in oncology.

    PubMed

    Hölzer, S; Tafazzoli, A G; Altmann, U; Wächter, W; Dudeck, J

    1999-01-01

    At present, physicians are constrained by their limited skills to integrate and understand the growing amount of electronic medical information. To handle, extract, integrate, analyse and take advantage of the gathered information regarding the quality of patient care, the concept of a data warehouse seems to be especially interesting in medicine. Medical data warehousing allows the physicians to take advantage of all the operational data they have been collecting over the years. Our purpose is to build a data warehouse in order to use all available information about cancer patients. We think that with the sensible use of this tool, there are economic benefits for the Society and an improvement of quality of medical care for patients.

  6. Design and implementation of a control structure for quality products in a crude oil atmospheric distillation column.

    PubMed

    Sotelo, David; Favela-Contreras, Antonio; Sotelo, Carlos; Jiménez, Guillermo; Gallegos-Canales, Luis

    2017-11-01

    In recent years, interest for petrochemical processes has been increasing, especially in refinement area. However, the high variability in the dynamic characteristics present in the atmospheric distillation column poses a challenge to obtain quality products. To improve distillates quality in spite of the changes in the input crude oil composition, this paper details a new design of a control strategy in a conventional crude oil distillation plant defined using formal interaction analysis tools. The process dynamic and its control are simulated on Aspen HYSYS ® dynamic environment under real operating conditions. The simulation results are compared against a typical control strategy commonly used in crude oil atmospheric distillation columns. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  7. Design of a final approach spacing tool for TRACON air traffic control

    NASA Technical Reports Server (NTRS)

    Davis, Thomas J.; Erzberger, Heinz; Bergeron, Hugh

    1989-01-01

    This paper describes an automation tool that assists air traffic controllers in the Terminal Radar Approach Control (TRACON) Facilities in providing safe and efficient sequencing and spacing of arrival traffic. The automation tool, referred to as the Final Approach Spacing Tool (FAST), allows the controller to interactively choose various levels of automation and advisory information ranging from predicted time errors to speed and heading advisories for controlling time error. FAST also uses a timeline to display current scheduling and sequencing information for all aircraft in the TRACON airspace. FAST combines accurate predictive algorithms and state-of-the-art mouse and graphical interface technology to present advisory information to the controller. Furthermore, FAST exchanges various types of traffic information and communicates with automation tools being developed for the Air Route Traffic Control Center. Thus it is part of an integrated traffic management system for arrival traffic at major terminal areas.

  8. Analysis and control on changeable wheel tool system of hybrid grinding and polishing machine tool for blade finishing

    NASA Astrophysics Data System (ADS)

    He, Qiuwei; Lv, Xingming; Wang, Xin; Qu, Xingtian; Zhao, Ji

    2017-01-01

    Blade is the key component in the energy power equipment of turbine, aircraft engines and so on. Researches on the process and equipment for blade finishing become one of important and difficult point. To control precisely tool system of developed hybrid grinding and polishing machine tool for blade finishing, the tool system with changeable wheel for belt polishing is analyzed in this paper. Firstly, the belt length and wrap angle of each wheel in different position of tension wheel swing angle in the process of changing wheel is analyzed. The reasonable belt length is calculated by using MATLAB, and relationships between wrap angle of each wheel and cylinder expansion amount of contact wheel are obtained. Then, the control system for changeable wheel tool structure is developed. Lastly, the surface roughness of blade finishing is verified by experiments. Theoretical analysis and experimental results show that reasonable belt length and wheel wrap angle can be obtained by proposed analysis method, the changeable wheel tool system can be controlled precisely, and the surface roughness of blade after grinding meets the design requirements.

  9. 7 CFR 58.141 - Alternate quality control program.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Alternate quality control program. 58.141 Section 58... Service 1 Quality Specifications for Raw Milk § 58.141 Alternate quality control program. When a plant has in operation an acceptable quality program, at the producer level, which is approved by the...

  10. 21 CFR 111.105 - What must quality control personnel do?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 2 2011-04-01 2011-04-01 false What must quality control personnel do? 111.105... for Quality Control § 111.105 What must quality control personnel do? Quality control personnel must... manufacturing record. To do so, quality control personnel must perform operations that include: (a) Approving or...

  11. Modelling and control for laser based welding processes: modern methods of process control to improve quality of laser-based joining methods

    NASA Astrophysics Data System (ADS)

    Zäh, Ralf-Kilian; Mosbach, Benedikt; Hollwich, Jan; Faupel, Benedikt

    2017-02-01

    To ensure the competitiveness of manufacturing companies it is indispensable to optimize their manufacturing processes. Slight variations of process parameters and machine settings have only marginally effects on the product quality. Therefore, the largest possible editing window is required. Such parameters are, for example, the movement of the laser beam across the component for the laser keyhole welding. That`s why it is necessary to keep the formation of welding seams within specified limits. Therefore, the quality of laser welding processes is ensured, by using post-process methods, like ultrasonic inspection, or special in-process methods. These in-process systems only achieve a simple evaluation which shows whether the weld seam is acceptable or not. Furthermore, in-process systems use no feedback for changing the control variables such as speed of the laser or adjustment of laser power. In this paper the research group presents current results of the research field of Online Monitoring, Online Controlling and Model predictive controlling in laser welding processes to increase the product quality. To record the characteristics of the welding process, tested online methods are used during the process. Based on the measurement data, a state space model is ascertained, which includes all the control variables of the system. Depending on simulation tools the model predictive controller (MPC) is designed for the model and integrated into an NI-Real-Time-System.

  12. Theoretical approach to society-wide environmental quality control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ayano, K.

    1982-01-01

    The study outlines the basis for a theory of societal control of environmental quality in the US based on the concepts and philosophy of company-wide quality control which has developed in Japan as a cross-disciplinary approach to problem-solving in the industrial realm. The basic concepts are: 1) every member of society, as a producer of environmental products and services for future generations, in principle has the responsibility to control the quality of his output; 2) environment quality is the quality of life, or the fitness of use of environment for humans; and 3) societal control is any activity necessary formore » quality production of environmental products and services continuously or in the long run. A motivator-hygiene theory of environmental quality is identified, and a proposal is made that the policy provision must be formulated differently between those aimed at hygiene factors of environmental quality and those aimed at motivators, the former in a collectivistic manner, the latter as an individual problem. The concept of societal cost of environmental quality is introduced. Based on the motivator-hygiene theory of environmental quality, the collectivistic and individual approaches are differentiated and discussed.« less

  13. Effects of light quality on the accumulation of phytochemicals in vegetables produced in controlled environments: a review.

    PubMed

    Bian, Zhong Hua; Yang, Qi Chang; Liu, Wen Ke

    2015-03-30

    Phytochemicals in vegetables are important for human health, and their biosynthesis, metabolism and accumulation are affected by environmental factors. Light condition (light quality, light intensity and photoperiod) is one of the most important environmental variables in regulating vegetable growth, development and phytochemical accumulation, particularly for vegetables produced in controlled environments. With the development of light-emitting diode (LED) technology, the regulation of light environments has become increasingly feasible for the provision of ideal light quality, intensity and photoperiod for protected facilities. In this review, the effects of light quality regulation on phytochemical accumulation in vegetables produced in controlled environments are identified, highlighting the research progress and advantages of LED technology as a light environment regulation tool for modifying phytochemical accumulation in vegetables. © 2014 Society of Chemical Industry.

  14. Quality Metadata Management for Geospatial Scientific Workflows: from Retrieving to Assessing with Online Tools

    NASA Astrophysics Data System (ADS)

    Leibovici, D. G.; Pourabdollah, A.; Jackson, M.

    2011-12-01

    outputs of the workflow once run, is then provided using the meta-propagated qualities, obtained without running the workflow [6], together with the visualization pointing out the need to improve the workflow with better data or better processes on the workflow graph itself. [1] Leibovici, DG, Hobona, G Stock, K Jackson, M (2009) Qualifying geospatial workfow models for adaptive controlled validity and accuracy. In: IEEE 17th GeoInformatics, 1-5 [2] Leibovici, DG, Pourabdollah, A (2010a) Workflow Uncertainty using a Metamodel Framework and Metadata for Data and Processes. OGC TC/PC Meetings, September 2010, Toulouse, France [3] OGC (2011) www.opengeospatial.org [4] XPDL (2008) Workflow Process Definition Interface - XML Process Definition Language.Workflow Management Coalition, Document WfMC-TC-1025, 2008 [5] Leibovici, DG Pourabdollah, A Jackson, M (2011) Meta-propagation of Uncertainties for Scientific Workflow Management in Interoperable Spatial Data Infrastructures. In: Proceedings of the European Geosciences Union (EGU2011), April 2011, Austria [6] Pourabdollah, A Leibovici, DG Jackson, M (2011) MetaPunT: an Open Source tool for Meta-Propagation of uncerTainties in Geospatial Processing. In: Proceedings of OSGIS2011, June 2011, Nottingham, UK

  15. Rapid evaluation and quality control of next generation sequencing data with FaQCs.

    PubMed

    Lo, Chien-Chi; Chain, Patrick S G

    2014-11-19

    Next generation sequencing (NGS) technologies that parallelize the sequencing process and produce thousands to millions, or even hundreds of millions of sequences in a single sequencing run, have revolutionized genomic and genetic research. Because of the vagaries of any platform's sequencing chemistry, the experimental processing, machine failure, and so on, the quality of sequencing reads is never perfect, and often declines as the read is extended. These errors invariably affect downstream analysis/application and should therefore be identified early on to mitigate any unforeseen effects. Here we present a novel FastQ Quality Control Software (FaQCs) that can rapidly process large volumes of data, and which improves upon previous solutions to monitor the quality and remove poor quality data from sequencing runs. Both the speed of processing and the memory footprint of storing all required information have been optimized via algorithmic and parallel processing solutions. The trimmed output compared side-by-side with the original data is part of the automated PDF output. We show how this tool can help data analysis by providing a few examples, including an increased percentage of reads recruited to references, improved single nucleotide polymorphism identification as well as de novo sequence assembly metrics. FaQCs combines several features of currently available applications into a single, user-friendly process, and includes additional unique capabilities such as filtering the PhiX control sequences, conversion of FASTQ formats, and multi-threading. The original data and trimmed summaries are reported within a variety of graphics and reports, providing a simple way to do data quality control and assurance.

  16. [Video-based self-control in surgical teaching. A new tool in a new concept].

    PubMed

    Dahmen, U; Sänger, C; Wurst, C; Arlt, J; Wei, W; Dondorf, F; Richter, B; Settmacher, U; Dirsch, O

    2013-10-01

    Image and video-based results and process control are essential tools of a new teaching concept for conveying surgical skills. The new teaching concept integrates approved teaching principles and new media. Every performance of exercises is videotaped and the result photographically recorded. The quality of the process and result becomes accessible for an analysis by the teacher and the student/learner. The learner is instructed to perform a criteria-based self-analysis of the video and image material by themselves. The new learning concept has so far been successfully applied in seven rounds within the newly designed modular class "Intensivkurs Chirurgische Techniken" (Intensive training of surgical techniques). Result documentation and analysis via digital picture was completed by almost every student. The quality of the results was high. Interestingly the result quality did not correlate with the time needed for the exercise. The training success had a lasting effect. The new and elaborate concept improves the quality of teaching. In the long run resources for patient care should be saved when training students according to this concept prior to performing tasks in the operating theater. These resources should be allocated for further refining innovative teaching concepts.

  17. An automated testing tool for traffic signal controller functionalities.

    DOT National Transportation Integrated Search

    2010-03-01

    The purpose of this project was to develop an automated tool that facilitates testing of traffic controller functionality using controller interface device (CID) technology. Benefits of such automated testers to traffic engineers include reduced test...

  18. Quality Control in Higher Education.

    ERIC Educational Resources Information Center

    Hogarth, Charles P.

    The status of quality control in U.S. higher education is discussed with an overview of the functions and structure of public and private colleges and universities. The book is divided into seven chapters: (1) outside controls (accrediting groups, governmental groups and other groups); (2) structure (board of control, president, organization); (3)…

  19. Many roads may lead to Rome: Selected features of quality control within environmental assessment systems in the US, NL, CA, and UK

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Günther, Markus, E-mail: markus.guenther@tu-berlin.de; Geißler, Gesa; Köppel, Johann

    As there is no one-and-only concept on how to precisely define and establish quality control (QC) or quality assurance (QA) in the making of environmental assessments (EA), this paper presents selected features of international approaches that address quality in EA systems in the USA, the Netherlands, Canada, and the United Kingdom. Based on explanative case studies, we highlight the embedding of specific quality control features within the EA systems, the objectives and processes, and relevant transparency challenges. Such features of QC/QA approaches can be considered in cases where substantial quality control and assurance efforts are still missing. Yet further researchmore » needs to be conducted on the efficacy of these approaches, which remains beyond the scope of this study. - Highlights: • We present four tools for quality control and assurance from different EA systems. • Approaches vary in institutional setting, objectives, procedures, and transparency. • Highlighted features might provide guidance in cases where QC/QA is still lacking.« less

  20. Quality Control Technician.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. Center on Education and Training for Employment.

    This document contains 18 units to consider for use in a tech prep competency profile for the occupation of quality control technician. All the units listed will not necessarily apply to every situation or tech prep consortium, nor will all the competencies within each unit be appropriate. Several units appear within each specific occupation and…

  1. Surgical process improvement tools: defining quality gaps and priority areas in gastrointestinal cancer surgery

    PubMed Central

    Wei, A.C.; Devitt, K.S.; Wiebe, M.; Bathe, O.F.; McLeod, R.S.; Urbach, D.R.

    2014-01-01

    Background Surgery is a cornerstone of cancer treatment, but significant differences in the quality of surgery have been reported. Surgical process improvement tools (spits) modify the processes of care as a means to quality improvement (qi). We were interested in developing spits in the area of gastrointestinal (gi) cancer surgery. We report the recommendations of an expert panel held to define quality gaps and establish priority areas that would benefit from spits. Methods The present study used the knowledge-to-action cycle was as a framework. Canadian experts in qi and in gi cancer surgery were assembled in a nominal group workshop. Participants evaluated the merits of spits, described gaps in current knowledge, and identified and ranked processes of care that would benefit from qi. A qualitative analysis of the workshop deliberations using modified grounded theory methods identified major themes. Results The expert panel consisted of 22 participants. Experts confirmed that spits were an important strategy for qi. The top-rated spits included clinical pathways, electronic information technology, and patient safety tools. The preferred settings for use of spits included preoperative and intraoperative settings and multidisciplinary contexts. Outcomes of interest were cancer-related outcomes, process, and the technical quality of surgery measures. Conclusions Surgical process improvement tools were confirmed as an important strategy. Expert panel recommendations will be used to guide future research efforts for spits in gi cancer surgery. PMID:24764704

  2. 40 CFR 81.77 - Puerto Rico Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 17 2011-07-01 2011-07-01 false Puerto Rico Air Quality Control Region... PROGRAMS (CONTINUED) DESIGNATION OF AREAS FOR AIR QUALITY PLANNING PURPOSES Designation of Air Quality Control Regions § 81.77 Puerto Rico Air Quality Control Region. The Puerto Rico Air Quality Control Region...

  3. 40 CFR 81.77 - Puerto Rico Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 17 2010-07-01 2010-07-01 false Puerto Rico Air Quality Control Region... PROGRAMS (CONTINUED) DESIGNATION OF AREAS FOR AIR QUALITY PLANNING PURPOSES Designation of Air Quality Control Regions § 81.77 Puerto Rico Air Quality Control Region. The Puerto Rico Air Quality Control Region...

  4. 40 CFR 81.77 - Puerto Rico Air Quality Control Region.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 18 2012-07-01 2012-07-01 false Puerto Rico Air Quality Control Region... PROGRAMS (CONTINUED) DESIGNATION OF AREAS FOR AIR QUALITY PLANNING PURPOSES Designation of Air Quality Control Regions § 81.77 Puerto Rico Air Quality Control Region. The Puerto Rico Air Quality Control Region...

  5. 40 CFR 81.77 - Puerto Rico Air Quality Control Region.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 18 2013-07-01 2013-07-01 false Puerto Rico Air Quality Control Region... PROGRAMS (CONTINUED) DESIGNATION OF AREAS FOR AIR QUALITY PLANNING PURPOSES Designation of Air Quality Control Regions § 81.77 Puerto Rico Air Quality Control Region. The Puerto Rico Air Quality Control Region...

  6. Improving Escalation of Care: Development and Validation of the Quality of Information Transfer Tool.

    PubMed

    Johnston, Maximilian J; Arora, Sonal; Pucher, Philip H; Reissis, Yannis; Hull, Louise; Huddy, Jeremy R; King, Dominic; Darzi, Ara

    2016-03-01

    To develop and provide validity and feasibility evidence for the QUality of Information Transfer (QUIT) tool. Prompt escalation of care in the setting of patient deterioration can prevent further harm. Escalation and information transfer skills are not currently measured in surgery. This study comprised 3 phases: the development (phase 1), validation (phase 2), and feasibility analysis (phase 3) of the QUIT tool. Phase 1 involved identification of core skills needed for successful escalation of care through literature review and 33 semistructured interviews with stakeholders. Phase 2 involved the generation of validity evidence for the tool using a simulated setting. Thirty surgeons assessed a deteriorating postoperative patient in a simulated ward and escalated their care to a senior colleague. The face and content validity were assessed using a survey. Construct and concurrent validity of the tool were determined by comparing performance scores using the QUIT tool with those measured using the Situation-Background-Assessment-Recommendation (SBAR) tool. Phase 3 was conducted using direct observation of escalation scenarios on surgical wards in 2 hospitals. A 7-category assessment tool was developed from phase 1 consisting of 24 items. Twenty-one of 24 items had excellent content validity (content validity index >0.8). All 7 categories and 18 of 24 (P < 0.05) items demonstrated construct validity. The correlation between the QUIT and SBAR tools used was strong indicating concurrent validity (r = 0.694, P < 0.001). Real-time scoring of escalation referrals was feasible and indicated that doctors currently have better information transfer skills than nurses when faced with a deteriorating patient. A validated tool to assess information transfer for deteriorating surgical patients was developed and tested using simulation and real-time clinical scenarios. It may improve the quality and safety of patient care on the surgical ward.

  7. 21 CFR 111.105 - What must quality control personnel do?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false What must quality control personnel do? 111.105..., LABELING, OR HOLDING OPERATIONS FOR DIETARY SUPPLEMENTS Production and Process Control System: Requirements for Quality Control § 111.105 What must quality control personnel do? Quality control personnel must...

  8. Information security system quality assessment through the intelligent tools

    NASA Astrophysics Data System (ADS)

    Trapeznikov, E. V.

    2018-04-01

    The technology development has shown the automated system information security comprehensive analysis necessity. The subject area analysis indicates the study relevance. The research objective is to develop the information security system quality assessment methodology based on the intelligent tools. The basis of the methodology is the information security assessment model in the information system through the neural network. The paper presents the security assessment model, its algorithm. The methodology practical implementation results in the form of the software flow diagram are represented. The practical significance of the model being developed is noted in conclusions.

  9. DengueTools: innovative tools and strategies for the surveillance and control of dengue.

    PubMed

    Wilder-Smith, Annelies; Renhorn, Karl-Erik; Tissera, Hasitha; Abu Bakar, Sazaly; Alphey, Luke; Kittayapong, Pattamaporn; Lindsay, Steve; Logan, James; Hatz, Christoph; Reiter, Paul; Rocklöv, Joacim; Byass, Peter; Louis, Valérie R; Tozan, Yesim; Massad, Eduardo; Tenorio, Antonio; Lagneau, Christophe; L'Ambert, Grégory; Brooks, David; Wegerdt, Johannah; Gubler, Duane

    2012-01-01

    Dengue fever is a mosquito-borne viral disease estimated to cause about 230 million infections worldwide every year, of which 25,000 are fatal. Global incidence has risen rapidly in recent decades with some 3.6 billion people, over half of the world's population, now at risk, mainly in urban centres of the tropics and subtropics. Demographic and societal changes, in particular urbanization, globalization, and increased international travel, are major contributors to the rise in incidence and geographic expansion of dengue infections. Major research gaps continue to hamper the control of dengue. The European Commission launched a call under the 7th Framework Programme with the title of 'Comprehensive control of Dengue fever under changing climatic conditions'. Fourteen partners from several countries in Europe, Asia, and South America formed a consortium named 'DengueTools' to respond to the call to achieve better diagnosis, surveillance, prevention, and predictive models and improve our understanding of the spread of dengue to previously uninfected regions (including Europe) in the context of globalization and climate change.The consortium comprises 12 work packages to address a set of research questions in three areas:Research area 1: Develop a comprehensive early warning and surveillance system that has predictive capability for epidemic dengue and benefits from novel tools for laboratory diagnosis and vector monitoring.Research area 2: Develop novel strategies to prevent dengue in children.Research area 3: Understand and predict the risk of global spread of dengue, in particular the risk of introduction and establishment in Europe, within the context of parameters of vectorial capacity, global mobility, and climate change.In this paper, we report on the rationale and specific study objectives of 'DengueTools'. DengueTools is funded under the Health theme of the Seventh Framework Programme of the European Community, Grant Agreement Number: 282589 Dengue Tools.

  10. A Total Quality-Control Plan with Right-Sized Statistical Quality-Control.

    PubMed

    Westgard, James O

    2017-03-01

    A new Clinical Laboratory Improvement Amendments option for risk-based quality-control (QC) plans became effective in January, 2016. Called an Individualized QC Plan, this option requires the laboratory to perform a risk assessment, develop a QC plan, and implement a QC program to monitor ongoing performance of the QC plan. Difficulties in performing a risk assessment may limit validity of an Individualized QC Plan. A better alternative is to develop a Total QC Plan including a right-sized statistical QC procedure to detect medically important errors. Westgard Sigma Rules provides a simple way to select the right control rules and the right number of control measurements. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. The evolution of clinical audit as a tool for quality improvement.

    PubMed

    Berk, Michael; Callaly, Thomas; Hyland, Mary

    2003-05-01

    Clinical auditing practices are recognized universally as a useful tool in evaluating and improving the quality of care provided by a health service. External auditing is a regular activity for mental health services in Australia but internal auditing activities are conducted at the discretion of each service. This paper evaluates the effectiveness of 6 years of internal auditing activities in a mental health service. A review of the scope, audit tools, purpose, sampling and design of the internal audits and identification of the recommendations from six consecutive annual audit reports was completed. Audit recommendations were examined, as well as levels of implementation and reasons for success or failure. Fifty-seven recommendations were identified, with 35% without action, 28% implemented and 33.3% still pending or in progress. The recommendations were more likely to be implemented if they relied on activity, planning and action across a selection of service areas rather than being restricted to individual departments within a service, if they did not involve non-mental health service departments and if they were not reliant on attitudinal change. Tools used, scope and reporting formats have become more sophisticated as part of the evolutionary nature of the auditing process. Internal auditing in the Barwon Health Mental Health Service has been effective in producing change in the quality of care across the organization. A number of evolutionary changes in the audit process have improved the efficiency and effectiveness of the audit.

  12. Using a statistical process control chart during the quality assessment of cancer registry data.

    PubMed

    Myles, Zachary M; German, Robert R; Wilson, Reda J; Wu, Manxia

    2011-01-01

    Statistical process control (SPC) charts may be used to detect acute variations in the data while simultaneously evaluating unforeseen aberrations that may warrant further investigation by the data user. Using cancer stage data captured by the Summary Stage 2000 (SS2000) variable, we sought to present a brief report highlighting the utility of the SPC chart during the quality assessment of cancer registry data. Using a county-level caseload for the diagnosis period of 2001-2004 (n=25,648), we found the overall variation of the SS2000 variable to be in control during diagnosis years of 2001 and 2002, exceeded the lower control limit (LCL) in 2003, and exceeded the upper control limit (UCL) in 2004; in situ/localized stages were in control throughout the diagnosis period, regional stage exceeded UCL in 2004, and distant stage exceeded the LCL in 2001 and the UCL in 2004. Our application of the SPC chart with cancer registry data illustrates that the SPC chart may serve as a readily available and timely tool for identifying areas of concern during the data collection and quality assessment of central cancer registry data.

  13. Computer Controlled Optical Surfacing With Orbital Tool Motion

    NASA Astrophysics Data System (ADS)

    Jones, Robert A.

    1985-10-01

    Asymmetric aspheric optical surfaces are very difficult to fabricate using classical techniques and laps the same size as the workpiece. Opticians can produce such surfaces by grinding and polishing, using small laps with orbital tool motion. However, hand correction is a time consuming process unsuitable for large optical elements. Itek has developed Computer Controlled Optical Surfacing (CCOS) for fabricating such aspheric optics. Automated equipment moves a nonrotating orbiting tool slowly over the workpiece surface. The process corrects low frequency surface errors by figuring. The velocity of the tool assembly over the workpiece surface is purposely varied. Since the amount of material removal is proportional to the polishing or grinding time, accurate control over material removal is achieved. The removal of middle and high frequency surface errors is accomplished by pad smoothing. For a soft pad material, the pad will compress to fit the workpiece surface producing greater pressure and more removal at the surface high areas. A harder pad will ride on only the high regions resulting in removal only for those locations.

  14. Sequence quality analysis tool for HIV type 1 protease and reverse transcriptase.

    PubMed

    Delong, Allison K; Wu, Mingham; Bennett, Diane; Parkin, Neil; Wu, Zhijin; Hogan, Joseph W; Kantor, Rami

    2012-08-01

    Access to antiretroviral therapy is increasing globally and drug resistance evolution is anticipated. Currently, protease (PR) and reverse transcriptase (RT) sequence generation is increasing, including the use of in-house sequencing assays, and quality assessment prior to sequence analysis is essential. We created a computational HIV PR/RT Sequence Quality Analysis Tool (SQUAT) that runs in the R statistical environment. Sequence quality thresholds are calculated from a large dataset (46,802 PR and 44,432 RT sequences) from the published literature ( http://hivdb.Stanford.edu ). Nucleic acid sequences are read into SQUAT, identified, aligned, and translated. Nucleic acid sequences are flagged if with >five 1-2-base insertions; >one 3-base insertion; >one deletion; >six PR or >18 RT ambiguous bases; >three consecutive PR or >four RT nucleic acid mutations; >zero stop codons; >three PR or >six RT ambiguous amino acids; >three consecutive PR or >four RT amino acid mutations; >zero unique amino acids; or <0.5% or >15% genetic distance from another submitted sequence. Thresholds are user modifiable. SQUAT output includes a summary report with detailed comments for troubleshooting of flagged sequences, histograms of pairwise genetic distances, neighbor joining phylogenetic trees, and aligned nucleic and amino acid sequences. SQUAT is a stand-alone, free, web-independent tool to ensure use of high-quality HIV PR/RT sequences in interpretation and reporting of drug resistance, while increasing awareness and expertise and facilitating troubleshooting of potentially problematic sequences.

  15. A Real-Time Tool Positioning Sensor for Machine-Tools

    PubMed Central

    Ruiz, Antonio Ramon Jimenez; Rosas, Jorge Guevara; Granja, Fernando Seco; Honorato, Jose Carlos Prieto; Taboada, Jose Juan Esteve; Serrano, Vicente Mico; Jimenez, Teresa Molina

    2009-01-01

    In machining, natural oscillations, and elastic, gravitational or temperature deformations, are still a problem to guarantee the quality of fabricated parts. In this paper we present an optical measurement system designed to track and localize in 3D a reference retro-reflector close to the machine-tool's drill. The complete system and its components are described in detail. Several tests, some static (including impacts and rotations) and others dynamic (by executing linear and circular trajectories), were performed on two different machine tools. It has been integrated, for the first time, a laser tracking system into the position control loop of a machine-tool. Results indicate that oscillations and deformations close to the tool can be estimated with micrometric resolution and a bandwidth from 0 to more than 100 Hz. Therefore this sensor opens the possibility for on-line compensation of oscillations and deformations. PMID:22408472

  16. Measuring the Quality of Early Childhood Programs--Guidelines for Effective Evaluation Tools.

    ERIC Educational Resources Information Center

    Epstein, Ann S.

    2000-01-01

    Summarizes what High/Scope discovered to be the critical characteristics of a comprehensive and valid measure of early childhood program quality. Provides suggestions for how the tool can be used, and highlights with examples. Asserts that the guidelines effectively assess efforts of child development, staff development, and soundness of…

  17. Quality Appraisal of Single-Subject Experimental Designs: An Overview and Comparison of Different Appraisal Tools

    ERIC Educational Resources Information Center

    Wendt, Oliver; Miller, Bridget

    2012-01-01

    Critical appraisal of the research literature is an essential step in informing and implementing evidence-based practice. Quality appraisal tools that assess the methodological quality of experimental studies provide a means to identify the most rigorous research suitable for evidence-based decision-making. In single-subject experimental research,…

  18. Electronic Nose for Quality Control of Colombian Coffee through the Detection of Defects in “Cup Tests”

    PubMed Central

    Rodríguez, Juan; Durán, Cristhian; Reyes, Adriana

    2010-01-01

    Electronic noses (ENs), are used for many applications, but we must emphasize the importance of their application to foodstuffs like coffee. This paper presents a research study about the analysis of Colombian coffee samples for the detection and classification of defects (i.e., using “Cup Tests”), which was conducted at the Almacafé quality control laboratory in Cúcuta, Colombia. The results obtained show that the application of an electronic nose called “A-NOSE”, may be used in the coffee industry for the cupping tests. The results show that e-nose technology can be a useful tool for quality control to evaluate the excellence of the Colombian coffee produced by National Federation of Coffee Growers. PMID:22315525

  19. 18 CFR 12.40 - Quality control programs.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... PROJECT WORKS Other Responsibilities of Applicant or Licensee § 12.40 Quality control programs. (a) General rule. During any construction, repair, or modification of project works, including any corrective... 18 Conservation of Power and Water Resources 1 2014-04-01 2014-04-01 false Quality control...

  20. 18 CFR 12.40 - Quality control programs.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... PROJECT WORKS Other Responsibilities of Applicant or Licensee § 12.40 Quality control programs. (a) General rule. During any construction, repair, or modification of project works, including any corrective... 18 Conservation of Power and Water Resources 1 2013-04-01 2013-04-01 false Quality control...

  1. 18 CFR 12.40 - Quality control programs.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... PROJECT WORKS Other Responsibilities of Applicant or Licensee § 12.40 Quality control programs. (a) General rule. During any construction, repair, or modification of project works, including any corrective... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Quality control...

  2. 46 CFR 164.019-13 - Production quality control requirements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 6 2011-10-01 2011-10-01 false Production quality control requirements. 164.019-13....019-13 Production quality control requirements. (a) General. Each component manufacturer shall establish procedures for maintaining quality control of the materials used in production, manufacturing...

  3. 46 CFR 164.019-13 - Production quality control requirements.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 6 2012-10-01 2012-10-01 false Production quality control requirements. 164.019-13....019-13 Production quality control requirements. (a) General. Each component manufacturer shall establish procedures for maintaining quality control of the materials used in production, manufacturing...

  4. 46 CFR 164.019-13 - Production quality control requirements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 6 2010-10-01 2010-10-01 false Production quality control requirements. 164.019-13....019-13 Production quality control requirements. (a) General. Each component manufacturer shall establish procedures for maintaining quality control of the materials used in production, manufacturing...

  5. 46 CFR 164.019-13 - Production quality control requirements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 6 2014-10-01 2014-10-01 false Production quality control requirements. 164.019-13....019-13 Production quality control requirements. (a) General. Each component manufacturer shall establish procedures for maintaining quality control of the materials used in production, manufacturing...

  6. 46 CFR 164.019-13 - Production quality control requirements.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 6 2013-10-01 2013-10-01 false Production quality control requirements. 164.019-13....019-13 Production quality control requirements. (a) General. Each component manufacturer shall establish procedures for maintaining quality control of the materials used in production, manufacturing...

  7. Performance measurement: A tool for program control

    NASA Technical Reports Server (NTRS)

    Abell, Nancy

    1994-01-01

    Performance measurement is a management tool for planning, monitoring, and controlling as aspects of program and project management--cost, schedule, and technical requirements. It is a means (concept and approach) to a desired end (effective program planning and control). To reach the desired end, however, performance measurement must be applied and used appropriately, with full knowledge and recognition of its power and of its limitations--what it can and cannot do for the project manager. What is the potential of this management tool? What does performance measurement do that a traditional plan vs. actual technique cannot do? Performance measurement provides an improvement over the customary comparison of how much money was spent (actual cost) vs. how much was planned to be spent based on a schedule of activities (work planned). This commonly used plan vs. actual comparison does not allow one to know from the numerical data if the actual cost incurred was for work intended to be done.

  8. Comparative mass spectrometry & nuclear magnetic resonance metabolomic approaches for nutraceuticals quality control analysis: a brief review.

    PubMed

    Farag, Mohamed A

    2014-01-01

    The number of botanical dietary supplements in the market has recently increased primarily due to increased health awareness. Standardization and quality control of the constituents of these plant extracts is an important topic, particularly when such ingredients are used long term as dietary supplements, or in cases where higher doses are marketed as drugs. The development of fast, comprehensive, and effective untargeted analytical methods for plant extracts is of high interest. Nuclear magnetic resonance spectroscopy and mass spectrometry are the most informative tools, each of which enables high-throughput and global analysis of hundreds of metabolites in a single step. Although only one of the two techniques is utilized in the majority of plant metabolomics applications, there is a growing interest in combining the data from both platforms to effectively unravel the complexity of plant samples. The application of combined MS and NMR in the quality control of nutraceuticals forms the major part of this review. Finally I will look at the future developments and perspectives of these two technologies for the quality control of herbal materials.

  9. The development of a tool for assessing the quality of closed circuit camera footage for use in forensic gait analysis.

    PubMed

    Birch, Ivan; Vernon, Wesley; Walker, Jeremy; Saxelby, Jai

    2013-10-01

    Gait analysis from closed circuit camera footage is now commonly used as evidence in criminal trials. The biomechanical analysis of human gait is a well established science in both clinical and laboratory settings. However, closed circuit camera footage is rarely of the quality of that taken in the more controlled clinical and laboratory environments. The less than ideal quality of much of this footage for use in gait analysis is associated with a range of issues, the combination of which can often render the footage unsuitable for use in gait analysis. The aim of this piece of work was to develop a tool for assessing the suitability of closed circuit camera footage for the purpose of forensic gait analysis. A Delphi technique was employed with a small sample of expert forensic gait analysis practitioners, to identify key quality elements of CCTV footage used in legal proceedings. Five elements of the footage were identified and then subdivided into 15 contributing sub-elements, each of which was scored using a 5-point Likert scale. A Microsoft Excel worksheet was developed to calculate automatically an overall score from the fifteen sub-element scores. Five expert witnesses experienced in using CCTV footage for gait analysis then trialled the prototype tool on current case footage. A repeatability study was also undertaken using standardized CCTV footage. The results showed the tool to be a simple and repeatable means of assessing the suitability of closed circuit camera footage for use in forensic gait analysis. The inappropriate use of poor quality footage could lead to challenges to the practice of forensic gait analysis. All parties involved in criminal proceedings must therefore understand the fitness for purpose of any footage used. The development of this tool could offer a method of achieving this goal, and help to assure the continued role of forensic gait analysis as an aid to the identification process. Copyright © 2013 Elsevier Ltd and Faculty of

  10. Quality by control: Towards model predictive control of mammalian cell culture bioprocesses.

    PubMed

    Sommeregger, Wolfgang; Sissolak, Bernhard; Kandra, Kulwant; von Stosch, Moritz; Mayer, Martin; Striedner, Gerald

    2017-07-01

    The industrial production of complex biopharmaceuticals using recombinant mammalian cell lines is still mainly built on a quality by testing approach, which is represented by fixed process conditions and extensive testing of the end-product. In 2004 the FDA launched the process analytical technology initiative, aiming to guide the industry towards advanced process monitoring and better understanding of how critical process parameters affect the critical quality attributes. Implementation of process analytical technology into the bio-production process enables moving from the quality by testing to a more flexible quality by design approach. The application of advanced sensor systems in combination with mathematical modelling techniques offers enhanced process understanding, allows on-line prediction of critical quality attributes and subsequently real-time product quality control. In this review opportunities and unsolved issues on the road to a successful quality by design and dynamic control implementation are discussed. A major focus is directed on the preconditions for the application of model predictive control for mammalian cell culture bioprocesses. Design of experiments providing information about the process dynamics upon parameter change, dynamic process models, on-line process state predictions and powerful software environments seem to be a prerequisite for quality by control realization. © 2017 The Authors. Biotechnology Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Analytical approaches to quality assurance and quality control in rangeland monitoring data

    USDA-ARS?s Scientific Manuscript database

    Producing quality data to support land management decisions is the goal of every rangeland monitoring program. However, the results of quality assurance (QA) and quality control (QC) efforts to improve data quality are rarely reported. The purpose of QA and QC is to prevent and describe non-sampling...

  12. Professional Development for Water Quality Control Personnel.

    ERIC Educational Resources Information Center

    Shepard, Clinton Lewis

    This study investigated the availability of professional development opportunities for water quality control personnel in the midwest. The major objective of the study was to establish a listing of educational opportunities for the professional development of water quality control personnel and to compare these with the opportunities technicians…

  13. SIMULATION TOOL KIT FOR INDOOR AIR QUALITY AND INHALATION EXPOSURE (IAQX) VERSION 1.0 USER'S GUIDE

    EPA Science Inventory

    The User's Guide describes a Microsoft Windows-based indoor air quality (IAQ) simulation software package designed Simulation Tool Kit for Indoor Air Quality and Inhalation Exposure, or IAQX for short. This software complements and supplements existing IAQ simulation programs and...

  14. Operations management tools to be applied for textile

    NASA Astrophysics Data System (ADS)

    Maralcan, A.; Ilhan, I.

    2017-10-01

    In this paper, basic concepts of process analysis such as flow time, inventory, bottleneck, labour cost and utilization are illustrated first. The effect of bottleneck on the results of a business are especially emphasized. In the next section, tools on productivity measurement; KPI (Key Performance Indicators) Tree, OEE (Overall Equipment Effectiveness) and Takt Time are introduced and exemplified. KPI tree is a diagram on which we can visualize all the variables of an operation which are driving financial results through cost and profit. OEE is a tool to measure a potential extra capacity of an equipment or an employee. Takt time is a tool to determine the process flow rate according to the customer demand. KPI tree is studied through the whole process while OEE is exemplified for a stenter frame machine which is the most important machine (and usually the bottleneck) and the most expensive investment in a finishing plant. Takt time is exemplified for the quality control department. Finally quality tools, six sigma, control charts and jidoka are introduced. Six sigma is a tool to measure process capability and by the way probability of a defect. Control chart is a powerful tool to monitor the process. The idea of jidoka (detect, stop and alert) is about alerting the people that there is a problem in the process.

  15. A simple tool for neuroimaging data sharing

    PubMed Central

    Haselgrove, Christian; Poline, Jean-Baptiste; Kennedy, David N.

    2014-01-01

    Data sharing is becoming increasingly common, but despite encouragement and facilitation by funding agencies, journals, and some research efforts, most neuroimaging data acquired today is still not shared due to political, financial, social, and technical barriers to sharing data that remain. In particular, technical solutions are few for researchers that are not a part of larger efforts with dedicated sharing infrastructures, and social barriers such as the time commitment required to share can keep data from becoming publicly available. We present a system for sharing neuroimaging data, designed to be simple to use and to provide benefit to the data provider. The system consists of a server at the International Neuroinformatics Coordinating Facility (INCF) and user tools for uploading data to the server. The primary design principle for the user tools is ease of use: the user identifies a directory containing Digital Imaging and Communications in Medicine (DICOM) data, provides their INCF Portal authentication, and provides identifiers for the subject and imaging session. The user tool anonymizes the data and sends it to the server. The server then runs quality control routines on the data, and the data and the quality control reports are made public. The user retains control of the data and may change the sharing policy as they need. The result is that in a few minutes of the user’s time, DICOM data can be anonymized and made publicly available, and an initial quality control assessment can be performed on the data. The system is currently functional, and user tools and access to the public image database are available at http://xnat.incf.org/. PMID:24904398

  16. Assessment of tools for protection of quality of water: Uncontrollable discharges of pollutants.

    PubMed

    Dehghani Darmian, Mohsen; Hashemi Monfared, Seyed Arman; Azizyan, Gholamreza; Snyder, Shane A; Giesy, John P

    2018-06-06

    Selecting an appropriate crisis management plans during uncontrollable loading of pollution to water systems is crucial. In this research the quality of water resources against uncontrollable pollution is protected by use of suitable tools. Case study which was chosen in this investigation was a river-reservoir system. Analytical and numerical solutions of pollutant transport equation were considered as the simulation strategy to calculate the efficient tools to protect water quality. These practical instruments are dilution flow and a new tool called detention time which is proposed and simulated for the first time in this study. For uncontrollable pollution discharge which was approximately 130% of the river's assimilation capacity, as long as the duration of contact (T c ) was considered as a constraint, by releasing 30% of the base flow of the river from the upstream dilution reservoir, the unallowable pollution could be treated. Moreover, when the affected distance (X c ) was selected as a constraint, the required detention time that the rubber dam should detained the water to be treated was equal to 187% of the initial duration of contact. Copyright © 2018 Elsevier Inc. All rights reserved.

  17. Ride quality sensitivity to SAS control law and to handling quality variations

    NASA Technical Reports Server (NTRS)

    Roberts, P. A.; Schmidt, D. K.; Swaim, R. L.

    1976-01-01

    The RQ trends which large flexible aircraft exhibit under various parameterizations of control laws and handling qualities are discussed. A summary of the assumptions and solution technique, a control law parameterization review, a discussion of ride sensitivity to handling qualities, and the RQ effects generated by implementing relaxed static stability configurations are included.

  18. 14 CFR 21.147 - Changes in quality control system.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 1 2011-01-01 2011-01-01 false Changes in quality control system. 21.147 Section 21.147 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION... quality control system. After the issue of a production certificate, each change to the quality control...

  19. 14 CFR 21.147 - Changes in quality control system.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Changes in quality control system. 21.147 Section 21.147 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION... quality control system. After the issue of a production certificate, each change to the quality control...

  20. 21 CFR 111.117 - What quality control operations are required for equipment, instruments, and controls?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false What quality control operations are required for equipment, instruments, and controls? 111.117 Section 111.117 Food and Drugs FOOD AND DRUG ADMINISTRATION... and Process Control System: Requirements for Quality Control § 111.117 What quality control operations...

  1. Comparison of tools for assessing the methodological quality of primary and secondary studies in health technology assessment reports in Germany.

    PubMed

    Dreier, Maren; Borutta, Birgit; Stahmeyer, Jona; Krauth, Christian; Walter, Ulla

    2010-06-14

    HEALTH CARE POLICY BACKGROUND: Findings from scientific studies form the basis for evidence-based health policy decisions. Quality assessments to evaluate the credibility of study results are an essential part of health technology assessment reports and systematic reviews. Quality assessment tools (QAT) for assessing the study quality examine to what extent study results are systematically distorted by confounding or bias (internal validity). The tools can be divided into checklists, scales and component ratings. What QAT are available to assess the quality of interventional studies or studies in the field of health economics, how do they differ from each other and what conclusions can be drawn from these results for quality assessments? A systematic search of relevant databases from 1988 onwards is done, supplemented by screening of the references, of the HTA reports of the German Agency for Health Technology Assessment (DAHTA) and an internet search. The selection of relevant literature, the data extraction and the quality assessment are carried out by two independent reviewers. The substantive elements of the QAT are extracted using a modified criteria list consisting of items and domains specific to randomized trials, observational studies, diagnostic studies, systematic reviews and health economic studies. Based on the number of covered items and domains, more and less comprehensive QAT are distinguished. In order to exchange experiences regarding problems in the practical application of tools, a workshop is hosted. A total of eight systematic methodological reviews is identified as well as 147 QAT: 15 for systematic reviews, 80 for randomized trials, 30 for observational studies, 17 for diagnostic studies and 22 for health economic studies. The tools vary considerably with regard to the content, the performance and quality of operationalisation. Some tools do not only include the items of internal validity but also the items of quality of reporting and

  2. Rapid evaluation and quality control of next generation sequencing data with FaQCs

    DOE PAGES

    Lo, Chien -Chi; Chain, Patrick S. G.

    2014-12-01

    Background: Next generation sequencing (NGS) technologies that parallelize the sequencing process and produce thousands to millions, or even hundreds of millions of sequences in a single sequencing run, have revolutionized genomic and genetic research. Because of the vagaries of any platform's sequencing chemistry, the experimental processing, machine failure, and so on, the quality of sequencing reads is never perfect, and often declines as the read is extended. These errors invariably affect downstream analysis/application and should therefore be identified early on to mitigate any unforeseen effects. Results: Here we present a novel FastQ Quality Control Software (FaQCs) that can rapidly processmore » large volumes of data, and which improves upon previous solutions to monitor the quality and remove poor quality data from sequencing runs. Both the speed of processing and the memory footprint of storing all required information have been optimized via algorithmic and parallel processing solutions. The trimmed output compared side-by-side with the original data is part of the automated PDF output. We show how this tool can help data analysis by providing a few examples, including an increased percentage of reads recruited to references, improved single nucleotide polymorphism identification as well as de novo sequence assembly metrics. Conclusion: FaQCs combines several features of currently available applications into a single, user-friendly process, and includes additional unique capabilities such as filtering the PhiX control sequences, conversion of FASTQ formats, and multi-threading. The original data and trimmed summaries are reported within a variety of graphics and reports, providing a simple way to do data quality control and assurance.« less

  3. SOWFA Super-Controller: A High-Fidelity Tool for Evaluating Wind Plant Control Approaches

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fleming, P.; Gebraad, P.; van Wingerden, J. W.

    2013-01-01

    This paper presents a new tool for testing wind plant controllers in the Simulator for Offshore Wind Farm Applications (SOWFA). SOWFA is a high-fidelity simulator for the interaction between wind turbine dynamics and the fluid flow in a wind plant. The new super-controller testing environment in SOWFA allows for the implementation of the majority of the wind plant control strategies proposed in the literature.

  4. Using an Accountability Tool to Improve the Quality of Outcomes on Individual Family Service Plans

    ERIC Educational Resources Information Center

    Votava, Kristen M.; Johnson, Carol; Chiasson, Kari

    2011-01-01

    This study investigated using a state's Part C early intervention accountability tool to increase the number of outcomes meeting compliance within IFSPs. The Case Review Tool (CRT) was used to examine differences from year one to year three on three measures of quality outcomes. There was no evidence of change in two of the measures, but there was…

  5. Colorado Air Quality Control Regulations and Ambient Air Quality Standards.

    ERIC Educational Resources Information Center

    Colorado State Dept. of Health, Denver. Div. of Air Pollution Control.

    Regulations and standards relative to air quality control in Colorado are defined in this publication. Presented first are definitions of terms, a statement of intent, and general provisions applicable to all emission control regulations adopted by the Colorado Air Pollution Control Commission. Following this, three regulations are enumerated: (1)…

  6. 46 CFR 164.120-11 - Production quality control requirements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 6 2014-10-01 2014-10-01 false Production quality control requirements. 164.120-11... Rescue Boats § 164.120-11 Production quality control requirements. The resin manufacturer must institute a quality control procedure to ensure that all Coast Guard-accepted resin is produced to the same...

  7. 46 CFR 164.120-11 - Production quality control requirements.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 6 2012-10-01 2012-10-01 false Production quality control requirements. 164.120-11... Rescue Boats § 164.120-11 Production quality control requirements. The resin manufacturer must institute a quality control procedure to ensure that all Coast Guard-accepted resin is produced to the same...

  8. 46 CFR 164.120-11 - Production quality control requirements.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 6 2013-10-01 2013-10-01 false Production quality control requirements. 164.120-11... Rescue Boats § 164.120-11 Production quality control requirements. The resin manufacturer must institute a quality control procedure to ensure that all Coast Guard-accepted resin is produced to the same...

  9. Ecological Indication, Bioaccumulation, and Phytoremediation as Tools for Environmental Quality Management

    DTIC Science & Technology

    2004-12-01

    ECOLOGICAL INDICATION, BIOACCUMULATION, AND PHYTOREMEDIATION AS TOOLS FOR ENVIRONMENTAL QUALITY MANAGEMENT ELLY P. H. BEST1, HENRY E. TATEM1...subsequent transport to shoots, and degradation, or prevent contaminants from leaving the site in whatever form, such as leachate , runoff, trophic...transfer ( phytoremediation ). We use risk assessment to evaluate the toxicity and need for cleanup. Cleanup costs are expected to greatly exceed the cost

  10. SMOQ: a tool for predicting the absolute residue-specific quality of a single protein model with support vector machines

    PubMed Central

    2014-01-01

    Background It is important to predict the quality of a protein structural model before its native structure is known. The method that can predict the absolute local quality of individual residues in a single protein model is rare, yet particularly needed for using, ranking and refining protein models. Results We developed a machine learning tool (SMOQ) that can predict the distance deviation of each residue in a single protein model. SMOQ uses support vector machines (SVM) with protein sequence and structural features (i.e. basic feature set), including amino acid sequence, secondary structures, solvent accessibilities, and residue-residue contacts to make predictions. We also trained a SVM model with two new additional features (profiles and SOV scores) on 20 CASP8 targets and found that including them can only improve the performance when real deviations between native and model are higher than 5Å. The SMOQ tool finally released uses the basic feature set trained on 85 CASP8 targets. Moreover, SMOQ implemented a way to convert predicted local quality scores into a global quality score. SMOQ was tested on the 84 CASP9 single-domain targets. The average difference between the residue-specific distance deviation predicted by our method and the actual distance deviation on the test data is 2.637Å. The global quality prediction accuracy of the tool is comparable to other good tools on the same benchmark. Conclusion SMOQ is a useful tool for protein single model quality assessment. Its source code and executable are available at: http://sysbio.rnet.missouri.edu/multicom_toolbox/. PMID:24776231

  11. SMOQ: a tool for predicting the absolute residue-specific quality of a single protein model with support vector machines.

    PubMed

    Cao, Renzhi; Wang, Zheng; Wang, Yiheng; Cheng, Jianlin

    2014-04-28

    It is important to predict the quality of a protein structural model before its native structure is known. The method that can predict the absolute local quality of individual residues in a single protein model is rare, yet particularly needed for using, ranking and refining protein models. We developed a machine learning tool (SMOQ) that can predict the distance deviation of each residue in a single protein model. SMOQ uses support vector machines (SVM) with protein sequence and structural features (i.e. basic feature set), including amino acid sequence, secondary structures, solvent accessibilities, and residue-residue contacts to make predictions. We also trained a SVM model with two new additional features (profiles and SOV scores) on 20 CASP8 targets and found that including them can only improve the performance when real deviations between native and model are higher than 5Å. The SMOQ tool finally released uses the basic feature set trained on 85 CASP8 targets. Moreover, SMOQ implemented a way to convert predicted local quality scores into a global quality score. SMOQ was tested on the 84 CASP9 single-domain targets. The average difference between the residue-specific distance deviation predicted by our method and the actual distance deviation on the test data is 2.637Å. The global quality prediction accuracy of the tool is comparable to other good tools on the same benchmark. SMOQ is a useful tool for protein single model quality assessment. Its source code and executable are available at: http://sysbio.rnet.missouri.edu/multicom_toolbox/.

  12. The use of UV-visible reflectance spectroscopy as an objective tool to evaluate pearl quality.

    PubMed

    Agatonovic-Kustrin, Snezana; Morton, David W

    2012-07-01

    Assessing the quality of pearls involves the use of various tools and methods, which are mainly visual and often quite subjective. Pearls are normally classified by origin and are then graded by luster, nacre thickness, surface quality, size, color and shape. The aim of this study was to investigate the capacity of Artificial Neural Networks (ANNs) to classify and estimate the quality of 27 different pearls from their UV-Visible spectra. Due to the opaque nature of pearls, spectroscopy measurements were performed using the Diffuse Reflectance UV-Visible spectroscopy technique. The spectra were acquired at two different locations on each pearl sample in order to assess surface homogeneity. The spectral data (inputs) were smoothed to reduce the noise, fed into ANNs and correlated to the pearl's quality/grading criteria (outputs). The developed ANNs were successful in predicting pearl type, mollusk growing species, possible luster and color enhancing, donor condition/type, recipient/host color, donor color, pearl luster, pearl color, origin. The results of this study shows that the developed UV-Vis spectroscopy-ANN method could be used as a more objective method of assessing pearl quality (grading) and may become a valuable tool for the pearl grading industry.

  13. Quality Control Technician Curriculum. An Elusive Butterfly.

    ERIC Educational Resources Information Center

    Holler, Michael

    Defining and developing a quality control technician curriculum for an associate degree program is a difficult and puzzling job. There are as many definitions of quality control and curriculum ideas as there are educators asked. However, one could start by dividing the field into its major areas--heavy manufacturing, maintenance, research, and…

  14. Validation of the ICU-DaMa tool for automatically extracting variables for minimum dataset and quality indicators: The importance of data quality assessment.

    PubMed

    Sirgo, Gonzalo; Esteban, Federico; Gómez, Josep; Moreno, Gerard; Rodríguez, Alejandro; Blanch, Lluis; Guardiola, Juan José; Gracia, Rafael; De Haro, Lluis; Bodí, María

    2018-04-01

    Big data analytics promise insights into healthcare processes and management, improving outcomes while reducing costs. However, data quality is a major challenge for reliable results. Business process discovery techniques and an associated data model were used to develop data management tool, ICU-DaMa, for extracting variables essential for overseeing the quality of care in the intensive care unit (ICU). To determine the feasibility of using ICU-DaMa to automatically extract variables for the minimum dataset and ICU quality indicators from the clinical information system (CIS). The Wilcoxon signed-rank test and Fisher's exact test were used to compare the values extracted from the CIS with ICU-DaMa for 25 variables from all patients attended in a polyvalent ICU during a two-month period against the gold standard of values manually extracted by two trained physicians. Discrepancies with the gold standard were classified into plausibility, conformance, and completeness errors. Data from 149 patients were included. Although there were no significant differences between the automatic method and the manual method, we detected differences in values for five variables, including one plausibility error and two conformance and completeness errors. Plausibility: 1) Sex, ICU-DaMa incorrectly classified one male patient as female (error generated by the Hospital's Admissions Department). Conformance: 2) Reason for isolation, ICU-DaMa failed to detect a human error in which a professional misclassified a patient's isolation. 3) Brain death, ICU-DaMa failed to detect another human error in which a professional likely entered two mutually exclusive values related to the death of the patient (brain death and controlled donation after circulatory death). Completeness: 4) Destination at ICU discharge, ICU-DaMa incorrectly classified two patients due to a professional failing to fill out the patient discharge form when thepatients died. 5) Length of continuous renal replacement

  15. A multiple objective optimization approach to quality control

    NASA Technical Reports Server (NTRS)

    Seaman, Christopher Michael

    1991-01-01

    The use of product quality as the performance criteria for manufacturing system control is explored. The goal in manufacturing, for economic reasons, is to optimize product quality. The problem is that since quality is a rather nebulous product characteristic, there is seldom an analytic function that can be used as a measure. Therefore standard control approaches, such as optimal control, cannot readily be applied. A second problem with optimizing product quality is that it is typically measured along many dimensions: there are many apsects of quality which must be optimized simultaneously. Very often these different aspects are incommensurate and competing. The concept of optimality must now include accepting tradeoffs among the different quality characteristics. These problems are addressed using multiple objective optimization. It is shown that the quality control problem can be defined as a multiple objective optimization problem. A controller structure is defined using this as the basis. Then, an algorithm is presented which can be used by an operator to interactively find the best operating point. Essentially, the algorithm uses process data to provide the operator with two pieces of information: (1) if it is possible to simultaneously improve all quality criteria, then determine what changes to the process input or controller parameters should be made to do this; and (2) if it is not possible to improve all criteria, and the current operating point is not a desirable one, select a criteria in which a tradeoff should be made, and make input changes to improve all other criteria. The process is not operating at an optimal point in any sense if no tradeoff has to be made to move to a new operating point. This algorithm ensures that operating points are optimal in some sense and provides the operator with information about tradeoffs when seeking the best operating point. The multiobjective algorithm was implemented in two different injection molding scenarios

  16. 7 CFR 58.523 - Laboratory and quality control tests.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 3 2014-01-01 2014-01-01 false Laboratory and quality control tests. 58.523 Section... Service 1 Operations and Operating Procedures § 58.523 Laboratory and quality control tests. (a) Quality control tests shall be made on samples as often as necessary to determine the shelf-life and stability of...

  17. 7 CFR 58.523 - Laboratory and quality control tests.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Laboratory and quality control tests. 58.523 Section... Service 1 Operations and Operating Procedures § 58.523 Laboratory and quality control tests. (a) Quality control tests shall be made on samples as often as necessary to determine the shelf-life and stability of...

  18. 7 CFR 58.523 - Laboratory and quality control tests.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Laboratory and quality control tests. 58.523 Section... Service 1 Operations and Operating Procedures § 58.523 Laboratory and quality control tests. (a) Quality control tests shall be made on samples as often as necessary to determine the shelf-life and stability of...

  19. 7 CFR 58.523 - Laboratory and quality control tests.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Laboratory and quality control tests. 58.523 Section... Service 1 Operations and Operating Procedures § 58.523 Laboratory and quality control tests. (a) Quality control tests shall be made on samples as often as necessary to determine the shelf-life and stability of...

  20. 7 CFR 58.523 - Laboratory and quality control tests.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 3 2012-01-01 2012-01-01 false Laboratory and quality control tests. 58.523 Section... Service 1 Operations and Operating Procedures § 58.523 Laboratory and quality control tests. (a) Quality control tests shall be made on samples as often as necessary to determine the shelf-life and stability of...

  1. 21 CFR 640.56 - Quality control test for potency.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... quality control test for potency may be performed by a clinical laboratory which meets the standards of... 21 Food and Drugs 7 2010-04-01 2010-04-01 false Quality control test for potency. 640.56 Section...) BIOLOGICS ADDITIONAL STANDARDS FOR HUMAN BLOOD AND BLOOD PRODUCTS Cryoprecipitate § 640.56 Quality control...

  2. Web quality control for lectures: Supercourse and Amazon.com.

    PubMed

    Linkov, Faina; LaPorte, Ronald; Lovalekar, Mita; Dodani, Sunita

    2005-12-01

    Peer review has been at the corner stone of quality control of the biomedical journals in the past 300 years. With the emergency of the Internet, new models of quality control and peer review are emerging. However, such models are poorly investigated. We would argue that the popular system of quality control used in Amazon.com offers a way to ensure continuous quality improvement in the area of research communications on the Internet. Such system is providing an interesting alternative to the traditional peer review approaches used in the biomedical journals and challenges the traditional paradigms of scientific publishing. This idea is being explored in the context of Supercourse, a library of 2,350 prevention lectures, shared for free by faculty members from over 150 countries. Supercourse is successfully utilizing quality control approaches that are similar to Amazon.com model. Clearly, the existing approaches and emerging alternatives for quality control in scientific communications needs to be assessed scientifically. Rapid explosion of internet technologies could be leveraged to produce better, more cost effective systems for quality control in the biomedical publications and across all sciences.

  3. 40 CFR 81.107 - Greenwood Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Quality Control Regions § 81.107 Greenwood Intrastate Air Quality Control Region. The Greenwood Intrastate Air Quality Control Region (South Carolina) consists of the territorial area encompassed by the... 40 Protection of Environment 17 2010-07-01 2010-07-01 false Greenwood Intrastate Air Quality...

  4. 40 CFR 81.108 - Columbia Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Quality Control Regions § 81.108 Columbia Intrastate Air Quality Control Region. The Columbia Intrastate Air Quality Control Region (South Carolina) consists of the territorial area encompassed by the... 40 Protection of Environment 17 2011-07-01 2011-07-01 false Columbia Intrastate Air Quality...

  5. 40 CFR 81.108 - Columbia Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Quality Control Regions § 81.108 Columbia Intrastate Air Quality Control Region. The Columbia Intrastate Air Quality Control Region (South Carolina) consists of the territorial area encompassed by the... 40 Protection of Environment 17 2010-07-01 2010-07-01 false Columbia Intrastate Air Quality...

  6. 40 CFR 81.111 - Georgetown Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Quality Control Regions § 81.111 Georgetown Intrastate Air Quality Control Region. The Georgetown Intrastate Air Quality Control Region (South Carolina) consists of the territorial area encompassed by the... 40 Protection of Environment 17 2011-07-01 2011-07-01 false Georgetown Intrastate Air Quality...

  7. 40 CFR 81.109 - Florence Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Quality Control Regions § 81.109 Florence Intrastate Air Quality Control Region. The Florence Intrastate Air Quality Control Region (South Carolina) consists of the territorial area encompassed by the... 40 Protection of Environment 17 2010-07-01 2010-07-01 false Florence Intrastate Air Quality...

  8. 40 CFR 81.111 - Georgetown Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Quality Control Regions § 81.111 Georgetown Intrastate Air Quality Control Region. The Georgetown Intrastate Air Quality Control Region (South Carolina) consists of the territorial area encompassed by the... 40 Protection of Environment 17 2010-07-01 2010-07-01 false Georgetown Intrastate Air Quality...

  9. 40 CFR 81.109 - Florence Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Quality Control Regions § 81.109 Florence Intrastate Air Quality Control Region. The Florence Intrastate Air Quality Control Region (South Carolina) consists of the territorial area encompassed by the... 40 Protection of Environment 17 2011-07-01 2011-07-01 false Florence Intrastate Air Quality...

  10. 40 CFR 81.107 - Greenwood Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Quality Control Regions § 81.107 Greenwood Intrastate Air Quality Control Region. The Greenwood Intrastate Air Quality Control Region (South Carolina) consists of the territorial area encompassed by the... 40 Protection of Environment 17 2011-07-01 2011-07-01 false Greenwood Intrastate Air Quality...

  11. Machine tools and fixtures: A compilation

    NASA Technical Reports Server (NTRS)

    1971-01-01

    As part of NASA's Technology Utilizations Program, a compilation was made of technological developments regarding machine tools, jigs, and fixtures that have been produced, modified, or adapted to meet requirements of the aerospace program. The compilation is divided into three sections that include: (1) a variety of machine tool applications that offer easier and more efficient production techniques; (2) methods, techniques, and hardware that aid in the setup, alignment, and control of machines and machine tools to further quality assurance in finished products: and (3) jigs, fixtures, and adapters that are ancillary to basic machine tools and aid in realizing their greatest potential.

  12. Dietary Adherence Monitoring Tool for Free-living, Controlled Feeding Studies

    USDA-ARS?s Scientific Manuscript database

    Objective: To devise a dietary adherence monitoring tool for use in controlled human feeding trials involving free-living study participants. Methods: A scoring tool was devised to measure and track dietary adherence for an 8-wk randomized trial evaluating the effects of two different dietary patter...

  13. Contractor Performed Quality Control on KyTC Projects.

    DOT National Transportation Integrated Search

    2002-08-01

    This report addresses issues related to transferring the responsibility for quality control from the Kentucky Transportation Cabinet (KyTC) to construction contractors. : Several key topics related to Contractor Performed Quality Control (CPQC) are p...

  14. [Pharmaceutical product quality control and good manufacturing practices].

    PubMed

    Hiyama, Yukio

    2010-01-01

    This report describes the roles of Good Manufacturing Practices (GMP) in pharmaceutical product quality control. There are three keys to pharmaceutical product quality control. They are specifications, thorough product characterization during development, and adherence to GMP as the ICH Q6A guideline on specifications provides the most important principles in its background section. Impacts of the revised Pharmaceutical Affairs Law (rPAL) which became effective in 2005 on product quality control are discussed. Progress of ICH discussion for Pharmaceutical Development (Q8), Quality Risk Management (Q9) and Pharmaceutical Quality System (Q10) are reviewed. In order to reconstruct GMP guidelines and GMP inspection system in the regulatory agencies under the new paradigm by rPAL and the ICH, a series of Health Science studies were conducted. For GMP guidelines, product GMP guideline, technology transfer guideline, laboratory control guideline and change control system guideline were written. For the GMP inspection system, inspection check list, inspection memo and inspection scenario were proposed also by the Health Science study groups. Because pharmaceutical products and their raw materials are manufactured and distributed internationally, collaborations with other national authorities are highly desired. In order to enhance the international collaborations, consistent establishment of GMP inspection quality system throughout Japan will be essential.

  15. Clinical decision support tools for osteoporosis disease management: a systematic review of randomized controlled trials.

    PubMed

    Kastner, Monika; Straus, Sharon E

    2008-12-01

    Studies indicate a gap between evidence and clinical practice in osteoporosis management. Tools that facilitate clinical decision making at the point of care are promising strategies for closing these practice gaps. To systematically review the literature to identify and describe the effectiveness of tools that support clinical decision making in osteoporosis disease management. Medline, EMBASE, CINAHL, and EBM Reviews (CDSR, DARE, CCTR, and ACP J Club), and contact with experts in the field. Randomized controlled trials (RCTs) in any language from 1966 to July 2006 investigating disease management interventions in patients at risk for osteoporosis. Outcomes included fractures and bone mineral density (BMD) testing. Two investigators independently assessed articles for relevance and study quality, and extracted data using standardized forms. Of 1,246 citations that were screened for relevance, 13 RCTs met the inclusion criteria. Reported study quality was generally poor. Meta-analysis was not done because of methodological and clinical heterogeneity; 77% of studies included a reminder or education as a component of their intervention. Three studies of reminders plus education targeted to physicians and patients showed increased BMD testing (RR range 1.43 to 8.67) and osteoporosis medication use (RR range 1.60 to 8.67). A physician reminder plus a patient risk assessment strategy found reduced fractures [RR 0.58, 95% confidence interval (CI) 0.37 to 0.90] and increased osteoporosis therapy (RR 2.44, CI 1.43 to 4.17). Multi-component tools that are targeted to physicians and patients may be effective for supporting clinical decision making in osteoporosis disease management.

  16. A study with ESI PAM-STAMP® on the influence of tool deformation on final part quality during a forming process

    NASA Astrophysics Data System (ADS)

    Vrolijk, Mark; Ogawa, Takayuki; Camanho, Arthur; Biasutti, Manfredi; Lorenz, David

    2018-05-01

    As a result from the ever increasing demand to produce lighter vehicles, more and more advanced high-strength materials are used in automotive industry. Focusing on sheet metal cold forming processes, these materials require high pressing forces and exhibit large springback after forming. Due to the high pressing forces deformations occur in the tooling geometry, introducing dimensional inaccuracies in the blank and potentially impact the final springback behavior. As a result the tool deformations can have an impact on the final assembly or introduce cosmetic defects. Often several iterations are required in try-out to obtain the required tolerances, with costs going up to as much as 30% of the entire product development cost. To investigate the sheet metal part feasibility and quality, in automotive industry CAE tools are widely used. However, in current practice the influence of the tool deformations on the final part quality is generally neglected and simulations are carried out with rigid tools to avoid drastically increased calculation times. If the tool deformation is analyzed through simulation it is normally done at the end of the drawing prosses, when contact conditions are mapped on the die structure and a static analysis is performed to check the deflections of the tool. But this method does not predict the influence of these deflections on the final quality of the part. In order to take tool deformations into account during drawing simulations, ESI has developed the ability to couple solvers efficiently in a way the tool deformations can be real-time included in the drawing simulation without high increase in simulation time compared to simulations with rigid tools. In this paper a study will be presented which demonstrates the effect of tool deformations on the final part quality.

  17. 21 CFR 862.1660 - Quality control material (assayed and unassayed).

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Quality control material (assayed and unassayed... Test Systems § 862.1660 Quality control material (assayed and unassayed). (a) Identification. A quality... that may arise from reagent or analytical instrument variation. A quality control material (assayed and...

  18. QUALITY ASSURANCE AND QUALITY CONTROL FOR WASTE CONTAINMENT FACILITIES. Project Summary

    EPA Science Inventory

    It is generally agreed that both quality assurance (QA) and quality control (QC) are essential to the proper installation and eventual performance of environmentally safe and secure waste containment systems. Even further, there are both manufacturing and construction aspects to...

  19. Most systematic reviews of high methodological quality on psoriasis interventions are classified as high risk of bias using ROBIS tool.

    PubMed

    Gómez-García, Francisco; Ruano, Juan; Gay-Mimbrera, Jesus; Aguilar-Luque, Macarena; Sanz-Cabanillas, Juan Luis; Alcalde-Mellado, Patricia; Maestre-López, Beatriz; Carmona-Fernández, Pedro Jesús; González-Padilla, Marcelino; García-Nieto, Antonio Vélez; Isla-Tejera, Beatriz

    2017-12-01

    No gold standard exists to assess methodological quality of systematic reviews (SRs). Although Assessing the Methodological Quality of Systematic Reviews (AMSTAR) is widely accepted for analyzing quality, the ROBIS instrument has recently been developed. This study aimed to compare the capacity of both instruments to capture the quality of SRs concerning psoriasis interventions. Systematic literature searches were undertaken on relevant databases. For each review, methodological quality and bias risk were evaluated using the AMSTAR and ROBIS tools. Descriptive and principal component analyses were conducted to describe similarities and discrepancies between both assessment tools. We classified 139 intervention SRs as displaying high/moderate/low methodological quality and as high/low risk of bias. A high risk of bias was detected for most SRs classified as displaying high or moderate methodological quality by AMSTAR. When comparing ROBIS result profiles, responses to domain 4 signaling questions showed the greatest differences between bias risk assessments, whereas domain 2 items showed the least. When considering SRs published about psoriasis, methodological quality remains suboptimal, and the risk of bias is elevated, even for SRs exhibiting high methodological quality. Furthermore, the AMSTAR and ROBIS tools may be considered as complementary when conducting quality assessment of SRs. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. A Survey of Security Tools for the Industrial Control System Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hurd, Carl M.; McCarty, Michael V.

    This report details the results of a survey conducted by Idaho National Laboratory (INL) to identify existing tools which could be used to prevent, detect, mitigate, or investigate a cyber-attack in an industrial control system (ICS) environment. This report compiles a list of potentially applicable tools and shows the coverage of the tools in an ICS architecture.

  1. An evaluation of the quality of obstetric morbidity coding using an objective assessment tool, the Performance Indicators For Coding Quality (PICQ).

    PubMed

    Lamb, Mary K; Innes, Kerry; Saad, Patricia; Rust, Julie; Dimitropoulos, Vera; Cumerlato, Megan

    The Performance Indicators for Coding Quality (PICQ) is a data quality assessment tool developed by Australia's National Centre for Classification in Health (NCCH). PICQ consists of a number of indicators covering all ICD-10-AM disease chapters, some procedure chapters from the Australian Classification of Health Intervention (ACHI) and some Australian Coding Standards (ACS). The indicators can be used to assess the coding quality of hospital morbidity data by monitoring compliance of coding conventions and ACS; this enables the identification of particular records that may be incorrectly coded, thus providing a measure of data quality. There are 31 obstetric indicators available for the ICD-10-AM Fourth Edition. Twenty of these 31 indicators were classified as Fatal, nine as Warning and two Relative. These indicators were used to examine coding quality of obstetric records in the 2004-2005 financial year Australian national hospital morbidity dataset. Records with obstetric disease or procedure codes listed anywhere in the code string were extracted and exported from the SPSS source file. Data were then imported into a Microsoft Access database table as per PICQ instructions, and run against all Fatal and Warning and Relative (N=31) obstetric PICQ 2006 Fourth Edition Indicators v.5 for the ICD-10- AM Fourth Edition. There were 689,905 gynaecological and obstetric records in the 2004-2005 financial year, of which 1.14% were found to have triggered Fatal degree errors, 3.78% Warning degree errors and 8.35% Relative degree errors. The types of errors include completeness, redundancy, specificity and sequencing problems. It was found that PICQ is a useful initial screening tool for the assessment of ICD-10-AM/ACHI coding quality. The overall quality of codes assigned to obstetric records in the 2004- 2005 Australian national morbidity dataset is of fair quality.

  2. Proceedings of the Workshop on software tools for distributed intelligent control systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herget, C.J.

    1990-09-01

    The Workshop on Software Tools for Distributed Intelligent Control Systems was organized by Lawrence Livermore National Laboratory for the United States Army Headquarters Training and Doctrine Command and the Defense Advanced Research Projects Agency. The goals of the workshop were to the identify the current state of the art in tools which support control systems engineering design and implementation, identify research issues associated with writing software tools which would provide a design environment to assist engineers in multidisciplinary control design and implementation, formulate a potential investment strategy to resolve the research issues and develop public domain code which can formmore » the core of more powerful engineering design tools, and recommend test cases to focus the software development process and test associated performance metrics. Recognizing that the development of software tools for distributed intelligent control systems will require a multidisciplinary effort, experts in systems engineering, control systems engineering, and compute science were invited to participate in the workshop. In particular, experts who could address the following topics were selected: operating systems, engineering data representation and manipulation, emerging standards for manufacturing data, mathematical foundations, coupling of symbolic and numerical computation, user interface, system identification, system representation at different levels of abstraction, system specification, system design, verification and validation, automatic code generation, and integration of modular, reusable code.« less

  3. [Post-marketing reevaluation for potential quality risk and quality control in clinical application of traditional Chinese medicines].

    PubMed

    Li, Hong-jiao; He, Li-yun; Liu, Bao-yan

    2015-06-01

    The effective quality control in clinical practices is an effective guarantee for the authenticity and scientificity of the findings. The post-marketing reevaluation for traditional Chinese medicines (TCM) focuses on the efficacy, adverse reaction, combined medication and effective dose of drugs in the market by expanded clinical trials, and requires a larger sample size and a wider range of patients. Therefore, this increases the difficulty of quality control in clinical practices. With the experience in quality control in clinical practices for the post-marketing reevaluation for Kangbingdu oral for cold, researchers in this study reviewed the study purpose, project, scheme design and clinical practice process from an overall point of view, analyzed the study characteristics of the post-marketing reevaluation for TCMs and the quality control risks, designed the quality control contents with quality impacting factors, defined key review contents and summarized the precautions in clinical practices, with the aim to improve the efficiency of quality control of clinical practices. This study can provide reference to clinical units and quality control-related personnel in the post-marketing reevaluation for TCMs.

  4. Tools for controlling protein interactions with light

    PubMed Central

    Tucker, Chandra L.; Vrana, Justin D.; Kennedy, Matthew J.

    2014-01-01

    Genetically-encoded actuators that allow control of protein-protein interactions with light, termed ‘optical dimerizers’, are emerging as new tools for experimental biology. In recent years, numerous new and versatile dimerizer systems have been developed. Here we discuss the design of optical dimerizer experiments, including choice of a dimerizer system, photoexcitation sources, and coordinate use of imaging reporters. We provide detailed protocols for experiments using two dimerization systems we previously developed, CRY2/CIB and UVR8/UVR8, for use controlling transcription, protein localization, and protein secretion with light. Additionally, we provide instructions and software for constructing a pulse-controlled LED light device for use in experiments requiring extended light treatments. PMID:25181301

  5. Power quality improvement by using STATCOM control scheme in wind energy generation interface to grid

    NASA Astrophysics Data System (ADS)

    Kirmani, Sheeraz; Kumar, Brijesh

    2018-01-01

    “Electric Power Quality (EPQ) is a term that refers to maintaining the near sinusoidal waveform of power distribution bus voltages and currents at rated magnitude and frequency”. Today customers are more aware of the seriousness that the power quality possesses, this prompt the utilities to assure good quality of power to their customer. The power quality is basically customer centric. Increased focus of utilities toward maintaining reliable power supply by employing power quality improvement tools has reduced the power outages and black out considerably. Good power quality is the characteristic of reliable power supply. Low power factor, harmonic pollution, load imbalance, fast voltage variations are some common parameters which are used to define the power quality. If the power quality issues are not checked i.e. the parameters that define power quality doesn't fall within the predefined standards than it will lead into high electricity bill, high running cost in industries, malfunctioning of equipments, challenges in connecting renewable. Capacitor banks, FACTS devices, harmonic filters, SVC’s (static voltage compensators), STATCOM (Static-Compensator) are the solutions to achieve the power quality. The performance of Wind turbine generators is affected by poor quality power, at the same time these wind power generating plant affects the power quality negatively. This paper presents the STATCOM-BESS (battery energy storage system) system and studies its impact on the power quality in a system which consists of wind turbine generator, non linear load, hysteresis controller for controlling the operation of STATCOM and grid. The model is simulated in the MATLAB/Simulink. This scheme mitigates the power quality issues, improves voltage profile and also reduces harmonic distortion of the waveforms. BESS level out the imbalances caused in real power due to intermittent nature of wind power available due to varying wind speeds.

  6. Multi-Agent Architecture with Support to Quality of Service and Quality of Control

    NASA Astrophysics Data System (ADS)

    Poza-Luján, Jose-Luis; Posadas-Yagüe, Juan-Luis; Simó-Ten, Jose-Enrique

    Multi Agent Systems (MAS) are one of the most suitable frameworks for the implementation of intelligent distributed control system. Agents provide suitable flexibility to give support to implied heterogeneity in cyber-physical systems. Quality of Service (QoS) and Quality of Control (QoC) parameters are commonly utilized to evaluate the efficiency of the communications and the control loop. Agents can use the quality measures to take a wide range of decisions, like suitable placement on the control node or to change the workload to save energy. This article describes the architecture of a multi agent system that provides support to QoS and QoC parameters to optimize de system. The architecture uses a Publish-Subscriber model, based on Data Distribution Service (DDS) to send the control messages. Due to the nature of the Publish-Subscribe model, the architecture is suitable to implement event-based control (EBC) systems. The architecture has been called FSACtrl.

  7. 40 CFR 81.87 - Metropolitan Boise Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Quality Control Region. 81.87 Section 81.87 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.87 Metropolitan Boise Intrastate Air Quality Control Region. The Metropolitan Boise Intrastate Air Quality Control Region (Idaho) consists of the territorial area encompassed...

  8. 40 CFR 81.89 - Metropolitan Cheyenne Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Quality Control Region. 81.89 Section 81.89 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.89 Metropolitan Cheyenne Intrastate Air Quality Control Region. The Metropolitan Cheyenne Intrastate Air Quality Control Region (Wyoming) consists of the territorial area...

  9. 40 CFR 81.101 - Metropolitan Dubuque Interstate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Quality Control Region. 81.101 Section 81.101 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.101 Metropolitan Dubuque Interstate Air Quality Control Region. The Metropolitan Dubuque Interstate Air Quality Control Region (Illinois-Iowa-Wisconsin) consists of the...

  10. 40 CFR 81.104 - Central Pennsylvania Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Quality Control Region. 81.104 Section 81.104 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.104 Central Pennsylvania Intrastate Air Quality Control Region. The Central Pennsylvania Intrastate Air Quality Control Region consists of the territorial area encompassed by...

  11. 40 CFR 81.106 - Greenville-Spartanburg Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Quality Control Region. 81.106 Section 81.106 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.106 Greenville-Spartanburg Intrastate Air Quality Control Region. The Greenville-Spartanburg Intrastate Air Quality Control Region (South Carolina) consists of the territorial...

  12. 40 CFR 81.120 - Middle Tennessee Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Quality Control Region. 81.120 Section 81.120 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.120 Middle Tennessee Intrastate Air Quality Control Region. The Middle Tennessee Intrastate Air Quality Control Region consists of the territorial area encompassed by...

  13. 40 CFR 81.75 - Metropolitan Charlotte Interstate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Quality Control Region. 81.75 Section 81.75 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.75 Metropolitan Charlotte Interstate Air Quality Control Region. The Metropolitan Charlotte Interstate Air Quality Control Region (North Carolina-South Carolina) has been revised...

  14. 40 CFR 81.79 - Northeastern Oklahoma Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Quality Control Region. 81.79 Section 81.79 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.79 Northeastern Oklahoma Intrastate Air Quality Control Region. The Metropolitan Tulsa Intrastate Air Quality Control Region has been renamed the Northeastern Oklahoma Intrastate...

  15. 40 CFR 81.119 - Western Tennessee Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Quality Control Region. 81.119 Section 81.119 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.119 Western Tennessee Intrastate Air Quality Control Region. The Western Tennessee Intrastate Air Quality Control Region consists of the territorial area encompassed by...

  16. 40 CFR 81.104 - Central Pennsylvania Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Quality Control Region. 81.104 Section 81.104 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.104 Central Pennsylvania Intrastate Air Quality Control Region. The Central Pennsylvania Intrastate Air Quality Control Region consists of the territorial area encompassed by...

  17. 40 CFR 81.87 - Metropolitan Boise Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Quality Control Region. 81.87 Section 81.87 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.87 Metropolitan Boise Intrastate Air Quality Control Region. The Metropolitan Boise Intrastate Air Quality Control Region (Idaho) consists of the territorial area encompassed...

  18. 40 CFR 81.79 - Northeastern Oklahoma Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Quality Control Region. 81.79 Section 81.79 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.79 Northeastern Oklahoma Intrastate Air Quality Control Region. The Metropolitan Tulsa Intrastate Air Quality Control Region has been renamed the Northeastern Oklahoma Intrastate...

  19. 40 CFR 81.101 - Metropolitan Dubuque Interstate Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Quality Control Region. 81.101 Section 81.101 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.101 Metropolitan Dubuque Interstate Air Quality Control Region. The Metropolitan Dubuque Interstate Air Quality Control Region (Illinois-Iowa-Wisconsin) consists of the...

  20. 40 CFR 81.106 - Greenville-Spartanburg Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Quality Control Region. 81.106 Section 81.106 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.106 Greenville-Spartanburg Intrastate Air Quality Control Region. The Greenville-Spartanburg Intrastate Air Quality Control Region (South Carolina) consists of the territorial...

  1. 40 CFR 81.89 - Metropolitan Cheyenne Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Quality Control Region. 81.89 Section 81.89 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.89 Metropolitan Cheyenne Intrastate Air Quality Control Region. The Metropolitan Cheyenne Intrastate Air Quality Control Region (Wyoming) consists of the territorial area...

  2. 40 CFR 81.62 - Northeast Mississippi Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Quality Control Region. 81.62 Section 81.62 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.62 Northeast Mississippi Intrastate Air Quality Control Region. The Alabama-Mississippi-Tennessee Interstate Air Quality Control Region has been renamed the Northeast...

  3. 40 CFR 81.78 - Metropolitan Portland Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Quality Control Region. 81.78 Section 81.78 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.78 Metropolitan Portland Intrastate Air Quality Control Region. The Metropolitan Portland Intrastate Air Quality Control Region (Maine) consists of the territorial area...

  4. 40 CFR 81.77 - Puerto Rico Air Quality Control Region.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 18 2014-07-01 2014-07-01 false Puerto Rico Air Quality Control Region... Control Regions § 81.77 Puerto Rico Air Quality Control Region. The Puerto Rico Air Quality Control Region... delimited): The entire Commonwealth of Puerto Rico: Puerto Rico and surrounding islands, Vieques and...

  5. The Use of UV-Visible Reflectance Spectroscopy as an Objective Tool to Evaluate Pearl Quality

    PubMed Central

    Agatonovic-Kustrin, Snezana; Morton, David W.

    2012-01-01

    Assessing the quality of pearls involves the use of various tools and methods, which are mainly visual and often quite subjective. Pearls are normally classified by origin and are then graded by luster, nacre thickness, surface quality, size, color and shape. The aim of this study was to investigate the capacity of Artificial Neural Networks (ANNs) to classify and estimate the quality of 27 different pearls from their UV-Visible spectra. Due to the opaque nature of pearls, spectroscopy measurements were performed using the Diffuse Reflectance UV-Visible spectroscopy technique. The spectra were acquired at two different locations on each pearl sample in order to assess surface homogeneity. The spectral data (inputs) were smoothed to reduce the noise, fed into ANNs and correlated to the pearl’s quality/grading criteria (outputs). The developed ANNs were successful in predicting pearl type, mollusk growing species, possible luster and color enhancing, donor condition/type, recipient/host color, donor color, pearl luster, pearl color, origin. The results of this study shows that the developed UV-Vis spectroscopy-ANN method could be used as a more objective method of assessing pearl quality (grading) and may become a valuable tool for the pearl grading industry. PMID:22851919

  6. COMMUNITY MULTISCALE AIR QUALITY ( CMAQ ) MODEL - QUALITY ASSURANCE AND VERSION CONTROL

    EPA Science Inventory

    This presentation will be given to the EPA Exposure Modeling Workgroup on January 24, 2006. The quality assurance and version control procedures for the Community Multiscale Air Quality (CMAQ) Model are presented. A brief background of CMAQ is given, then issues related to qual...

  7. Nuclear Technology Series. Course 14: Introduction to Quality Assurance/Quality Control.

    ERIC Educational Resources Information Center

    Technical Education Research Center, Waco, TX.

    This technical specialty course is one of thirty-five courses designed for use by two-year postsecondary institutions in five nuclear technician curriculum areas: (1) radiation protection technician, (2) nuclear instrumentation and control technician, (3) nuclear materials processing technician, (4) nuclear quality-assurance/quality-control…

  8. Can social media be used as a hospital quality improvement tool?

    PubMed

    Lagu, Tara; Goff, Sarah L; Craft, Ben; Calcasola, Stephanie; Benjamin, Evan M; Priya, Aruna; Lindenauer, Peter K

    2016-01-01

    Many hospitals wish to improve their patients' experience of care. To learn whether social media could be used as a tool to engage patients and to identify opportunities for hospital quality improvement (QI), we solicited patients' narrative feedback on the Baystate Medical Center Facebook page during a 3-week period in 2014. Two investigators used directed qualitative content analysis to code comments and descriptive statistics to assess the frequency of selected codes and themes. We identified common themes, including: (1) comments about staff (17/37 respondents, 45.9%); (2) comments about specific departments (22/37, 59.5%); (3) comments on technical aspects of care, including perceived errors and inattention to pain control (9/37, 24.3%); and (4) comments describing the hospital physical plant, parking, and amenities (9/37, 24.3%). A small number (n = 3) of patients repeatedly responded, accounting for 30% (45/148) of narratives. Although patient feedback on social media could help to drive hospital QI efforts, any potential benefits must be weighed against the reputational risks, the lack of representativeness among respondents, and the volume of responses needed to identify areas of improvement. © 2015 Society of Hospital Medicine.

  9. Manufacturing Execution Systems: Examples of Performance Indicator and Operational Robustness Tools.

    PubMed

    Gendre, Yannick; Waridel, Gérard; Guyon, Myrtille; Demuth, Jean-François; Guelpa, Hervé; Humbert, Thierry

    Manufacturing Execution Systems (MES) are computerized systems used to measure production performance in terms of productivity, yield, and quality. In the first part, performance indicator and overall equipment effectiveness (OEE), process robustness tools and statistical process control are described. The second part details some tools to help process robustness and control by operators by preventing deviations from target control charts. MES was developed by Syngenta together with CIMO for automation.

  10. 40 CFR 81.118 - Southwest Missouri Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Quality Control Region. 81.118 Section 81.118 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.118 Southwest Missouri Intrastate Air Quality Control Region. The Southwest Missouri Intrastate Air Quality Control Region consists of the territorial area encompassed by the...

  11. 40 CFR 81.116 - Northern Missouri Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Quality Control Region. 81.116 Section 81.116 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.116 Northern Missouri Intrastate Air Quality Control Region. The Northern Missouri Intrastate Air Quality Control Region consists of the territorial area encompassed by the...

  12. 40 CFR 81.97 - Southwest Florida Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Quality Control Region. 81.97 Section 81.97 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.97 Southwest Florida Intrastate Air Quality Control Region. The Southwest Florida Intrastate Air Quality Control Region consists of the territorial area encompassed by the...

  13. 40 CFR 81.117 - Southeast Missouri Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Quality Control Region. 81.117 Section 81.117 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.117 Southeast Missouri Intrastate Air Quality Control Region. The Southeast Missouri Intrastate Air Quality Control Region consists of the territorial area encompassed by the...

  14. 40 CFR 81.98 - Burlington-Keokuk Interstate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Quality Control Region. 81.98 Section 81.98 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.98 Burlington-Keokuk Interstate Air Quality Control Region. The Burlington-Keokuk Interstate Air Quality Control Region (Illinois-Iowa) is revised to consist of the...

  15. 40 CFR 81.118 - Southwest Missouri Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Quality Control Region. 81.118 Section 81.118 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.118 Southwest Missouri Intrastate Air Quality Control Region. The Southwest Missouri Intrastate Air Quality Control Region consists of the territorial area encompassed by the...

  16. 40 CFR 81.115 - Northwest Nevada Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Quality Control Region. 81.115 Section 81.115 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.115 Northwest Nevada Intrastate Air Quality Control Region. The Northwest Nevada Intrastate Air Quality Control Region consists of the territorial area encompassed by the...

  17. 40 CFR 81.116 - Northern Missouri Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Quality Control Region. 81.116 Section 81.116 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.116 Northern Missouri Intrastate Air Quality Control Region. The Northern Missouri Intrastate Air Quality Control Region consists of the territorial area encompassed by the...

  18. 40 CFR 81.123 - Southeastern Oklahoma Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Quality Control Region. 81.123 Section 81.123 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.123 Southeastern Oklahoma Intrastate Air Quality Control Region. The Southeastern Oklahoma Intrastate Air Quality Control Region consists of the territorial area encompassed by the...

  19. 40 CFR 81.67 - Lake Michigan Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Quality Control Regions § 81.67 Lake Michigan Intrastate Air Quality Control Region. The Menominee-Escanaba (Michigan)-Marinette (Wisconsin) Interstate Air Quality Control Region has been renamed the Lake Michigan Intrastate Air Quality Control Region (Wisconsin) and revised to consist of the territorial area...

  20. 40 CFR 81.115 - Northwest Nevada Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Quality Control Region. 81.115 Section 81.115 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.115 Northwest Nevada Intrastate Air Quality Control Region. The Northwest Nevada Intrastate Air Quality Control Region consists of the territorial area encompassed by the...

  1. 40 CFR 81.97 - Southwest Florida Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Quality Control Region. 81.97 Section 81.97 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.97 Southwest Florida Intrastate Air Quality Control Region. The Southwest Florida Intrastate Air Quality Control Region consists of the territorial area encompassed by the...

  2. 40 CFR 81.117 - Southeast Missouri Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Quality Control Region. 81.117 Section 81.117 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.117 Southeast Missouri Intrastate Air Quality Control Region. The Southeast Missouri Intrastate Air Quality Control Region consists of the territorial area encompassed by the...

  3. 40 CFR 81.122 - Mississippi Delta Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Quality Control Region. 81.122 Section 81.122 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.122 Mississippi Delta Intrastate Air Quality Control Region. The Mississippi Delta Intrastate Air Quality Control Region consists of the territorial area encompassed by the...

  4. 40 CFR 81.98 - Burlington-Keokuk Interstate Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Quality Control Region. 81.98 Section 81.98 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.98 Burlington-Keokuk Interstate Air Quality Control Region. The Burlington-Keokuk Interstate Air Quality Control Region (Illinois-Iowa) is revised to consist of the...

  5. Improving STEM Program Quality in Out-of-School-Time: Tool Development and Validation

    ERIC Educational Resources Information Center

    Shah, Ashima Mathur; Wylie, Caroline; Gitomer, Drew; Noam, Gil

    2018-01-01

    In and out-of-school time (OST) experiences are viewed as complementary in contributing to students' interest, engagement, and performance in science, technology, engineering, and mathematics (STEM). While tools exist to measure quality in general afterschool settings and others to measure structured science classroom experiences, there is a need…

  6. In-line quality control of moving objects by means of spectral-domain OCT

    NASA Astrophysics Data System (ADS)

    Markl, Daniel; Hannesschläger, Günther; Buchsbaum, Andreas; Sacher, Stephan; Khinast, Johannes G.; Leitner, Michael

    2014-08-01

    In-line quality control of intermediate and final products is essential in various industries. This may imply determining the thickness of a foil or evaluating the homogeneity of coating applied to a pharmaceutical tablet. Such a qualitative and quantitative monitoring in a depth-resolved manner can be accomplished using optical coherence tomography (OCT). In-line quality control based on OCT requires additional consideration of motion effects for the system design as well as for data interpretation. This study focuses on transverse motion effects that can arise in spectral-domain (SD-) OCT systems. The impact of a transverse movement is analyzed for a constant relative speed difference up to 0.7 m/s between sample and sensor head. In particular, transverse motion is affecting OCT system properties such as the beam displacement (distance between adjacent A-scans) and transverse resolution. These properties were evaluated theoretically and experimentally for OCT images of a resolution target and pharmaceutical film-coated tablets. Both theoretical and experimental analyses highlight the shift of the transverse resolution limiting factor from the optics to the beam displacement above a relative speed difference between sensor head and sample of 0.42 m/s (for the presented SD-OCT setup). Speeds above 0.4 m/s are often demanded when monitoring industrial processes, such as a coating process when producing film-coated tablets. This emphasizes the importance of a fast data acquisition when using OCT as in-line quality control tool.

  7. 20 CFR 602.41 - Proper expenditure of Quality Control granted funds.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 20 Employees' Benefits 3 2011-04-01 2011-04-01 false Proper expenditure of Quality Control granted... LABOR QUALITY CONTROL IN THE FEDERAL-STATE UNEMPLOYMENT INSURANCE SYSTEM Quality Control Grants to States § 602.41 Proper expenditure of Quality Control granted funds. The Secretary may, after reasonable...

  8. 20 CFR 602.41 - Proper expenditure of Quality Control granted funds.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Proper expenditure of Quality Control granted... LABOR QUALITY CONTROL IN THE FEDERAL-STATE UNEMPLOYMENT INSURANCE SYSTEM Quality Control Grants to States § 602.41 Proper expenditure of Quality Control granted funds. The Secretary may, after reasonable...

  9. Web-based Quality Control Tool used to validate CERES products on a cluster of Linux servers

    NASA Astrophysics Data System (ADS)

    Chu, C.; Sun-Mack, S.; Heckert, E.; Chen, Y.; Mlynczak, P.; Mitrescu, C.; Doelling, D.

    2014-12-01

    There have been a few popular desktop tools used in the Earth Science community to validate science data. Because of the limitation on the capacity of desktop hardware such as disk space and CPUs, those softwares are not able to display large amount of data from files.This poster will talk about an in-house developed web-based software built on a cluster of Linux servers. That allows users to take advantage of a few Linux servers working in parallel to generate hundreds images in a short period of time. The poster will demonstrate:(1) The hardware and software architecture is used to provide high throughput of images. (2) The software structure that can incorporate new products and new requirement quickly. (3) The user interface about how users can manipulate the data and users can control how the images are displayed.

  10. Japanese Quality Control Circles.

    ERIC Educational Resources Information Center

    Nishiyama, Kazuo

    In recent years, United States scholars with an interest in international business and organizational communication have begun to notice the success of Japanese "quality control circles." These are small groups, usually composed of seven to ten workers, who are organized at the production levels within most large Japanese factories. A…

  11. 14 CFR 21.143 - Quality control data requirements; prime manufacturer.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 1 2011-01-01 2011-01-01 false Quality control data requirements; prime... describing assigned responsibilities and delegated authority of the quality control organization, together with a chart indicating the functional relationship of the quality control organization to management...

  12. 40 CFR 81.51 - Portland Interstate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Control Region. 81.51 Section 81.51 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) DESIGNATION OF AREAS FOR AIR QUALITY PLANNING PURPOSES Designation of Air Quality Control Regions § 81.51 Portland Interstate Air Quality Control Region. The Portland Interstate...

  13. 40 CFR 81.51 - Portland Interstate Air Quality Control Region.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Control Region. 81.51 Section 81.51 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) DESIGNATION OF AREAS FOR AIR QUALITY PLANNING PURPOSES Designation of Air Quality Control Regions § 81.51 Portland Interstate Air Quality Control Region. The Portland Interstate...

  14. 40 CFR 81.51 - Portland Interstate Air Quality Control Region.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Control Region. 81.51 Section 81.51 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) DESIGNATION OF AREAS FOR AIR QUALITY PLANNING PURPOSES Designation of Air Quality Control Regions § 81.51 Portland Interstate Air Quality Control Region. The Portland Interstate...

  15. Addendum to Air Quality: Decision Support Tools, Partner Plans, Working Groups, Committees

    NASA Technical Reports Server (NTRS)

    Holekamp, Kara; Frisbie, Troy; Estep, Lee

    2005-01-01

    In the original report dated February 11, 2005, the utility of NASA Earth science data in the air quality activities of other agencies and organizations was assessed by reviewing strategic and mission plans and by conducting personal interviews with agency experts to identify and investigate agencies with the potential for partnership with NASA. The overarching agency strategic plans were reviewed and commonalities such as the desire for partnerships and technology development were noted. This addendum to the original report contains such information about the Tennessee Valley Authority and will be inserted as Section 2.6 of "Air Quality: Decision Support Tools, Partner Plans, Working Groups, Committees."

  16. Addendum to Air Quality: Decision Support Tools, Partner Plans, Working Groups, Committees

    NASA Technical Reports Server (NTRS)

    Holekamp, Kara; Frisbie, Troy; Estep, Lee

    2005-01-01

    In the original report dated February 11, 2005, the utility of the NASA Earth science data in the air quality activities of other agencies and organizations was assessed by reviewing strategic and mission plans and by conducting personal interviews with agency experts to identify and investigate agencies with the potential for partnership with NASA. The overarching agency strategic plans were reviewed and commonalities such as the desire for partnerships and technology development were noted. The addendum to the original report contains such information about the Tennessee Valley Authority and will be inserted in Section 2.6 of "Air Quality Decision Support Tools, Partner Plans, Working Groups, Committees".

  17. 30 CFR 28.32 - Proposed quality control plans; approval by MSHA.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Proposed quality control plans; approval by...-CIRCUIT PROTECTION FOR TRAILING CABLES IN COAL MINES Quality Control § 28.32 Proposed quality control plans; approval by MSHA. (a) Each proposed quality control plan submitted in accordance with this...

  18. 30 CFR 28.32 - Proposed quality control plans; approval by MSHA.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...-CIRCUIT PROTECTION FOR TRAILING CABLES IN COAL MINES Quality Control § 28.32 Proposed quality control plans; approval by MSHA. (a) Each proposed quality control plan submitted in accordance with this... 30 Mineral Resources 1 2012-07-01 2012-07-01 false Proposed quality control plans; approval by...

  19. 30 CFR 28.32 - Proposed quality control plans; approval by MSHA.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...-CIRCUIT PROTECTION FOR TRAILING CABLES IN COAL MINES Quality Control § 28.32 Proposed quality control plans; approval by MSHA. (a) Each proposed quality control plan submitted in accordance with this... 30 Mineral Resources 1 2013-07-01 2013-07-01 false Proposed quality control plans; approval by...

  20. 30 CFR 28.32 - Proposed quality control plans; approval by MSHA.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...-CIRCUIT PROTECTION FOR TRAILING CABLES IN COAL MINES Quality Control § 28.32 Proposed quality control plans; approval by MSHA. (a) Each proposed quality control plan submitted in accordance with this... 30 Mineral Resources 1 2014-07-01 2014-07-01 false Proposed quality control plans; approval by...