NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR FORM QA/QC CHECKS (UA-C-2.0)
The purpose of this SOP is to outline the process of Field Quality Assurance and Quality Control checks. This procedure was followed to ensure consistent data retrieval during the Arizona NHEXAS project and the "Border" study. Keywords: custody; QA/QC; field checks.
The Nation...
The purpose of this SOP is to outline the process of field quality assurance and quality control checks. This procedure was followed to ensure consistent data retrieval during the Arizona NHEXAS project and the Border study. Keywords: custody; QA/QC; field checks.
The U.S.-Mex...
40 CFR Appendix B to Part 75 - Quality Assurance and Quality Control Procedures
Code of Federal Regulations, 2012 CFR
2012-07-01
... Systems 1.2.1Calibration Error Test and Linearity Check Procedures Keep a written record of the procedures used for daily calibration error tests and linearity checks (e.g., how gases are to be injected..., and when calibration adjustments should be made). Identify any calibration error test and linearity...
40 CFR Appendix B to Part 75 - Quality Assurance and Quality Control Procedures
Code of Federal Regulations, 2013 CFR
2013-07-01
... Systems 1.2.1Calibration Error Test and Linearity Check Procedures Keep a written record of the procedures used for daily calibration error tests and linearity checks (e.g., how gases are to be injected..., and when calibration adjustments should be made). Identify any calibration error test and linearity...
7 CFR 58.243 - Checking quality.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 3 2010-01-01 2010-01-01 false Checking quality. 58.243 Section 58.243 Agriculture... Procedures § 58.243 Checking quality. All milk, milk products and dry milk products shall be subject to inspection and analysis by the dairy plant for quality and condition throughout each processing operation...
Code of Federal Regulations, 2010 CFR
2010-07-01
... until the leak check is passed. Post-test leak check ≤4% of average sampling rate After sampling ** See... the test site. The sorbent media must be obtained from a source that can demonstrate the quality...-traceable calibration gas standards and reagents shall be used for the tests and procedures required under...
Helical tomotherapy quality assurance with ArcCHECK.
Chapman, David; Barnett, Rob; Yartsev, Slav
2014-01-01
To design a quality assurance (QA) procedure for helical tomotherapy that measures multiple beam parameters with 1 delivery and uses a rotating gantry to simulate treatment conditions. The customized QA procedure was preprogrammed on the tomotherapy operator station. The dosimetry measurements were performed using an ArcCHECK diode array and an A1SL ion chamber inserted in the central holder. The ArcCHECK was positioned 10cm above the isocenter so that the 21-cm diameter detector array could measure the 40-cm wide tomotherapy beam. During the implementation of the new QA procedure, separate comparative measurements were made using ion chambers in both liquid and solid water, the tomotherapy onboard detector array, and a MapCHECK diode array for a period of 10 weeks. There was good agreement (within 1.3%) for the beam output and cone ratio obtained with the new procedure and the routine QA measurements. The measured beam energy was comparable (0.3%) to solid water measurement during the 10-week evaluation period, excluding 2 of the 10 measurements with unusually high background. The symmetry reading was similarly compromised for those 2 weeks, and on the other weeks, it deviated from the solid water reading by ~2.5%. The ArcCHECK phantom presents a suitable alternative for performing helical tomotherapy QA, provided the background is collected properly. The proposed weekly procedure using ArcCHECK and water phantom makes the QA process more efficient. Copyright © 2014 American Association of Medical Dosimetrists. Published by Elsevier Inc. All rights reserved.
Helical tomotherapy quality assurance with ArcCHECK
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chapman, David; Barnett, Rob; Yartsev, Slav, E-mail: slav.yartsev@lhsc.on.ca
2014-07-01
To design a quality assurance (QA) procedure for helical tomotherapy that measures multiple beam parameters with 1 delivery and uses a rotating gantry to simulate treatment conditions. The customized QA procedure was preprogrammed on the tomotherapy operator station. The dosimetry measurements were performed using an ArcCHECK diode array and an A1SL ion chamber inserted in the central holder. The ArcCHECK was positioned 10 cm above the isocenter so that the 21-cm diameter detector array could measure the 40-cm wide tomotherapy beam. During the implementation of the new QA procedure, separate comparative measurements were made using ion chambers in both liquidmore » and solid water, the tomotherapy onboard detector array, and a MapCHECK diode array for a period of 10 weeks. There was good agreement (within 1.3%) for the beam output and cone ratio obtained with the new procedure and the routine QA measurements. The measured beam energy was comparable (0.3%) to solid water measurement during the 10-week evaluation period, excluding 2 of the 10 measurements with unusually high background. The symmetry reading was similarly compromised for those 2 weeks, and on the other weeks, it deviated from the solid water reading by ∼2.5%. The ArcCHECK phantom presents a suitable alternative for performing helical tomotherapy QA, provided the background is collected properly. The proposed weekly procedure using ArcCHECK and water phantom makes the QA process more efficient.« less
Standard Reference Specimens in Quality Control of Engineering Surfaces
Song, J. F.; Vorburger, T. V.
1991-01-01
In the quality control of engineering surfaces, we aim to understand and maintain a good relationship between the manufacturing process and surface function. This is achieved by controlling the surface texture. The control process involves: 1) learning the functional parameters and their control values through controlled experiments or through a long history of production and use; 2) maintaining high accuracy and reproducibility with measurements not only of roughness calibration specimens but also of real engineering parts. In this paper, the characteristics, utilizations, and limitations of different classes of precision roughness calibration specimens are described. A measuring procedure of engineering surfaces, based on the calibration procedure of roughness specimens at NIST, is proposed. This procedure involves utilization of check specimens with waveform, wavelength, and other roughness parameters similar to functioning engineering surfaces. These check specimens would be certified under standardized reference measuring conditions, or by a reference instrument, and could be used for overall checking of the measuring procedure and for maintaining accuracy and agreement in engineering surface measurement. The concept of “surface texture design” is also suggested, which involves designing the engineering surface texture, the manufacturing process, and the quality control procedure to meet the optimal functional needs. PMID:28184115
Operative blood transfusion quality improvement audit.
Al Sohaibani, Mazen; Al Malki, Assaf; Pogaku, Venumadhav; Al Dossary, Saad; Al Bernawi, Hanan
2014-01-01
To determine how current anesthesia team handless the identification of surgical anaesthetized patient (right patient). And the check of blood unit before collecting and immediately before blood administration (right blood) in operating rooms where nurses have minimal duties and responsibility to handle blood for transfusion in anaesthetized patients. To elicit the degree of anesthesia staff compliance with new policies and procedures for anaesthetized surgical patient the blood transfusion administration. A large tertiary care reference and teaching hospital. A prospective quality improvement. Elaboration on steps for administration of transfusion from policies and procedures to anaesthetized patients; and analysis of the audit forms for conducted transfusions. An audit form was used to get key performance indicators (KPIs) observed in all procedures involve blood transfusion and was ticked as item was met, partially met, not met or not applicable. Descriptive statistics as number and percentage Microsoft excel 2003. Central quality improvement committee presented the results in number percentage and graphs. The degree of compliance in performing the phases of blood transfusion by anesthesia staff reached high percentage which let us feel certain that the quality is assured that the internal policy and procedures (IPP) are followed in the great majority of all types of red cells and other blood products transfusion from the start of requesting the blood or blood product to the prescript of checking the patient in the immediate post-transfusion period. Specific problem area of giving blood transfusion to anaesthetized patient was checking KPI concerning the phases of blood transfusion was audited and assured the investigators of high quality performance in procedures of transfusion.
40 CFR Appendix B to Part 75 - Quality Assurance and Quality Control Procedures
Code of Federal Regulations, 2010 CFR
2010-07-01
... in section 2.3 of this appendix and the Hg emission tests described in §§ 75.81(c) and 75.81(d)(4). 1.2Specific Requirements for Continuous Emissions Monitoring Systems 1.2.1Calibration Error Test and Linearity Check Procedures Keep a written record of the procedures used for daily calibration error tests and...
Revisiting the Procedures for the Vector Data Quality Assurance in Practice
NASA Astrophysics Data System (ADS)
Erdoğan, M.; Torun, A.; Boyacı, D.
2012-07-01
Immense use of topographical data in spatial data visualization, business GIS (Geographic Information Systems) solutions and applications, mobile and location-based services forced the topo-data providers to create standard, up-to-date and complete data sets in a sustainable frame. Data quality has been studied and researched for more than two decades. There have been un-countable numbers of references on its semantics, its conceptual logical and representations and many applications on spatial databases and GIS. However, there is a gap between research and practice in the sense of spatial data quality which increases the costs and decreases the efficiency of data production. Spatial data quality is well-known by academia and industry but usually in different context. The research on spatial data quality stated several issues having practical use such as descriptive information, metadata, fulfillment of spatial relationships among data, integrity measures, geometric constraints etc. The industry and data producers realize them in three stages; pre-, co- and post data capturing. The pre-data capturing stage covers semantic modelling, data definition, cataloguing, modelling, data dictionary and schema creation processes. The co-data capturing stage covers general rules of spatial relationships, data and model specific rules such as topologic and model building relationships, geometric threshold, data extraction guidelines, object-object, object-belonging class, object-non-belonging class, class-class relationships to be taken into account during data capturing. And post-data capturing stage covers specified QC (quality check) benchmarks and checking compliance to general and specific rules. The vector data quality criteria are different from the views of producers and users. But these criteria are generally driven by the needs, expectations and feedbacks of the users. This paper presents a practical method which closes the gap between theory and practice. Development of spatial data quality concepts into developments and application requires existence of conceptual, logical and most importantly physical existence of data model, rules and knowledge of realization in a form of geo-spatial data. The applicable metrics and thresholds are determined on this concrete base. This study discusses application of geo-spatial data quality issues and QA (quality assurance) and QC procedures in the topographic data production. Firstly we introduce MGCP (Multinational Geospatial Co-production Program) data profile of NATO (North Atlantic Treaty Organization) DFDD (DGIWG Feature Data Dictionary), the requirements of data owner, the view of data producers for both data capturing and QC and finally QA to fulfil user needs. Then, our practical and new approach which divides the quality into three phases is introduced. Finally, implementation of our approach to accomplish metrics, measures and thresholds of quality definitions is discussed. In this paper, especially geometry and semantics quality and quality control procedures that can be performed by the producers are discussed. Some applicable best-practices that we experienced on techniques of quality control, defining regulations that define the objectives and data production procedures are given in the final remarks. These quality control procedures should include the visual checks over the source data, captured vector data and printouts, some automatic checks that can be performed by software and some semi-automatic checks by the interaction with quality control personnel. Finally, these quality control procedures should ensure the geometric, semantic, attribution and metadata quality of vector data.
A System Approach to Navy Medical Education and Training. Appendix 22. Otolaryngology Technician.
1974-08-31
PROCEDURES TO PATIENT 12 PEXPLAIN LUMBAR PUNCTURE PROCEDURES TO PATIENT 13 IMEASURE/WEIGH PATIENT OR PERSONNEL 14 ICHECK CENTRAL VENOUS PRESSURE 15 TAKE...BLOOD PRESSURE 16 [CHECK RADIAL AWRIST) PULSE 17 ICHECK FEMORAL PULSE FOR PRESENCE AND QUALITY 8 IDETERMINE APICAL PULSE RATE/RHYTHM WITH STETHESCOPE 19... ICHECK PATIENTS TEMPERATURE 2U ICHECK /COUNT RESPIRATIONS 21 IPERFORM CIRCULATION CHECK, E.G. COLOR, PULSE, TEMPERATURE OF ISKIN, CAPILLARY RETURN 22
SU-E-T-649: Quality Assurances for Proton Therapy Delivery Equipment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arjomandy, B; Kase, Y; Flanz, J
2015-06-15
Purpose: The number of proton therapy centers has increased dramatically over the past decade. Currently, there is no comprehensive set of guidelines that addresses quality assurance (QA) procedures for the different technologies used for proton therapy. The AAPM has charged task group 224 (TG-224) to provide recommendations for QA required for accurate and safe dose delivery, using existing and next generation proton therapy delivery equipment. Methods: A database comprised of QA procedures and tolerance limits was generated from many existing proton therapy centers in and outside of the US. These consist of proton therapy centers that possessed double scattering, uniformmore » scanning, and pencil beams delivery systems. The diversity in beam delivery systems as well as the existing devices to perform QA checks for different beam parameters is the main subject of TG-224. Based on current practice at the clinically active proton centers participating in this task group, consensus QA recommendations were developed. The methodologies and requirements of the parameters that must be verified for consistency of the performance of the proton beam delivery systems are discussed. Results: TG-224 provides procedures and QA checks for mechanical, imaging, safety and dosimetry requirements for different proton equipment. These procedures are categorized based on their importance and their required frequencies in order to deliver a safe and consistent dose. The task group provides daily, weekly, monthly, and annual QA check procedures with their tolerance limits. Conclusions: The procedures outlined in this protocol provide sufficient information to qualified medical physicists to perform QA checks for any proton delivery system. Execution of these procedures should provide confidence that proton therapy equipment is functioning as commissioned for patient treatment and delivers dose safely and accurately within the established tolerance limits. The report will be published in late 2015.« less
Quality assurance of weather data for agricultural system model input
USDA-ARS?s Scientific Manuscript database
It is well known that crop production and hydrologic variation on watersheds is weather related. Rarely, however, is meteorological data quality checks reported for agricultural systems model research. We present quality assurance procedures for agricultural system model weather data input. Problems...
[Video recording system of endoscopic procedures for digital forensics].
Endo, Chiaki; Sakurada, A; Kondo, T
2009-07-01
Recently, endoscopic procedures including surgery, intervention, and examination have been widely performed. Medical practitioners are required to record the procedures precisely in order to check the procedures retrospectively and to get the legally reliable record. Medical Forensic System made by KS Olympus Japan offers 2 kinds of movie and patient's data, such as heart rate, blood pressure, and Spo, which are simultaneously recorded. We installed this system into the bronchoscopy room and have experienced its benefit. Under this system, we can get bronchoscopic image, bronchoscopy room view, and patient's data simultaneously. We can check the quality of the bronchoscopic procedures retrospectively, which is useful for bronchoscopy staff training. Medical Forensic System should be installed in any kind of endoscopic procedures.
Root, Patsy; Hunt, Margo; Fjeld, Karla; Kundrat, Laurie
2014-01-01
Quality assurance (QA) and quality control (QC) data are required in order to have confidence in the results from analytical tests and the equipment used to produce those results. Some AOAC water methods include specific QA/QC procedures, frequencies, and acceptance criteria, but these are considered to be the minimum controls needed to perform a microbiological method successfully. Some regulatory programs, such as those at Code of Federal Regulations (CFR), Title 40, Part 136.7 for chemistry methods, require additional QA/QC measures beyond those listed in the method, which can also apply to microbiological methods. Essential QA/QC measures include sterility checks, reagent specificity and sensitivity checks, assessment of each analyst's capabilities, analysis of blind check samples, and evaluation of the presence of laboratory contamination and instrument calibration and checks. The details of these procedures, their performance frequency, and expected results are set out in this report as they apply to microbiological methods. The specific regulatory requirements of CFR Title 40 Part 136.7 for the Clean Water Act, the laboratory certification requirements of CFR Title 40 Part 141 for the Safe Drinking Water Act, and the International Organization for Standardization 17025 accreditation requirements under The NELAC Institute are also discussed.
DOT National Transportation Integrated Search
2009-07-01
Current roadway quality control and quality acceptance (QC/QA) procedures : for Louisiana include coring for thickness, density, and air void checks in hot : mix asphalt (HMA) pavements and thickness and compressive strength for : Portland cement con...
46 CFR 160.132-9 - Preapproval review.
Code of Federal Regulations, 2014 CFR
2014-10-01
... inventory of materials; (iii) The method for checking quality of fabrication and joints, including welding... §§ 160.132-19 and 160.132-21 of this subpart; (5) A description of the quality control procedures and... between the independent laboratory and Commandant under 46 CFR subpart 159.010. (d) Plan quality. All...
46 CFR 160.132-9 - Preapproval review.
Code of Federal Regulations, 2012 CFR
2012-10-01
... inventory of materials; (iii) The method for checking quality of fabrication and joints, including welding... §§ 160.132-19 and 160.132-21 of this subpart; (5) A description of the quality control procedures and... between the independent laboratory and Commandant under 46 CFR subpart 159.010. (d) Plan quality. All...
46 CFR 160.132-9 - Preapproval review.
Code of Federal Regulations, 2013 CFR
2013-10-01
... inventory of materials; (iii) The method for checking quality of fabrication and joints, including welding... §§ 160.132-19 and 160.132-21 of this subpart; (5) A description of the quality control procedures and... between the independent laboratory and Commandant under 46 CFR subpart 159.010. (d) Plan quality. All...
NASA Astrophysics Data System (ADS)
Manzella, G. M. R.; Scoccimarro, E.; Pinardi, N.; Tonani, M.
2003-01-01
A "ship of opportunity" program was launched as part of the Mediterranean Forecasting System Pilot Project. During the operational period (September 1999 to May 2000), six tracks covered the Mediterranean from the northern to southern boundaries approximately every 15 days, while a long eastwest track from Haifa to Gibraltar was covered approximately every month. XBT data were collected, sub-sampled at 15 inflection points and transmitted through a satellite communication system to a regional data centre. It was found that this data transmission system has limitations in terms of quality of the temperature profiles and quantity of data successfully transmitted. At the end of the MFSPP operational period, a new strategy for data transmission and management was developed. First of all, VOS-XBT data are transmitted with full resolution. Secondly, a new data management system, called Near Real Time Quality Control for XBT (NRT.QC.XBT), was defined to produce a parallel stream of high quality XBT data for further scientific analysis. The procedure includes: (1) Position control; (2) Elimination of spikes; (3) Re-sampling at a 1 metre vertical interval; (4) Filtering; (5) General malfunctioning check; (6) Comparison with climatology (and distance from this in terms of standard deviations); (7) Visual check; and (8) Data consistency check. The first six steps of the new procedure are completely automated; they are also performed using a new climatology developed as part of the project. The visual checks are finally done with a free-market software that allows NRT final data assessment.
46 CFR 160.115-9 - Preapproval review.
Code of Federal Regulations, 2012 CFR
2012-10-01
... inventory of materials; (iii) The method for checking quality of fabrication and joints, including welding... §§ 160.115-19 and 160.115-21 of this subpart; (5) A description of the quality control procedures and... between the independent laboratory and Commandant under 46 CFR part 159, subpart 159.010. (d) Plan quality...
46 CFR 160.115-9 - Preapproval review.
Code of Federal Regulations, 2014 CFR
2014-10-01
... inventory of materials; (iii) The method for checking quality of fabrication and joints, including welding... §§ 160.115-19 and 160.115-21 of this subpart; (5) A description of the quality control procedures and... between the independent laboratory and Commandant under 46 CFR part 159, subpart 159.010. (d) Plan quality...
46 CFR 160.115-9 - Preapproval review.
Code of Federal Regulations, 2013 CFR
2013-10-01
... inventory of materials; (iii) The method for checking quality of fabrication and joints, including welding... §§ 160.115-19 and 160.115-21 of this subpart; (5) A description of the quality control procedures and... between the independent laboratory and Commandant under 46 CFR part 159, subpart 159.010. (d) Plan quality...
40 CFR Appendix A to Part 58 - Quality Assurance Requirements for SLAMS, SPMs and PSD Air Monitoring
Code of Federal Regulations, 2014 CFR
2014-07-01
... monitor. 3.3.4.4Pb Performance Evaluation Program (PEP) Procedures. Each year, one performance evaluation... Information 2. Quality System Requirements 3. Measurement Quality Check Requirements 4. Calculations for Data... 10 of this appendix) and at a national level in references 1, 2, and 3 of this appendix. 1...
40 CFR Appendix A to Part 58 - Quality Assurance Requirements for SLAMS, SPMs and PSD Air Monitoring
Code of Federal Regulations, 2013 CFR
2013-07-01
... monitor. 3.3.4.4Pb Performance Evaluation Program (PEP) Procedures. Each year, one performance evaluation... Information 2. Quality System Requirements 3. Measurement Quality Check Requirements 4. Calculations for Data... 10 of this appendix) and at a national level in references 1, 2, and 3 of this appendix. 1...
46 CFR 160.133-9 - Preapproval review.
Code of Federal Regulations, 2012 CFR
2012-10-01
... manual as described in §§ 160.133-19 and 160.133-21 of this subpart; (7) A description of the quality... suppliers; (ii) The method for controlling the inventory of materials; (iii) The method for checking quality of fabrication and joints, including welding inspection procedures; and (iv) The inspection...
46 CFR 160.133-9 - Preapproval review.
Code of Federal Regulations, 2013 CFR
2013-10-01
... manual as described in §§ 160.133-19 and 160.133-21 of this subpart; (7) A description of the quality... suppliers; (ii) The method for controlling the inventory of materials; (iii) The method for checking quality of fabrication and joints, including welding inspection procedures; and (iv) The inspection...
46 CFR 160.170-9 - Preapproval review.
Code of Federal Regulations, 2014 CFR
2014-10-01
... manual as described in §§ 160.170-19 and 160.170-21 of this subpart; (7) A description of the quality... suppliers; (ii) The method for controlling the inventory of materials; (iii) The method for checking quality of fabrication and joints, including welding inspection procedures; and (iv) The inspection...
46 CFR 160.170-9 - Preapproval review.
Code of Federal Regulations, 2013 CFR
2013-10-01
... manual as described in §§ 160.170-19 and 160.170-21 of this subpart; (7) A description of the quality... suppliers; (ii) The method for controlling the inventory of materials; (iii) The method for checking quality of fabrication and joints, including welding inspection procedures; and (iv) The inspection...
46 CFR 160.133-9 - Preapproval review.
Code of Federal Regulations, 2014 CFR
2014-10-01
... manual as described in §§ 160.133-19 and 160.133-21 of this subpart; (7) A description of the quality... suppliers; (ii) The method for controlling the inventory of materials; (iii) The method for checking quality of fabrication and joints, including welding inspection procedures; and (iv) The inspection...
46 CFR 160.170-9 - Preapproval review.
Code of Federal Regulations, 2012 CFR
2012-10-01
... manual as described in §§ 160.170-19 and 160.170-21 of this subpart; (7) A description of the quality... suppliers; (ii) The method for controlling the inventory of materials; (iii) The method for checking quality of fabrication and joints, including welding inspection procedures; and (iv) The inspection...
Enhancement of the Automated Quality Control Procedures for the International Soil Moisture Network
NASA Astrophysics Data System (ADS)
Heer, Elsa; Xaver, Angelika; Dorigo, Wouter; Messner, Romina
2017-04-01
In-situ soil moisture observations are still trusted to be the most reliable data to validate remotely sensed soil moisture products. Thus, the quality of in-situ soil moisture observations is of high importance. The International Soil Moisture Network (ISMN; http://ismn.geo.tuwien.ac.at/) provides in-situ soil moisture data from all around the world. The data is collected from individual networks and data providers, measured by different sensors in various depths. The data sets which are delivered in different units, time zones and data formats are then transformed into homogeneous data sets. An erroneous behavior of soil moisture data is very difficult to detect, due to annual and daily changes and most significantly the high influence of precipitation and snow melting processes. Only few of the network providers have a quality assessment for their data sets. Therefore, advanced quality control procedures have been developed for the ISMN (Dorigo et al. 2013). Three categories of quality checks were introduced: exceeding boundary values, geophysical consistency checks and a spectrum based approach. The spectrum based quality control algorithms aim to detect erroneous measurements which occur within plausible geophysical ranges, e.g. a sudden drop in soil moisture caused by a sensor malfunction. By defining several conditions which have to be met by the original soil moisture time series and their first and second derivative, such error types can be detected. Since the development of these sophisticated methods many more data providers shared their data with the ISMN and new types of erroneous measurements were identified. Thus, an enhancement of the automated quality control procedures became necessary. In the present work, we introduce enhancements of the existing quality control algorithms. Additionally, six completely new quality checks have been developed, e.g. detection of suspicious values before or after NAN-values, constant values and values that lie in a spectrum where a high majority of values before and after is flagged and therefore a sensor malfunction is certain. For the evaluation of the enhanced automated quality control system many test data sets were chosen, and manually validated to be compared to the existing quality control procedures and the new algorithms. Improvements will be shown that assure an appropriate assessment of the ISMN data sets, which are used for validations of soil moisture data retrieved by satellite data and are the foundation many other scientific publications.
Data Quality Control of the French Permanent Broadband Network in the RESIF Framework
NASA Astrophysics Data System (ADS)
Grunberg, Marc; Lambotte, Sophie; Engels, Fabien; Dretzen, Remi; Hernandez, Alain
2014-05-01
In the framework of the RESIF (Réseau Sismologique et géodésique Français) project, a new information system is being setting up, allowing the improvement of the management and the distribution of high quality data from the different elements of RESIF and the associated networks. Within this information system, EOST (in Strasbourg) is in charge of collecting real-time permanent broadband seismic waveform, and performing Quality Control on these data. The real-time and validated data set are pushed to the French National Distribution Center (Isterre/Grenoble) in order to make them publicly available. Furthermore EOST hosts the BCSF-ReNaSS, in charge of the French metropolitan seismic bulletin. This allows to benefit from some high-end quality control based on the national and world-wide seismicity. Here we present first the real-time seismic data flow from the stations of the French National Broad Band Network to EOST, and then, the data Quality Control procedures that were recently installed, including some new developments. The data Quality Control consists in applying a variety of subprocesses to check the consistency of the whole system and process from the stations to the data center. This allows us to verify that instruments and data transmission are operating correctly. Moreover analysis of the ambient noise helps to characterize intrinsic seismic quality of the stations and to identify other kind of disturbances. The deployed Quality Control consist in a pipeline that starts with low-level procedures : check the real-time miniseed data file (file naming convention, data integrity), check for inconsistencies between waveform and meta-data (channel name, sample rate, etc.), compute waveform statistics (data availability, gap/overlap, mean, rms, time quality, spike). It is followed by some high-level procedures such as : power spectral density computation (PSD), STA/LTA computation to be correlated to the seismicity, phases picking and stations magnitudes discrepancies. The results of quality control is visualized through a web interface. This latter gathers data from different information systems to provide a global view on last events that could impact the data (like intervention on site or seismic events, etc.). This work is still an ongoing project. We intend to add more sophisticated procedures to enhanced our data Quality Control. Among them, we will deploy a seismic moment tensor inversion tool for amplitude, time and polarity control and a noise correlation procedure for time drift detections.
Quality Assurance and Quality Control, Part 2.
Akers, Michael J
2015-01-01
The tragedy surrounding the New England Compounding Center and contaminated steroid syringe preparations clearly points out what can happen if quality-assurance and quality-control procedures are not strictly practiced in the compounding of sterile preparations. This article is part 2 of a two-part article on requirements to comply with United States Pharmacopeia general chapters <797> and <1163> with respect to quality assurance of compounded sterile preparations. Part 1 covered documentation requirements, inspection procedures, compounding accuracy checks, and part of a discussion on bacterial endotoxin testing. Part 2 covers sterility testing, the completion from part 1 on bacterial endotoxin testing, a brief dicussion of United States Pharmacopeia <1163>, and advances in pharmaceutical quality systems.
Assessment of Petrological Microscopes.
ERIC Educational Resources Information Center
Mathison, Charter Innes
1990-01-01
Presented is a set of procedures designed to check the design, ergonomics, illumination, function, optics, accessory equipment, and image quality of a microscope being considered for purchase. Functions for use in a petrology or mineralogy laboratory are stressed. (CW)
DOT National Transportation Integrated Search
2010-06-01
This manual provides information and recommended procedures to be utilized by an agencys Weigh-in-Motion (WIM) Office Data Analyst to perform validation and quality control (QC) checks of WIM traffic data. This manual focuses on data generated by ...
Data quality in a DRG-based information system.
Colin, C; Ecochard, R; Delahaye, F; Landrivon, G; Messy, P; Morgon, E; Matillon, Y
1994-09-01
The aim of this study initiated in May 1990 was to evaluate the quality of the medical data collected from the main hospital of the "Hospices Civils de Lyon", Edouard Herriot Hospital. We studied a random sample of 593 discharge abstracts from 12 wards of the hospital. Quality control was performed by checking multi-hospitalized patients' personal data, checking that each discharge abstract was exhaustive, examining the quality of abstracting, studying diagnoses and medical procedures coding, and checking data entry. Assessment of personal data showed a 4.4% error rate. It was mainly accounted for by spelling mistakes in surnames and first names, and mistakes in dates of birth. The quality of a discharge abstract was estimated according to the two purposes of the medical information system: description of hospital morbidity per patient and Diagnosis Related Group's case mix. Error rates in discharge abstracts were expressed in two ways: an overall rate for errors of concordance between Discharge Abstracts and Medical Records, and a specific rate for errors modifying classification in Diagnosis Related Groups (DRG). For abstracting medical information, these error rates were 11.5% (SE +/- 2.2) and 7.5% (SE +/- 1.9) respectively. For coding diagnoses and procedures, they were 11.4% (SE +/- 1.5) and 1.3% (SE +/- 0.5) respectively. For data entry on the computerized data base, the error rate was 2% (SE +/- 0.5) and 0.2% (SE +/- 0.05). Quality control must be performed regularly because it demonstrates the degree of participation from health care teams and the coherence of the database.(ABSTRACT TRUNCATED AT 250 WORDS)
Data Quality Control of the French Permanent Broadband Network in the RESIF Framework.
NASA Astrophysics Data System (ADS)
Grunberg, M.; Lambotte, S.; Engels, F.
2014-12-01
In the framework of the RESIF (Réseau Sismologique et géodésique Français) project, a new information system is setting up, allowing the improvement of the management and the distribution of high quality data from the different elements of RESIF. Within this information system, EOST (in Strasbourg) is in charge of collecting real-time permanent broadband seismic waveform, and performing Quality Control on these data. The real-time and validated data set are pushed to the French National Distribution Center (Isterre/Grenoble) to make them publicly available. Furthermore EOST hosts the BCSF-ReNaSS, in charge of the French metropolitan seismic bulletin. This allows to benefit from some high-end quality control based on the national and world-wide seismicity. Here we present the real-time seismic data flow from the stations of the French National Broad Band Network to EOST, and then, the data Quality Control procedures that were recently installed, including some new developments.The data Quality Control consists in applying a variety of processes to check the consistency of the whole system from the stations to the data center. This allows us to verify that instruments and data transmission are operating correctly. Moreover, time quality is critical for most of the scientific data applications. To face this challenge and check the consistency of polarities and amplitudes, we deployed several high-end processes including a noise correlation procedure to check for timing accuracy (intrumental time errors result in a time-shift of the whole cross-correlation, clearly distinct from those due to change in medium physical properties), and a systematic comparison of synthetic and real data for teleseismic earthquakes of magnitude larger than 6.5 to detect timing errors as well as polarity and amplitude problems.
Valid internal standard technique for arson detection based on gas chromatography-mass spectrometry.
Salgueiro, Pedro A S; Borges, Carlos M F; Bettencourt da Silva, Ricardo J N
2012-09-28
The most popular procedures for the detection of residues of accelerants in fire debris are the ones published by the American Society for Testing and Materials (ASTM E1412-07 and E1618-10). The most critical stages of these tests are the conservation of fire debris from the sampling to the laboratory, the extraction of residues of accelerants from the debris to the activated charcoal strips (ACS) and from those to the final solvent, as well as the analysis of sample extract by gas chromatography-mass spectrometry (GC-MS) and the interpretation of the instrumental signal. This work proposes a strategy for checking the quality of the sample conservation, the accelerant residues transference to final solvent and GC-MS analysis, using internal standard additions. It is used internal standards ranging from a highly volatile compound for checking debris conservation to low volatile compound for checking GC-MS repeatability. The developed quality control (QC) parameters are not affected by GC-MS sensitivity variation and, specifically, the GC-MS performance control is not affected by ACS adsorption saturation that may mask test performance deviations. The proposed QC procedure proved to be adequate to check GC-MS repeatability, ACS extraction and sample conservation since: (1) standard additions are affected by negligible uncertainty and (2) observed dispersion of QC parameters are fit for its intended use. Copyright © 2012 Elsevier B.V. All rights reserved.
Web-based video monitoring of CT and MRI procedures
NASA Astrophysics Data System (ADS)
Ratib, Osman M.; Dahlbom, Magdalena; Kho, Hwa T.; Valentino, Daniel J.; McCoy, J. Michael
2000-05-01
A web-based video transmission of images from CT and MRI consoles was implemented in an Intranet environment for real- time monitoring of ongoing procedures. Images captured from the consoles are compressed to video resolution and broadcasted through a web server. When called upon, the attending radiologists can view these live images on any computer within the secured Intranet network. With adequate compression, these images can be displayed simultaneously in different locations at a rate of 2 to 5 images/sec through standard LAN. The quality of the images being insufficient for diagnostic purposes, our users survey showed that they were suitable for supervising a procedure, positioning the imaging slices and for routine quality checking before completion of a study. The system was implemented at UCLA to monitor 9 CTs and 6 MRIs distributed in 4 buildings. This system significantly improved the radiologists productivity by saving precious time spent in trips between reading rooms and examination rooms. It also improved patient throughput by reducing the waiting time for the radiologists to come to check a study before moving the patient from the scanner.
Rural-Urban Differences in Medicare Quality Outcomes and the Impact of Risk Adjustment.
Henning-Smith, Carrie; Kozhimannil, Katy; Casey, Michelle; Prasad, Shailendra; Moscovice, Ira
2017-09-01
There has been considerable debate in recent years about whether, and how, to risk-adjust quality measures for sociodemographic characteristics. However, geographic location, especially rurality, has been largely absent from the discussion. To examine differences by rurality in quality outcomes, and the impact of adjustment for individual and community-level sociodemographic characteristics on quality outcomes. The 2012 Medicare Current Beneficiary Survey, Access to Care module, combined with the 2012 County Health Rankings. All data used were publicly available, secondary data. We merged the 2012 Medicare Current Beneficiary Survey data with the 2012 County Health Rankings data using county of residence. We compared 6 unadjusted quality of care measures for Medicare beneficiaries (satisfaction with care, blood pressure checked, cholesterol checked, flu shot receipt, change in health status, and all-cause annual readmission) by rurality (rural noncore, micropolitan, and metropolitan). We then ran nested multivariable logistic regression models to assess the impact of adjusting for community and individual-level sociodemographic characteristics to determine whether these mediate the rurality difference in quality of care. The relationship between rurality and change in health status was mediated by the inclusion of community-level characteristics; however, adjusting for community and individual-level characteristics caused differences by rurality to emerge in 2 of the measures: blood pressure checked and cholesterol checked. For all quality scores, model fit improved after adding community and individual characteristics. Quality is multifaceted and is impacted by individual and community-level socio-demographic characteristics, as well as by geographic location. Current debates about risk-adjustment procedures should take rurality into account.
Basal Area Growth Estimators for Survivor Component: A Quality Control Application
Charles E. Thomas; Francis A. Roesch
1990-01-01
Several possible estimators are available for basal area growth of survivor trees, when horizontal prism (or point) plots (HPP) are remeasured. This study's comparison of three estimators not only provides a check for the estimate of basal area growth but suggests that they can provide a quality control indicator for yield procedures. An example is derived from...
NASA Astrophysics Data System (ADS)
Toussaint, F.; Hoeck, H.; Stockhause, M.; Lautenschlager, M.
2014-12-01
The classical goals of a quality assessment system in the data life cycle are (1) to encourage data creators to improve their quality assessment procedures to reach the next quality level and (2) enable data consumers to decide, whether a dataset has a quality that is sufficient for usage in the target application, i.e. to appraise the data usability for their own purpose.As the data volumes of projects and the interdisciplinarity of data usage grow, the need for homogeneous structure and standardised notation of data and metadata increases. This third aspect is especially valid for the data repositories, as they manage data through machine agents. So checks for homogeneity and consistency in early parts of the workflow become essential to cope with today's data volumes.Selected parts of the workflow in the model intercomparison project CMIP5 and the archival of the data for the interdiscipliary user community of the IPCC-DDC AR5 and the associated quality checks are reviewed. We compare data and metadata checks and relate different types of checks to their positions in the data life cycle.The project's data citation approach is included in the discussion, with focus on temporal aspects of the time necessary to comply with the project's requirements for formal data citations and the demand for the availability of such data citations.In order to make different quality assessments of projects comparable, WDCC developed a generic Quality Assessment System. Based on the self-assessment approach of a maturity matrix, an objective and uniform quality level system for all data at WDCC is derived which consists of five maturity quality levels.
Class Model Development Using Business Rules
NASA Astrophysics Data System (ADS)
Skersys, Tomas; Gudas, Saulius
New developments in the area of computer-aided system engineering (CASE) greatly improve processes of the information systems development life cycle (ISDLC). Much effort is put into the quality improvement issues, but IS development projects still suffer from the poor quality of models during the system analysis and design cycles. At some degree, quality of models that are developed using CASE tools can be assured using various. automated. model comparison, syntax. checking procedures. It. is also reasonable to check these models against the business domain knowledge, but the domain knowledge stored in the repository of CASE tool (enterprise model) is insufficient (Gudas et al. 2004). Involvement of business domain experts into these processes is complicated because non- IT people often find it difficult to understand models that were developed by IT professionals using some specific modeling language.
Austrian Daily Climate Data Rescue and Quality Control
NASA Astrophysics Data System (ADS)
Jurkovic, A.; Lipa, W.; Adler, S.; Albenberger, J.; Lechner, W.; Swietli, R.; Vossberg, I.; Zehetner, S.
2010-09-01
Checked climate datasets are a "conditio sine qua non" for all projects that are relevant for environment and climate. In the framework of climate change studies and analysis it is essential to work with quality controlled and trustful data. Furthermore these datasets are used as input for various simulation models. In regard to investigations of extreme events, like strong precipitation periods, drought periods and similar ones we need climate data in high temporal resolution (at least in daily resolution). Because of the historical background - during Second World War the majority of our climate sheets were sent to Berlin, where the historical sheets were destroyed by a bomb attack and so important information got lost - only several climate sheets, mostly duplicates, before 1939 are available and stored in our climate data archive. In 1970 the Central Institute for Meteorology and Geodynamics in Vienna started a first attempt to digitize climate data by means of punch cards. With the introduction of a routinely climate data quality control in 1984 we can speak of high-class-checked daily data (finally checked data, quality flag 6). Our group is working on the processing of digitization and quality control of the historical data for the period 1872 to 1983 for 18 years. Since 2007 it was possible to intensify the work (processes) in the framework of an internal project, namely Austrian Climate Data Rescue and Quality Control. The aim of this initiative was - and still is - to supply daily data in an outstanding good and uniform quality. So this project is a kind of pre-project for all scientific projects which are working with daily data. In addition to routine quality checks (that are running since 1984) using the commercial Bull Software we are testing our data with additional open source software, namely ProClim.db. By the use of this spatial and statistical test procedure, the elements air temperature and precipitation - for several sites in Carinthia - could already be checked, flagged and corrected. Checking the output (so called- error list) of ProClim is very time consuming and needs trained staff; however, in last instance it is necessary. Due to the guideline "Your archive is your business card for quality" the sub-project NEW ARCHIVE was initialized and started at the end of 2009. Our paper archive contains historical, up to 150 year-old, climate sheets that are valuable cultural assets. Unfortunately the storage of these historical and actual data treasures turned out to be more than suboptimal (insufficient protection against dust, dirt, humidity and light incidence). Because of this fact a concept for a new storage system and archive database was generated and already partly realized. In a nutshell this presentation shows on the one hand the importance of recovering historical climate sheets for climate change research - even if it is exhausting and time consuming - and gives on the other hand a general overview of used quality control procedures at our institute.
[Process-oriented quality management in the hospital].
Wolters, H G
1998-03-01
Procedures and experiences concerning the implementation of quality management in a midsize hospital with 6 medical disciplines are described. Quality of infrastructure was checked with lists and the quality of medical performance assessed by means of standardized numerical audit with all professional groups. Weaknesses were identified by comparing the result to each quality indicator with target standards. As examples, causal relations and consequences of deficiencies in clinical care documentation, scheme of preoperative diagnosis, co-ordination of surgical procedures and handling of complications are given in more detail. Obstacles were rated depending on frequency and risk potential, sometimes cost effectiveness. Members of all professional groups and departments involved participated in trouble solving teams to which external expert assistance was provided. For example, interventions leading to improved co-ordination of surgical activities and their impacts are specified. Improving systematically the quality of clinical procedures is one gateway to establish quality management in hospitals continuously and thoroughly becoming an integrated part of the corporate culture. Investment of resources is necessary but justified by midrange benefits.
Model Checking Satellite Operational Procedures
NASA Astrophysics Data System (ADS)
Cavaliere, Federico; Mari, Federico; Melatti, Igor; Minei, Giovanni; Salvo, Ivano; Tronci, Enrico; Verzino, Giovanni; Yushtein, Yuri
2011-08-01
We present a model checking approach for the automatic verification of satellite operational procedures (OPs). Building a model for a complex system as a satellite is a hard task. We overcome this obstruction by using a suitable simulator (SIMSAT) for the satellite. Our approach aims at improving OP quality assurance by automatic exhaustive exploration of all possible simulation scenarios. Moreover, our solution decreases OP verification costs by using a model checker (CMurphi) to automatically drive the simulator. We model OPs as user-executed programs observing the simulator telemetries and sending telecommands to the simulator. In order to assess feasibility of our approach we present experimental results on a simple meaningful scenario. Our results show that we can save up to 90% of verification time.
A hydrochemical data base for the Hanford Site, Washington
DOE Office of Scientific and Technical Information (OSTI.GOV)
Early, T.O.; Mitchell, M.D.; Spice, G.D.
1986-05-01
This data package contains a revision of the Site Hydrochemical Data Base for water samples associated with the Basalt Waste Isolation Project (BWIP). In addition to the detailed chemical analyses, a summary description of the data base format, detailed descriptions of verification procedures used to check data entries, and detailed descriptions of validation procedures used to evaluate data quality are included. 32 refs., 21 figs., 3 tabs.
NASA Astrophysics Data System (ADS)
Chen, Min; Zhang, Yu
2017-04-01
A wind profiler network with a total of 65 profiling radars was operated by the MOC/CMA in China until July 2015. In this study, a quality control procedure is constructed to incorporate the profiler data from the wind-profiling network into the local data assimilation and forecasting system (BJRUC). The procedure applies a blacklisting check that removes stations with gross errors and an outlier check that rejects data with large deviations from the background. Instead of the bi-weighting method, which has been commonly implemented in outlier elimination for one-dimensional scalar observations, an outlier elimination method is developed based on the iterated reweighted minimum covariance determinant (IRMCD) for multi-variate observations such as wind profiler data. A quality control experiment is separately performed for subsets containing profiler data tagged in parallel with/without rain flags at every 00UTC/12UTC from 20 June to 30 Sep 2015. From the results, we find that with the quality control, the frequency distributions of the differences between the observations and model background become more Gaussian-like and meet the requirements of a Gaussian distribution for data assimilation. Further intensive assessment for each quality control step reveals that the stations rejected by blacklisting contain poor data quality, and the IRMCD rejects outliers in a robust and physically reasonable manner.
7 CFR 90.2 - General terms defined.
Code of Federal Regulations, 2011 CFR
2011-01-01
... agency, or other agency, organization or person that defines in the general terms the basis on which the... analytical data using proficiency check sample or analyte recovery techniques. In addition, the certainty.... Quality control. The system of close examination of the critical details of an analytical procedure in...
28 CFR 105.23 - Procedure for requesting criminal history record check.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 28 Judicial Administration 2 2013-07-01 2013-07-01 false Procedure for requesting criminal history... HISTORY BACKGROUND CHECKS Private Security Officer Employment § 105.23 Procedure for requesting criminal history record check. These procedures only apply to participating states. An authorized employer may...
28 CFR 105.23 - Procedure for requesting criminal history record check.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 28 Judicial Administration 2 2014-07-01 2014-07-01 false Procedure for requesting criminal history... HISTORY BACKGROUND CHECKS Private Security Officer Employment § 105.23 Procedure for requesting criminal history record check. These procedures only apply to participating states. An authorized employer may...
28 CFR 105.23 - Procedure for requesting criminal history record check.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 28 Judicial Administration 2 2011-07-01 2011-07-01 false Procedure for requesting criminal history... HISTORY BACKGROUND CHECKS Private Security Officer Employment § 105.23 Procedure for requesting criminal history record check. These procedures only apply to participating states. An authorized employer may...
28 CFR 105.23 - Procedure for requesting criminal history record check.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 28 Judicial Administration 2 2012-07-01 2012-07-01 false Procedure for requesting criminal history... HISTORY BACKGROUND CHECKS Private Security Officer Employment § 105.23 Procedure for requesting criminal history record check. These procedures only apply to participating states. An authorized employer may...
28 CFR 105.23 - Procedure for requesting criminal history record check.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 28 Judicial Administration 2 2010-07-01 2010-07-01 false Procedure for requesting criminal history... HISTORY BACKGROUND CHECKS Private Security Officer Employment § 105.23 Procedure for requesting criminal history record check. These procedures only apply to participating states. An authorized employer may...
Application of reiteration of Hankel singular value decomposition in quality control
NASA Astrophysics Data System (ADS)
Staniszewski, Michał; Skorupa, Agnieszka; Boguszewicz, Łukasz; Michalczuk, Agnieszka; Wereszczyński, Kamil; Wicher, Magdalena; Konopka, Marek; Sokół, Maria; Polański, Andrzej
2017-07-01
Medical centres are obliged to store past medical records, including the results of quality assurance (QA) tests of the medical equipment, which is especially useful in checking reproducibility of medical devices and procedures. Analysis of multivariate time series is an important part of quality control of NMR data. In this work we proposean anomaly detection tool based on Reiteration of Hankel Singular Value Decomposition method. The presented method was compared with external software and authors obtained comparable results.
Data services providing by the Ukrainian NODC (MHI NASU)
NASA Astrophysics Data System (ADS)
Eremeev, V.; Godin, E.; Khaliulin, A.; Ingerov, A.; Zhuk, E.
2009-04-01
At modern stage of the World Ocean study information support of investigation based on ad-vanced computer technologies becomes of particular importance. These abstracts are devoted to presentation of several data services developed in the Ukrainian NODC on the base of the Ma-rine Environmental and Information Technologies Department of MHI NASU. The Data Quality Control Service Using experience of international collaboration in the field of data collection and quality check we have developed the quality control (QC) software providing both preliminary(automatic) and expert(manual) data quality check procedures. The current version of the QC software works for the Mediterranean and Black seas and includes the climatic arrays for hydrological and few hydrochemical parameters based on such products as MEDAR/MEDATLAS II, Physical Oceanography of the Black Sea and Climatic Atlas of Oxygen and Hydrogen Sulfide in the Black sea. The data quality check procedure includes metadata control and hydrological and hydrochemical data control. Metadata control provides checking of duplicate cruises and pro-files, date and chronology, ship velocity, station location, sea depth and observation depth. Data QC procedure includes climatic (or range for parameters with small number of observations) data QC, density inversion check for hydrological data and searching for spikes. Using of cli-matic fields and profiles prepared by regional oceanography experts leads to more reliable results of data quality check procedure. The Data Access Services The Ukrainian NODC provides two products for data access - on-line software and data access module for the MHI NASU local net. This software allows select-ing data on rectangle area, on date, on months, on cruises. The result of query is metadata which are presented in the table and the visual presentation of stations on the map. It is possible to see both metadata and data. For this purpose it is necessary to select station in the table of metadata or on the map. There is also an opportunity to export data in ODV format. The product is avail-able on http://www.ocean.nodc.org.ua/DataAccess.php The local net version provides access to the oceanological database of the MHI NASU. The cur-rent version allows selecting data by spatial and temporal limits, depth, values of parameters, quality flags and works for the Mediterranean and Black seas. It provides visualization of meta-data and data, statistics of data selection, data export into several data formats. The Operational Data Management Services The collaborators of the MHI Experimental Branch developed a system of obtaining information on water pressure and temperature, as well as on atmospheric pressure. Sea level observations are also conducted. The obtained data are transferred online. The interface for operation data access was developed. It allows to select parameters (sea level, water temperature, atmospheric pressure, wind and wa-ter pressure) and time interval to see parameter graphics. The product is available on http://www.ocean.nodc.org.ua/Katsively.php . The Climatic products The current version of the Climatic Atlas includes maps on such pa-rameters as temperature, salinity, density, heat storage, dynamic heights, upper boundary of hy-drogen sulfide and lower boundary of oxygen for the Black sea basin. Maps for temperature, sa-linity, density were calculated on 19 standard depths and averaged monthly for depths 0 - 300 m and annually for lower depth values. The climatic maps of upper boundary of hydrogen sulfide and lower boundary of oxygen were averaged by decades from 20 till 90 of the XX century and by seasons. Two versions of climatic atlas viewer - on-line and desktop for presentation of the climatic maps were developed. They provide similar functions of selection and viewing maps by parameter, month and depth and saving maps in various formats. On-line version of atlas is available on http://www.ocean.nodc.org.ua/Main_Atlas.php .
Reporting the accuracy of biochemical measurements for epidemiologic and nutrition studies.
McShane, L M; Clark, L C; Combs, G F; Turnbull, B W
1991-06-01
Procedures for reporting and monitoring the accuracy of biochemical measurements are presented. They are proposed as standard reporting procedures for laboratory assays for epidemiologic and clinical-nutrition studies. The recommended procedures require identification and estimation of all major sources of variability and explanations of laboratory quality control procedures employed. Variance-components techniques are used to model the total variability and calculate a maximum percent error that provides an easily understandable measure of laboratory precision accounting for all sources of variability. This avoids ambiguities encountered when reporting an SD that may taken into account only a few of the potential sources of variability. Other proposed uses of the total-variability model include estimating precision of laboratory methods for various replication schemes and developing effective quality control-checking schemes. These procedures are demonstrated with an example of the analysis of alpha-tocopherol in human plasma by using high-performance liquid chromatography.
Are patients in Western Turkey contented with healthcare services?: a quality assessment study.
Kuguoglu, Sema; Aslan, Fatma Eti; Icli, Gülnur
2006-01-01
The purpose of this study was to describe service quality as perceived by 1200 patients who had inpatient treatment at 3 hospitals in Istanbul: university, social security administration, and government. Patients were most satisfied with the helpful attitudes of personnel during check-in procedures, promptness and skill of nurses, overall service provided by physicians, speedy and skilled work of personnel in laboratories and X-ray rooms, and hospitals in general.
78 FR 29669 - Airworthiness Directives; DASSAULT AVIATION Airplanes
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-21
... Maintenance Procedure 26-20-2, ``Removal of Pyrotechnical Cartridge for Check/Replacement,'' dated October... Maintenance Procedure 26-20-2, ``Removal of Pyrotechnical Cartridge for Check/ Replacement,'' dated October... Maintenance Procedure 26-20-2, ``Removal of Pyrotechnical Cartridge for Check/Replacement,'' dated October...
Capturing, using, and managing quality assurance knowledge for shuttle post-MECO flight design
NASA Technical Reports Server (NTRS)
Peters, H. L.; Fussell, L. R.; Goodwin, M. A.; Schultz, Roger D.
1991-01-01
Ascent initialization values used by the Shuttle's onboard computer for nominal and abort mission scenarios are verified by a six degrees of freedom computer simulation. The procedure that the Ascent Post Main Engine Cutoff (Post-MECO) group uses to perform quality assurance (QA) of the simulation is time consuming. Also, the QA data, checklists and associated rationale, though known by the group members, is not sufficiently documented, hindering transfer of knowledge and problem resolution. A new QA procedure which retains the current high level of integrity while reducing the time required to perform QA is needed to support the increasing Shuttle flight rate. Documenting the knowledge is also needed to increase its availability for training and problem resolution. To meet these needs, a knowledge capture process, embedded into the group activities, was initiated to verify the existing QA checks, define new ones, and document all rationale. The resulting checks were automated in a conventional software program to achieve the desired standardization, integrity, and time reduction. A prototype electronic knowledge base was developed with Macintosh's HyperCard to serve as a knowledge capture tool and data storage.
Lake water quality mapping from LANDSAT
NASA Technical Reports Server (NTRS)
Scherz, J. P.
1977-01-01
The lakes in three LANDSAT scenes were mapped by the Bendix MDAS multispectral analysis system. Field checking the maps by three separate individuals revealed approximately 90-95% correct classification for the lake categories selected. Variations between observers was about 5%. From the MDAS color coded maps the lake with the worst algae problem was easily located. This lake was closely checked and a pollution source of 100 cows was found in the springs which fed this lake. The theory, lab work and field work which made it possible for this demonstration project to be a practical lake classification procedure are presented.
NASA Astrophysics Data System (ADS)
Espinar, B.; Blanc, P.; Wald, L.; Hoyer-Klick, C.; Schroedter-Homscheidt, M.; Wanderer, T.
2012-04-01
Meteorological data measured by ground stations are often a key element in the development and validation of methods exploiting satellite images. These data are considered as a reference against which satellite-derived estimates are compared. Long-term radiation and meteorological measurements are available from a large number of measuring stations. However, close examination of the data often reveals a lack of quality, often for extended periods of time. This lack of quality has been the reason, in many cases, of the rejection of large amount of available data. The quality data must be checked before their use in order to guarantee the inputs for the methods used in modelling, monitoring, forecast, etc. To control their quality, data should be submitted to several conditions or tests. After this checking, data that are not flagged by any of the test is released as a plausible data. In this work, it has been performed a bibliographical research of quality control tests for the common meteorological variables (ambient temperature, relative humidity and wind speed) and for the usual solar radiometrical variables (horizontal global and diffuse components of the solar radiation and the beam normal component). The different tests have been grouped according to the variable and the average time period (sub-hourly, hourly, daily and monthly averages). The quality test may be classified as follows: • Range checks: test that verify values are within a specific range. There are two types of range checks, those based on extrema and those based on rare observations. • Step check: test aimed at detecting unrealistic jumps or stagnation in the time series. • Consistency checks: test that verify the relationship between two or more time series. The gathered quality tests are applicable for all latitudes as they have not been optimized regionally nor seasonably with the aim of being generic. They have been applied to ground measurements in several geographic locations, what result in the detection of some control tests that are no longer adequate, due to different reasons. After the modification of some test, based in our experience, a set of quality control tests is now presented, updated according to technology advances and classified. The presented set of quality tests allows radiation and meteorological data to be tested in order to know their plausibility to be used as inputs in theoretical or empirical methods for scientific research. The research leading to those results has partly receive funding from the European Union's Seventh Framework Programme (FP7/2007-2013) under Grant Agreement no. 262892 (ENDORSE project).
[A re-evaluation of the program for diabetes mellitus type 2. A proposal for quality indices].
Espinàs, J; Salla, R M; Bellvehí, M; Reig, E; Iruela, T; Muñoz, E; Isern, R; Molas, M
1993-02-28
To find out how accurate our records are and the state of health of the patients with diabetes mellitus type II (DM) in our Base Health Area (BHA) in Osona county (Barcelona), both before and after introducing a new procedure. Quality control study based on the medical records (PCMR) of DM patients. The evaluation took place between 1.1.90 and 31.12.90; and the re-evaluation between 1.1.91 and 31.12.91, after the DM procedure had been put in place as a corrective measure. 198 patients: all of those suffering from DM type II. 110 women and 88 men, with an average age of 65.4 +/- 11.9, were under study. We observed from the records of attendance that 94.4% were or had been smokers, whereas the question of the eye fundus was only mentioned in 36.8%. The introduction of a procedure has improved the records in almost every parameter. In 1991, 36.8% of the patients had normal-weight criteria, 33.3% had good biochemical control and 15.6% fulfilled both these criteria. Those tests which could be performed with few instruments were carried out much better than those which needed more complex technology or specialist support. Arising from this study, the authors propose four indicators of quality control: 1) Weight normality. 2) Annual plasmatic fructosamine. 3) Annual eye fundus check. 4) Annual proteinuria check.
Code of Federal Regulations, 2014 CFR
2014-10-01
... National Service Criminal History Check for a covered position? 2540.205 Section 2540.205 Public Welfare... What procedures must I follow in conducting a National Service Criminal History Check for a covered... criminal history check, and for the appropriate sharing of the results of the checks within the program...
The Implications of Bank-Issued Check Surveys for Evaluators: A Case Study
ERIC Educational Resources Information Center
Blair, Jason; Taylor, Ted K.; Johnson-Shelton, Deborah
2007-01-01
This article describes an innovative data collection procedure. A subsample (n = 164) of a longitudinal research project was assessed using a bank-issued check survey procedure (a removable bank check on which response fields were printed). Using the new procedure, parents returned their surveys simply by depositing or cashing their incentive…
40 CFR Appendix F to Part 60 - Quality Assurance Procedures
Code of Federal Regulations, 2013 CFR
2013-07-01
... plus the 2.5 percent error confidence coefficient of a series of tests divided by the mean of the RM...-level) CD or the daily high-level CD exceeds two times the limits of the applicable PS's in appendix B... result exceeds four times the applicable drift specification in appendix B during any CD check, the CEMS...
40 CFR Appendix F to Part 60 - Quality Assurance Procedures
Code of Federal Regulations, 2010 CFR
2010-07-01
... plus the 2.5 percent error confidence coefficient of a series of tests divided by the mean of the RM...-level) CD or the daily high-level CD exceeds two times the limits of the applicable PS's in appendix B... result exceeds four times the applicable drift specification in appendix B during any CD check, the CEMS...
40 CFR Appendix F to Part 60 - Quality Assurance Procedures
Code of Federal Regulations, 2011 CFR
2011-07-01
... plus the 2.5 percent error confidence coefficient of a series of tests divided by the mean of the RM...-level) CD or the daily high-level CD exceeds two times the limits of the applicable PS's in appendix B... result exceeds four times the applicable drift specification in appendix B during any CD check, the CEMS...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tan, J; Yan, Y; Hager, F
Purpose: Radiation therapy has evolved to become not only more precise and potent, but also more complicated to monitor and deliver. More rigorous and comprehensive quality assurance is needed to safeguard ever advancing radiation therapy. ICRU standards dictate that an ever growing set of treatment parameters are manually checked weekly by medical physicists. This “weekly chart check” procedure is laborious and subject to human errors or other factors. A computer-assisted chart checking process will enable more complete and accurate human review of critical parameters, reduce the risk of medical errors, and improve the efficiency. Methods: We developed a web-based softwaremore » system that enables a thorough weekly quality assurance checks. In the backend, the software retrieves all machine parameters from a Treatment Management System (TMS) and compares them against the corresponding ones from the treatment planning system. They are also checked for validity against preset rules. The results are displayed as a web page in the front-end for physicists to review. Then a summary report is generated and uploaded automatically to the TMS as a record for weekly chart checking. Results: The software system has been deployed on a web server in our department’s intranet, and has been tested thoroughly by our clinical physicists. A plan parameter would be highlighted when it is off the preset limit. The developed system has changed the way of checking charts with significantly improved accuracy, efficiency, and completeness. It has been shown to be robust, fast, and easy to use. Conclusion: A computer-assisted system has been developed for efficient, accurate, and comprehensive weekly chart checking. The system has been extensively validated and is being implemented for routine clinical use.« less
NASA Technical Reports Server (NTRS)
Fisher, Forest; Gladden, Roy; Khanampornpan, Teerapat
2008-01-01
The MRO Sequence Checking Tool program, mro_check, automates significant portions of the MRO (Mars Reconnaissance Orbiter) sequence checking procedure. Though MRO has similar checks to the ODY s (Mars Odyssey) Mega Check tool, the checks needed for MRO are unique to the MRO spacecraft. The MRO sequence checking tool automates the majority of the sequence validation procedure and check lists that are used to validate the sequences generated by MRO MPST (mission planning and sequencing team). The tool performs more than 50 different checks on the sequence. The automation varies from summarizing data about the sequence needed for visual verification of the sequence, to performing automated checks on the sequence and providing a report for each step. To allow for the addition of new checks as needed, this tool is built in a modular fashion.
Quality control of FWC during assembly and commissioning in SST-1 Tokamak
NASA Astrophysics Data System (ADS)
Patel, Hitesh; Santra, Prosenjit; Parekh, Tejas; Biswas, Prabal; Jayswal, Snehal; Chauhan, Pradeep; Paravastu, Yuvakiran; George, Siju; Semwal, Pratibha; Thankey, Prashant; Ramesh, Gattu; Prakash, Arun; Dhanani, Kalpesh; Raval, D. C.; Khan, Ziauddin; Pradhan, Subrata
2017-04-01
First Wall Components (FWC) of SST-1 tokamak, which are in the immediate vicinity of plasma, comprises of limiters, divertors, baffles, passive stabilizers designed to operate long duration (∼1000 s) discharges of elongated plasma. All FWC consist of copper alloy heat sink modules with SS cooling tubes brazed onto it, graphite tiles acting as armour material facing the plasma, and are mounted to the vacuum vessels with suitable Inconel support structures at inter-connected ring & port locations. The FWC are very recently assembled and commissioned successfully inside the vacuum vessel of SST-1 undergoing a rigorous quality control and checks at every stage of the assembly process. This paper will present the quality control aspects and checks of FWC from commencement of assembly procedure, namely material test reports, leak testing of high temperature baked components, assembled dimensional tolerances, leak testing of all welded joints, graphite tile tightening torques, electrical continuity and electrical isolation of passive stabilizers from vacuum vessel, baking and cooling hydraulic connections inside vacuum vessel.
Code of Federal Regulations, 2011 CFR
2011-10-01
... alternative procedures in conducting a State criminal registry check? (a) FBI fingerprint-based check. If you or your designee conduct and document a fingerprint-based criminal history check through the Federal...
Code of Federal Regulations, 2010 CFR
2010-10-01
... alternative procedures in conducting a State criminal registry check? (a) FBI fingerprint-based check. If you or your designee conduct and document a fingerprint-based criminal history check through the Federal...
Code of Federal Regulations, 2010 CFR
2010-10-01
... alternative procedures in conducting a State criminal registry check? (a) FBI fingerprint-based check. If you or your designee conduct and document a fingerprint-based criminal history check through the Federal...
Code of Federal Regulations, 2012 CFR
2012-10-01
... alternative procedures in conducting a State criminal registry check? (a) FBI fingerprint-based check. If you or your designee conduct and document a fingerprint-based criminal history check through the Federal...
Code of Federal Regulations, 2011 CFR
2011-10-01
... alternative procedures in conducting a State criminal registry check? (a) FBI fingerprint-based check. If you or your designee conduct and document a fingerprint-based criminal history check through the Federal...
Code of Federal Regulations, 2012 CFR
2012-10-01
... alternative procedures in conducting a State criminal registry check? (a) FBI fingerprint-based check. If you or your designee conduct and document a fingerprint-based criminal history check through the Federal...
Code of Federal Regulations, 2012 CFR
2012-07-01
... requirements? (a) If you elect to install a CEMS as specified in Table 5 of this subpart, you must install... periodic data quality checks in accordance with 40 CFR part 60, appendix F, procedure 1. (3) As specified... you are required to install a continuous parameter monitoring system (CPMS) as specified in Table 5 of...
NASA Astrophysics Data System (ADS)
Servilla, M. S.; O'Brien, M.; Costa, D.
2013-12-01
Considerable ecological research performed today occurs through the analysis of data downloaded from various repositories and archives, often resulting in derived or synthetic products generated by automated workflows. These data are only meaningful for research if they are well documented by metadata, lest semantic or data type errors may occur in interpretation or processing. The Long Term Ecological Research (LTER) Network now screens all data packages entering its long-term archive to ensure that each package contains metadata that is complete, of high quality, and accurately describes the structure of its associated data entity and the data are structurally congruent to the metadata. Screening occurs prior to the upload of a data package into the Provenance Aware Synthesis Tracking Architecture (PASTA) data management system through a series of quality checks, thus preventing ambiguously or incorrectly documented data packages from entering the system. The quality checks within PASTA are designed to work specifically with the Ecological Metadata Language (EML), the metadata standard adopted by the LTER Network to describe data generated by their 26 research sites. Each quality check is codified in Java as part of the ecological community-supported Data Manager Library, which is a resource of the EML specification and used as a component of the PASTA software stack. Quality checks test for metadata quality, data integrity, or metadata-data congruence. Quality checks are further classified as either conditional or informational. Conditional checks issue a 'valid', 'warning' or 'error' response. Only an 'error' response blocks the data package from upload into PASTA. Informational checks only provide descriptive content pertaining to a particular facet of the data package. Quality checks are designed by a group of LTER information managers and reviewed by the LTER community before deploying into PASTA. A total of 32 quality checks have been deployed to date. Quality checks can be customized through a configurable template, which includes turning checks 'on' or 'off' and setting the severity of conditional checks. This feature is important to other potential users of the Data Manager Library who wish to configure its quality checks in accordance with the standards of their community. Executing the complete set of quality checks produces a report that describes the result of each check. The report is an XML document that is stored by PASTA for future reference.
López-Tarjuelo, Juan; Bouché-Babiloni, Ana; Santos-Serra, Agustín; Morillo-Macías, Virginia; Calvo, Felipe A; Kubyshin, Yuri; Ferrer-Albiach, Carlos
2014-11-01
Industrial companies use failure mode and effect analysis (FMEA) to improve quality. Our objective was to describe an FMEA and subsequent interventions for an automated intraoperative electron radiotherapy (IOERT) procedure with computed tomography simulation, pre-planning, and a fixed conventional linear accelerator. A process map, an FMEA, and a fault tree analysis are reported. The equipment considered was the radiance treatment planning system (TPS), the Elekta Precise linac, and TN-502RDM-H metal-oxide-semiconductor-field-effect transistor in vivo dosimeters. Computerized order-entry and treatment-automation were also analyzed. Fifty-seven potential modes and effects were identified and classified into 'treatment cancellation' and 'delivering an unintended dose'. They were graded from 'inconvenience' or 'suboptimal treatment' to 'total cancellation' or 'potentially wrong' or 'very wrong administered dose', although these latter effects were never experienced. Risk priority numbers (RPNs) ranged from 3 to 324 and totaled 4804. After interventions such as double checking, interlocking, automation, and structural changes the final total RPN was reduced to 1320. FMEA is crucial for prioritizing risk-reduction interventions. In a semi-surgical procedure like IOERT double checking has the potential to reduce risk and improve quality. Interlocks and automation should also be implemented to increase the safety of the procedure. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Systematic monitoring and evaluation of M7 scanner performance and data quality
NASA Technical Reports Server (NTRS)
Stewart, S.; Christenson, D.; Larsen, L.
1974-01-01
An investigation was conducted to provide the information required to maintain data quality of the Michigan M7 Multispectral scanner by systematic checks on specific system performance characteristics. Data processing techniques which use calibration data gathered routinely every mission have been developed to assess current data quality. Significant changes from past data quality are thus identified and attempts made to discover their causes. Procedures for systematic monitoring of scanner data quality are discussed. In the solar reflective region, calculations of Noise Equivalent Change in Radiance on a permission basis are compared to theoretical tape-recorder limits to provide an estimate of overall scanner performance. M7 signal/noise characteristics are examined.
14 CFR 91.1069 - Flight crew: Instrument proficiency check requirements.
Code of Federal Regulations, 2010 CFR
2010-01-01
... procedures. The instrument approach procedure or procedures must include at least one straight-in approach... conducted to published minimums for that procedure. (d) The instrument proficiency checks required by... emergencies, and standard instrument approaches involving navigational facilities which that pilot is to be...
14 CFR 91.1069 - Flight crew: Instrument proficiency check requirements.
Code of Federal Regulations, 2011 CFR
2011-01-01
... procedures. The instrument approach procedure or procedures must include at least one straight-in approach... conducted to published minimums for that procedure. (d) The instrument proficiency checks required by... emergencies, and standard instrument approaches involving navigational facilities which that pilot is to be...
5 CFR 178.102 - Procedures for submitting claims.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Proceeds of Canceled Checks for Veterans' Benefits Payable to Deceased Beneficiaries § 178.102 Procedures...; and (6) Any other information that the agency believes OPM should consider. (d) Canceled checks for veterans' benefits. Claims for the proceeds of canceled checks for veterans' benefits payable to deceased...
49 CFR 1572.21 - Procedures for TWIC security threat assessment.
Code of Federal Regulations, 2012 CFR
2012-10-01
... conducts includes a fingerprint-based criminal history records check (CHRC), an intelligence-related check, and a final disposition. (b) Fingerprint-based check. The following procedures must be completed to conduct a fingerprint-based CHRC: (1) Consistent with the implementation schedule described in 49 CFR 1572...
49 CFR 1572.21 - Procedures for TWIC security threat assessment.
Code of Federal Regulations, 2013 CFR
2013-10-01
... conducts includes a fingerprint-based criminal history records check (CHRC), an intelligence-related check, and a final disposition. (b) Fingerprint-based check. The following procedures must be completed to conduct a fingerprint-based CHRC: (1) Consistent with the implementation schedule described in 49 CFR 1572...
49 CFR 1572.21 - Procedures for TWIC security threat assessment.
Code of Federal Regulations, 2010 CFR
2010-10-01
... conducts includes a fingerprint-based criminal history records check (CHRC), an intelligence-related check, and a final disposition. (b) Fingerprint-based check. The following procedures must be completed to conduct a fingerprint-based CHRC: (1) Consistent with the implementation schedule described in 49 CFR 1572...
49 CFR 1572.21 - Procedures for TWIC security threat assessment.
Code of Federal Regulations, 2011 CFR
2011-10-01
... conducts includes a fingerprint-based criminal history records check (CHRC), an intelligence-related check, and a final disposition. (b) Fingerprint-based check. The following procedures must be completed to conduct a fingerprint-based CHRC: (1) Consistent with the implementation schedule described in 49 CFR 1572...
49 CFR 1572.21 - Procedures for TWIC security threat assessment.
Code of Federal Regulations, 2014 CFR
2014-10-01
... conducts includes a fingerprint-based criminal history records check (CHRC), an intelligence-related check, and a final disposition. (b) Fingerprint-based check. The following procedures must be completed to conduct a fingerprint-based CHRC: (1) Consistent with the implementation schedule described in 49 CFR 1572...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sayler, E; Harrison, A; Eldredge-Hindy, H
Purpose: and Leipzig applicators (VLAs) are single-channel brachytherapy surface applicators used to treat skin lesions up to 2cm diameter. Source dwell times can be calculated and entered manually after clinical set-up or ultrasound. This procedure differs dramatically from CT-based planning; the novelty and unfamiliarity could lead to severe errors. To build layers of safety and ensure quality, a multidisciplinary team created a protocol and applied Failure Modes and Effects Analysis (FMEA) to the clinical procedure for HDR VLA skin treatments. Methods: team including physicists, physicians, nurses, therapists, residents, and administration developed a clinical procedure for VLA treatment. The procedure wasmore » evaluated using FMEA. Failure modes were identified and scored by severity, occurrence, and detection. The clinical procedure was revised to address high-scoring process nodes. Results: Several key components were added to the clinical procedure to minimize risk probability numbers (RPN): -Treatments are reviewed at weekly QA rounds, where physicians discuss diagnosis, prescription, applicator selection, and set-up. Peer review reduces the likelihood of an inappropriate treatment regime. -A template for HDR skin treatments was established in the clinical EMR system to standardize treatment instructions. This reduces the chances of miscommunication between the physician and planning physicist, and increases the detectability of an error during the physics second check. -A screen check was implemented during the second check to increase detectability of an error. -To reduce error probability, the treatment plan worksheet was designed to display plan parameters in a format visually similar to the treatment console display. This facilitates data entry and verification. -VLAs are color-coded and labeled to match the EMR prescriptions, which simplifies in-room selection and verification. Conclusion: Multidisciplinary planning and FMEA increased delectability and reduced error probability during VLA HDR Brachytherapy. This clinical model may be useful to institutions implementing similar procedures.« less
Relationship Between Operating Room Teamwork, Contextual Factors, and Safety Checklist Performance.
Singer, Sara J; Molina, George; Li, Zhonghe; Jiang, Wei; Nurudeen, Suliat; Kite, Julia G; Edmondson, Lizabeth; Foster, Richard; Haynes, Alex B; Berry, William R
2016-10-01
Studies show that using surgical safety checklists (SSCs) reduces complications. Many believe SSCs accomplish this by enhancing teamwork, but evidence is limited. Our study sought to relate teamwork to checklist performance, understand how they relate, and determine conditions that affect this relationship. Using 2 validated tools for observing and coaching operating room teams, we evaluated the association between checklist performance with surgeon buy-in and 4 domains of surgical teamwork: clinical leadership, communication, coordination, and respect. Hospital staff in 10 South Carolina hospitals observed 207 procedures between April 2011 and January 2013. We calculated levels of checklist performance, buy-in, and measures of teamwork, and evaluated their relationship, controlling for patient and case characteristics. Few teams completed most or all SSC items. Teams more often completed items considered procedural "checks" than conversation "prompts." Surgeon buy-in, clinical leadership, communication, a summary measure of teamwork overall, and observers' teamwork ratings positively related to overall checklist completion (multivariable model estimates from 0.04, p < 0.05 for communication to 0.17, p < 0.01 for surgeon buy-in). All measures of teamwork and surgeon buy-in related positively to completing more conversation prompts; none related significantly to procedural checks (estimates from 0.10, p < 0.01 for communication to 0.27, p < 0.001 for surgeon buy-in). Patient age was significantly associated with completing the checklist and prompts (p < 0.05); only case duration was positively associated with performing more checks (p < 0.10). Surgeon buy-in and surgical teamwork characterized by shared clinical leadership, open communication, active coordination, and mutual respect were critical in prompting case-related conversations, but not in completing procedural checks. Findings highlight the importance of surgeon engagement and high-quality, consistent teamwork for promoting checklist use and ensuring a safe surgical environment. Copyright © 2016 American College of Surgeons. Published by Elsevier Inc. All rights reserved.
Ultrasound use during cardiopulmonary resuscitation is associated with delays in chest compressions.
Huis In 't Veld, Maite A; Allison, Michael G; Bostick, David S; Fisher, Kiondra R; Goloubeva, Olga G; Witting, Michael D; Winters, Michael E
2017-10-01
High-quality chest compressions are a critical component of the resuscitation of patients in cardiopulmonary arrest. Point-of-care ultrasound (POCUS) is used frequently during emergency department (ED) resuscitations, but there has been limited research assessing its benefits and harms during the delivery of cardiopulmonary resuscitation (CPR). We hypothesized that use of POCUS during cardiac arrest resuscitation adversely affects high-quality CPR by lengthening the duration of pulse checks beyond the current cardiopulmonary resuscitation guidelines recommendation of 10s. We conducted a prospective cohort study of adults in cardiac arrest treated in an urban ED between August 2015 and September 2016. Resuscitations were recorded using video equipment in designated resuscitation rooms, and the use of POCUS was documented and timed. A linear mixed-effects model was used to estimate the effect of POCUS on pulse check duration. Twenty-three patients were enrolled in our study. The mean duration of pulse checks with POCUS was 21.0s (95% CI, 18-24) compared with 13.0s (95% CI, 12-15) for those without POCUS. POCUS increased the duration of pulse checks and CPR interruption by 8.4s (95% CI, 6.7-10.0 [p<0.0001]). Age, body mass index (BMI), and procedures did not significantly affect the duration of pulse checks. The use of POCUS during cardiac arrest resuscitation was associated with significantly increased duration of pulse checks, nearly doubling the 10-s maximum duration recommended in current guidelines. It is important for acute care providers to pay close attention to the duration of interruptions in the delivery of chest compressions when using POCUS during cardiac arrest resuscitation. Copyright © 2017 Elsevier B.V. All rights reserved.
Code of Federal Regulations, 2012 CFR
2012-10-01
... National Service Criminal History Check? 2551.29 Section 2551.29 Public Welfare Regulations Relating to... Service Criminal History Check? You are responsible for ensuring that the following procedures are... program is contingent upon the organization's review of the individual's criminal history, if any; (d...
Code of Federal Regulations, 2011 CFR
2011-10-01
... National Service Criminal History Check? 2551.29 Section 2551.29 Public Welfare Regulations Relating to... Service Criminal History Check? You are responsible for ensuring that the following procedures are... program is contingent upon the organization's review of the individual's criminal history, if any; (d...
Code of Federal Regulations, 2012 CFR
2012-10-01
... National Service Criminal History Check? 2552.29 Section 2552.29 Public Welfare Regulations Relating to... Service Criminal History Check? You are responsible for ensuring that the following procedures are... program is contingent upon the organization's review of the individual's criminal history, if any; (d...
Code of Federal Regulations, 2011 CFR
2011-10-01
... National Service Criminal History Check? 2552.29 Section 2552.29 Public Welfare Regulations Relating to... Service Criminal History Check? You are responsible for ensuring that the following procedures are... program is contingent upon the organization's review of the individual's criminal history, if any; (d...
Code of Federal Regulations, 2010 CFR
2010-10-01
... National Service Criminal History Check? 2552.29 Section 2552.29 Public Welfare Regulations Relating to... Service Criminal History Check? You are responsible for ensuring that the following procedures are... program is contingent upon the organization's review of the individual's criminal history, if any; (d...
Code of Federal Regulations, 2010 CFR
2010-10-01
... National Service Criminal History Check? 2551.29 Section 2551.29 Public Welfare Regulations Relating to... Service Criminal History Check? You are responsible for ensuring that the following procedures are... program is contingent upon the organization's review of the individual's criminal history, if any; (d...
Mixed Element Type Unstructured Grid Generation for Viscous Flow Applications
NASA Technical Reports Server (NTRS)
Marcum, David L.; Gaither, J. Adam
2000-01-01
A procedure is presented for efficient generation of high-quality unstructured grids suitable for CFD simulation of high Reynolds number viscous flow fields. Layers of anisotropic elements are generated by advancing along prescribed normals from solid boundaries. The points are generated such that either pentahedral or tetrahedral elements with an implied connectivity can be be directly recovered. As points are generated they are temporarily attached to a volume triangulation of the boundary points. This triangulation allows efficient local search algorithms to be used when checking merging layers, The existing advancing-front/local-reconnection procedure is used to generate isotropic elements outside of the anisotropic region. Results are presented for a variety of applications. The results demonstrate that high-quality anisotropic unstructured grids can be efficiently and consistently generated for complex configurations.
Membrane oxygenator heat exchanger failure detected by unique blood gas findings.
Hawkins, Justin L
2014-03-01
Failure of components integrated into the cardiopulmonary bypass circuit, although rare, can bring about catastrophic results. One of these components is the heat exchanger of the membrane oxygenator. In this compartment, unsterile water from the heater cooler device is separated from the sterile blood by stainless steel, aluminum, or by polyurethane. These areas are glued or welded to keep the two compartments separate, maintaining sterility of the blood. Although quality control testing is performed by the manufacturer at the factory level, transport presents the real possibility for damage. Because of this, each manufacturer has included in the instructions for use a testing procedure for testing the integrity of the heat exchanger component. Water is circulated through the heat exchanger before priming and a visible check is made of the oxygenator bundle to check for leaks. If none are apparent, then priming of the oxygenator is performed. In this particular case, this procedure was not useful in detecting communication between the water and blood chambers of the oxygenator.
40 CFR 89.408 - Post-test procedures.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Post-test procedures. 89.408 Section... Procedures § 89.408 Post-test procedures. (a) A hangup check is recommended at the completion of the last...) Record the post-test data specified in § 89.405(f). (e) For a valid test, the zero and span checks...
40 CFR 89.408 - Post-test procedures.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 21 2013-07-01 2013-07-01 false Post-test procedures. 89.408 Section... Procedures § 89.408 Post-test procedures. (a) A hangup check is recommended at the completion of the last...) Record the post-test data specified in § 89.405(f). (e) For a valid test, the zero and span checks...
40 CFR 89.408 - Post-test procedures.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Post-test procedures. 89.408 Section... Procedures § 89.408 Post-test procedures. (a) A hangup check is recommended at the completion of the last...) Record the post-test data specified in § 89.405(f). (e) For a valid test, the zero and span checks...
40 CFR 89.408 - Post-test procedures.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 21 2012-07-01 2012-07-01 false Post-test procedures. 89.408 Section... Procedures § 89.408 Post-test procedures. (a) A hangup check is recommended at the completion of the last...) Record the post-test data specified in § 89.405(f). (e) For a valid test, the zero and span checks...
40 CFR 89.408 - Post-test procedures.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 20 2014-07-01 2013-07-01 true Post-test procedures. 89.408 Section 89... Procedures § 89.408 Post-test procedures. (a) A hangup check is recommended at the completion of the last...) Record the post-test data specified in § 89.405(f). (e) For a valid test, the zero and span checks...
This SOP describes the methods and procedures for two types of QA procedures: spot checks of hand entered data, and QA procedures for co-located and split samples. The spot checks were used to determine whether the error rate goal for the input of hand entered data was being att...
Design of experiments enhanced statistical process control for wind tunnel check standard testing
NASA Astrophysics Data System (ADS)
Phillips, Ben D.
The current wind tunnel check standard testing program at NASA Langley Research Center is focused on increasing data quality, uncertainty quantification and overall control and improvement of wind tunnel measurement processes. The statistical process control (SPC) methodology employed in the check standard testing program allows for the tracking of variations in measurements over time as well as an overall assessment of facility health. While the SPC approach can and does provide researchers with valuable information, it has certain limitations in the areas of process improvement and uncertainty quantification. It is thought by utilizing design of experiments methodology in conjunction with the current SPC practices that one can efficiently and more robustly characterize uncertainties and develop enhanced process improvement procedures. In this research, methodologies were developed to generate regression models for wind tunnel calibration coefficients, balance force coefficients and wind tunnel flow angularities. The coefficients of these regression models were then tracked in statistical process control charts, giving a higher level of understanding of the processes. The methodology outlined is sufficiently generic such that this research can be applicable to any wind tunnel check standard testing program.
PACS quality control and automatic problem notifier
NASA Astrophysics Data System (ADS)
Honeyman-Buck, Janice C.; Jones, Douglas; Frost, Meryll M.; Staab, Edward V.
1997-05-01
One side effect of installing a clinical PACS Is that users become dependent upon the technology and in some cases it can be very difficult to revert back to a film based system if components fail. The nature of system failures range from slow deterioration of function as seen in the loss of monitor luminance through sudden catastrophic loss of the entire PACS networks. This paper describes the quality control procedures in place at the University of Florida and the automatic notification system that alerts PACS personnel when a failure has happened or is anticipated. The goal is to recover from a failure with a minimum of downtime and no data loss. Routine quality control is practiced on all aspects of PACS, from acquisition, through network routing, through display, and including archiving. Whenever possible, the system components perform self and between platform checks for active processes, file system status, errors in log files, and system uptime. When an error is detected or a exception occurs, an automatic page is sent to a pager with a diagnostic code. Documentation on each code, trouble shooting procedures, and repairs are kept on an intranet server accessible only to people involved in maintaining the PACS. In addition to the automatic paging system for error conditions, acquisition is assured by an automatic fax report sent on a daily basis to all technologists acquiring PACS images to be used as a cross check that all studies are archived prior to being removed from the acquisition systems. Daily quality control is preformed to assure that studies can be moved from each acquisition and contrast adjustment. The results of selected quality control reports will be presented. The intranet documentation server will be described with the automatic pager system. Monitor quality control reports will be described and the cost of quality control will be quantified. As PACS is accepted as a clinical tool, the same standards of quality control must be established as are expected on other equipment used in the diagnostic process.
Code of Federal Regulations, 2012 CFR
2012-10-01
... National Service Criminal History Check for a covered position? 2540.204 Section 2540.204 Public Welfare... What procedures must I follow in conducting a National Service Criminal History Check for a covered...'s review of the individual's criminal history, if any; (d) Provide a reasonable opportunity for the...
Code of Federal Regulations, 2011 CFR
2011-10-01
... National Service Criminal History Check for a covered position? 2540.204 Section 2540.204 Public Welfare... What procedures must I follow in conducting a National Service Criminal History Check for a covered...'s review of the individual's criminal history, if any; (d) Provide a reasonable opportunity for the...
Code of Federal Regulations, 2010 CFR
2010-10-01
... National Service Criminal History Check for a covered position? 2540.204 Section 2540.204 Public Welfare... What procedures must I follow in conducting a National Service Criminal History Check for a covered...'s review of the individual's criminal history, if any; (d) Provide a reasonable opportunity for the...
Global harmonization of quality assurance naming conventions in radiation therapy clinical trials.
Melidis, Christos; Bosch, Walther R; Izewska, Joanna; Fidarova, Elena; Zubizarreta, Eduardo; Ulin, Kenneth; Ishikura, Satoshi; Followill, David; Galvin, James; Haworth, Annette; Besuijen, Deidre; Clark, Catharine H; Clark, Clark H; Miles, Elizabeth; Aird, Edwin; Weber, Damien C; Hurkmans, Coen W; Verellen, Dirk
2014-12-01
To review the various radiation therapy quality assurance (RTQA) procedures used by the Global Clinical Trials RTQA Harmonization Group (GHG) steering committee members and present the harmonized RTQA naming conventions by amalgamating procedures with similar objectives. A survey of the GHG steering committee members' RTQA procedures, their goals, and naming conventions was conducted. The RTQA procedures were classified as baseline, preaccrual, and prospective/retrospective data capture and analysis. After all the procedures were accumulated and described, extensive discussions took place to come to harmonized RTQA procedures and names. The RTQA procedures implemented within a trial by the GHG steering committee members vary in quantity, timing, name, and compliance criteria. The procedures of each member are based on perceived chances of noncompliance, so that the quality of radiation therapy planning and treatment does not negatively influence the trial measured outcomes. A comparison of these procedures demonstrated similarities among the goals of the various methods, but the naming given to each differed. After thorough discussions, the GHG steering committee members amalgamated the 27 RTQA procedures to 10 harmonized ones with corresponding names: facility questionnaire, beam output audit, benchmark case, dummy run, complex treatment dosimetry check, virtual phantom, individual case review, review of patients' treatment records, and protocol compliance and dosimetry site visit. Harmonized RTQA harmonized naming conventions, which can be used in all future clinical trials involving radiation therapy, have been established. Harmonized procedures will facilitate future intergroup trial collaboration and help to ensure comparable RTQA between international trials, which enables meta-analyses and reduces RTQA workload for intergroup studies. Copyright © 2014 Elsevier Inc. All rights reserved.
14 CFR 121.315 - Cockpit check procedure.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 14 Aeronautics and Space 3 2010-01-01 2010-01-01 false Cockpit check procedure. 121.315 Section 121.315 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED... emergencies. The procedures must be designed so that a flight crewmember will not need to rely upon his memory...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hardin, M; Harrison, A; Lockamy, V
Purpose: Desire to improve efficiency and throughput inspired a review of our physics chart check procedures. Departmental policy mandates plan checks pre-treatment, after first treatment and weekly every 3–5 days. This study examined the effectiveness of the “after first” check with respect to improving patient safety and clinical efficiency. Type and frequency of variations discovered during this redundant secondary review was examined over seven months. Methods: A community spreadsheet was created to record variations in care discovered during chart review following the first fraction of treatment and before the second fraction (each plan reviewed prior to treatment). Entries were recordedmore » from August 2014 through February 2015, amounting to 43 recorded variations out of 906 reviewed charts. The variations were divided into categories and frequencies were assessed month-to-month. Results: Analysis of recorded variations indicates an overall variation rate of 4.7%. The initial rate was 13.5%; months 2–7 average 3.7%. The majority of variations related to discrepancies in documentation at 46.5%, followed by prescription, plan deficiency, and dose tracking related variations at 25.5%, 12.8%, and 12.8%, respectively. Minor variations (negligible consequence on patient treatment) outweighed major variations 3 to 1. Conclusion: This work indicates that this redundant secondary check is effective. The first month spike in rates could be due to the Hawthorne/observer effect, but the consistent 4% variation rate suggests the need for periodical re-training on variations noted as frequent to improve awareness and quality of the initial chart review process, which may lead to improved treatment quality, patient safety and increased clinical efficiency. Utilizing these results, a continuous quality improvement process following Deming’s Plan-Do-Study-Act (PDSA) methodology was generated. The first iteration of this PDSA was adding a specific dose tracking checklist item in the pre-treatment plan check assessment; the ramification of which will be assessed in future data.« less
Rangé, G; Chassaing, S; Marcollet, P; Saint-Étienne, C; Dequenne, P; Goralski, M; Bardiére, P; Beverilli, F; Godillon, L; Sabine, B; Laure, C; Gautier, S; Hakim, R; Albert, F; Angoulvant, D; Grammatico-Guillon, L
2018-05-01
To assess the reliability and low cost of a computerized interventional cardiology (IC) registry to prospectively and systematically collect high-quality data for all consecutive coronary patients referred for coronary angiogram or/and coronary angioplasty. Rigorous clinical practice assessment is a key factor to improve prognosis in IC. A prospective and permanent registry could achieve this goal but, presumably, at high cost and low level of data quality. One multicentric IC registry (CRAC registry), fully integrated to usual coronary activity report software, started in the centre Val-de-Loire (CVL) French region in 2014. Quality assessment of CRAC registry was conducted on five IC CathLab of the CVL region, from January 1st to December 31st 2014. Quality of collected data was evaluated by measuring procedure exhaustivity (comparing with data from hospital information system), data completeness (quality controls) and data consistency (by checking complete medical charts as gold standard). Cost per procedure (global registry operating cost/number of collected procedures) was also estimated. CRAC model provided a high-quality level with 98.2% procedure completeness, 99.6% data completeness and 89% data consistency. The operating cost per procedure was €14.70 ($16.51) for data collection and quality control, including ST-segment elevation myocardial infarction (STEMI) preadmission information and one-year follow-up after angioplasty. This integrated computerized IC registry led to the construction of an exhaustive, reliable and costless database, including all coronary patients entering in participating IC centers in the CVL region. This solution will be developed in other French regions, setting up a national IC database for coronary patients in 2020: France PCI. Copyright © 2018 Elsevier Masson SAS. All rights reserved.
Data consistency checks for Jefferson Lab Experiment E00-002
NASA Astrophysics Data System (ADS)
Telfeyan, John; Niculescu, Gabriel; Niculescu, Ioana
2006-10-01
Jefferson Lab experiment E00-002 aims to measure inclusive electron-proton and electron-deuteron scattering cross section at low Q squared and moderately low Bjorken x. Data in this kinematic region will further our understanding of the transition between the perturbative and non-perturbative regimes of Quantum Chromodynamics (QCD). As part of the data analysis effort underway at James Madison University (JMU) a comprehensive set of checks and tests was implemented. These tests ensure the quality and consistency of the experimental data, as well as providing, where appropriate, correction factors between the experimental apparatus as used and its idealized computer-simulated representation. This contribution will outline this testing procedure as implemented in the JMU analysis, highlighting the most important features/results.
Quality monitored distributed voting system
Skogmo, David
1997-01-01
A quality monitoring system can detect certain system faults and fraud attempts in a distributed voting system. The system uses decoy voters to cast predetermined check ballots. Absent check ballots can indicate system faults. Altered check ballots can indicate attempts at counterfeiting votes. The system can also cast check ballots at predetermined times to provide another check on the distributed voting system.
A qualitative study of DRG coding practice in hospitals under the Thai Universal Coverage scheme.
Pongpirul, Krit; Walker, Damian G; Winch, Peter J; Robinson, Courtland
2011-04-08
In the Thai Universal Coverage health insurance scheme, hospital providers are paid for their inpatient care using Diagnosis Related Group-based retrospective payment, for which quality of the diagnosis and procedure codes is crucial. However, there has been limited understandings on which health care professions are involved and how the diagnosis and procedure coding is actually done within hospital settings. The objective of this study is to detail hospital coding structure and process, and to describe the roles of key hospital staff, and other related internal dynamics in Thai hospitals that affect quality of data submitted for inpatient care reimbursement. Research involved qualitative semi-structured interview with 43 participants at 10 hospitals chosen to represent a range of hospital sizes (small/medium/large), location (urban/rural), and type (public/private). Hospital Coding Practice has structural and process components. While the structural component includes human resources, hospital committee, and information technology infrastructure, the process component comprises all activities from patient discharge to submission of the diagnosis and procedure codes. At least eight health care professional disciplines are involved in the coding process which comprises seven major steps, each of which involves different hospital staff: 1) Discharge Summarization, 2) Completeness Checking, 3) Diagnosis and Procedure Coding, 4) Code Checking, 5) Relative Weight Challenging, 6) Coding Report, and 7) Internal Audit. The hospital coding practice can be affected by at least five main factors: 1) Internal Dynamics, 2) Management Context, 3) Financial Dependency, 4) Resource and Capacity, and 5) External Factors. Hospital coding practice comprises both structural and process components, involves many health care professional disciplines, and is greatly varied across hospitals as a result of five main factors.
A qualitative study of DRG coding practice in hospitals under the Thai Universal Coverage Scheme
2011-01-01
Background In the Thai Universal Coverage health insurance scheme, hospital providers are paid for their inpatient care using Diagnosis Related Group-based retrospective payment, for which quality of the diagnosis and procedure codes is crucial. However, there has been limited understandings on which health care professions are involved and how the diagnosis and procedure coding is actually done within hospital settings. The objective of this study is to detail hospital coding structure and process, and to describe the roles of key hospital staff, and other related internal dynamics in Thai hospitals that affect quality of data submitted for inpatient care reimbursement. Methods Research involved qualitative semi-structured interview with 43 participants at 10 hospitals chosen to represent a range of hospital sizes (small/medium/large), location (urban/rural), and type (public/private). Results Hospital Coding Practice has structural and process components. While the structural component includes human resources, hospital committee, and information technology infrastructure, the process component comprises all activities from patient discharge to submission of the diagnosis and procedure codes. At least eight health care professional disciplines are involved in the coding process which comprises seven major steps, each of which involves different hospital staff: 1) Discharge Summarization, 2) Completeness Checking, 3) Diagnosis and Procedure Coding, 4) Code Checking, 5) Relative Weight Challenging, 6) Coding Report, and 7) Internal Audit. The hospital coding practice can be affected by at least five main factors: 1) Internal Dynamics, 2) Management Context, 3) Financial Dependency, 4) Resource and Capacity, and 5) External Factors. Conclusions Hospital coding practice comprises both structural and process components, involves many health care professional disciplines, and is greatly varied across hospitals as a result of five main factors. PMID:21477310
Suyama, Yoshihisa; Matsuki, Yu
2015-01-01
Restriction-enzyme (RE)-based next-generation sequencing methods have revolutionized marker-assisted genetic studies; however, the use of REs has limited their widespread adoption, especially in field samples with low-quality DNA and/or small quantities of DNA. Here, we developed a PCR-based procedure to construct reduced representation libraries without RE digestion steps, representing de novo single-nucleotide polymorphism discovery, and its genotyping using next-generation sequencing. Using multiplexed inter-simple sequence repeat (ISSR) primers, thousands of genome-wide regions were amplified effectively from a wide variety of genomes, without prior genetic information. We demonstrated: 1) Mendelian gametic segregation of the discovered variants; 2) reproducibility of genotyping by checking its applicability for individual identification; and 3) applicability in a wide variety of species by checking standard population genetic analysis. This approach, called multiplexed ISSR genotyping by sequencing, should be applicable to many marker-assisted genetic studies with a wide range of DNA qualities and quantities. PMID:26593239
Telecommunications Systems Career Ladder, AFSC 307XO.
1981-01-01
standard test tone levels perform impulse noise tests make in-service or out-of- service quality check.s on composite signal transmission levels Even...service or out-of- service quality control (QC) reports maintain trouble and restoration record forms (DD Form 1443) direct circuit or system checks...include: perform fault isolation on analog circuits make in-service or out-of- service quality checks on voice frequency carrier telegraph (VFCT) terminals
NASA Technical Reports Server (NTRS)
1974-01-01
Operational and configuration checks for the Apollo-Soyuz Test Project are presented. The checks include: backup crew prelaunch, prime crew prelaunch, boost and insertion, G and C reference data, G and N reference modes, rendezvous, navigation, Apollo-Soyuz operations, abort procedures, and emergency procedures.
Quality monitored distributed voting system
Skogmo, D.
1997-03-18
A quality monitoring system can detect certain system faults and fraud attempts in a distributed voting system. The system uses decoy voters to cast predetermined check ballots. Absent check ballots can indicate system faults. Altered check ballots can indicate attempts at counterfeiting votes. The system can also cast check ballots at predetermined times to provide another check on the distributed voting system. 6 figs.
Comparison of de novo assembly statistics of Cucumis sativus L.
NASA Astrophysics Data System (ADS)
Wojcieszek, Michał; Kuśmirek, Wiktor; Pawełkowicz, Magdalena; PlÄ der, Wojciech; Nowak, Robert M.
2017-08-01
Genome sequencing is the core of genomic research. With the development of NGS and lowering the cost of procedure there is another tight gap - genome assembly. Developing the proper tool for this task is essential as quality of genome has important impact on further research. Here we present comparison of several de Bruijn assemblers tested on C. sativus genomic reads. The assessment shows that newly developed software - dnaasm provides better results in terms of quantity and quality. The number of generated sequences is lower by 5 - 33% with even two fold higher N50. Quality check showed reliable results were generated by dnaasm. This provides us with very strong base for future genomic analysis.
The purpose of this SOP is to define the procedure for conducting a data accuracy check on a randomly selected 10% sample of all electronic data. This procedure applies to the cleaned, working databases generated during the Arizona NHEXAS project and the "Border" study. Keyword...
Global Harmonization of Quality Assurance Naming Conventions in Radiation Therapy Clinical Trials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Melidis, Christos, E-mail: christos.melidis@eortc.be; Bosch, Walther R.; Izewska, Joanna
2014-12-01
Purpose: To review the various radiation therapy quality assurance (RTQA) procedures used by the Global Clinical Trials RTQA Harmonization Group (GHG) steering committee members and present the harmonized RTQA naming conventions by amalgamating procedures with similar objectives. Methods and Materials: A survey of the GHG steering committee members' RTQA procedures, their goals, and naming conventions was conducted. The RTQA procedures were classified as baseline, preaccrual, and prospective/retrospective data capture and analysis. After all the procedures were accumulated and described, extensive discussions took place to come to harmonized RTQA procedures and names. Results: The RTQA procedures implemented within a trial by themore » GHG steering committee members vary in quantity, timing, name, and compliance criteria. The procedures of each member are based on perceived chances of noncompliance, so that the quality of radiation therapy planning and treatment does not negatively influence the trial measured outcomes. A comparison of these procedures demonstrated similarities among the goals of the various methods, but the naming given to each differed. After thorough discussions, the GHG steering committee members amalgamated the 27 RTQA procedures to 10 harmonized ones with corresponding names: facility questionnaire, beam output audit, benchmark case, dummy run, complex treatment dosimetry check, virtual phantom, individual case review, review of patients' treatment records, and protocol compliance and dosimetry site visit. Conclusions: Harmonized RTQA harmonized naming conventions, which can be used in all future clinical trials involving radiation therapy, have been established. Harmonized procedures will facilitate future intergroup trial collaboration and help to ensure comparable RTQA between international trials, which enables meta-analyses and reduces RTQA workload for intergroup studies.« less
Morphology, geology and water quality assessment of former tin mining catchment.
Ashraf, Muhammad Aqeel; Maah, Mohd Jamil; Yusoff, Ismail
2012-01-01
Bestari Jaya, former tin mining catchment covers an area of 2656.31 hectares comprised of four hundred and forty-two different-size lakes and ponds. The present study area comprise of 92 hectares of the catchment that include four large size lakes. Arc GIS version 9.2 used to develop bathymetric map, Global Positioning System (GPS) for hydrographical survey and flow meter was utilized for water discharge analysis (flow routing) of the catchment. The water quality parameters (pH, temperature, electric conductivity, dissolved oxygen DO, total dissolved solids TDS, chlorides, ammonium, nitrates) were analyzed by using Hydrolab. Quality assurance (QA) and quality control (QC) procedures were strictly followed throughout the field work and data analysis. Different procedures were employed to evaluate the analytical data and to check for possible transcription or dilution errors, changes during analysis, or unusual or unlikely values. The results obtained are compared with interim national water quality standards for Malaysia indicates that water quality of area is highly degraded. It is concluded that Bestri Jaya ex-mining catchment has a high pollution potential due to mining activities and River Ayer Hitam, recipient of catchment water, is a highly polluted river.
Morphology, Geology and Water Quality Assessment of Former Tin Mining Catchment
Ashraf, Muhammad Aqeel; Maah, Mohd. Jamil; Yusoff, Ismail
2012-01-01
Bestari Jaya, former tin mining catchment covers an area of 2656.31 hectares comprised of four hundred and forty-two different-size lakes and ponds. The present study area comprise of 92 hectares of the catchment that include four large size lakes. Arc GIS version 9.2 used to develop bathymetric map, Global Positioning System (GPS) for hydrographical survey and flow meter was utilized for water discharge analysis (flow routing) of the catchment. The water quality parameters (pH, temperature, electric conductivity, dissolved oxygen DO, total dissolved solids TDS, chlorides, ammonium, nitrates) were analyzed by using Hydrolab. Quality assurance (QA) and quality control (QC) procedures were strictly followed throughout the field work and data analysis. Different procedures were employed to evaluate the analytical data and to check for possible transcription or dilution errors, changes during analysis, or unusual or unlikely values. The results obtained are compared with interim national water quality standards for Malaysia indicates that water quality of area is highly degraded. It is concluded that Bestri Jaya ex-mining catchment has a high pollution potential due to mining activities and River Ayer Hitam, recipient of catchment water, is a highly polluted river. PMID:22761549
Focant, Jean-François; Eppe, Gauthier; Massart, Anne-Cécile; Scholl, Georges; Pirard, Catherine; De Pauw, Edwin
2006-10-13
We report on the use of a state-of-the-art method for the measurement of selected polychlorinated dibenzo-p-dioxins, polychlorinated dibenzofurans and polychlorinated biphenyls in human serum specimens. The sample preparation procedure is based on manual small size solid-phase extraction (SPE) followed by automated clean-up and fractionation using multi-sorbent liquid chromatography columns. SPE cartridges and all clean-up columns are disposable. Samples are processed in batches of 20 units, including one blank control (BC) sample and one quality control (QC) sample. The analytical measurement is performed using gas chromatography coupled to isotope dilution high-resolution mass spectrometry. The sample throughput corresponds to one series of 20 samples per day, from sample reception to data quality cross-check and reporting, once the procedure has been started and series of samples keep being produced. Four analysts are required to ensure proper performances of the procedure. The entire procedure has been validated under International Organization for Standardization (ISO) 17025 criteria and further tested over more than 1500 unknown samples during various epidemiological studies. The method is further discussed in terms of reproducibility, efficiency and long-term stability regarding the 35 target analytes. Data related to quality control and limit of quantification (LOQ) calculations are also presented and discussed.
Data Quality Verification at STScI - Automated Assessment and Your Data
NASA Astrophysics Data System (ADS)
Dempsey, R.; Swade, D.; Scott, J.; Hamilton, F.; Holm, A.
1996-12-01
As satellite based observatories improve their ability to deliver wider varieties and more complex types of scientific data, so to does the process of analyzing and reducing these data. It becomes correspondingly imperative that Guest Observers or Archival Researchers have access to an accurate, consistent, and easily understandable summary of the quality of their data. Previously, at the STScI, an astronomer would display and examine the quality and scientific usefulness of every single observation obtained with HST. Recently, this process has undergone a major reorganization at the Institute. A major part of the new process is that the majority of data are assessed automatically with little or no human intervention. As part of routine processing in the OSS--PODPS Unified System (OPUS), the Observatory Monitoring System (OMS) observation logs, the science processing trailer file (also known as the TRL file), and the science data headers are inspected by an automated tool, AUTO_DQ. AUTO_DQ then determines if any anomalous events occurred during the observation or through processing and calibration of the data that affects the procedural quality of the data. The results are placed directly into the Procedural Data Quality (PDQ) file as a string of predefined data quality keywords and comments. These in turn are used by the Contact Scientist (CS) to check the scientific usefulness of the observations. In this manner, the telemetry stream is checked for known problems such as losses of lock, re-centerings, or degraded guiding, for example, while missing data or calibration errors are also easily flagged. If the problem is serious, the data are then queued for manual inspection by an astronomer. The success of every target acquisition is verified manually. If serious failures are confirmed, the PI and the scheduling staff are notified so that options concerning rescheduling the observations can be explored.
Check-off logs for routine equipment maintenance.
Brewster, M A; Carver, P H; Randolph, B
1995-12-01
The regulatory requirement for appropriate routine instrument maintenance documentation is approached by laboratories in numerous ways. Standard operating procedures (SOPs) may refer to maintenance listed in instrument manuals, may indicate "periodic" performance of an action, or may indicate specific tasks to be performed at certain frequencies. The Quality Assurance Unit (QAU) task of assuring the performance of these indicated maintenance tasks can be extremely laborious if these records are merged with other analysis records. Further, the lack of written maintenance schedules often leads to omission of infrequently performed tasks. We recommend creation of routine maintenance check-off logs for instruments with tasks grouped by frequency of expected performance. Usage of such logs should result in better laboratory compliance with SOPs and the compliance can be readily monitored by QAU or by regulatory agencies.
Implementing Model-Check for Employee and Management Satisfaction
NASA Technical Reports Server (NTRS)
Jones, Corey; LaPha, Steven
2013-01-01
This presentation will discuss methods to which ModelCheck can be implemented to not only improve model quality, but also satisfy both employees and management through different sets of quality checks. This approach allows a standard set of modeling practices to be upheld throughout a company, with minimal interaction required by the end user. The presenter will demonstrate how to create multiple ModelCheck standards, preventing users from evading the system, and how it can improve the quality of drawings and models.
The purpose of this SOP is to describe the procedures for checking the food collections with the food stated as consumed in the 24-Hour Food Diary. The sample is then packaged and shipped for further analysis. This procedure applies to the checking of food samples and shipment ...
The purpose of this SOP is to describe the procedures for checking the food collections with the food stated as consumed in the 24-Hour Food Diary. The sample is then packaged and shipped to the FDA for further analysis. This procedure applies to the checking of food samples an...
77 FR 12175 - Airworthiness Directives; DASSAULT AVIATION Airplanes
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-29
... specified products. The MCAI states: The Maintenance Procedure (MP) 57-607, related to non destructive check... Recommended Maintenance Schedules chapter of the Aircraft Maintenance Documentation. After the implementation... maintenance program to include ``Non-Destructive Check of Flap Tracks 2 and 5,'' Maintenance Procedure 57-607...
NASA Astrophysics Data System (ADS)
Daughtrey, E. Hunter; Adams, Jeffrey R.; Oliver, Karen D.; Kronmiller, Keith G.; McClenny, William A.
1998-09-01
A trailer-deployed automated gas chromatograph-mass spectrometer (autoGC-MS) system capable of making continuous hourly measurements was used to determine volatile organic compounds (VOCs) in ambient air at New Hendersonville, Tennessee, and Research Triangle Park, North Carolina, in 1995. The system configuration, including the autoGC-MS, trailer and transfer line, siting, and sampling plan and schedule, is described. The autoGC-MS system employs a pair of matched sorbent traps to allow simultaneous sampling and desorption. Desorption is followed by Stirling engine cryofocusing and subsequent GC separation and mass spectral identification and quantification. Quality control measurements described include evaluating precision and accuracy of replicate analyses of independently supplied audit and round-robin canisters and determining the completeness of the data sets taken in Tennessee. Data quality objectives for precision (±10%) and accuracy (±20%) of 10- to 20-ppbv audit canisters and a completeness of >75% data capture were met. Quality assurance measures used in reviewing the data set include retention time stability, calibration checks, frequency distribution checks, and checks of the mass spectra. Special procedures and tests were used to minimize sorbent trap artifacts, to verify the quality of a standard prepared in our laboratory, and to prove the integrity of the insulated, heated transfer line. A rigorous determination of total system blank concentration levels using humidified scientific air spiked with ozone allowed estimation of method detection limits, ranging from 0.01 to 1.0 ppb C, for most of the 100 target compounds, which were a composite list of the target compounds for the Photochemical Assessment Monitoring Station network, those for Environmental Protection Agency method TO-14, and selected oxygenated VOCs.
Fabrication of a grazing incidence telescope by grinding and polishing techniques on aluminum
NASA Technical Reports Server (NTRS)
Gallagher, Dennis; Cash, Webster; Green, James
1991-01-01
The paper describes the fabrication processes, by grinding and polishing, used in making the mirrors for a f/2.8 Wolter type-I grazing incidence telescope at Boulder (Colorado), together with testing procedure used to determine the quality of the images. All grinding and polishing is done on specially designed machine that consists of a horizontal spindle to hold and rotate the mirror and a stroke arm machine to push the various tools back and forth along the mirrors length. The progress is checked by means of the ronchi test during all grinding and polishing stages. Current measurements of the telescope's image quality give a FWHM measurement of 44 arcsec, with the goal set at 5-10 arcsec quality.
Testing Intelligently Includes Double-Checking Wechsler IQ Scores
ERIC Educational Resources Information Center
Kuentzel, Jeffrey G.; Hetterscheidt, Lesley A.; Barnett, Douglas
2011-01-01
The rigors of standardized testing make for numerous opportunities for examiner error, including simple computational mistakes in scoring. Although experts recommend that test scoring be double-checked, the extent to which independent double-checking would reduce scoring errors is not known. A double-checking procedure was established at a…
Quality Assurance and Control Considerations in Environmental Measurements and Monitoring
NASA Astrophysics Data System (ADS)
Sedlet, Jacob
1982-06-01
Quality assurance and quality control have become accepted as essential parts of all environmental surveillance, measurements, and monitoring programs, both nuclear and non-nuclear. The same principles and details apply to each. It is primarily the final measurement technique that differs. As the desire and need to measure smaller amounts of pollutants with greater accuracy has increased, it has been recognized that quality assurance and control programs are cost-effective in achieving the expected results. Quality assurance (QA) consists of all the actions necessary to provide confidence in the results. Quality control (QC) is a part of QA, and consists of those actions and activities that permit the control of the individual steps in the environmental program. The distinction between the two terms is not always clearly defined, but a sharp division is not necessary. The essential principle of QA and QC is a commitment to high quality results. The essential components of a QA and QC program are a complete, written procedures manual for all parts of the environmental program, the use of standard or validated procedures, participation in applicable interlaboratory comparison or QA programs, replicate analysis and measurement, training of personnel, and a means of auditing or checking that the QA and QC programs are properly conducted. These components are discussed below in some detail.
Spokane Community College Library Serials Operation Handbook.
ERIC Educational Resources Information Center
Waesche, Betty; Cargill, Katie
Listed are general policies for handling periodicals and specific policies for their acquisition, renewal, and processing, as well as procedures for checking in and claiming periodicals and handling duplicate copies. Flowcharts accompany procedural statements for claiming and checking-in periodicals and for dealing with duplicates. Specific duties…
The Quality of Rare Disease Registries: Evaluation and Characterization.
Coi, Alessio; Santoro, Michele; Villaverde-Hueso, Ana; Lipucci Di Paola, Michele; Gainotti, Sabina; Taruscio, Domenica; Posada de la Paz, Manuel; Bianchi, Fabrizio
2016-01-01
The focus on the quality of the procedures for data collection, storing, and analysis in the definition and implementation of a rare disease registry (RDR) is the basis for developing a valid and long-term sustainable tool. The aim of this study was to provide useful information for characterizing a quality profile for RDRs using an analytical approach applied to RDRs participating in the European Platform for Rare Disease Registries 2011-2014 (EPIRARE) survey. An indicator of quality was defined by choosing a small set of quality-related variables derived from the survey. The random forest method was used to identify the variables best defining a quality profile for RDRs. Fisher's exact test was employed to assess the association with the indicator of quality, and the Cochran-Armitage test was used to check the presence of a linear trend along different levels of quality. The set of variables found to characterize high-quality RDRs focused on ethical and legal issues, governance, communication of activities and results, established procedures to regulate access to data and security, and established plans to ensure long-term sustainability. The quality of RDRs is usually associated with a good oversight and governance mechanism and with durable funding. The results suggest that RDRs would benefit from support in management, information technology, epidemiology, and statistics. © 2016 S. Karger AG, Basel.
The effect of column purification on cDNA indirect labelling for microarrays
Molas, M Lia; Kiss, John Z
2007-01-01
Background The success of the microarray reproducibility is dependent upon the performance of standardized procedures. Since the introduction of microarray technology for the analysis of global gene expression, reproducibility of results among different laboratories has been a major problem. Two of the main contributors to this variability are the use of different microarray platforms and different laboratory practices. In this paper, we address the latter question in terms of how variation in one of the steps of a labelling procedure affects the cDNA product prior to microarray hybridization. Results We used a standard procedure to label cDNA for microarray hybridization and employed different types of column chromatography for cDNA purification. After purifying labelled cDNA, we used the Agilent 2100 Bioanalyzer and agarose gel electrophoresis to assess the quality of the labelled cDNA before its hybridization onto a microarray platform. There were major differences in the cDNA profile (i.e. cDNA fragment lengths and abundance) as a result of using four different columns for purification. In addition, different columns have different efficiencies to remove rRNA contamination. This study indicates that the appropriate column to use in this type of protocol has to be experimentally determined. Finally, we present new evidence establishing the importance of testing the method of purification used during an indirect labelling procedure. Our results confirm the importance of assessing the quality of the sample in the labelling procedure prior to hybridization onto a microarray platform. Conclusion Standardization of column purification systems to be used in labelling procedures will improve the reproducibility of microarray results among different laboratories. In addition, implementation of a quality control check point of the labelled samples prior to microarray hybridization will prevent hybridizing a poor quality sample to expensive micorarrays. PMID:17597522
The effect of column purification on cDNA indirect labelling for microarrays.
Molas, M Lia; Kiss, John Z
2007-06-27
The success of the microarray reproducibility is dependent upon the performance of standardized procedures. Since the introduction of microarray technology for the analysis of global gene expression, reproducibility of results among different laboratories has been a major problem. Two of the main contributors to this variability are the use of different microarray platforms and different laboratory practices. In this paper, we address the latter question in terms of how variation in one of the steps of a labelling procedure affects the cDNA product prior to microarray hybridization. We used a standard procedure to label cDNA for microarray hybridization and employed different types of column chromatography for cDNA purification. After purifying labelled cDNA, we used the Agilent 2100 Bioanalyzer and agarose gel electrophoresis to assess the quality of the labelled cDNA before its hybridization onto a microarray platform. There were major differences in the cDNA profile (i.e. cDNA fragment lengths and abundance) as a result of using four different columns for purification. In addition, different columns have different efficiencies to remove rRNA contamination. This study indicates that the appropriate column to use in this type of protocol has to be experimentally determined. Finally, we present new evidence establishing the importance of testing the method of purification used during an indirect labelling procedure. Our results confirm the importance of assessing the quality of the sample in the labelling procedure prior to hybridization onto a microarray platform. Standardization of column purification systems to be used in labelling procedures will improve the reproducibility of microarray results among different laboratories. In addition, implementation of a quality control check point of the labelled samples prior to microarray hybridization will prevent hybridizing a poor quality sample to expensive micorarrays.
Yadlapati, Rena; Johnston, Elyse R; Gluskin, Adam B; Gregory, Dyanna L; Cyrus, Rachel; Werth, Lindsay; Ciolino, Jody D; Grande, David P; Keswani, Rajesh N
2017-07-19
Inpatient colonoscopy preparations are often inadequate, compromising patient safety and procedure quality, while resulting in greater hospital costs. The aims of this study were to: (1) design and implement an electronic inpatient split-dose bowel preparation order set; (2) assess the intervention's impact upon preparation adequacy, repeated colonoscopies, hospital days, and costs. We conducted a single center prospective pragmatic quasiexperimental study of hospitalized adults undergoing colonoscopy. The experimental intervention was designed using DMAIC (define, measure, analyze, improve, and control) methodology. Prospective data collected over 12 months were compared with data from a historical preintervention cohort. The primary outcome was bowel preparation quality and secondary outcomes included number of repeated procedures, hospital days, and costs. On the basis of a Delphi method and DMAIC process, we created an electronic inpatient bowel preparation order set inclusive of a split-dose bowel preparation algorithm, automated orders for rescue medications, and nursing bowel preparation checks. The analysis data set included 969 patients, 445 (46%) in the postintervention group. The adequacy of bowel preparation significantly increased following intervention (86% vs. 43%; P<0.01) and proportion of repeated procedures decreased (2.0% vs. 4.6%; P=0.03). Mean hospital days from bowel preparation initiation to discharge decreased from 8.0 to 6.9 days (P=0.02). The intervention resulted in an estimated 1-year cost-savings of $46,076 based on a reduction in excess hospital days associated with repeated and delayed procedures. Our interdisciplinary initiative targeting inpatient colonoscopy preparations significantly improved quality and reduced repeat procedures, and hospital days. Other institutions should consider utilizing this framework to improve inpatient colonoscopy value.
Quality Assurance Specifications for Planetary Protection Assays
NASA Astrophysics Data System (ADS)
Baker, Amy
As the European Space Agency planetary protection (PP) activities move forward to support the ExoMars and other planetary missions, it will become necessary to increase staffing of labo-ratories that provide analyses for these programs. Standardization of procedures, a comprehen-sive quality assurance program, and unilateral training of personnel will be necessary to ensure that the planetary protection goals and schedules are met. The PP Quality Assurance/Quality Control (QAQC) program is designed to regulate and monitor procedures performed by labora-tory personnel to ensure that all work meets data quality objectives through the assembly and launch process. Because personnel time is at a premium and sampling schedules are often de-pendent on engineering schedules, it is necessary to have flexible staffing to support all sampling requirements. The most productive approach to having a competent and flexible work force is to establish well defined laboratory procedures and training programs that clearly address the needs of the program and the work force. The quality assurance specification for planetary protection assays has to ensure that labora-tories and associated personnel can demonstrate the competence to perform assays according to the applicable standard AD4. Detailed subjects included in the presentation are as follows: • field and laboratory control criteria • data reporting • personnel training requirements and certification • laboratory audit criteria. Based upon RD2 for primary and secondary validation and RD3 for data quality objectives, the QAQC will provide traceable quality assurance safeguards by providing structured laboratory requirements for guidelines and oversight including training and technical updates, standardized documentation, standardized QA/QC checks, data review and data archiving.
High-level neutron coincidence counter maintenance manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Swansen, J.; Collinsworth, P.
1983-05-01
High-level neutron coincidence counter operational (field) calibration and usage is well known. This manual makes explicit basic (shop) check-out, calibration, and testing of new units and is a guide for repair of failed in-service units. Operational criteria for the major electronic functions are detailed, as are adjustments and calibration procedures, and recurrent mechanical/electromechanical problems are addressed. Some system tests are included for quality assurance. Data on nonstandard large-scale integrated (circuit) components and a schematic set are also included.
Prevalidation in pharmaceutical analysis. Part I. Fundamentals and critical discussion.
Grdinić, Vladimir; Vuković, Jadranka
2004-05-28
A complete prevalidation, as a basic prevalidation strategy for quality control and standardization of analytical procedure was inaugurated. Fast and simple, the prevalidation methodology based on mathematical/statistical evaluation of a reduced number of experiments (N < or = 24) was elaborated and guidelines as well as algorithms were given in detail. This strategy has been produced for the pharmaceutical applications and dedicated to the preliminary evaluation of analytical methods where linear calibration model, which is very often occurred in practice, could be the most appropriate to fit experimental data. The requirements presented in this paper should therefore help the analyst to design and perform the minimum number of prevalidation experiments needed to obtain all the required information to evaluate and demonstrate the reliability of its analytical procedure. In complete prevalidation process, characterization of analytical groups, checking of two limiting groups, testing of data homogeneity, establishment of analytical functions, recognition of outliers, evaluation of limiting values and extraction of prevalidation parameters were included. Moreover, system of diagnosis for particular prevalidation step was suggested. As an illustrative example for demonstration of feasibility of prevalidation methodology, among great number of analytical procedures, Vis-spectrophotometric procedure for determination of tannins with Folin-Ciocalteu's phenol reagent was selected. Favourable metrological characteristics of this analytical procedure, as prevalidation figures of merit, recognized the metrological procedure as a valuable concept in preliminary evaluation of quality of analytical procedures.
32 CFR Appendix A to Part 86 - Criminal History Background Check Procedures
Code of Federal Regulations, 2013 CFR
2013-07-01
... 32 National Defense 1 2013-07-01 2013-07-01 false Criminal History Background Check Procedures A Appendix A to Part 86 National Defense Department of Defense OFFICE OF THE SECRETARY OF DEFENSE PERSONNEL...-190, section 1094, the Department of Defense requires that military members (except healthcare...
32 CFR Appendix A to Part 86 - Criminal History Background Check Procedures
Code of Federal Regulations, 2012 CFR
2012-07-01
... 32 National Defense 1 2012-07-01 2012-07-01 false Criminal History Background Check Procedures A Appendix A to Part 86 National Defense Department of Defense OFFICE OF THE SECRETARY OF DEFENSE PERSONNEL...-190, section 1094, the Department of Defense requires that military members (except healthcare...
32 CFR Appendix A to Part 86 - Criminal History Background Check Procedures
Code of Federal Regulations, 2014 CFR
2014-07-01
... 32 National Defense 1 2014-07-01 2014-07-01 false Criminal History Background Check Procedures A Appendix A to Part 86 National Defense Department of Defense OFFICE OF THE SECRETARY OF DEFENSE PERSONNEL...-190, section 1094, the Department of Defense requires that military members (except healthcare...
45 CFR 205.32 - Procedures for issuance of replacement checks.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 45 Public Welfare 2 2012-10-01 2012-10-01 false Procedures for issuance of replacement checks. 205.32 Section 205.32 Public Welfare Regulations Relating to Public Welfare OFFICE OF FAMILY ASSISTANCE (ASSISTANCE PROGRAMS), ADMINISTRATION FOR CHILDREN AND FAMILIES, DEPARTMENT OF HEALTH AND HUMAN SERVICES...
45 CFR 205.32 - Procedures for issuance of replacement checks.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 45 Public Welfare 2 2014-10-01 2012-10-01 true Procedures for issuance of replacement checks. 205.32 Section 205.32 Public Welfare Regulations Relating to Public Welfare OFFICE OF FAMILY ASSISTANCE (ASSISTANCE PROGRAMS), ADMINISTRATION FOR CHILDREN AND FAMILIES, DEPARTMENT OF HEALTH AND HUMAN SERVICES...
45 CFR 205.32 - Procedures for issuance of replacement checks.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 45 Public Welfare 2 2013-10-01 2012-10-01 true Procedures for issuance of replacement checks. 205.32 Section 205.32 Public Welfare Regulations Relating to Public Welfare OFFICE OF FAMILY ASSISTANCE (ASSISTANCE PROGRAMS), ADMINISTRATION FOR CHILDREN AND FAMILIES, DEPARTMENT OF HEALTH AND HUMAN SERVICES...
45 CFR 205.32 - Procedures for issuance of replacement checks.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 45 Public Welfare 2 2010-10-01 2010-10-01 false Procedures for issuance of replacement checks. 205.32 Section 205.32 Public Welfare Regulations Relating to Public Welfare OFFICE OF FAMILY ASSISTANCE (ASSISTANCE PROGRAMS), ADMINISTRATION FOR CHILDREN AND FAMILIES, DEPARTMENT OF HEALTH AND HUMAN SERVICES...
45 CFR 205.32 - Procedures for issuance of replacement checks.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 45 Public Welfare 2 2011-10-01 2011-10-01 false Procedures for issuance of replacement checks. 205.32 Section 205.32 Public Welfare Regulations Relating to Public Welfare OFFICE OF FAMILY ASSISTANCE (ASSISTANCE PROGRAMS), ADMINISTRATION FOR CHILDREN AND FAMILIES, DEPARTMENT OF HEALTH AND HUMAN SERVICES...
Serials Management by Microcomputer: The Potential of DBMS.
ERIC Educational Resources Information Center
Vogel, J. Thomas; Burns, Lynn W.
1984-01-01
Describes serials management at Philadelphia College of Textiles and Science library via a microcomputer, a file manager called PFS, and a relational database management system called dBase II. Check-in procedures, programing with dBase II, "static" and "active" databases, and claim procedures are discussed. Check-in forms are…
Anani, Nadim; Mazya, Michael V; Chen, Rong; Prazeres Moreira, Tiago; Bill, Olivier; Ahmed, Niaz; Wahlgren, Nils; Koch, Sabine
2017-01-10
Interoperability standards intend to standardise health information, clinical practice guidelines intend to standardise care procedures, and patient data registries are vital for monitoring quality of care and for clinical research. This study combines all three: it uses interoperability specifications to model guideline knowledge and applies the result to registry data. We applied the openEHR Guideline Definition Language (GDL) to data from 18,400 European patients in the Safe Implementation of Treatments in Stroke (SITS) registry to retrospectively check their compliance with European recommendations for acute stroke treatment. Comparing compliance rates obtained with GDL to those obtained by conventional statistical data analysis yielded a complete match, suggesting that GDL technology is reliable for guideline compliance checking. The successful application of a standard guideline formalism to a large patient registry dataset is an important step toward widespread implementation of computer-interpretable guidelines in clinical practice and registry-based research. Application of the methodology gave important results on the evolution of stroke care in Europe, important both for quality of care monitoring and clinical research.
Lukášová, Ivana; Muselík, Jan; Franc, Aleš; Goněc, Roman; Mika, Filip; Vetchý, David
2017-11-15
Warfarin is intensively discussed drug with narrow therapeutic range. There have been cases of bleeding attributed to varying content or altered quality of the active substance. Factor analysis is useful for finding suitable technological parameters leading to high content uniformity of tablets containing low amount of active substance. The composition of tabletting blend and technological procedure were set with respect to factor analysis of previously published results. The correctness of set parameters was checked by manufacturing and evaluation of tablets containing 1-10mg of warfarin sodium. The robustness of suggested technology was checked by using "worst case scenario" and statistical evaluation of European Pharmacopoeia (EP) content uniformity limits with respect to Bergum division and process capability index (Cpk). To evaluate the quality of active substance and tablets, dissolution method was developed (water; EP apparatus II; 25rpm), allowing for statistical comparison of dissolution profiles. Obtained results prove the suitability of factor analysis to optimize the composition with respect to batches manufactured previously and thus the use of metaanalysis under industrial conditions is feasible. Copyright © 2017 Elsevier B.V. All rights reserved.
The Effects of Check, Connect, and Expect on Behavioral and Academic Growth
ERIC Educational Resources Information Center
McDaniel, Sara C.; Houchins, David E.; Robinson, Cecil
2016-01-01
"Check, Connect, and Expect" (CCE) is a secondary tier behavioral intervention that provides students with levels of support including a dedicated "coach" for check-in and check-out procedures, and social skills instruction. Elementary students (n = 22) in an alternative education school setting received CCE for 13 weeks…
Sayler, Elaine; Eldredge-Hindy, Harriet; Dinome, Jessie; Lockamy, Virginia; Harrison, Amy S
2015-01-01
The planning procedure for Valencia and Leipzig surface applicators (VLSAs) (Nucletron, Veenendaal, The Netherlands) differs substantially from CT-based planning; the unfamiliarity could lead to significant errors. This study applies failure modes and effects analysis (FMEA) to high-dose-rate (HDR) skin brachytherapy using VLSAs to ensure safety and quality. A multidisciplinary team created a protocol for HDR VLSA skin treatments and applied FMEA. Failure modes were identified and scored by severity, occurrence, and detectability. The clinical procedure was then revised to address high-scoring process nodes. Several key components were added to the protocol to minimize risk probability numbers. (1) Diagnosis, prescription, applicator selection, and setup are reviewed at weekly quality assurance rounds. Peer review reduces the likelihood of an inappropriate treatment regime. (2) A template for HDR skin treatments was established in the clinic's electronic medical record system to standardize treatment instructions. This reduces the chances of miscommunication between the physician and planner as well as increases the detectability of an error. (3) A screen check was implemented during the second check to increase detectability of an error. (4) To reduce error probability, the treatment plan worksheet was designed to display plan parameters in a format visually similar to the treatment console display, facilitating data entry and verification. (5) VLSAs are color coded and labeled to match the electronic medical record prescriptions, simplifying in-room selection and verification. Multidisciplinary planning and FMEA increased detectability and reduced error probability during VLSA HDR brachytherapy. This clinical model may be useful to institutions implementing similar procedures. Copyright © 2015 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.
Young, Stacie T.M.; Jamison, Marcael T.J.
2007-01-01
Storm runoff water-quality samples were collected as part of the State of Hawaii Department of Transportation Stormwater Monitoring Program. This program is designed to assess the effects of highway runoff and urban runoff on Halawa Stream. For this program, rainfall data were collected at two stations, continuous streamflow data at three stations, and water-quality data at five stations, which include the two continuous streamflow stations. This report summarizes rainfall, streamflow, and water-quality data collected between July 1, 2006 and June 30, 2007. A total of 13 samples was collected over two storms during July 1, 2006 to June 30, 2007. The goal was to collect grab samples nearly simultaneously at all five stations and flow-weighted time-composite samples at the three stations equipped with automatic samplers. Samples were analyzed for total suspended solids, total dissolved solids, nutrients, chemical oxygen demand, and selected trace metals (cadmium, chromium, copper, lead, nickel, and zinc). Additionally, grab samples were analyzed for oil and grease, total petroleum hydrocarbons, fecal coliform, and biological oxygen demand. Quality-assurance/quality-control samples were also collected during storms and during routine maintenance to verify analytical procedures and check the effectiveness of equipment-cleaning procedures.
40 CFR 89.411 - Exhaust sample procedure-gaseous components.
Code of Federal Regulations, 2014 CFR
2014-07-01
... and the values recorded. The number of events that may occur between the pre- and post-analysis checks... drift nor the span drift between the pre-analysis and post-analysis checks on any range used may exceed... Emission Test Procedures § 89.411 Exhaust sample procedure—gaseous components. (a) Automatic data...
40 CFR 89.411 - Exhaust sample procedure-gaseous components.
Code of Federal Regulations, 2013 CFR
2013-07-01
... and the values recorded. The number of events that may occur between the pre- and post-analysis checks... drift nor the span drift between the pre-analysis and post-analysis checks on any range used may exceed... Emission Test Procedures § 89.411 Exhaust sample procedure—gaseous components. (a) Automatic data...
40 CFR 90.413 - Exhaust sample procedure-gaseous components.
Code of Federal Regulations, 2011 CFR
2011-07-01
... the values recorded. The number of events that may occur between the pre- and post-checks is not.... (9) Neither the zero drift nor the span drift between the pre-analysis and post-analysis checks on... Gaseous Exhaust Test Procedures § 90.413 Exhaust sample procedure—gaseous components. (a) Automatic data...
40 CFR 90.413 - Exhaust sample procedure-gaseous components.
Code of Federal Regulations, 2013 CFR
2013-07-01
... the values recorded. The number of events that may occur between the pre- and post-checks is not.... (9) Neither the zero drift nor the span drift between the pre-analysis and post-analysis checks on... Gaseous Exhaust Test Procedures § 90.413 Exhaust sample procedure—gaseous components. (a) Automatic data...
40 CFR 90.413 - Exhaust sample procedure-gaseous components.
Code of Federal Regulations, 2014 CFR
2014-07-01
... the values recorded. The number of events that may occur between the pre- and post-checks is not.... (9) Neither the zero drift nor the span drift between the pre-analysis and post-analysis checks on... Gaseous Exhaust Test Procedures § 90.413 Exhaust sample procedure—gaseous components. (a) Automatic data...
40 CFR 90.413 - Exhaust sample procedure-gaseous components.
Code of Federal Regulations, 2012 CFR
2012-07-01
... the values recorded. The number of events that may occur between the pre- and post-checks is not.... (9) Neither the zero drift nor the span drift between the pre-analysis and post-analysis checks on... Gaseous Exhaust Test Procedures § 90.413 Exhaust sample procedure—gaseous components. (a) Automatic data...
40 CFR 89.411 - Exhaust sample procedure-gaseous components.
Code of Federal Regulations, 2011 CFR
2011-07-01
... and the values recorded. The number of events that may occur between the pre- and post-analysis checks... drift nor the span drift between the pre-analysis and post-analysis checks on any range used may exceed... Emission Test Procedures § 89.411 Exhaust sample procedure—gaseous components. (a) Automatic data...
40 CFR 89.411 - Exhaust sample procedure-gaseous components.
Code of Federal Regulations, 2010 CFR
2010-07-01
... and the values recorded. The number of events that may occur between the pre- and post-analysis checks... drift nor the span drift between the pre-analysis and post-analysis checks on any range used may exceed... Emission Test Procedures § 89.411 Exhaust sample procedure—gaseous components. (a) Automatic data...
40 CFR 89.411 - Exhaust sample procedure-gaseous components.
Code of Federal Regulations, 2012 CFR
2012-07-01
... and the values recorded. The number of events that may occur between the pre- and post-analysis checks... drift nor the span drift between the pre-analysis and post-analysis checks on any range used may exceed... Emission Test Procedures § 89.411 Exhaust sample procedure—gaseous components. (a) Automatic data...
40 CFR 90.413 - Exhaust sample procedure-gaseous components.
Code of Federal Regulations, 2010 CFR
2010-07-01
... the values recorded. The number of events that may occur between the pre- and post-checks is not.... (9) Neither the zero drift nor the span drift between the pre-analysis and post-analysis checks on... Gaseous Exhaust Test Procedures § 90.413 Exhaust sample procedure—gaseous components. (a) Automatic data...
Code of Federal Regulations, 2012 CFR
2012-10-01
... criminal registry check for a covered position? (a) FBI fingerprint-based check. If you conduct and document a fingerprint-based criminal history check through the Federal Bureau of Investigation, you will...
Code of Federal Regulations, 2010 CFR
2010-10-01
... criminal registry check for a covered position? (a) FBI fingerprint-based check. If you conduct and document a fingerprint-based criminal history check through the Federal Bureau of Investigation, you will...
Code of Federal Regulations, 2011 CFR
2011-10-01
... criminal registry check for a covered position? (a) FBI fingerprint-based check. If you conduct and document a fingerprint-based criminal history check through the Federal Bureau of Investigation, you will...
Ensuring long-term reliability of the data storage on optical disc
NASA Astrophysics Data System (ADS)
Chen, Ken; Pan, Longfa; Xu, Bin; Liu, Wei
2008-12-01
"Quality requirements and handling regulation of archival optical disc for electronic records filing" is released by The State Archives Administration of the People's Republic of China (SAAC) on its network in March 2007. This document established a complete operative managing process for optical disc data storage in archives departments. The quality requirements of the optical disc used in archives departments are stipulated. Quality check of the recorded disc before filing is considered to be necessary and the threshold of the parameter of the qualified filing disc is set down. The handling regulations for the staffs in the archives departments are described. Recommended environment conditions of the disc preservation, recording, accessing and testing are presented. The block error rate of the disc is selected as main monitoring parameter of the lifetime of the filing disc and three classes pre-alarm lines are created for marking of different quality check intervals. The strategy of monitoring the variation of the error rate curve of the filing discs and moving the data to a new disc or a new media when the error rate of the disc reaches the third class pre-alarm line will effectively guarantee the data migration before permanent loss. Only when every step of the procedure is strictly implemented, it is believed that long-term reliability of the data storage on optical disc for archives departments can be effectively ensured.
Technical Report: TG-142 compliant and comprehensive quality assurance tests for respiratory gating
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woods, Kyle; Rong, Yi, E-mail: yrong@ucdavis.edu
2015-11-15
Purpose: To develop and establish a comprehensive gating commissioning and quality assurance procedure in compliance with TG-142. Methods: Eight Varian TrueBeam Linacs were used for this study. Gating commissioning included an end-to-end test and baseline establishment. The end-to-end test was performed using a CIRS dynamic thoracic phantom with a moving cylinder inside the lung, which was used for carrying both optically simulated luminescence detectors (OSLDs) and Gafchromic EBT2 films while the target is moving, for a point dose check and 2D profile check. In addition, baselines were established for beam-on temporal delay and calibration of the surrogate, for both megavoltagemore » (MV) and kilovoltage (kV) beams. A motion simulation device (MotionSim) was used to provide periodic motion on a platform, in synchronizing with a surrogate motion. The overall accuracy and uncertainties were analyzed and compared. Results: The OSLD readings were within 5% compared to the planned dose (within measurement uncertainty) for both phase and amplitude gated deliveries. Film results showed less than 3% agreement to the predicted dose with a standard sinusoid motion. The gate-on temporal accuracy was averaged at 139 ± 10 ms for MV beams and 92 ± 11 ms for kV beams. The temporal delay of the surrogate motion depends on the motion speed and was averaged at 54.6 ± 3.1 ms for slow, 24.9 ± 2.9 ms for intermediate, and 23.0 ± 20.1 ms for fast speed. Conclusions: A comprehensive gating commissioning procedure was introduced for verifying the output accuracy and establishing the temporal accuracy baselines with respiratory gating. The baselines are needed for routine quality assurance tests, as suggested by TG-142.« less
[The quality of medication orders--can it be improved?].
Vaknin, Ofra; Wingart-Emerel, Efrat; Stern, Zvi
2003-07-01
Medication errors are a common cause of morbidity and mortality among patients. Medication administration in hospitals is a complicated procedure with the possibility of error at each step. Errors are most commonly found at the prescription and transcription stages, although it is known that most errors can easily be avoided through strict adherence to standardized procedure guidelines. In examination of medication errors reported in the hospital in the year 2000, we found that 38% reported to have resulted from transcription errors. In the year 2001, the hospital initiated a program designed to identify faulty process of orders in an effort to improve the quality and effectiveness of the medication administration process. As part of this program, it was decided to check and evaluate the quality of the written doctor's orders and the transcription of those orders to the nursing cadre, in various hospital units. The study was conducted using a questionnaire which checked compliance to hospital standards with regard to the medication administration process, as applied to 6 units over the course of 8 weeks. Results of the survey showed poor compliance to guidelines on the part of doctors and nurses. Only 18% of doctors' orders in the study and 37% of the nurses' transcriptions were written according to standards. The Emergency Department showed an even lower compliance with only 3% of doctors' orders and 25% of nurses' transcriptions complying to standards. As a result of this study, it was decided to initiate an intensive in-service teaching course to refresh the staff's knowledge of medication administration guidelines. In the future it is recommended that hand-written orders be replaced by computerized orders in an effort to limit the chance of error.
Technical Report: TG-142 compliant and comprehensive quality assurance tests for respiratory gating.
Woods, Kyle; Rong, Yi
2015-11-01
To develop and establish a comprehensive gating commissioning and quality assurance procedure in compliance with TG-142. Eight Varian TrueBeam Linacs were used for this study. Gating commissioning included an end-to-end test and baseline establishment. The end-to-end test was performed using a CIRS dynamic thoracic phantom with a moving cylinder inside the lung, which was used for carrying both optically simulated luminescence detectors (OSLDs) and Gafchromic EBT2 films while the target is moving, for a point dose check and 2D profile check. In addition, baselines were established for beam-on temporal delay and calibration of the surrogate, for both megavoltage (MV) and kilovoltage (kV) beams. A motion simulation device (MotionSim) was used to provide periodic motion on a platform, in synchronizing with a surrogate motion. The overall accuracy and uncertainties were analyzed and compared. The OSLD readings were within 5% compared to the planned dose (within measurement uncertainty) for both phase and amplitude gated deliveries. Film results showed less than 3% agreement to the predicted dose with a standard sinusoid motion. The gate-on temporal accuracy was averaged at 139±10 ms for MV beams and 92±11 ms for kV beams. The temporal delay of the surrogate motion depends on the motion speed and was averaged at 54.6±3.1 ms for slow, 24.9±2.9 ms for intermediate, and 23.0±20.1 ms for fast speed. A comprehensive gating commissioning procedure was introduced for verifying the output accuracy and establishing the temporal accuracy baselines with respiratory gating. The baselines are needed for routine quality assurance tests, as suggested by TG-142.
Methods to achieve high interrater reliability in data collection from primary care medical records.
Liddy, Clare; Wiens, Miriam; Hogg, William
2011-01-01
We assessed interrater reliability (IRR) of chart abstractors within a randomized trial of cardiovascular care in primary care. We report our findings, and outline issues and provide recommendations related to determining sample size, frequency of verification, and minimum thresholds for 2 measures of IRR: the κ statistic and percent agreement. We designed a data quality monitoring procedure having 4 parts: use of standardized protocols and forms, extensive training, continuous monitoring of IRR, and a quality improvement feedback mechanism. Four abstractors checked a 5% sample of charts at 3 time points for a predefined set of indicators of the quality of care. We set our quality threshold for IRR at a κ of 0.75, a percent agreement of 95%, or both. Abstractors reabstracted a sample of charts in 16 of 27 primary care practices, checking a total of 132 charts with 38 indicators per chart. The overall κ across all items was 0.91 (95% confidence interval, 0.90-0.92) and the overall percent agreement was 94.3%, signifying excellent agreement between abstractors. We gave feedback to the abstractors to highlight items that had a κ of less than 0.70 or a percent agreement less than 95%. No practice had to have its charts abstracted again because of poor quality. A 5% sampling of charts for quality control using IRR analysis yielded κ and agreement levels that met or exceeded our quality thresholds. Using 3 time points during the chart audit phase allows for early quality control as well as ongoing quality monitoring. Our results can be used as a guide and benchmark for other medical chart review studies in primary care.
Remote preenrollment checking of consent forms to reduce nonconformity.
Journot, Valérie; Pérusat-Villetorte, Sophie; Bouyssou, Caroline; Couffin-Cadiergues, Sandrine; Tall, Aminata; Chêne, Geneviève
2013-01-01
In biomedical research, the signed consent form must be checked for compliance with regulatory requirements. Checking usually is performed on site, most frequently after a participant's final enrollment. We piloted a procedure for remote preenrollment consent forms checking. We applied it in five trials and assessed its efficiency to reduce form nonconformity before participant enrollment. Our clinical trials unit (CTU) routinely uses a consent form with an additional copy that contains a pattern that partially masks the participant's name and signature. After completion and signatures by the participant and investigator, this masked copy is faxed to the CTU for checking. In case of detected nonconformity, the CTU suspends the participant's enrollment until the form is brought into compliance. We checked nonconformities of consent forms both remotely before enrollment and on site in five trials conducted in our CTU. We tabulated the number and nature of nonconformities by location of detection: at the CTU or on site. We used these data for a pseudo before-and-after analysis and estimated the efficiency of this remote checking procedure in terms of reduction of nonconformities before enrollment as compared to the standard on-site checking procedure. We searched for nonconformity determinants among characteristics of trials, consent forms, investigator sites, and participants through multivariate logistic regression so as to identify opportunities for improvement in our procedure. Five trials, starting sequentially but running concurrently, with remote preenrollment and on-site checking of consent forms from 415 participants screened in 2006-2009 led to 518 consent forms checked; 94 nonconformities were detected in 75 forms, 75 (80%) remotely and 19 more (20%) on site. Nonconformities infrequently concerned dates of signatures (7%) and information about participants (12%). Most nonconformities dealt with investigator information (76%), primarily contact information (54%). The procedure reduced nonconformities by 81% (95% confidence interval (CI): 73%-89%) before enrollment. Nonconforming consent forms dropped from 25% to 0% over the period, indicating a rapid learning effect between trials. Fewer nonconformities were observed for participants screened later in a trial (odds ratio (95% CI): 0.5 (0.3-0.8); p = 0.004), indicating a learning effect within trials. Nonconformities were more common for participants enrolled after screening (2.4 (1.1-5.3); p = 0.03), indicating a stricter scrutiny by form checkers. Although our study had a pseudo before-and-after design, no major bias was identified. Power and generalizability of our findings were sufficient to support implementation in future trials. This procedure substantially limited nonconformity of consent forms with regulatory requirements before enrollment, thus proving a key component of a risk-based monitoring strategy that has been recommended to optimize resources for clinical research.
Quality Assurance: Patient Chart Reviews
NASA Astrophysics Data System (ADS)
Oginni, B. M.; Odero, D. O.
2009-07-01
Recent developments in radiation therapy have immensely impacted the way the radiation dose is delivered to patients undergoing radiation treatments. However, the fundamental quality assurance (QA) issues underlying the radiation therapy still remain the accuracy of the radiation dose and the radiation safety. One of the major duties of clinical medical physicists in the radiation therapy departments still revolves around ensuring the accuracy of dose delivery to the planning target volume (PTV), the reduction of unintended radiation to normal organs and minimization of the radiation exposure to the medical personnel based on ALARA (as low as reasonably achievable) principle. Many of the errors in radiation therapy can be minimized through a comprehensive program of periodic checks. One of the QA procedures on the patient comes in the form of chart reviews which could be in either electronic or paper-based format. We present the quality assurance procedures that have to be performed on the patient records from the beginning and periodically to the end of the treatment, based on the guidelines from the American Association of Physicists in Medicine (AAPM) and American College of Physicians (ACP).
Healthy participants in phase I clinical trials: the quality of their decision to take part.
Rabin, Cheryl; Tabak, Nili
2006-08-01
This study was set out to test the quality of the decision-making process of healthy volunteers in clinical trials. Researchers fear that the decision to volunteer for clinical trials is taken inadequately and that the signature on the consent forms, meant to affirm that consent was 'informed', is actually insubstantial. The study design was quasi-experimental, using a convenience quota sample. Over a period of a year, candidates were approached during their screening process for a proposed clinical trial, after concluding the required 'Informed Consent' procedure. In all, 100 participants in phase I trials filled out questionnaires based ultimately on the Janis and Mann model of vigilant information processing, during their stay in the research centre. Only 35% of the participants reached a 'quality decision'. There is a definite correlation between information processing and quality decision-making. However, many of the healthy research volunteers (58%) do not seek out information nor check alternatives before making a decision. Full disclosure is essential to a valid informed consent procedure but not sufficient; emphasis must be put on having the information understood and assimilated. Research nurses play a central role in achieving this objective.
A rapid and efficient DNA extraction protocol from fresh and frozen human blood samples.
Guha, Pokhraj; Das, Avishek; Dutta, Somit; Chaudhuri, Tapas Kumar
2018-01-01
Different methods available for extraction of human genomic DNA suffer from one or more drawbacks including low yield, compromised quality, cost, time consumption, use of toxic organic solvents, and many more. Herein, we aimed to develop a method to extract DNA from 500 μL of fresh or frozen human blood. Five hundred microliters of fresh and frozen human blood samples were used for standardization of the extraction procedure. Absorbance at 260 and 280 nm, respectively, (A 260 /A 280 ) were estimated to check the quality and quantity of the extracted DNA sample. Qualitative assessment of the extracted DNA was checked by Polymerase Chain reaction and double digestion of the DNA sample. Our protocol resulted in average yield of 22±2.97 μg and 20.5±3.97 μg from 500 μL of fresh and frozen blood, respectively, which were comparable to many reference protocols and kits. Besides yielding bulk amount of DNA, our protocol is rapid, economical, and avoids toxic organic solvents such as Phenol. Due to unaffected quality, the DNA is suitable for downstream applications. The protocol may also be useful for pursuing basic molecular researches in laboratories having limited funds. © 2017 Wiley Periodicals, Inc.
1987-08-25
Department Of Computer Science University Of Utah Salt Lake City, UT 84112 August 25, 1987 C IO ,c Acessin Tor i Ifl G I . DTIC ?T3 0 ummouned 0C0 Juxtil t...generated for each instance of an alternative operation. One procedure merits special attention. CheckAndCommit(AltListr, g ): INTEGER is called by process P...in figure 2. CheckAndCommit uses a procedure CheckGuard(AltListr, g ): INTEGER that scans th, remote alternative list AltList7 looking for a matching
Gaipa, Giuseppe; Tilenni, Manuela; Straino, Stefania; Burba, Ilaria; Zaccagnini, Germana; Belotti, Daniela; Biagi, Ettore; Valentini, Marco; Perseghin, Paolo; Parma, Matteo; Campli, Cristiana Di; Biondi, Andrea; Capogrossi, Maurizio C; Pompilio, Giulio; Pesce, Maurizio
2010-01-01
Abstract The aim of the present study was to develop and validate a good manufacturing practice (GMP) compliant procedure for the preparation of bone marrow (BM) derived CD133+ cells for cardiovascular repair. Starting from available laboratory protocols to purify CD133+ cells from human cord blood, we implemented these procedures in a GMP facility and applied quality control conditions defining purity, microbiological safety and vitality of CD133+ cells. Validation of CD133+ cells isolation and release process were performed according to a two-step experimental program comprising release quality checking (step 1) as well as ‘proofs of principle’ of their phenotypic integrity and biological function (step 2). This testing program was accomplished using in vitro culture assays and in vivo testing in an immunosuppressed mouse model of hindlimb ischemia. These criteria and procedures were successfully applied to GMP production of CD133+ cells from the BM for an ongoing clinical trial of autologous stem cells administration into patients with ischemic cardiomyopathy. Our results show that GMP implementation of currently available protocols for CD133+ cells selection is feasible and reproducible, and enables the production of cells having a full biological potential according to the most recent quality requirements by European Regulatory Agencies. PMID:19627397
A method of setting limits for the purpose of quality assurance
NASA Astrophysics Data System (ADS)
Sanghangthum, Taweap; Suriyapee, Sivalee; Kim, Gwe-Ya; Pawlicki, Todd
2013-10-01
The result from any assurance measurement needs to be checked against some limits for acceptability. There are two types of limits; those that define clinical acceptability (action limits) and those that are meant to serve as a warning that the measurement is close to the action limits (tolerance limits). Currently, there is no standard procedure to set these limits. In this work, we propose an operational procedure to set tolerance limits and action limits. The approach to establish the limits is based on techniques of quality engineering using control charts and a process capability index. The method is different for tolerance limits and action limits with action limits being categorized into those that are specified and unspecified. The procedure is to first ensure process control using the I-MR control charts. Then, the tolerance limits are set equal to the control chart limits on the I chart. Action limits are determined using the Cpm process capability index with the requirements that the process must be in-control. The limits from the proposed procedure are compared to an existing or conventional method. Four examples are investigated: two of volumetric modulated arc therapy (VMAT) point dose quality assurance (QA) and two of routine linear accelerator output QA. The tolerance limits range from about 6% larger to 9% smaller than conventional action limits for VMAT QA cases. For the linac output QA, tolerance limits are about 60% smaller than conventional action limits. The operational procedure describe in this work is based on established quality management tools and will provide a systematic guide to set up tolerance and action limits for different equipment and processes.
Encoded physics knowledge in checking codes for nuclear cross section libraries at Los Alamos
NASA Astrophysics Data System (ADS)
Parsons, D. Kent
2017-09-01
Checking procedures for processed nuclear data at Los Alamos are described. Both continuous energy and multi-group nuclear data are verified by locally developed checking codes which use basic physics knowledge and common-sense rules. A list of nuclear data problems which have been identified with help of these checking codes is also given.
External audits of electron beams using mailed TLD dosimetry: preliminary results.
Gomola, I; Van Dam, J; Isern-Verdum, J; Verstraete, J; Reymen, R; Dutreix, A; Davis, B; Huyskens, D
2001-02-01
A feasibility study has been performed to investigate the possibility of using mailed thermoluminescence dosimetry (TLD) for external audits of clinical electron beams in Europe. In the frame of the EC Network Project for Quality Assurance in Radiotherapy, instruction sheets and mailing procedures have been defined for mailed TLD dosimetry using the dedicated holder developed by a panel of experts of the International Atomic Energy Agency (IAEA). Three hundred and thirty electron beam set-ups have been checked in the reference centres and some local centres of the EC Network Project and in addition through the centres participating to the EORTC Radiotherapy Group trial 22922. The mean ratio of measured dose to stated dose is 0.2% and the standard deviation of measured dose to stated dose is 3.2%. In seven beam set-ups, deviations greater than 10% were observed (max. 66%), showing the usefulness of these checks. The results of this feasibility study (instruction sheets, mailing procedures, holder) are presently endorsed by the EQUAL-ESTRO structure in order to offer in the future to all ESTRO members the possibility to request external audits of clinical electron beams.
Control Chart on Semi Analytical Weighting
NASA Astrophysics Data System (ADS)
Miranda, G. S.; Oliveira, C. C.; Silva, T. B. S. C.; Stellato, T. B.; Monteiro, L. R.; Marques, J. R.; Faustino, M. G.; Soares, S. M. V.; Ulrich, J. C.; Pires, M. A. F.; Cotrim, M. E. B.
2018-03-01
Semi-analytical balance verification intends to assess the balance performance using graphs that illustrate measurement dispersion, trough time, and to demonstrate measurements were performed in a reliable manner. This study presents internal quality control of a semi-analytical balance (GEHAKA BG400) using control charts. From 2013 to 2016, 2 weight standards were monitored before any balance operation. This work intended to evaluate if any significant difference or bias were presented on weighting procedure over time, to check the generated data reliability. This work also exemplifies how control intervals are established.
Serologic test systems development. Progress report, July 1, 1976--September 30, 1977
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saunders, G.C.; Clinard, E.H.; Bartlett, M.L.
1978-01-01
Work has continued on the development and application of the Enzyme-Labeled Antibody (ELA) test to the USDA needs. Results on trichinosis, brucellosis, and staphylococcal enterotoxin A detection are very encouraging. A field test for trichinosis detection is being worked out in cooperation with Food Safety and Quality Service personnel. Work is in progress with the Technicon Instrument Corporation to develop a modification of their equipment to automatically process samples by the ELA procedure. An automated ELA readout instrument for 96-well trays has been completed and is being checked out.
40 CFR Appendix M to Part 51 - Recommended Test Methods for State Implementation Plans
Code of Federal Regulations, 2014 CFR
2014-07-01
..., and after the run with the cyclone removed. The cyclone is removed before the post-test leak-check to.... 4.1.4.3 Post-Test Leak-Check. A leak-check is required at the conclusion of each sampling run... disturbing the collected sample and use the following procedure to conduct a post-test leak-check. 4.1.4.3.1...
40 CFR Appendix M to Part 51 - Recommended Test Methods for State Implementation Plans
Code of Federal Regulations, 2013 CFR
2013-07-01
..., and after the run with the cyclone removed. The cyclone is removed before the post-test leak-check to.... 4.1.4.3 Post-Test Leak-Check. A leak-check is required at the conclusion of each sampling run... disturbing the collected sample and use the following procedure to conduct a post-test leak-check. 4.1.4.3.1...
40 CFR Appendix M to Part 51 - Recommended Test Methods for State Implementation Plans
Code of Federal Regulations, 2011 CFR
2011-07-01
..., and after the run with the cyclone removed. The cyclone is removed before the post-test leak-check to.... 4.1.4.3 Post-Test Leak-Check. A leak-check is required at the conclusion of each sampling run... disturbing the collected sample and use the following procedure to conduct a post-test leak-check. 4.1.4.3.1...
40 CFR Appendix M to Part 51 - Recommended Test Methods for State Implementation Plans
Code of Federal Regulations, 2012 CFR
2012-07-01
..., and after the run with the cyclone removed. The cyclone is removed before the post-test leak-check to.... 4.1.4.3 Post-Test Leak-Check. A leak-check is required at the conclusion of each sampling run... disturbing the collected sample and use the following procedure to conduct a post-test leak-check. 4.1.4.3.1...
Ruffing, T; Huchzermeier, P; Muhm, M; Winkler, H
2014-05-01
Precise coding is an essential requirement in order to generate a valid DRG. The aim of our study was to evaluate the quality of the initial coding of surgical procedures, as well as to introduce our "hybrid model" of a surgical specialist supervising medical coding and a nonphysician for case auditing. The department's DRG responsible physician as a surgical specialist has profound knowledge both in surgery and in DRG coding. At a Level 1 hospital, 1000 coded cases of surgical procedures were checked. In our department, the DRG responsible physician who is both a surgeon and encoder has proven itself for many years. The initial surgical DRG coding had to be corrected by the DRG responsible physician in 42.2% of cases. On average, one hour per working day was necessary. The implementation of a DRG responsible physician is a simple, effective way to connect medical and business expertise without interface problems. Permanent feedback promotes both medical and economic sensitivity for the improvement of coding quality.
NASA Astrophysics Data System (ADS)
Pastorello, G.; Agarwal, D.; Poindexter, C.; Papale, D.; Trotta, C.; Ribeca, A.; Canfora, E.; Faybishenko, B.; Gunter, D.; Chu, H.
2015-12-01
The fluxes-measuring sites that are part of AmeriFlux are operated and maintained in a fairly independent fashion, both in terms of scientific goals and operational practices. This is also the case for most sites from other networks in FLUXNET. This independence leads to a degree of heterogeneity in the data sets collected at the sites, which is also reflected in data quality levels. The generation of derived data products and data synthesis efforts, two of the main goals of these networks, are directly affected by the heterogeneity in data quality. In a collaborative effort between AmeriFlux and ICOS, a series of quality checks are being conducted for the data sets before any network-level data processing and product generation take place. From these checks, a set of common data issues were identified, and are being cataloged and classified into data quality patterns. These patterns are now being used as a basis for implementing automation for certain data quality checks, speeding up the process of applying the checks and evaluating the data. Currently, most data checks are performed individually in each data set, requiring visual inspection and inputs from a data curator. This manual process makes it difficult to scale the quality checks, creating a bottleneck for the data processing. One goal of the automated checks is to free up time of data curators so they can focus on new or less common issues. As new issues are identified, they can also be cataloged and classified, extending the coverage of existing patterns or potentially generating new patterns, helping both improve existing automated checks and create new ones. This approach is helping make data quality evaluation faster, more systematic, and reproducible. Furthermore, these patterns are also helping with documenting common causes and solutions for data problems. This can help tower teams with diagnosing problems in data collection and processing, and also in correcting historical data sets. In this presentation, using AmeriFlux fluxes and micrometeorological data, we discuss our approach to creating observational data patterns, and how we are using them to implement new automated checks. We also detail examples of these observational data patterns, illustrating how they are being used.
49 CFR 1572.15 - Procedures for HME security threat assessment.
Code of Federal Regulations, 2012 CFR
2012-10-01
... completes includes a fingerprint-based criminal history records check (CHRC), an intelligence-related background check, and a final disposition. (b) Fingerprint-based check. In order to conduct a fingerprint... before the date of the expiration of the HME. (2) Where the State elects to collect fingerprints and...
49 CFR 1572.15 - Procedures for HME security threat assessment.
Code of Federal Regulations, 2013 CFR
2013-10-01
... completes includes a fingerprint-based criminal history records check (CHRC), an intelligence-related background check, and a final disposition. (b) Fingerprint-based check. In order to conduct a fingerprint... before the date of the expiration of the HME. (2) Where the State elects to collect fingerprints and...
49 CFR 1572.15 - Procedures for HME security threat assessment.
Code of Federal Regulations, 2011 CFR
2011-10-01
... completes includes a fingerprint-based criminal history records check (CHRC), an intelligence-related background check, and a final disposition. (b) Fingerprint-based check. In order to conduct a fingerprint... before the date of the expiration of the HME. (2) Where the State elects to collect fingerprints and...
49 CFR 1572.15 - Procedures for HME security threat assessment.
Code of Federal Regulations, 2010 CFR
2010-10-01
... completes includes a fingerprint-based criminal history records check (CHRC), an intelligence-related background check, and a final disposition. (b) Fingerprint-based check. In order to conduct a fingerprint... before the date of the expiration of the HME. (2) Where the State elects to collect fingerprints and...
49 CFR 1572.15 - Procedures for HME security threat assessment.
Code of Federal Regulations, 2014 CFR
2014-10-01
... completes includes a fingerprint-based criminal history records check (CHRC), an intelligence-related background check, and a final disposition. (b) Fingerprint-based check. In order to conduct a fingerprint... before the date of the expiration of the HME. (2) Where the State elects to collect fingerprints and...
Fatigue design procedure for the American SST prototype
NASA Technical Reports Server (NTRS)
Doty, R. J.
1972-01-01
For supersonic airline operations, significantly higher environmental temperature is the primary new factor affecting structural service life. Methods for incorporating the influence of temperature in detailed fatigue analyses are shown along with current test indications. Thermal effects investigated include real-time compared with short-time testing, long-time temperature exposure, and stress-temperature cycle phasing. A method is presented which allows designers and stress analyzers to check fatigue resistance of structural design details. A communicative rating system is presented which defines the relative fatigue quality of the detail so that the analyst can define cyclic-load capability of the design detail by entering constant-life charts for varying detail quality. If necessary then, this system allows the designer to determine ways to improve the fatigue quality for better life or to determine the operating stresses which will provide the required service life.
Young, Stacie T.M.; Ball, Marcael T.J.
2004-01-01
Storm runoff water-quality samples were collected as part of the State of Hawaii Department of Transportation Stormwater Monitoring Program. This program is designed to assess the effects of highway runoff and urban runoff on Halawa Stream. For this program, rainfall data were collected at two sites, continuous streamflow data at three sites, and water-quality data at five sites, which include the three streamflow sites. This report summarizes rainfall, streamflow, and water-quality data collected between July 1, 2003 and June 30, 2004. A total of 30 samples was collected over four storms during July 1, 2003 to June 30, 2004. In general, an attempt was made to collect grab samples nearly simultaneously at all five sites, and flow-weighted time-composite samples were collected at the three sites equipped with automatic samplers. However, all four storms were partially sampled because either not all stations were sampled or only grab samples were collected. Samples were analyzed for total suspended solids, total dissolved solids, nutrients, chemical oxygen demand, and selected trace metals (cadmium, copper, lead, and zinc). Grab samples were additionally analyzed for oil and grease, total petroleum hydrocarbons, fecal coliform, and biological oxygen demand. Quality-assurance/quality-control samples, collected during storms and during routine maintenance, were also collected to verify analytical procedures and check the effectiveness of equipment-cleaning procedures.
Efficiency of the strong satisfiability checking procedure for reactive system specifications
NASA Astrophysics Data System (ADS)
Shimakawa, Masaya; Hagihara, Shigeki; Yonezaki, Naoki
2018-04-01
Reactive systems are those that interact with their environment. To develop reactive systems without defects, it is important to describe behavior specifications in a formal language, such as linear temporal logic, and to verify the specification. Specifically, it is important to check whether specifications satisfy the property called realizability. In previous studies, we have proposed the concept of strong satisfiability as a necessary condition for realizability. Although this property of reactive system specifications is a necessary condition, many practical unrealizable specifications are also strongly unsatisfiable. Moreover, we have previously shown the theoretical complexity of the strong satisfiability problem. In the current study, we investigate the practical efficiency of the strong satisfiability checking procedure and demonstrate that strong satisfiability can be checked more efficiently than realizability.
Validating LES for Jet Aeroacoustics
NASA Technical Reports Server (NTRS)
Bridges, James; Wernet, Mark P.
2011-01-01
Engineers charged with making jet aircraft quieter have long dreamed of being able to see exactly how turbulent eddies produce sound and this dream is now coming true with the advent of large eddy simulation (LES). Two obvious challenges remain: validating the LES codes at the resolution required to see the fluid-acoustic coupling, and the interpretation of the massive datasets that are produced. This paper addresses the former, the use of advanced experimental techniques such as particle image velocimetry (PIV) and Raman and Rayleigh scattering, to validate the computer codes and procedures used to create LES solutions. This paper argues that the issue of accuracy of the experimental measurements be addressed by cross-facility and cross-disciplinary examination of modern datasets along with increased reporting of internal quality checks in PIV analysis. Further, it argues that the appropriate validation metrics for aeroacoustic applications are increasingly complicated statistics that have been shown in aeroacoustic theory to be critical to flow-generated sound, such as two-point space-time velocity correlations. A brief review of data sources available is presented along with examples illustrating cross-facility and internal quality checks required of the data before it should be accepted for validation of LES.
Behera, M D; Gupta, A K; Barik, S K; Das, P; Panda, R M
2018-06-15
With the availability of satellite data from free data domain, remote sensing has increasingly become a fast-hand tool for monitoring of land and water resources development activities with minimal cost and time. Here, we verified construction of check dams and implementation of plantation activities in two districts of Tripura state using Landsat and Sentinel-2 images for the years 2008 and 2016-2017, respectively. We applied spectral reflectance curves and index-based proxies to quantify these activities for two time periods. A subset of the total check dams and plantation sites was chosen on the basis of site condition, nature of check dams, and planted species for identification on satellite images, and another subset was randomly chosen to validate identification procedure. The normalized difference water index (NDWI) derived from Landsat and Senitnel-2 were used to quantify water area evolved, qualify the water quality, and influence of associated tree shadows. Three types of check dams were observed, i.e., full, partial, and fully soil exposed on the basis of the presence of grass or scrub on the check dams. Based on the nature of check dam and site characteristics, we classified the water bodies under 11-categories using six interpretation keys (size, shape, water depth, quality, shadow of associated trees, catchment area). The check dams constructed on existing narrow gullies totally covered by branches or associated plants were not identified without field verification. Further, use of EVI enabled us to approve the plantation activities and adjudge the corresponding increase in vegetation vigor. The plantation activities were established based on the presence and absence of existing vegetation. Clearing on the plantation sites for plantation shows differential increase in EVI values during the initial years. The 403 plantation sites were categorized into 12 major groups on the basis of presence of dominant species and site conditions. The dominant species were Areca catechu, Musa paradisiaca, Ananas comosus, Bambusa sp., and mix plantation of A. catechu and M. paradisiaca. However, the highest maximum increase in average EVI was observed for the pine apple plantation sites (0.11), followed by Bambussa sp. (0.10). These sites were fully covered with plantation without any exposed soil. The present study successfully demonstrates a satellite-based survey supplemented with ground information evaluating the changes in vegetation profile due to plantation activities, locations of check dams, extent of water bodies, downstream irrigation, and catchment area of water bodies.
The Latest Information on Fort Detrick Gate Access Procedures | Poster
As of Jan. 5, all visitors to Fort Detrick are required to undergo a National Crime Information Center background check prior to entering base. The background checks are conducted at Old Farm Gate. The new access procedures may cause delays at all Fort Detrick gates, but especially at Old Farm Gate. Access requirements have not changed for employees and personnel with a
40 CFR 92.121 - Oxides of nitrogen analyzer calibration and check.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 21 2012-07-01 2012-07-01 false Oxides of nitrogen analyzer... Procedures § 92.121 Oxides of nitrogen analyzer calibration and check. (a) Quench checks; NO X analyzer. (1... performed in step in paragraph (a)(3)(i) this section. (b) Oxides of nitrogen analyzer calibration. (1...
40 CFR 92.121 - Oxides of nitrogen analyzer calibration and check.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 20 2014-07-01 2013-07-01 true Oxides of nitrogen analyzer calibration... Procedures § 92.121 Oxides of nitrogen analyzer calibration and check. (a) Quench checks; NO X analyzer. (1... performed in step in paragraph (a)(3)(i) this section. (b) Oxides of nitrogen analyzer calibration. (1...
40 CFR 92.121 - Oxides of nitrogen analyzer calibration and check.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Oxides of nitrogen analyzer... Procedures § 92.121 Oxides of nitrogen analyzer calibration and check. (a) Quench checks; NO X analyzer. (1... performed in step in paragraph (a)(3)(i) this section. (b) Oxides of nitrogen analyzer calibration. (1...
40 CFR 92.121 - Oxides of nitrogen analyzer calibration and check.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 21 2013-07-01 2013-07-01 false Oxides of nitrogen analyzer... Procedures § 92.121 Oxides of nitrogen analyzer calibration and check. (a) Quench checks; NO X analyzer. (1... performed in step in paragraph (a)(3)(i) this section. (b) Oxides of nitrogen analyzer calibration. (1...
40 CFR 51.363 - Quality assurance.
Code of Federal Regulations, 2011 CFR
2011-07-01
... test, the evaporative system tests, and emission control component checks (as applicable); (vi...) A check of the Constant Volume Sampler flow calibration; (5) A check for the optimization of the... selection, and power absorption; (9) A check of the system's ability to accurately detect background...
Data quality assessment for comparative effectiveness research in distributed data networks
Brown, Jeffrey; Kahn, Michael; Toh, Sengwee
2015-01-01
Background Electronic health information routinely collected during healthcare delivery and reimbursement can help address the need for evidence about the real-world effectiveness, safety, and quality of medical care. Often, distributed networks that combine information from multiple sources are needed to generate this real-world evidence. Objective We provide a set of field-tested best practices and a set of recommendations for data quality checking for comparative effectiveness research (CER) in distributed data networks. Methods Explore the requirements for data quality checking and describe data quality approaches undertaken by several existing multi-site networks. Results There are no established standards regarding how to evaluate the quality of electronic health data for CER within distributed networks. Data checks of increasing complexity are often employed, ranging from consistency with syntactic rules to evaluation of semantics and consistency within and across sites. Temporal trends within and across sites are widely used, as are checks of each data refresh or update. Rates of specific events and exposures by age group, sex, and month are also common. Discussion Secondary use of electronic health data for CER holds promise but is complex, especially in distributed data networks that incorporate periodic data refreshes. The viability of a learning health system is dependent on a robust understanding of the quality, validity, and optimal secondary uses of routinely collected electronic health data within distributed health data networks. Robust data quality checking can strengthen confidence in findings based on distributed data network. PMID:23793049
2018-01-01
Objectives To quality assure a Trusted Third Party linked data set to prepare it for analysis. Setting Birth registration and notification records from the Office for National Statistics for all births in England 2005–2014 linked to Maternity Hospital Episode Statistics (HES) delivery records by NHS Digital using mothers’ identifiers. Participants All 6 676 912 births that occurred in England from 1 January 2005 to 31 December 2014. Primary and secondary outcome measures Every link between a registered birth and an HES delivery record for the study period was categorised as either the same baby or a different baby to the same mother, or as a wrong link, by comparing common baby data items and valid values in key fields with stepwise deterministic rules. Rates of preserved and discarded links were calculated and which features were more common in each group were assessed. Results Ninety-eight per cent of births originally linked to HES were left with one preserved link. The majority of discarded links were due to duplicate HES delivery records. Of the 4854 discarded links categorised as wrong links, clerical checks found 85% were false-positives links, 13% were quality assurance false negatives and 2% were undeterminable. Births linked using a less reliable stage of the linkage algorithm, births at home and in the London region, and with birth weight or gestational age values missing in HES were more likely to have all links discarded. Conclusions Linkage error, data quality issues, and false negatives in the quality assurance procedure were uncovered. The procedure could be improved by allowing for transposition in date fields, and more discrimination between missing and differing values. The availability of identifiers in the datasets supported clerical checking. Other research using Trusted Third Party linkage should not assume the linked dataset is error-free or optimised for their analysis, and allow sufficient resources for this. PMID:29500200
Developing an approach for teaching and learning about Lewis structures
NASA Astrophysics Data System (ADS)
Kaufmann, Ilana; Hamza, Karim M.; Rundgren, Carl-Johan; Eriksson, Lars
2017-08-01
This study explores first-year university students' reasoning as they learn to draw Lewis structures. We also present a theoretical account of the formal procedure commonly taught for drawing these structures. Students' discussions during problem-solving activities were video recorded and detailed analyses of the discussions were made through the use of practical epistemology analysis (PEA). Our results show that the formal procedure was central for drawing Lewis structures, but its use varied depending on situational aspects. Commonly, the use of individual steps of the formal procedure was contingent on experiences of chemical structures, and other information such as the characteristics of the problem given. The analysis revealed a number of patterns in how students constructed, checked and modified the structure in relation to the formal procedure and the situational aspects. We suggest that explicitly teaching the formal procedure as a process of constructing, checking and modifying might be helpful for students learning to draw Lewis structures. By doing so, the students may learn to check the accuracy of the generated structure not only in relation to the octet rule and formal charge, but also to other experiences that are not explicitly included in the formal procedure.
25 CFR 224.63 - What provisions must a TERA contain?
Code of Federal Regulations, 2010 CFR
2010-04-01
... RESOURCE AGREEMENTS UNDER THE INDIAN TRIBAL ENERGY DEVELOPMENT AND SELF DETERMINATION ACT Procedures for... cancelled checks; cash receipt vouchers; copies of money orders or cashiers checks; or verification of...
NASA Technical Reports Server (NTRS)
Hooker, Stanford B. (Editor); Firestone, Elaine R. (Editor); McClain, Charles R.; Darzi, Michael; Barnes, Robert A.; Eplee, Robert E.; Firestone, James K.; Patt, Frederick S.; Robinson, Wayne D.; Schieber, Brian D.;
1996-01-01
This document provides five brief reports that address several quality control procedures under the auspices of the Calibration and Validation Element (CVE) within the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) Project. Chapter 1 describes analyses of the 32 sensor engineering telemetry streams. Anomalies in any of the values may impact sensor performance in direct or indirect ways. The analyses are primarily examinations of parameter time series combined with statistical methods such as auto- and cross-correlation functions. Chapter 2 describes how the various onboard (solar and lunar) and vicarious (in situ) calibration data will be analyzed to quantify sensor degradation, if present. The analyses also include methods for detecting the influence of charged particles on sensor performance such as might be expected in the South Atlantic Anomaly (SAA). Chapter 3 discusses the quality control of the ancillary environmental data that are routinely received from other agencies or projects which are used in the atmospheric correction algorithm (total ozone, surface wind velocity, and surface pressure; surface relative humidity is also obtained, but is not used in the initial operational algorithm). Chapter 4 explains the procedures for screening level-, level-2, and level-3 products. These quality control operations incorporate both automated and interactive procedures which check for file format errors (all levels), navigation offsets (level-1), mask and flag performance (level-2), and product anomalies (all levels). Finally, Chapter 5 discusses the match-up data set development for comparing SeaWiFS level-2 derived products with in situ observations, as well as the subsequent outlier analyses that will be used for evaluating error sources.
Patrick, Hannah; Sims, Andrew; Burn, Julie; Bousfield, Derek; Colechin, Elaine; Reay, Christopher; Alderson, Neil; Goode, Stephen; Cunningham, David; Campbell, Bruce
2013-03-01
New devices and procedures are often introduced into health services when the evidence base for their efficacy and safety is limited. The authors sought to assess the availability and accuracy of routinely collected Hospital Episodes Statistics (HES) data in the UK and their potential contribution to the monitoring of new procedures. Four years of HES data (April 2006-March 2010) were analysed to identify episodes of hospital care involving a sample of 12 new interventional procedures. HES data were cross checked against other relevant sources including national or local registers and manufacturers' information. HES records were available for all 12 procedures during the entire study period. Comparative data sources were available from national (5), local (2) and manufacturer (2) registers. Factors found to affect comparisons were miscoding, alternative coding and inconsistent use of subsidiary codes. The analysis of provider coverage showed that HES is sensitive at detecting centres which carry out procedures, but specificity is poor in some cases. Routinely collected HES data have the potential to support quality improvements and evidence-based commissioning of devices and procedures in health services but achievement of this potential depends upon the accurate coding of procedures.
Take the Reins on Model Quality with ModelCHECK and Gatekeeper
NASA Technical Reports Server (NTRS)
Jones, Corey
2012-01-01
Model quality and consistency has been an issue for us due to the diverse experience level and imaginative modeling techniques of our users. Fortunately, setting up ModelCHECK and Gatekeeper to enforce our best practices has helped greatly, but it wasn't easy. There were many challenges associated with setting up ModelCHECK and Gatekeeper including: limited documentation, restrictions within ModelCHECK, and resistance from end users. However, we consider ours a success story. In this presentation we will describe how we overcame these obstacles and present some of the details of how we configured them to work for us.
STS-34 onboard view of iodine comparator assembly used to check water quality
NASA Technical Reports Server (NTRS)
1989-01-01
STS-34 closeup view taken onboard Atlantis, Orbiter Vehicle (OV) 104, is of the iodine comparator assembly. Potable water quality is checked by comparing the water color to the color chart on the surrounding board.
NASA Technical Reports Server (NTRS)
Brenton, James C.; Barbre, Robert E.; Orcutt, John M.; Decker, Ryan K.
2018-01-01
The National Aeronautics and Space Administration's (NASA) Marshall Space Flight Center (MSFC) Natural Environments Branch (EV44) has provided atmospheric databases and analysis in support of space vehicle design and day-of-launch operations for NASA and commercial launch vehicle programs launching from the NASA Kennedy Space Center (KSC), co-located on the United States Air Force's Eastern Range (ER) at the Cape Canaveral Air Force Station. The ER is one of the most heavily instrumented sites in the United States measuring various atmospheric parameters on a continuous basis. An inherent challenge with the large databases that EV44 receives from the ER consists of ensuring erroneous data are removed from the databases, and thus excluded from launch vehicle design analyses. EV44 has put forth great effort in developing quality control (QC) procedures for individual meteorological instruments; however, no standard QC procedures for all databases currently exist resulting in QC databases that have inconsistencies in variables, methodologies, and periods of record. The goal of this activity is to use the previous efforts by EV44 to develop a standardized set of QC procedures from which to build flags within the meteorological databases from KSC and the ER, while maintaining open communication with end users from the launch community to develop ways to improve, adapt and grow the QC database. Details of the QC checks are described. The flagged data points will be plotted in a graphical user interface (GUI) as part of a manual confirmation that the flagged data do indeed need to be removed from the archive. As the rate of launches increases with additional launch vehicle programs, more emphasis is being placed to continually update and check weather databases for data quality before use in launch vehicle design and certification analyses.
Development of a TLD mailed system for remote dosimetry audit for (192)Ir HDR and PDR sources.
Roué, Amélie; Venselaar, Jack L M; Ferreira, Ivaldo H; Bridier, André; Van Dam, Jan
2007-04-01
In the framework of an ESTRO ESQUIRE project, the BRAPHYQS Physics Network and the EQUAL-ESTRO laboratory have developed a procedure for checking the absorbed dose to water in the vicinity of HDR or PDR sources using a mailed TLD system. The methodology and the materials used in the procedure are based on the existing EQUAL-ESTRO external radiotherapy dose checks. A phantom for TLD postal dose assurance service, adapted to accept catheters from different HDR afterloaders, has been developed. The phantom consists of three PMMA tubes supporting catheters placed at 120 degrees around a central TLD holder. A study on the use of LiF powder type DTL 937 (Philitech) has been performed in order to establish the TLD calibration in dose-to-water at a given distance from (192)Ir source, as well as to determine all correction factors to convert the TLD reading into absorbed dose to water. The dosimetric audit is based on the comparison between the dose to water measured with the TL dosimeter and the dose calculated by the clinical TPS. Results of the audits are classified in four different levels depending on the ratio of the measured dose to the stated dose. The total uncertainty budget in the measurement of the absorbed dose to water using TLD near an (192)Ir HDR source, including TLD reading, correction factors and TLD calibration coefficient, is determined as 3.27% (1s). To validate the procedures, the external audit was first tested among the members of the BRAPHYQS Network. Since November 2004, the test has been made available for use by all European brachytherapy centres. To date, 11 centres have participated in the checks and the results obtained are very encouraging. Nevertheless, one error detected has shown the usefulness of this audit. A method of absorbed dose to water determination in the vicinity of an (192)Ir brachytherapy source was developed for the purpose of a mailed TL dosimetry system. The accuracy of the procedure was determined. This method allows a check of the whole dosimetry chain for this type of brachytherapy afterloading system and can easily be performed by mail to any institution in the European area and elsewhere. Such an external audit can be an efficient QC method complementary to internal quality control as it can reveal some errors which are not observable by other means.
Double checking medicines: defence against error or contributory factor?
Armitage, Gerry
2008-08-01
The double checking of medicines in health care is a contestable procedure. It occupies an obvious position in health care practice and is understood to be an effective defence against medication error but the process is variable and the outcomes have not been exposed to testing. This paper presents an appraisal of the process using data from part of a larger study on the contributory factors in medication errors and their reporting. Previous research studies are reviewed; data are analysed from a review of 991 drug error reports and a subsequent series of 40 in-depth interviews with health professionals in an acute hospital in northern England. The incident reports showed that errors occurred despite double checking but that action taken did not appear to investigate the checking process. Most interview participants (34) talked extensively about double checking but believed the process to be inconsistent. Four key categories were apparent: deference to authority, reduction of responsibility, automatic processing and lack of time. Solutions to the problems were also offered, which are discussed with several recommendations. Double checking medicines should be a selective and systematic procedure informed by key principles and encompassing certain behaviours. Psychological research may be instructive in reducing checking errors but the aviation industry may also have a part to play in increasing error wisdom and reducing risk.
[Investigation of Elekta linac characteristics for VMAT].
Luo, Guangwen; Zhang, Kunyi
2012-01-01
The aim of this study is to investigate the characteristics of Elekta delivery system for volumetric modulated arc therapy (VMAT). Five VMAT plans were delivered in service mode and dose rates, and speed of gantry and MLC leaves were analyzed by log files. Results showed that dose rates varied between 6 dose rates. Gantry and MLC leaf speed dynamically varied during delivery. The technique of VMAT requires linac to dynamically control more parameters, and these key dynamic variables during VMAT delivery can be checked by log files. Quality assurance procedure should be carried out for VMAT related parameter.
ERIC Educational Resources Information Center
Miller, Leila Mullooly
2013-01-01
Though preliminary research indicates Check-In/Check-Out (CICO) is an effective intervention for improving problematic behavior in a variety of populations, the literature is limited in several ways. Several studies have relied on indirect measures of behavior, such as office discipline referrals (ODRs) and teacher ratings, to determine the…
Mielmann, P
2001-08-01
The EU conditions for the approval of border inspection posts are explained, just as the modalities of checking (documentary check, identity check, physical examination) and the information and communication systems (ANIMO, Rapid Alerts, EU Internet). The differences in procedure between import, transit and some specials are outlined. An overview is given of the volume, kinds of products, origins and the import figures relating to the transit figures. The results of the checks are commented.
Young, Stacie T.M.; Ball, Marcael T.J.
2005-01-01
Storm runoff water-quality samples were collected as part of the State of Hawaii Department of Transportation Stormwater Monitoring Program. This program is designed to assess the effects of highway runoff and urban runoff on Halawa Stream. For this program, rainfall data were collected at two stations, continuous streamflow data at two stations, and water-quality data at five stations, which include the two continuous streamflow stations. This report summarizes rainfall, streamflow, and water-quality data collected between July 1, 2004 and June 30, 2005. A total of 15 samples was collected over three storms during July 1, 2004 to June 30, 2005. In general, an attempt was made to collect grab samples nearly simultaneously at all five stations and flow-weighted time-composite samples at the three stations equipped with automatic samplers. However, all three storms were partially sampled because either not all stations were sampled or not all composite samples were collected. Samples were analyzed for total suspended solids, total dissolved solids, nutrients, chemical oxygen demand, and selected trace metals (cadmium, chromium, copper, lead, nickel, and zinc). Chromium and nickel were added to the analysis starting October 1, 2004. Grab samples were additionally analyzed for oil and grease, total petroleum hydrocarbons, fecal coliform, and biological oxygen demand. Quality-assurance/quality-control samples were also collected during storms and during routine maintenance to verify analytical procedures and check the effectiveness of equipment-cleaning procedures.
Quality Assurance in Breast Health Care and Requirement for Accreditation in Specialized Units
Güler, Sertaç Ata; Güllüoğlu, Bahadır M.
2014-01-01
Breast health is a subject of increasing importance. The statistical increase in the frequency of breast cancer and the consequent increase in death rate increase the importance of quality of services to be provided for breast health. For these reasons, the minimum standards and optimum quality metrics of breast care provided to the community are determined. The quality parameters for breast care service include the results, the structure and the operation of services. Within this group, the results of breast health services are determined according to clinical results, patient satisfaction and financial condition. The structure of quality services should include interdisciplinary meetings, written standards for specific procedures and the existence of standardized reporting systems. Establishing breast centers that adopt integrated multidisciplinary working principles and their cost-effective maintenance are important in terms of operation of breast health services. The importance of using a “reviewing/auditing” procedure that checks if all of these functions existing in the health system are carried out at the desired level and an “accreditation” system indicating that the working breast units/centers provide minimum quality adequacy in all aspects, is undeniable. Currently, the accreditation system for breast centers is being used in the European Union and the United States for the last 5–10 years. This system is thought to provide standardization in breast care services, and is accepted as one of the important factors that resulted in reduction in mortality associated with breast cancer. PMID:28331658
Quality Assurance in Breast Health Care and Requirement for Accreditation in Specialized Units.
Güler, Sertaç Ata; Güllüoğlu, Bahadır M
2014-07-01
Breast health is a subject of increasing importance. The statistical increase in the frequency of breast cancer and the consequent increase in death rate increase the importance of quality of services to be provided for breast health. For these reasons, the minimum standards and optimum quality metrics of breast care provided to the community are determined. The quality parameters for breast care service include the results, the structure and the operation of services. Within this group, the results of breast health services are determined according to clinical results, patient satisfaction and financial condition. The structure of quality services should include interdisciplinary meetings, written standards for specific procedures and the existence of standardized reporting systems. Establishing breast centers that adopt integrated multidisciplinary working principles and their cost-effective maintenance are important in terms of operation of breast health services. The importance of using a "reviewing/auditing" procedure that checks if all of these functions existing in the health system are carried out at the desired level and an "accreditation" system indicating that the working breast units/centers provide minimum quality adequacy in all aspects, is undeniable. Currently, the accreditation system for breast centers is being used in the European Union and the United States for the last 5-10 years. This system is thought to provide standardization in breast care services, and is accepted as one of the important factors that resulted in reduction in mortality associated with breast cancer.
Training and quality assurance with the Structured Clinical Interview for DSM-IV (SCID-I/P).
Ventura, J; Liberman, R P; Green, M F; Shaner, A; Mintz, J
1998-06-15
Accuracy in psychiatric diagnosis is critical for evaluating the suitability of the subjects for entry into research protocols and for establishing comparability of findings across study sites. However, training programs in the use of diagnostic instruments for research projects are not well systematized. Furthermore, little information has been published on the maintenance of interrater reliability of diagnostic assessments. At the UCLA Research Center for Major Mental Illnesses, a Training and Quality Assurance Program for SCID interviewers was used to evaluate interrater reliability and diagnostic accuracy. Although clinically experienced interviewers achieved better interrater reliability and overall diagnostic accuracy than neophyte interviewers, both groups were able to achieve and maintain high levels of interrater reliability, diagnostic accuracy, and interviewer skill. At the first quality assurance check after training, there were no significant differences between experienced and neophyte interviewers in interrater reliability or diagnostic accuracy. Standardization of training and quality assurance procedures within and across research projects may make research findings from study sites more comparable.
Leonardi, Michael J; McGory, Marcia L; Ko, Clifford Y
2007-09-01
To explore hospital comparison Web sites for general surgery based on: (1) a systematic Internet search, (2) Web site quality evaluation, and (3) exploration of possible areas of improvement. A systematic Internet search was performed to identify hospital quality comparison Web sites in September 2006. Publicly available Web sites were rated on accessibility, data/statistical transparency, appropriateness, and timeliness. A sample search was performed to determine ranking consistency. Six national hospital comparison Web sites were identified: 1 government (Hospital Compare [Centers for Medicare and Medicaid Services]), 2 nonprofit (Quality Check [Joint Commission on Accreditation of Healthcare Organizations] and Hospital Quality and Safety Survey Results [Leapfrog Group]), and 3 proprietary sites (names withheld). For accessibility and data transparency, the government and nonprofit Web sites were best. For appropriateness, the proprietary Web sites were best, comparing multiple surgical procedures using a combination of process, structure, and outcome measures. However, none of these sites explicitly defined terms such as complications. Two proprietary sites allowed patients to choose ranking criteria. Most data on these sites were 2 years old or older. A sample search of 3 surgical procedures at 4 hospitals demonstrated significant inconsistencies. Patients undergoing surgery are increasingly using the Internet to compare hospital quality. However, a review of available hospital comparison Web sites shows suboptimal measures of quality and inconsistent results. This may be partially because of a lack of complete and timely data. Surgeons should be involved with quality comparison Web sites to ensure appropriate methods and criteria.
Heudorf, U; Gasteyer, S; Samoiski, Y; Voigt, K
2012-08-01
Due to the Infectious Disease Prevention Act, public health services in Germany are obliged to check the infection prevention in hospitals and other medical facilities as well as in nursing homes. In Frankfurt/Main, Germany, standardized control visits have been performed for many years. In 2011 focus was laid on cleaning and disinfection of surfaces. All 41 nursing homes were checked according to a standardized checklist covering quality of structure (i.e. staffing, hygiene concept), quality of process (observation of the cleaning processes in the homes) and quality of output, which was monitored by checking the cleaning of fluorescent marks which had been applied some days before and should have been removed via cleaning in the following days before the final check. In more than two thirds of the homes, cleaning personnel were salaried, in one third external personnel were hired. Of the homes 85% provided service clothing and all of them offered protective clothing. All homes had established hygiene and cleaning concepts, however, in 15% of the homes concepts for the handling of Norovirus and in 30% concepts for the handling of Clostridium difficile were missing. Regarding process quality only half of the processes observed, i.e. cleaning of hand contact surfaces, such as handrails, washing areas and bins, were correct. Only 44% of the cleaning controls were correct with enormous differences between the homes (0-100%). The correlation between quality of process and quality of output was significant. There was good quality of structure in the homes but regarding quality of process and outcome there was great need for improvement. This was especially due to faults in communication and coordination between cleaning personnel and nursing personnel. Quality outcome was neither associated with the number of the places for residents nor with staffing. Thus, not only quality of structure but also quality of process and outcome should be checked by the public health services.
Fast and global authenticity screening of honey using ¹H-NMR profiling.
Spiteri, Marc; Jamin, Eric; Thomas, Freddy; Rebours, Agathe; Lees, Michèle; Rogers, Karyne M; Rutledge, Douglas N
2015-12-15
An innovative analytical approach was developed to tackle the most common adulterations and quality deviations in honey. Using proton-NMR profiling coupled to suitable quantification procedures and statistical models, analytical criteria were defined to check the authenticity of both mono- and multi-floral honey. The reference data set used was a worldwide collection of more than 800 honeys, covering most of the economically significant botanical and geographical origins. Typical plant nectar markers can be used to check monofloral honey labeling. Spectral patterns and natural variability were established for multifloral honeys, and marker signals for sugar syrups were identified by statistical comparison with a commercial dataset of ca. 200 honeys. Although the results are qualitative, spiking experiments have confirmed the ability of the method to detect sugar addition down to 10% levels in favorable cases. Within the same NMR experiments, quantification of glucose, fructose, sucrose and 5-HMF (regulated parameters) was performed. Finally markers showing the onset of fermentation are described. Copyright © 2014 Elsevier Ltd. All rights reserved.
Quality assurance in melanoma surgery: The evolving experience at a large tertiary referral centre.
Read, R L; Pasquali, S; Haydu, L; Thompson, J F; Stretch, J R; Saw, R P M; Quinn, M J; Shannon, K; Spillane, A J
2015-07-01
The quality of melanoma surgery needs to be assessed by oncological outcome and complication rates. There is no published consensus on complication rates for common melanoma surgeries, namely wide excision (WE), sentinel node biopsy (SNB) and regional lymph node dissection (RLND). Consequently there are no agreed standards by which surgeons can audit their practices. Surgical standards were proposed in 2008 following review of the literature and from expert opinion. Melanoma Institute Australia (MIA) self-reported audit data from 2011 and 2012 were compared with these standards. To quality check the self-reported audit, RLND data were extracted from the MIA database. Six surgeons performed a mean of 568 surgeries each quarter; with a mean of 106 major procedures. Following WE with primary closure or flap repair, wound infection or dehiscence occurred in <1% of cases. When skin grafting was required non-take of >20% of the grafted area was observed in 5.9% of cases. Following SNB wound infection and significant seroma occurred in 1.8% of cases. RLND node counts were below the 90% standard in 4 of 409 procedures. In comparison, data extraction identified 405 RLNDs, with node counts below the 90% standard in eight procedures. Two of these patients had previously undergone surgery removing nodes from the field and two had gross coalescing disease with extensive extra-nodal spread. The quality standards proposed in 2008 have been validated long-term by high volume caseloads. The data presented provide standards by which melanoma surgeons can audit their surgical performance. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Watson, Robert A.
1991-01-01
Approximate solutions of static and dynamic beam problems by the p-version of the finite element method are investigated. Within a hierarchy of engineering beam idealizations, rigorous formulations of the strain and kinetic energies for straight and circular beam elements are presented. These formulations include rotating coordinate system effects and geometric nonlinearities to allow for the evaluation of vertical axis wind turbines, the motivating problem for this research. Hierarchic finite element spaces, based on extensions of the polynomial orders used to approximate the displacement variables, are constructed. The developed models are implemented into a general purpose computer program for evaluation. Quality control procedures are examined for a diverse set of sample problems. These procedures include estimating discretization errors in energy norm and natural frequencies, performing static and dynamic equilibrium checks, observing convergence for qualities of interest, and comparison with more exacting theories and experimental data. It is demonstrated that p-extensions produce exponential rates of convergence in the approximation of strain energy and natural frequencies for the class of problems investigated.
Design Considerations for Human Rating of Liquid Rocket Engines
NASA Technical Reports Server (NTRS)
Parkinson, Douglas
2010-01-01
I.Human-rating is specific to each engine; a. Context of program/project must be understood. b. Engine cannot be discussed independently from vehicle and mission. II. Utilize a logical combination of design, manufacturing, and test approaches a. Design 1) It is crucial to know the potential ways a system can fail, and how a failure can propagate; 2) Fault avoidance, fault tolerance, DFMR, caution and warning all have roles to play. b. Manufacturing and Assembly; 1) As-built vs. as-designed; 2) Review procedures for assembly and maintenance periodically; and 3) Keep personnel trained and certified. c. There is no substitute for test: 1) Analytical tools are constantly advancing, but still need test data for anchoring assumptions; 2) Demonstrate robustness and explore sensitivities; 3) Ideally, flight will be encompassed by ground test experience. III. Consistency and repeatability is key in production a. Maintain robust processes and procedures for inspection and quality control based upon development and qualification experience; b. Establish methods to "spot check" quality and consistency in parts: 1) Dedicated ground test engines; 2) Random components pulled from the line/lot to go through "enhanced" testing.
Rizvi, Zainab; Usmani, Rabia Arshed; Rizvi, Amna; Wazir, Salim; Zahra, Taskeen; Rasool, Hafza
2017-01-01
Quality of any service is the most important aspect for the manufacturer as well as the consumer. The primary objective of any nation's health system is to provide supreme quality health care services to its patients. The objective of this study was to assess the quality of diagnostic fine needle aspiration cytology service in a tertiary care hospital. As Patient's perspectives provide valuable information on quality of process, therefore, patient's perception in terms of satisfaction with the service was measured. In this cross sectional analytical study, 291 patients undergoing fine needle aspiration cytology in Mayo Hospital were selected by systematic sampling technique. Information regarding satisfaction of patients with four dimensions of service quality process, namely "procedure, sterilization, conduct and competency of doctor" was collected through interview on questionnaire. The questionnaire was developed on SERVQUAL model, a measurement tool, for quality assessment of services provided to patients. All items were assessed on 2- point likert scale (0=dissatisfied, 1=satisfied). Frequencies and percentages of satisfied and dissatisfied patients were recorded for each item and all items in each dimension were scored. If the percentage of sum of all item scores of a dimension was ≥60, the dimension was 'good quality'. Whereas <60% was 'poor quality' dimension. Data was analysed using epi-info-3.5.1. Fisher test was applied to check statistical significance. (p-value <0.05). Out of the 4 dimensions of service quality process, Procedure (48.8%), Sterilization (51.5%) and practitioner conduct (50.9%) were perceived as 'poor' by the patients. Only practitioner competency (67.4%) was perceived as 'good'. Comparison of dimensions of service quality scoring with overall level of patient satisfaction revealed that all 4 dimensions were significantly related to patient dissatisfaction (p<.05). The study suggests that service quality of therapeutic and diagnostic procedures in public hospitals should be routinely monitored from the patients' point of view as most aspects of service quality in public hospitals of Pakistan, require improvements. In this manner patient's satisfaction regarding use of services in public hospitals can be made better.
A rigorous approach to self-checking programming
NASA Technical Reports Server (NTRS)
Hua, Kien A.; Abraham, Jacob A.
1986-01-01
Self-checking programming is shown to be an effective concurrent error detection technique. The reliability of a self-checking program however relies on the quality of its assertion statements. A self-checking program written without formal guidelines could provide a poor coverage of the errors. A constructive technique for self-checking programming is presented. A Structured Program Design Language (SPDL) suitable for self-checking software development is defined. A set of formal rules, was also developed, that allows the transfromation of SPDL designs into self-checking designs to be done in a systematic manner.
2009-09-22
test officer). At a minimum, the CIL will be conducted at the operator level (often referred to as “field strip and clean”). More detailed...is checked before each shot is fired. Use a boresight (optical or laser ) as necessary to check alignment to the target aiming point if the barrel is...for this test should be representative of production. All components must be present, including projectile paint and markings, fuzes, any tool
Automobile driver on-road performance test. Volume 3, Examiner's manual
DOT National Transportation Integrated Search
1981-09-30
This report provides procedures for administering and scoring the Automobile Driver On-Road Performance Test (ADOPT). The ADOPT checks 21 separate driving performances. Performances are checked at pre-determined locations along a 10-minute route and ...
Automobile driver on-road performance test. Volume 2, Administrator's manual
DOT National Transportation Integrated Search
1981-09-30
This report provides procedures for setting up, administering, and scoring the Automobile Driver On-Road Performance Test (ADOPT). The ADOPT checks 21 separate driving performances. Performances are checked at pre-determined locations along a 10-minu...
31 CFR 240.6 - Provisional credit; first examination; declination; final payment.
Code of Federal Regulations, 2010 CFR
2010-07-01
... alteration without examining the original check or a better quality image of the check and Treasury is on... after the check is presented to a Federal Reserve Processing Center for payment, Treasury will be deemed...
31 CFR 240.6 - Provisional credit; first examination; declination; final payment.
Code of Federal Regulations, 2011 CFR
2011-07-01
... alteration without examining the original check or a better quality image of the check and Treasury is on... after the check is presented to a Federal Reserve Processing Center for payment, Treasury will be deemed...
31 CFR 240.6 - Provisional credit; first examination; declination; final payment.
Code of Federal Regulations, 2014 CFR
2014-07-01
... alteration without examining the original check or a better quality image of the check and Treasury is on... after the check is presented to a Federal Reserve Processing Center for payment, Treasury will be deemed...
31 CFR 240.6 - Provisional credit; first examination; declination; final payment.
Code of Federal Regulations, 2013 CFR
2013-07-01
... alteration without examining the original check or a better quality image of the check and Treasury is on... after the check is presented to a Federal Reserve Processing Center for payment, Treasury will be deemed...
31 CFR 240.6 - Provisional credit; first examination; declination; final payment.
Code of Federal Regulations, 2012 CFR
2012-07-01
... alteration without examining the original check or a better quality image of the check and Treasury is on... after the check is presented to a Federal Reserve Processing Center for payment, Treasury will be deemed...
Chen, Chun-Hung; Li, Cheng-Chang; Chou, Chuan-Yu; Chen, Shu-Hwa
2009-08-01
This project was designed to improve the low validity rate for nurses responsible to operate single door autoclave sterilizers in the operating room. By investigating the current status, we found that the nursing staff validity rate of cognition on the autoclave sterilizer was 85%, and the practice operating check validity rate was only 80%. Such was due to a lack of in-service education. Problems with operation included: 1. Unsafe behaviors - not following standard procedure, lacking relevant operating knowledge and absence of a check form; 2. Unsafe environment - the conveying steam piping was typically not covered and lacked operation marks. Recommended improvement measures included: 1. holding in-service education; 2. generating an operation procedure flow chart; 3. implementing obstacle eliminating procedures; 4. covering piping to prevent fire and burns; 5. performing regular checks to ensure all procedures are followed. Following intervention, nursing staff cognition rose from 85% to 100%, while the operation validity rate rose from 80% to 100%. These changes ensure a safer operating room environment, and helps facilities move toward a zero accident rate in the healthcare environment.
Ford, Eric C; Terezakis, Stephanie; Souranis, Annette; Harris, Kendra; Gay, Hiram; Mutic, Sasa
2012-11-01
To quantify the error-detection effectiveness of commonly used quality control (QC) measures. We analyzed incidents from 2007-2010 logged into a voluntary in-house, electronic incident learning systems at 2 academic radiation oncology clinics. None of the incidents resulted in patient harm. Each incident was graded for potential severity using the French Nuclear Safety Authority scoring scale; high potential severity incidents (score >3) were considered, along with a subset of 30 randomly chosen low severity incidents. Each report was evaluated to identify which of 15 common QC checks could have detected it. The effectiveness was calculated, defined as the percentage of incidents that each QC measure could detect, both for individual QC checks and for combinations of checks. In total, 4407 incidents were reported, 292 of which had high-potential severity. High- and low-severity incidents were detectable by 4.0 ± 2.3 (mean ± SD) and 2.6 ± 1.4 QC checks, respectively (P<.001). All individual checks were less than 50% sensitive with the exception of pretreatment plan review by a physicist (63%). An effectiveness of 97% was achieved with 7 checks used in combination and was not further improved with more checks. The combination of checks with the highest effectiveness includes physics plan review, physician plan review, Electronic Portal Imaging Device-based in vivo portal dosimetry, radiation therapist timeout, weekly physics chart check, the use of checklists, port films, and source-to-skin distance checks. Some commonly used QC checks such as pretreatment intensity modulated radiation therapy QA do not substantially add to the ability to detect errors in these data. The effectiveness of QC measures in radiation oncology depends sensitively on which checks are used and in which combinations. A small percentage of errors cannot be detected by any of the standard formal QC checks currently in broad use, suggesting that further improvements are needed. These data require confirmation with a broader incident-reporting database. Copyright © 2012 Elsevier Inc. All rights reserved.
Charnock, P; Jones, R; Fazakerley, J; Wilde, R; Dunn, A F
2011-09-01
Data are currently being collected from hospital radiology information systems in the North West of the UK for the purposes of both clinical audit and patient dose audit. Could these data also be used to satisfy quality assurance (QA) requirements according to UK guidance? From 2008 to 2009, 731 653 records were submitted from 8 hospitals from the North West England. For automatic exposure control QA, the protocol from Institute of Physics and Engineering in Medicine (IPEM) report 91 recommends that milliamperes per second can be monitored for repeatability and reproducibility using a suitable phantom, at 70-81 kV. Abdomen AP and chest PA examinations were analysed to find the most common kilovoltage used with these records then used to plot average monthly milliamperes per second with time. IPEM report 91 also recommends that a range of commonly used clinical settings is used to check output reproducibility and repeatability. For each tube, the dose area product values were plotted over time for two most common exposure factor sets. Results show that it is possible to do performance checks of AEC systems; however more work is required to be able to monitor tube output performance. Procedurally, the management system requires work and the benefits to the workflow would need to be demonstrated.
The relation of mechanical properties of wood and nosebar pressure in the production of veneer
Charles W. McMillin
1958-01-01
Observations of checking frequency, depth of check penetration, veneer thickness, and surface quality were made at 20 machining conditions. An inverse relationship between depth of check and frequency of checking was established. The effect of cutting temperature was demonstrated, and strength in compression perpendicular to the grain, tension perpendicular to the...
[Coronary artery bypass surgery: methods of performance monitoring and quality control].
Albert, A; Sergeant, P; Ennker, J
2009-10-01
The strength of coronary bypass operations depends on the preservation of their benefits regarding freedom of symptoms, quality of life and survival, over decades. Significant variability of the results of an operative intervention according to the hospital or the operating surgeon is considered a weakness in the procedure. The external quality insurance tries to reach a transparent service providing market through hospital ranking comparability. Widely available information and competition will promote the improvement of the whole quality. The structured dialog acts as a control instrument for the BQS (Federal Quality Insurance). It is launched in case of deviations from the standard references or statistically significant differences between the results of the operations in any hospital and the average notational results. In comparison to the external control the hospital internal control has greater ability to reach a medically useful statement regarding the results of the treatment and to correct the mistakes in time. An online information portal based on a departmental databank (DataWarehouse, DataMart) is an attractive solution for the physician in order to get transparently and timely informed about the variability in the performance.The individual surgeon significantly influences the short- and long-term treatment results. Accordingly, selection, targeted training and performance measurements are necessary.Strict risk management and failure analysis of individual cases are included in the methods of internal quality control aiming to identify and correct the inadequacies in the system and the course of treatment. According to the international as well as our own experience, at least 30% of the mortalities after bypass operations are avoidable. A functioning quality control is especially important in minimally invasive interventions because they are often technically more demanding in comparison to the conventional procedures. In the field of OPCAB surgery, the special advantages of the procedure can be utilised to reach a nearly complete avoidance of postoperative stroke through combining the procedure with aorta no-touch technique. The long-term success of the bypass operation depends on the type of bypass material in additions to many other factors. Both internal mammary arteries are considered the most durable.Using an operation preparation check contributes to the operative success.
A comprehensive and efficient daily quality assurance for PBS proton therapy
NASA Astrophysics Data System (ADS)
Actis, O.; Meer, D.; König, S.; Weber, D. C.; Mayor, A.
2017-03-01
There are several general recommendations for quality assurance (QA) measures, which have to be performed at proton therapy centres. However, almost each centre uses a different therapy system. In particular, there is no standard procedure for centres employing pencil beam scanning and each centre applies a specific QA program. Gantry 2 is an operating therapy system which was developed at PSI and relies on the most advanced technological innovations. We developed a comprehensive daily QA program in order to verify the main beam characteristics to assure the functionality of the therapy delivery system and the patient safety system. The daily QA program entails new hardware and software solutions for a highly efficient clinical operation. In this paper, we describe a dosimetric phantom used for verifying the most critical beam parameters and the software architecture developed for a fully automated QA procedure. The connection between our QA software and the database allows us to store the data collected on a daily basis and use it for trend analysis over longer periods of time. All the data presented here have been collected during a time span of over two years, since the beginning of the Gantry 2 clinical operation in 2013. Our procedure operates in a stable way and delivers the expected beam quality. The daily QA program takes only 20 min. At the same time, the comprehensive approach allows us to avoid most of the weekly and monthly QA checks and increases the clinical beam availability.
Levels at streamflow gaging stations
Kennedy, E.J.
1990-01-01
This manual establishes the surveying procedures for (1) setting gages at a streamflow gaging station to datum and (2) checking the gages periodically for errors caused by vertical movement of the structures that support them. Surveying terms and concepts are explained, and procedures for testing, adjusting, and operating the instruments are described in detail. Notekeeping, adjusting level circuits, checking gages, summarizing results, locating the nearest National Geodetic Vertical Datum of 1929 bench mark, and relating the gage datum to the national datum are also described.
Integrating automated support for a software management cycle into the TAME system
NASA Technical Reports Server (NTRS)
Sunazuka, Toshihiko; Basili, Victor R.
1989-01-01
Software managers are interested in the quantitative management of software quality, cost and progress. An integrated software management methodology, which can be applied throughout the software life cycle for any number purposes, is required. The TAME (Tailoring A Measurement Environment) methodology is based on the improvement paradigm and the goal/question/metric (GQM) paradigm. This methodology helps generate a software engineering process and measurement environment based on the project characteristics. The SQMAR (software quality measurement and assurance technology) is a software quality metric system and methodology applied to the development processes. It is based on the feed forward control principle. Quality target setting is carried out before the plan-do-check-action activities are performed. These methodologies are integrated to realize goal oriented measurement, process control and visual management. A metric setting procedure based on the GQM paradigm, a management system called the software management cycle (SMC), and its application to a case study based on NASA/SEL data are discussed. The expected effects of SMC are quality improvement, managerial cost reduction, accumulation and reuse of experience, and a highly visual management reporting system.
NASA Technical Reports Server (NTRS)
1973-01-01
Information required to calibrate, functionally check, and operate the Instrumentation Branch equipment on the NASA-6 aircraft is provided. All procedures required for preflight checks and in-flight operation of the NASA-6 atmospheric measuring station are given. The calibration section is intended for only that portion of the system maintained and calibrated by IN-MSD-12 Systems Operation contractor personnel. Maintenance is not included.
Arts, Daniëlle; de Keizer, Nicolette; Scheffer, Gert-Jan; de Jonge, Evert
2002-05-01
To analyse the quality of data used to measure severity of illness in the Dutch National Intensive Care Evaluation (NICE) registry, after implementation of quality improving procedures. Data were re-abstracted from the paper records of patients or the Patient Data Management System and compared to the data contained in the registry. The re-abstracted data were considered to be the gold standard. ICUs of nine Dutch hospitals that had been collecting data for the NICE registry for at least 1 year. The mean percentages of inaccurate and incomplete data, per hospital, over all variables, were 6.1%+/-4.4 (SD) and 2.7%+/-4.4 (SD), respectively. The mean difference in severity of illness scores between registry data and re-abstracted data was 0.2 points for APACHE II and 0.4 points for SAPS II. The mean difference in predicted mortality according to APACHE II and SAPS II between registry data and re-abstracted data was 0.4% and 0.02%, respectively. The current data quality of the NICE registry is good and justifies evaluative research. These positive results might be explained by the implementation of several quality assurance procedures in the NICE registry, such as training and automatic data checks. Electronic supplementary material to this paper can be obtained by using the Springer LINK server located at http://dx.doi.org/10.1007/s00134-002-1272-z
Sasaki, Kei; Sasaki, Hiroto; Takahashi, Atsuki; Kang, Siu; Yuasa, Tetsuya; Kato, Ryuji
2016-02-01
In recent years, cell and tissue therapy in regenerative medicine have advanced rapidly towards commercialization. However, conventional invasive cell quality assessment is incompatible with direct evaluation of the cells produced for such therapies, especially in the case of regenerative medicine products. Our group has demonstrated the potential of quantitative assessment of cell quality, using information obtained from cell images, for non-invasive real-time evaluation of regenerative medicine products. However, image of cells in the confluent state are often difficult to evaluate, because accurate recognition of cells is technically difficult and the morphological features of confluent cells are non-characteristic. To overcome these challenges, we developed a new image-processing algorithm, heterogeneity of orientation (H-Orient) processing, to describe the heterogeneous density of cells in the confluent state. In this algorithm, we introduced a Hessian calculation that converts pixel intensity data to orientation data and a statistical profiling calculation that evaluates the heterogeneity of orientations within an image, generating novel parameters that yield a quantitative profile of an image. Using such parameters, we tested the algorithm's performance in discriminating different qualities of cellular images with three types of clinically important cell quality check (QC) models: remaining lifespan check (QC1), manipulation error check (QC2), and differentiation potential check (QC3). Our results show that our orientation analysis algorithm could predict with high accuracy the outcomes of all types of cellular quality checks (>84% average accuracy with cross-validation). Copyright © 2015 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.
Financial Record Checking in Surveys: Do Prompts Improve Data Quality?
ERIC Educational Resources Information Center
Murphy, Joe; Rosen, Jeffrey; Richards, Ashley; Riley, Sarah; Peytchev, Andy; Lindblad, Mark
2016-01-01
Self-reports of financial information in surveys, such as wealth, income, and assets, are particularly prone to inaccuracy. We sought to improve the quality of financial information captured in a survey conducted by phone and in person by encouraging respondents to check records when reporting on income and assets. We investigated whether…
Statistical analysis of global horizontal solar irradiation GHI in Fez city, Morocco
NASA Astrophysics Data System (ADS)
Bounoua, Z.; Mechaqrane, A.
2018-05-01
An accurate knowledge of the solar energy reaching the ground is necessary for sizing and optimizing the performances of solar installations. This paper describes a statistical analysis of the global horizontal solar irradiation (GHI) at Fez city, Morocco. For better reliability, we have first applied a set of check procedures to test the quality of hourly GHI measurements. We then eliminate the erroneous values which are generally due to measurement or the cosine effect errors. Statistical analysis show that the annual mean daily values of GHI is of approximately 5 kWh/m²/day. Daily monthly mean values and other parameter are also calculated.
Code of Federal Regulations, 2010 CFR
2010-01-01
... principles of the teaching-learning process; (ii) Teaching methods and procedures; and (iii) The instructor... certificate holder's policies and procedures. (3) The applicable methods, procedures, and techniques for... approved methods, procedures, and limitations for performing the required normal, abnormal, and emergency...
Code of Federal Regulations, 2014 CFR
2014-01-01
...-learning process; (ii) Teaching methods and procedures; and (iii) The instructor-student relationship. (d... procedures. (3) The appropriate methods, procedures, and techniques for conducting flight instruction. (4... corrective action in the case of unsatisfactory training progress. (6) The approved methods, procedures, and...
Code of Federal Regulations, 2014 CFR
2014-01-01
... principles of the teaching-learning process; (ii) Teaching methods and procedures; and (iii) The instructor... certificate holder's policies and procedures. (3) The applicable methods, procedures, and techniques for... approved methods, procedures, and limitations for performing the required normal, abnormal, and emergency...
Code of Federal Regulations, 2012 CFR
2012-01-01
... principles of the teaching-learning process; (ii) Teaching methods and procedures; and (iii) The instructor... certificate holder's policies and procedures. (3) The applicable methods, procedures, and techniques for... approved methods, procedures, and limitations for performing the required normal, abnormal, and emergency...
Code of Federal Regulations, 2013 CFR
2013-01-01
... principles of the teaching-learning process; (ii) Teaching methods and procedures; and (iii) The instructor... certificate holder's policies and procedures. (3) The applicable methods, procedures, and techniques for... approved methods, procedures, and limitations for performing the required normal, abnormal, and emergency...
Code of Federal Regulations, 2011 CFR
2011-01-01
... principles of the teaching-learning process; (ii) Teaching methods and procedures; and (iii) The instructor... certificate holder's policies and procedures. (3) The applicable methods, procedures, and techniques for... approved methods, procedures, and limitations for performing the required normal, abnormal, and emergency...
Indirect check of the stability of the reference ion chamber used for accelerator output calibration
NASA Astrophysics Data System (ADS)
Kang, Sei-Kwon; Yoon, Jai-Woong; Park, Soah; Hwang, Taejin; Cheong, Kwang-Ho; Han, Tae Jin; Kim, Haeyoung; Lee, Me-Yeon; Kim, Kyoung Ju; Bae, Hoonsik
2014-11-01
A linear accelerator's output is periodically checked by using a reference ion chamber which is also periodically calibrated at the accredited standard dosimetry laboratories. We suggest a simple procedure for checking the chamber's stability between calibrations by comparison with another ion chamber. To identify the long-term stability of chambers, we collected and assessed the dose-to-water conversion factors provided by standard laboratories for three chambers during a period of four years. To develop the chamber constancy check program, we used one Farmer-type reference ion chamber FC65-G, two ion chambers (CC13a and CC13b) and one CC01 ion chamber (IBA). Under the accelerator, each chamber was placed inside the solid phantom and irradiated; the experimental configurations were identical. To check the variation in charge collection of the reference chamber, we monitored the ratios of the FC65-G values over each chamber reading. Based on the error propagation of the two chamber ratios, we estimated the uncertainty of the output calibration from the chamber variation. The calibration factors provided for the three chambers showed 0.04 ˜ 0.12% standard deviations during four years. For procedure development, the reading ratios of FC65-G over CCxx showed very good stability; the ratios of FC65-G over CC13a, CC13b and CC01 varied less than 0.059, 0.087 and 0.248%, respectively, over five measurements. By ascribing possible uncertainties of the ratio to the reference chamber alone, we could conservatively check the stability of the reference chamber for treatment safety. An extension of the chamber calibration period was also evaluated. In conclusion, we designed a stability check procedure for the reference chamber based on a reading ratio of two chambers. This could help the user assess the chamber stability between periodic chamber calibration, and the associated patient treatment could be carried out with enhanced safety.
Dosimetry audits and intercomparisons in radiotherapy: A Malaysian profile
NASA Astrophysics Data System (ADS)
M. Noor, Noramaliza; Nisbet, A.; Hussein, M.; Chu S, Sarene; Kadni, T.; Abdullah, N.; Bradley, D. A.
2017-11-01
Quality audits and intercomparisons are important in ensuring control of processes in any system of endeavour. Present interest is in control of dosimetry in teletherapy, there being a need to assess the extent to which there is consistent radiation dose delivery to the patient. In this study we review significant factors that impact upon radiotherapy dosimetry, focusing upon the example situation of radiotherapy delivery in Malaysia, examining existing literature in support of such efforts. A number of recommendations are made to provide for increased quality assurance and control. In addition to this study, the first level of intercomparison audit i.e. measuring beam output under reference conditions at eight selected Malaysian radiotherapy centres is checked; use being made of 9 μm core diameter Ge-doped silica fibres (Ge-9 μm). The results of Malaysian Secondary Standard Dosimetry Laboratory (SSDL) participation in the IAEA/WHO TLD postal dose audit services during the period between 2011 and 2015 will also been discussed. In conclusion, following review of the development of dosimetry audits and the conduct of one such exercise in Malaysia, it is apparent that regular periodic radiotherapy audits and intercomparison programmes should be strongly supported and implemented worldwide. The programmes to-date demonstrate these to be a good indicator of errors and of consistency between centres. A total of ei+ght beams have been checked in eight Malaysian radiotherapy centres. One out of the eight beams checked produced an unacceptable deviation; this was found to be due to unfamiliarity with the irradiation procedures. Prior to a repeat measurement, the mean ratio of measured to quoted dose was found to be 0.99 with standard deviation of 3%. Subsequent to the repeat measurement, the mean distribution was 1.00, and the standard deviation was 1.3%.
Quality assurance methodology for Varian RapidArc treatment plans
Cirino, Eileen T.; Xiong, Li; Mower, Herbert W.
2010-01-01
With the commercial introduction of the Varian RapidArc, a new modality for treatment planning and delivery, the need has arisen for consistent and efficient techniques for performing patient‐specific quality assurance (QA) tests. In this paper we present our methodology for a RapidArc treatment plan QA procedure. For our measurements we used a 2D diode array (MapCHECK) embedded at 5 cm water equivalent depth in MapPHAN 5 phantom and an Exradin A16 ion chamber placed in six different positions in a cylindrical homogeneous phantom (QUASAR). We also checked the MUs for the RapidArc plans by using independent software (RadCalc). The agreement between Eclipse calculations and MapCHECK/MapPHAN 5 measurements was evaluated using both absolute distance‐to‐agreement (DTA) and gamma index with 10% dose threshold (TH), 3% dose difference (DD), and 3 mm DTA. The average agreement was 94.4% for the DTA approach and 96.3% for the gamma index approach. In high‐dose areas, the discrepancy between calculations and ion chamber measurements using the QUASAR phantom was within 4.5% for prostate cases. For the RadCalc calculations, we used the average SSD along the arc; however, for some patients the agreement for the MUs obtained with RadCalc versus Eclipse was inadequate (discrepancy>5%). In these cases, the plan was divided into partial arc plans so that RadCalc could perform a better estimation of the MUs. The discrepancy was further reduced to within ~4% using this approach. Regardless of the variation in prescribed dose and location of the treated areas, we obtained very good results for all patients studied in this paper. PACS number: 87.55.Qr
Code of Federal Regulations, 2013 CFR
2013-10-01
...) Obtain prior, written authorization from the individual for the State registry check, for the FBI... by the applicant; and (g) Ensure that an individual, for whom the results of a required state or FBI...
49 CFR 1546.203 - Acceptance and screening of checked baggage.
Code of Federal Regulations, 2010 CFR
2010-10-01
... foreign air carrier must refuse to transport any individual's checked baggage or property if the...) TRANSPORTATION SECURITY ADMINISTRATION, DEPARTMENT OF HOMELAND SECURITY CIVIL AVIATION SECURITY FOREIGN AIR... deterring the carriage of any explosive or incendiary. Each foreign air carrier must use the procedures...
NASA Technical Reports Server (NTRS)
Brenton, James C.; Barbre. Robert E., Jr.; Decker, Ryan K.; Orcutt, John M.
2018-01-01
The National Aeronautics and Space Administration's (NASA) Marshall Space Flight Center (MSFC) Natural Environments Branch (EV44) has provided atmospheric databases and analysis in support of space vehicle design and day-of-launch operations for NASA and commercial launch vehicle programs launching from the NASA Kennedy Space Center (KSC), co-located on the United States Air Force's Eastern Range (ER) at the Cape Canaveral Air Force Station. The ER complex is one of the most heavily instrumented sites in the United States with over 31 towers measuring various atmospheric parameters on a continuous basis. An inherent challenge with large sets of data consists of ensuring erroneous data is removed from databases, and thus excluded from launch vehicle design analyses. EV44 has put forth great effort in developing quality control (QC) procedures for individual meteorological instruments, however no standard QC procedures for all databases currently exists resulting in QC databases that have inconsistencies in variables, methodologies, and periods of record. The goal of this activity is to use the previous efforts by EV44 to develop a standardized set of QC procedures from which to build meteorological databases from KSC and the ER, while maintaining open communication with end users from the launch community to develop ways to improve, adapt and grow the QC database. Details of the QC procedures will be described. As the rate of launches increases with additional launch vehicle programs, it is becoming more important that weather databases are continually updated and checked for data quality before use in launch vehicle design and certification analyses.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vonach, H.; Tagesen, S.
Starting with a discussion of the requirements and goals for high quality general-purpose evaluations the paper will describe the procedures chosen in our evaluation work for JEFF for producing new general evaluations with complete covariance information for all cross sections (file 3 data). Key problems essential for the goal of making the best possible use of the existing theoretical and experimental knowledge on neutron interactions with the respective nuclide will be addressed, especially the problem of assigning covariances to calculated cross sections, necessary checking procedures for all experimental data and various possibilities to amend the experimental database beyond the obviousmore » use of EXFOR data for the respective cross sections. In this respect both, the use of elemental cross sections in isotopic evaluations and the use of implicit cross-section data (that is data which can be converted into cross sections by simple methods) will be discussed in some detail.« less
Presley, Todd K.; Jamison, Marcael T.J.; Young-Smith, Stacie T. M.
2006-01-01
Storm runoff water-quality samples were collected as part of the State of Hawaii Department of Transportation Stormwater Monitoring Program. This program is designed to assess the effects of highway runoff and urban runoff on Halawa Stream. For this program, rainfall data were collected at two stations, continuous discharge data at one station, continuous streamflow data at two stations, and water-quality data at five stations, which include the continuous discharge and streamflow stations. This report summarizes rainfall, discharge, streamflow, and water-quality data collected between July 1, 2005 and June 30, 2006. A total of 23 samples was collected over five storms during July 1, 2005 to June 30, 2006. The goal was to collect grab samples nearly simultaneously at all five stations, and flow-weighted time-composite samples at the three stations equipped with automatic samplers; however, all five storms were partially sampled owing to lack of flow at the time of sampling at some sites, or because some samples collected by the automatic sampler did not represent water from the storm. Samples were analyzed for total suspended solids, total dissolved solids, nutrients, chemical oxygen demand, and selected trace metals (cadmium, chromium, copper, lead, nickel, and zinc). Additionally, grab samples were analyzed for oil and grease, total petroleum hydrocarbons, fecal coliform, and biological oxygen demand. Quality-assurance/quality-control samples were also collected during storms and during routine maintenance to verify analytical procedures and check the effectiveness of equipment-cleaning procedures.
[Yes, we should keep ABO agglutination test within bedside transfusion checks].
Daurat, G
2008-11-01
ABO incompatible transfusions are still a frequent cause of serious adverse transfusion reactions. Bedside check is intended to detect patient errors and prevent ABO mismatch. France is one of the few countries that includes ABO agglutination test for red blood cells in bedside checks. Evaluation of this ABO agglutination test, performed with a special card, shows that, on the field, despite frequent users' mishandling, it can detect up to 93% of ABO incompatibilities. This is not enough to rely on this sole test for bedside checks. But, linking it with an another test, currently, checks that the right blood is given to the right patient, rises the sensitivity of the whole bedside procedure up to an estimated 99.65%, for detection of ABO incompatibilities. This linkage has been introduced in the French regulation in 2003. Since then, the incidence of ABO incompatible transfusions has decreased dramatically and faster than in any other country, so France has now, probably, the lowest rate of ABO incompatible transfusions. The investigation of the few ABO accidents that still occur, shows that professionals have always bypassed this linkage. On the other hand, introducing bedside recipient and blood products barcode or radio-chip checks in all the 1500 French hospitals, though technically possible, would provide very little enhancement and lead to major difficulties and expenses. Linkage of ABO agglutination test to patient and blood checks within the bedside procedure has proved to be efficient and should be kept.
40 CFR 63.9816 - What records must I keep?
Code of Federal Regulations, 2010 CFR
2010-07-01
... with a catalytic oxidizer, records of annual checks of catalyst activity levels and subsequent... approved alternative monitoring method(s) or test procedure(s). (8) Records of maintenance activities and...
40 CFR 63.9816 - What records must I keep?
Code of Federal Regulations, 2011 CFR
2011-07-01
... with a catalytic oxidizer, records of annual checks of catalyst activity levels and subsequent... approved alternative monitoring method(s) or test procedure(s). (8) Records of maintenance activities and...
1983-12-01
AS REnITIR 6. Rotarywing.hea ...... ................ ..... cRR( PnPR mOVmIENT 7. Drop tank pumps and indicators (SH--3H...POSE the rotor head. 9. ACTION: Watch forn Captain_ fr sigPlan Catain.lean 10. RESPONSE: Stycycin contrl andk checks for__retonse clea and the____...cyclic forward and to the left ’- and slowly pump te collec z v. 2. RESPONSE: watch for 1/8 revolution of rotary wing- 1. ACTION: Check head area clear1
Machine vision method for online surface inspection of easy open can ends
NASA Astrophysics Data System (ADS)
Mariño, Perfecto; Pastoriza, Vicente; Santamaría, Miguel
2006-10-01
Easy open can end manufacturing process in the food canning sector currently makes use of a manual, non-destructive testing procedure to guarantee can end repair coating quality. This surface inspection is based on a visual inspection made by human inspectors. Due to the high production rate (100 to 500 ends per minute) only a small part of each lot is verified (statistical sampling), then an automatic, online, inspection system, based on machine vision, has been developed to improve this quality control. The inspection system uses a fuzzy model to make the acceptance/rejection decision for each can end from the information obtained by the vision sensor. In this work, the inspection method is presented. This surface inspection system checks the total production, classifies the ends in agreement with an expert human inspector, supplies interpretability to the operators in order to find out the failure causes and reduce mean time to repair during failures, and allows to modify the minimum can end repair coating quality.
Collapse Mechanisms Of Masonry Structures
NASA Astrophysics Data System (ADS)
Zuccaro, G.; Rauci, M.
2008-07-01
The paper outlines a possible approach to typology recognition, safety check analyses and/or damage measuring taking advantage by a multimedia tool (MEDEA), tracing a guided procedure useful for seismic safety check evaluation and post event macroseismic assessment. A list of the possible collapse mechanisms observed in the post event surveys on masonry structures and a complete abacus of the damages are provided in MEDEA. In this tool a possible combination between a set of damage typologies and each collapse mechanism is supplied in order to improve the homogeneity of the damages interpretation. On the other hand recent researches of one of the author have selected a number of possible typological vulnerability factors of masonry buildings, these are listed in the paper and combined with potential collapse mechanisms to be activated under seismic excitation. The procedure takes place from simple structural behavior models, derived from the Umbria-Marche earthquake observations, and tested after the San Giuliano di Puglia event; it provides the basis either for safety check analyses of the existing buildings or for post-event structural safety assessment and economic damage evaluation. In the paper taking advantage of MEDEA mechanisms analysis, mainly developed for the post event safety check surveyors training, a simple logic path is traced in order to approach the evaluation of the masonry building safety check. The procedure starts from the identification of the typological vulnerability factors to derive the potential collapse mechanisms and their collapse multipliers and finally addresses the simplest and cheapest strengthening techniques to reduce the original vulnerability. The procedure has been introduced in the Guide Lines of the Regione Campania for the professionals in charge of the safety check analyses and the buildings strengthening in application of the national mitigation campaign introduced by the Ordinance of the Central Government n. 3362/03. The main cases of out of plane mechanisms are analyzed and a possible innovative theory for masonry building vulnerability assessment, based on limit state analyses, is outlined. The paper report the first step of a research granted by the Department of the Civil Protection to Reluis within the research program of Line 10.
Collapse Mechanisms Of Masonry Structures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zuccaro, G.; Rauci, M.
2008-07-08
The paper outlines a possible approach to typology recognition, safety check analyses and/or damage measuring taking advantage by a multimedia tool (MEDEA), tracing a guided procedure useful for seismic safety check evaluation and post event macroseismic assessment. A list of the possible collapse mechanisms observed in the post event surveys on masonry structures and a complete abacus of the damages are provided in MEDEA. In this tool a possible combination between a set of damage typologies and each collapse mechanism is supplied in order to improve the homogeneity of the damages interpretation. On the other hand recent researches of onemore » of the author have selected a number of possible typological vulnerability factors of masonry buildings, these are listed in the paper and combined with potential collapse mechanisms to be activated under seismic excitation. The procedure takes place from simple structural behavior models, derived from the Umbria-Marche earthquake observations, and tested after the San Giuliano di Puglia event; it provides the basis either for safety check analyses of the existing buildings or for post-event structural safety assessment and economic damage evaluation. In the paper taking advantage of MEDEA mechanisms analysis, mainly developed for the post event safety check surveyors training, a simple logic path is traced in order to approach the evaluation of the masonry building safety check. The procedure starts from the identification of the typological vulnerability factors to derive the potential collapse mechanisms and their collapse multipliers and finally addresses the simplest and cheapest strengthening techniques to reduce the original vulnerability. The procedure has been introduced in the Guide Lines of the Regione Campania for the professionals in charge of the safety check analyses and the buildings strengthening in application of the national mitigation campaign introduced by the Ordinance of the Central Government n. 3362/03. The main cases of out of plane mechanisms are analyzed and a possible innovative theory for masonry building vulnerability assessment, based on limit state analyses, is outlined. The paper report the first step of a research granted by the Department of the Civil Protection to Reluis within the research program of Line 10.« less
40 CFR 63.1046 - Test methods and procedures.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Method 21 of 40 CFR part 60, appendix A. Each potential leak interface (i.e., a location where organic vapor leakage could occur) on the cover and associated closure devices shall be checked. Potential leak... 60, appendix A. (7) Each potential leak interface shall be checked by traversing the instrument probe...
40 CFR 63.1046 - Test methods and procedures.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Method 21 of 40 CFR part 60, appendix A. Each potential leak interface (i.e., a location where organic vapor leakage could occur) on the cover and associated closure devices shall be checked. Potential leak... 60, appendix A. (7) Each potential leak interface shall be checked by traversing the instrument probe...
40 CFR 63.1046 - Test methods and procedures.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Method 21 of 40 CFR part 60, appendix A. Each potential leak interface (i.e., a location where organic vapor leakage could occur) on the cover and associated closure devices shall be checked. Potential leak... 60, appendix A. (7) Each potential leak interface shall be checked by traversing the instrument probe...
AUTOMOTIVE DIESEL MAINTENANCE 2. UNIT XXV, MICHIGAN/CLARK TRANSMISSION--TROUBLESHOOTING.
ERIC Educational Resources Information Center
Minnesota State Dept. of Education, St. Paul. Div. of Vocational and Technical Education.
THIS MODULE OF A 25-MODULE COURSE IS DESIGNED TO DEVELOP AN UNDERSTANDING OF TROUBLESHOOTING PROCEDURES FOR A SPECIFIC TRANSMISSION USED ON DIESEL POWERED EQUIPMENT. TOPICS ARE (1) PRELIMINARY CHECKS, (2) PRESSURE AND OIL FLOW CHECKS, (3) TROUBLESHOOTING TABLES, (4) TROUBLESHOOTING VEHICLES UNDER FIELD CONDITIONS, AND (5) ANALYZING UNACCEPTABLE…
40 CFR 86.327-79 - Quench checks; NOX analyzer.
Code of Federal Regulations, 2012 CFR
2012-07-01
... (CONTINUED) CONTROL OF EMISSIONS FROM NEW AND IN-USE HIGHWAY VEHICLES AND ENGINES Emission Regulations for New Gasoline-Fueled and Diesel-Fueled Heavy-Duty Engines; Gaseous Exhaust Test Procedures § 86.327-79..., recalibrate and repeat the quench check. (4) Prior to testing, the difference between the calculated NOX...
40 CFR 86.327-79 - Quench checks; NOX analyzer.
Code of Federal Regulations, 2013 CFR
2013-07-01
... (CONTINUED) CONTROL OF EMISSIONS FROM NEW AND IN-USE HIGHWAY VEHICLES AND ENGINES Emission Regulations for New Gasoline-Fueled and Diesel-Fueled Heavy-Duty Engines; Gaseous Exhaust Test Procedures § 86.327-79..., recalibrate and repeat the quench check. (4) Prior to testing, the difference between the calculated NOX...
40 CFR 86.540-90 - Exhaust sample analysis.
Code of Federal Regulations, 2012 CFR
2012-07-01
..., if appropriate, NOX: (1) Zero the analyzers and obtain a stable zero reading. Recheck after tests. (2... actual concentrations on chart. (3) Check zeros; repeat the procedure in paragraphs (a) (1) and (2) of... appropriate, NOX. concentrations of samples. (6) Check zero and span points. If difference is greater than 2...
40 CFR 91.413 - Exhaust sample procedure-gaseous components.
Code of Federal Regulations, 2011 CFR
2011-07-01
... that may occur between the pre and post checks is not specified. However, the difference between pre... drift nor the span drift between the pre-analysis and post-analysis checks on any range used may exceed... PROGRAMS (CONTINUED) CONTROL OF EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Gaseous Exhaust Test...
40 CFR 91.413 - Exhaust sample procedure-gaseous components.
Code of Federal Regulations, 2014 CFR
2014-07-01
... that may occur between the pre and post checks is not specified. However, the difference between pre... drift nor the span drift between the pre-analysis and post-analysis checks on any range used may exceed... PROGRAMS (CONTINUED) CONTROL OF EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Gaseous Exhaust Test...
40 CFR 91.413 - Exhaust sample procedure-gaseous components.
Code of Federal Regulations, 2010 CFR
2010-07-01
... that may occur between the pre and post checks is not specified. However, the difference between pre... drift nor the span drift between the pre-analysis and post-analysis checks on any range used may exceed... PROGRAMS (CONTINUED) CONTROL OF EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Gaseous Exhaust Test...
40 CFR 91.413 - Exhaust sample procedure-gaseous components.
Code of Federal Regulations, 2012 CFR
2012-07-01
... that may occur between the pre and post checks is not specified. However, the difference between pre... drift nor the span drift between the pre-analysis and post-analysis checks on any range used may exceed... PROGRAMS (CONTINUED) CONTROL OF EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Gaseous Exhaust Test...
40 CFR 91.413 - Exhaust sample procedure-gaseous components.
Code of Federal Regulations, 2013 CFR
2013-07-01
... that may occur between the pre and post checks is not specified. However, the difference between pre... drift nor the span drift between the pre-analysis and post-analysis checks on any range used may exceed... PROGRAMS (CONTINUED) CONTROL OF EMISSIONS FROM MARINE SPARK-IGNITION ENGINES Gaseous Exhaust Test...
[Quality of information in the process of informed consent for anesthesia].
Guillén-Perales, José; Luna-Maldonado, Aurelio; Fernández-Prada, María; Guillén-Solvas, José Francisco; Bueno-Cavanillas, Aurora
2013-11-01
To assess the quality of the information that patients receive in the informed consent document signed prior to surgery. Cross-sectional study of a sample of cancer patients admitted for surgery at the University Hospital San Cecilio of Granada in 2011. After checking the inclusion criteria and obtaining their consent, demographic data were collected and procedure data, and a questionnaire «ad hoc» to assess the quality and comprehensiveness of the information provided in the informed consent. 150 patients were studied. The majority (109 over 150) said they had received sufficient information, in appropriate language, and had the opportunity to ask questions, but only 44.7% correctly answered three or more issues related to anesthesia. University education level, knowledge of the intervention, information about the anesthesia problems and appropriate language were associated. Although systematic informed consent was performed, half of the patients did not comprehend the anesthesia risks. Variables primarily related to the information received were associated with the quality of the response, but not with patient characteristics. Copyright © 2013 AEC. Published by Elsevier Espana. All rights reserved.
Double checking: a second look
Chreim, Samia; Forster, Alan
2015-01-01
Abstract Rationale, aims and objectives Double checking is a standard practice in many areas of health care, notwithstanding the lack of evidence supporting its efficacy. We ask in this study: ‘How do front line practitioners conceptualize double checking? What are the weaknesses of double checking? What alternate views of double checking could render it a more robust process?’ Method This is part of a larger qualitative study based on 85 semi‐structured interviews of health care practitioners in general internal medicine and obstetrics and neonatology; thematic analysis of the transcribed interviews was undertaken. Inductive and deductive themes are reported. Results Weaknesses in the double checking process include inconsistent conceptualization of double checking, double (or more) checking as a costly and time‐consuming procedure, double checking trusted as an accepted and stand‐alone process, and double checking as preventing reporting of near misses. Alternate views of double checking that would render it a more robust process include recognizing that double checking requires training and a dedicated environment, Introducing automated double checking, and expanding double checking beyond error detection. These results are linked with the concepts of collective efficiency thoroughness trade off (ETTO), an in‐family approach, and resilience. Conclusion(s) Double checking deserves more questioning, as there are limitations to the process. Practitioners could view double checking through alternate lenses, and thus help strengthen this ubiquitous practice that is rarely challenged. PMID:26568537
Parks, Donovan H.; Imelfort, Michael; Skennerton, Connor T.; Hugenholtz, Philip; Tyson, Gene W.
2015-01-01
Large-scale recovery of genomes from isolates, single cells, and metagenomic data has been made possible by advances in computational methods and substantial reductions in sequencing costs. Although this increasing breadth of draft genomes is providing key information regarding the evolutionary and functional diversity of microbial life, it has become impractical to finish all available reference genomes. Making robust biological inferences from draft genomes requires accurate estimates of their completeness and contamination. Current methods for assessing genome quality are ad hoc and generally make use of a limited number of “marker” genes conserved across all bacterial or archaeal genomes. Here we introduce CheckM, an automated method for assessing the quality of a genome using a broader set of marker genes specific to the position of a genome within a reference genome tree and information about the collocation of these genes. We demonstrate the effectiveness of CheckM using synthetic data and a wide range of isolate-, single-cell-, and metagenome-derived genomes. CheckM is shown to provide accurate estimates of genome completeness and contamination and to outperform existing approaches. Using CheckM, we identify a diverse range of errors currently impacting publicly available isolate genomes and demonstrate that genomes obtained from single cells and metagenomic data vary substantially in quality. In order to facilitate the use of draft genomes, we propose an objective measure of genome quality that can be used to select genomes suitable for specific gene- and genome-centric analyses of microbial communities. PMID:25977477
Parks, Donovan H; Imelfort, Michael; Skennerton, Connor T; Hugenholtz, Philip; Tyson, Gene W
2015-07-01
Large-scale recovery of genomes from isolates, single cells, and metagenomic data has been made possible by advances in computational methods and substantial reductions in sequencing costs. Although this increasing breadth of draft genomes is providing key information regarding the evolutionary and functional diversity of microbial life, it has become impractical to finish all available reference genomes. Making robust biological inferences from draft genomes requires accurate estimates of their completeness and contamination. Current methods for assessing genome quality are ad hoc and generally make use of a limited number of "marker" genes conserved across all bacterial or archaeal genomes. Here we introduce CheckM, an automated method for assessing the quality of a genome using a broader set of marker genes specific to the position of a genome within a reference genome tree and information about the collocation of these genes. We demonstrate the effectiveness of CheckM using synthetic data and a wide range of isolate-, single-cell-, and metagenome-derived genomes. CheckM is shown to provide accurate estimates of genome completeness and contamination and to outperform existing approaches. Using CheckM, we identify a diverse range of errors currently impacting publicly available isolate genomes and demonstrate that genomes obtained from single cells and metagenomic data vary substantially in quality. In order to facilitate the use of draft genomes, we propose an objective measure of genome quality that can be used to select genomes suitable for specific gene- and genome-centric analyses of microbial communities. © 2015 Parks et al.; Published by Cold Spring Harbor Laboratory Press.
Check & Connect: The Importance of Relationships for Promoting Engagement with School
ERIC Educational Resources Information Center
Anderson, Amy R.; Christenson, Sandra L.; Sinclair, Mary F.; Lehr, Camilla A.
2004-01-01
The purpose of this study was to examine whether the closeness and quality of relationships between intervention staff and students involved in the Check & Connect program were associated with improved student engagement in school. Participants included 80 elementary and middle school students referred to the Check & Connect program for poor…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harpool, K; De La Fuente Herman, T; Ahmad, S
Purpose: To evaluate the performance of a two-dimensional (2D) array-diode- detector for geometric and dosimetric quality assurance (QA) tests of high-dose-rate (HDR) brachytherapy with an Ir-192-source. Methods: A phantom setup was designed that encapsulated a two-dimensional (2D) array-diode-detector (MapCheck2) and a catheter for the HDR brachytherapy Ir-192 source. This setup was used to perform both geometric and dosimetric quality assurance for the HDR-Ir192 source. The geometric tests included: (a) measurement of the position of the source and (b) spacing between different dwell positions. The dosimteric tests include: (a) linearity of output with time, (b) end effect and (c) relative dosemore » verification. The 2D-dose distribution measured with MapCheck2 was used to perform the previous tests. The results of MapCheck2 were compared with the corresponding quality assurance testes performed with Gafchromic-film and well-ionization-chamber. Results: The position of the source and the spacing between different dwell-positions were reproducible within 1 mm accuracy by measuring the position of maximal dose using MapCheck2 in contrast to the film which showed a blurred image of the dwell positions due to limited film sensitivity to irradiation. The linearity of the dose with dwell times measured from MapCheck2 was superior to the linearity measured with ionization chamber due to higher signal-to-noise ratio of the diode readings. MapCheck2 provided more accurate measurement of the end effect with uncertainty < 1.5% in comparison with the ionization chamber uncertainty of 3%. Although MapCheck2 did not provide absolute calibration dosimeter for the activity of the source, it provided accurate tool for relative dose verification in HDR-brachytherapy. Conclusion: The 2D-array-diode-detector provides a practical, compact and accurate tool to perform quality assurance for HDR-brachytherapy with an Ir-192 source. The diodes in MapCheck2 have high radiation sensitivity and linearity that is superior to Gafchromic-films and ionization chamber used for geometric and dosimetric QA in HDR-brachytherapy, respectively.« less
49 CFR 40.235 - What are the requirements for proper use and care of ASDs?
Code of Federal Regulations, 2014 CFR
2014-10-01
... ASD on the CPL. Your QAP must specify the methods used for quality control checks, temperatures at which the ASD must be stored and used, the shelf life of the device, and environmental conditions (e.g... the specified quality control checks or that has passed its expiration date. (e) As an employer, with...
49 CFR 40.235 - What are the requirements for proper use and care of ASDs?
Code of Federal Regulations, 2012 CFR
2012-10-01
... ASD on the CPL. Your QAP must specify the methods used for quality control checks, temperatures at which the ASD must be stored and used, the shelf life of the device, and environmental conditions (e.g... the specified quality control checks or that has passed its expiration date. (e) As an employer, with...
49 CFR 40.235 - What are the requirements for proper use and care of ASDs?
Code of Federal Regulations, 2011 CFR
2011-10-01
... ASD on the CPL. Your QAP must specify the methods used for quality control checks, temperatures at which the ASD must be stored and used, the shelf life of the device, and environmental conditions (e.g... the specified quality control checks or that has passed its expiration date. (e) As an employer, with...
49 CFR 40.235 - What are the requirements for proper use and care of ASDs?
Code of Federal Regulations, 2010 CFR
2010-10-01
... ASD on the CPL. Your QAP must specify the methods used for quality control checks, temperatures at which the ASD must be stored and used, the shelf life of the device, and environmental conditions (e.g... the specified quality control checks or that has passed its expiration date. (e) As an employer, with...
49 CFR 40.235 - What are the requirements for proper use and care of ASDs?
Code of Federal Regulations, 2013 CFR
2013-10-01
... ASD on the CPL. Your QAP must specify the methods used for quality control checks, temperatures at which the ASD must be stored and used, the shelf life of the device, and environmental conditions (e.g... the specified quality control checks or that has passed its expiration date. (e) As an employer, with...
ERIC Educational Resources Information Center
GRITTNER, FRANK; PAVLAT, RUSSELL
IN ORDER TO ASSIST NON-TECHNICAL PEOPLE IN SCHOOLS TO CONDUCT A FIELD CHECK OF LANGUAGE LABORATORY EQUIPMENT BEFORE THEY MAKE FINAL PAYMENTS, THIS MANUAL OFFERS CRITERIA, TESTS, AND METHODS OF SCORING THE QUALITY OF THE EQUIPMENT. CHECKLISTS ARE PROVIDED FOR EVALUATING CONSOLE FUNCTIONS, TAPE RECORDERS, AMPLIFIERS, SOUND QUALITY (INCLUDING…
SU-E-T-420: Failure Effects Mode Analysis for Trigeminal Neuralgia Frameless Radiosurgery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Howe, J
2015-06-15
Purpose: Functional radiosurgery has been used successfully in the treatment of trigeminal neuralgia but presents significant challenges to ensuring the high prescription dose is delivered accurately. A review of existing practice should help direct the focus of quality improvement for this treatment regime. Method: Failure modes and effects analysis was used to identify the processes in preparing radiosurgery treatment for TN. The map was developed by a multidisciplinary team including: neurosurgeon, radiation oncology, physicist and therapist. Potential failure modes were identified for each step in the process map as well as potential causes and end effect. A risk priority numbermore » was assigned to each cause. Results: The process map identified 66 individual steps (see attached supporting document). Corrective actions were developed for areas of high risk priority number. Wrong site treatment is at higher risk for trigeminal neuralgia treatment due to the lack of site specific pathologic imaging on MR and CT – additional site specific checks were implemented to minimize the risk of wrong site treatment. Failed collision checks resulted from an insufficient collision model in the treatment planning system and a plan template was developed to address this problem. Conclusion: Failure modes and effects analysis is an effective tool for developing quality improvement in high risk radiotherapy procedures such as functional radiosurgery.« less
Code of Federal Regulations, 2012 CFR
2012-01-01
... teaching-learning process; (ii) Teaching methods and procedures; and (iii) The instructor-student... policies and procedures. (3) The appropriate methods, procedures, and techniques for conducting flight...) The corrective action in the case of unsatisfactory training progress. (6) The approved methods...
Code of Federal Regulations, 2011 CFR
2011-01-01
... teaching-learning process; (ii) Teaching methods and procedures; and (iii) The instructor-student... policies and procedures. (3) The appropriate methods, procedures, and techniques for conducting flight...) The corrective action in the case of unsatisfactory training progress. (6) The approved methods...
Code of Federal Regulations, 2013 CFR
2013-01-01
... teaching-learning process; (ii) Teaching methods and procedures; and (iii) The instructor-student... policies and procedures. (3) The appropriate methods, procedures, and techniques for conducting flight...) The corrective action in the case of unsatisfactory training progress. (6) The approved methods...
Code of Federal Regulations, 2010 CFR
2010-01-01
... teaching-learning process; (ii) Teaching methods and procedures; and (iii) The instructor-student... policies and procedures. (3) The appropriate methods, procedures, and techniques for conducting flight...) The corrective action in the case of unsatisfactory training progress. (6) The approved methods...
Design and performance of daily quality assurance system for carbon ion therapy at NIRS
NASA Astrophysics Data System (ADS)
Saotome, N.; Furukawa, T.; Hara, Y.; Mizushima, K.; Tansho, R.; Saraya, Y.; Shirai, T.; Noda, K.
2017-09-01
At National Institute of Radiological Sciences (NIRS), we have been commissioning a rotating-gantry system for carbon-ion radiotherapy. This rotating gantry can transport heavy ions at 430 MeV/u to an isocenter with irradiation angles of ±180° that can rotate around the patient so that the tumor can be irradiated from any direction. A three-dimensional pencil-beam scanning irradiation system equipped with the rotating gantry enables the optimal use of physical characteristics of carbon ions to provide accurate treatment. To ensure the treatment quality using such a complex system, the calibration of the primary dose monitor, output check, range check, dose rate check, machine safety check, and some mechanical tests should be performed efficiently. For this purpose, we have developed a measurement system dedicated for quality assurance (QA) of this gantry system: the Daily QA system. The system consists of an ionization chamber system and a scintillator system. The ionization chamber system is used for the calibration of the primary dose monitor, output check, and dose rate check, and the scintillator system is used for the range check, isocenter, and gantry angle. The performance of the Daily QA system was verified by a beam test. The stability of the output was within 0.5%, and the range was within 0.5 mm. The coincidence of the coordinates between the patient-positioning system and the irradiation system was verified using the Daily QA system. Our present findings verified that the new Daily QA system for a rotating gantry is capable of verifying the irradiation system with sufficient accuracy.
Operations of a spaceflight experiment to investigate plant tropisms
NASA Astrophysics Data System (ADS)
Kiss, John Z.; Kumar, Prem; Millar, Katherine D. L.; Edelmann, Richard E.; Correll, Melanie J.
2009-10-01
Plants will be an important component in bioregenerative systems for long-term missions to the Moon and Mars. Since gravity is reduced both on the Moon and Mars, studies that identify the basic mechanisms of plant growth and development in altered gravity are required to ensure successful plant production on these space colonization missions. To address these issues, we have developed a project on the International Space Station (ISS) to study the interaction between gravitropism and phototropism in Arabidopsis thaliana. These experiments were termed TROPI (for tropisms) and were performed on the European Modular Cultivation System (EMCS) in 2006. In this paper, we provide an operational summary of TROPI and preliminary results on studies of tropistic curvature of seedlings grown in space. Seed germination in TROPI was lower compared to previous space experiments, and this was likely due to extended storage in hardware for up to 8 months. Video downlinks provided an important quality check on the automated experimental time line that also was monitored with telemetry. Good quality images of seedlings were obtained, but the use of analog video tapes resulted in delays in image processing and analysis procedures. Seedlings that germinated exhibited robust phototropic curvature. Frozen plant samples were returned on three space shuttle missions, and improvements in cold stowage and handing procedures in the second and third missions resulted in quality RNA extracted from the seedlings that was used in subsequent microarray analyses. While the TROPI experiment had technical and logistical difficulties, most of the procedures worked well due to refinement during the project.
Five years' experience of classical swine fever polymerase chain reaction ring trials in France.
Po, F; Le Dimna, M; Le Potier, M F
2011-12-01
Since 2004, the French National Reference Laboratory for classical swine fever (CSF) has conducted an annual proficiency test (PT) to evaluate the ability of local veterinary laboratories to perform real-time polymerase chain reaction (PCR) for CSF virus. The results of five years of testing (2004-2008) are described here. The PT was conducted under blind conditions on 20 samples. The same batch of samples was used for all five years. The number of laboratories that analysed the samples increased from four in 2004 to 13 in 2008. The results of the PT showed the following: cross-contamination between samples and deficiencies in RNA preparation can occur even in experienced laboratories; sample homogeneity should be checked carefully before selection; samples stored at-80 degrees C for several years remain stable; and poor shipment conditions do not damage the samples with regard to detection of CSF virus genome. These results will enable redesign of the panel to improve the overall quality of the PT, which will encourage laboratories to check and improve their PCR procedures and expertise. This is an excellent way to determine laboratory performance.
The Boeing 747 fatigue integrity program
NASA Technical Reports Server (NTRS)
Spencer, M. M.
1972-01-01
The fatigue integrity program which was established to insure economic operations and to provide foundation data for inspection and maintenance is discussed. Significant features of the 747 fatigue integrity program are: (1) fatigue analyses which are continually updated to reflect design changes, fatigue test results, and static and flight load survey measurements; (2) material selection and detail design by using initial fatigue analyses, service experience, and testing; and (3) fatigue testing to check detail design quality and to verify the analyses, culminated by the test of a structurally complete airframe. Fatigue stress analyses were performed with the aid of experimental as well as analytical procedures. Extensive application was made of the stress severity factor, developed at Boeing, for evaluating peak stresses in complex joints. A frame of reference was established by families of structural fatigue performance curves (S-N curves) encompassing the range of materials and fatigue qualities anticipated for the 747 airplane design.
40 CFR 90.411 - Post-test analyzer procedures.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 21 2013-07-01 2013-07-01 false Post-test analyzer procedures. 90.411... Test Procedures § 90.411 Post-test analyzer procedures. (a) Perform a HC hang-up check within 60...), the test is void. (d) Read and record the post-test data specified in § 90.405(e). (e) For a valid...
40 CFR 90.411 - Post-test analyzer procedures.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Post-test analyzer procedures. 90.411... Test Procedures § 90.411 Post-test analyzer procedures. (a) Perform a HC hang-up check within 60...), the test is void. (d) Read and record the post-test data specified in § 90.405(e). (e) For a valid...
40 CFR 90.411 - Post-test analyzer procedures.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 21 2012-07-01 2012-07-01 false Post-test analyzer procedures. 90.411... Test Procedures § 90.411 Post-test analyzer procedures. (a) Perform a HC hang-up check within 60...), the test is void. (d) Read and record the post-test data specified in § 90.405(e). (e) For a valid...
40 CFR 90.411 - Post-test analyzer procedures.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Post-test analyzer procedures. 90.411... Test Procedures § 90.411 Post-test analyzer procedures. (a) Perform a HC hang-up check within 60...), the test is void. (d) Read and record the post-test data specified in § 90.405(e). (e) For a valid...
40 CFR 90.411 - Post-test analyzer procedures.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 20 2014-07-01 2013-07-01 true Post-test analyzer procedures. 90.411... Test Procedures § 90.411 Post-test analyzer procedures. (a) Perform a HC hang-up check within 60...), the test is void. (d) Read and record the post-test data specified in § 90.405(e). (e) For a valid...
Byrum, Russell; Keith, Lauren; Bartos, Christopher; St Claire, Marisa; Lackemeyer, Matthew G; Holbrook, Michael R; Janosko, Krisztina; Barr, Jason; Pusl, Daniela; Bollinger, Laura; Wada, Jiro; Coe, Linda; Hensley, Lisa E; Jahrling, Peter B; Kuhn, Jens H; Lentz, Margaret R
2016-10-03
Medical imaging using animal models for human diseases has been utilized for decades; however, until recently, medical imaging of diseases induced by high-consequence pathogens has not been possible. In 2014, the National Institutes of Health, National Institute of Allergy and Infectious Diseases, Integrated Research Facility at Fort Detrick opened an Animal Biosafety Level 4 (ABSL-4) facility to assess the clinical course and pathology of infectious diseases in experimentally infected animals. Multiple imaging modalities including computed tomography (CT), magnetic resonance imaging, positron emission tomography, and single photon emission computed tomography are available to researchers for these evaluations. The focus of this article is to describe the workflow for safely obtaining a CT image of a live guinea pig in an ABSL-4 facility. These procedures include animal handling, anesthesia, and preparing and monitoring the animal until recovery from sedation. We will also discuss preparing the imaging equipment, performing quality checks, communication methods from "hot side" (containing pathogens) to "cold side," and moving the animal from the holding room to the imaging suite.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 32 National Defense 1 2013-07-01 2013-07-01 false Procedures. 86.6 Section 86.6 National Defense Department of Defense OFFICE OF THE SECRETARY OF DEFENSE PERSONNEL, MILITARY AND CIVILIAN CRIMINAL HISTORY BACKGROUND CHECKS ON INDIVIDUALS IN CHILD CARE SERVICES § 86.6 Procedures. The records of all existing...
Code of Federal Regulations, 2010 CFR
2010-07-01
... 32 National Defense 1 2010-07-01 2010-07-01 false Procedures. 86.6 Section 86.6 National Defense Department of Defense OFFICE OF THE SECRETARY OF DEFENSE PERSONNEL, MILITARY AND CIVILIAN CRIMINAL HISTORY BACKGROUND CHECKS ON INDIVIDUALS IN CHILD CARE SERVICES § 86.6 Procedures. The records of all existing...
Code of Federal Regulations, 2011 CFR
2011-07-01
... 32 National Defense 1 2011-07-01 2011-07-01 false Procedures. 86.6 Section 86.6 National Defense Department of Defense OFFICE OF THE SECRETARY OF DEFENSE PERSONNEL, MILITARY AND CIVILIAN CRIMINAL HISTORY BACKGROUND CHECKS ON INDIVIDUALS IN CHILD CARE SERVICES § 86.6 Procedures. The records of all existing...
Code of Federal Regulations, 2014 CFR
2014-07-01
... 32 National Defense 1 2014-07-01 2014-07-01 false Procedures. 86.6 Section 86.6 National Defense Department of Defense OFFICE OF THE SECRETARY OF DEFENSE PERSONNEL, MILITARY AND CIVILIAN CRIMINAL HISTORY BACKGROUND CHECKS ON INDIVIDUALS IN CHILD CARE SERVICES § 86.6 Procedures. The records of all existing...
Code of Federal Regulations, 2012 CFR
2012-07-01
... 32 National Defense 1 2012-07-01 2012-07-01 false Procedures. 86.6 Section 86.6 National Defense Department of Defense OFFICE OF THE SECRETARY OF DEFENSE PERSONNEL, MILITARY AND CIVILIAN CRIMINAL HISTORY BACKGROUND CHECKS ON INDIVIDUALS IN CHILD CARE SERVICES § 86.6 Procedures. The records of all existing...
40 CFR 86.329-79 - System response time; check procedure.
Code of Federal Regulations, 2010 CFR
2010-07-01
... PROGRAMS (CONTINUED) CONTROL OF EMISSIONS FROM NEW AND IN-USE HIGHWAY VEHICLES AND ENGINES Emission Regulations for New Gasoline-Fueled and Diesel-Fueled Heavy-Duty Engines; Gaseous Exhaust Test Procedures § 86... in step (i). (2) Capillary flow analyzers. This procedure is applicable only to analyzers that have...
Astronaut Joseph Tanner checks gloves during during launch/entry training
NASA Technical Reports Server (NTRS)
1994-01-01
Astronaut Joseph R. Tanner, mission specialist, checks his gloves during a rehearsal for the launch and entry phases of the scheduled November 1994 flight of STS-66. This rehearsal, held in the crew compartment trainer (CCT) of JSC's Shuttle mockup and integration laboratory, was followed by a training session on emergency egress procedures.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-03
... Sikorsky Model S-64E helicopters. The AD requires repetitive checks of the Blade Inspection Method (BIM... and check procedures for BIM blades installed on the Model S-64F helicopters. Several blade spars with a crack emanating from corrosion pits and other damage have been found because of BIM pressure...
47 CFR 25.274 - Procedures to be followed in the event of harmful interference.
Code of Federal Regulations, 2013 CFR
2013-10-01
... in the event of harmful interference. (a) The earth station operator whose transmission is suffering harmful interference shall first check the earth station equipment to ensure that the equipment is functioning properly. (b) The earth station operator shall then check all other earth stations in the licensee...
47 CFR 25.274 - Procedures to be followed in the event of harmful interference.
Code of Federal Regulations, 2010 CFR
2010-10-01
... in the event of harmful interference. (a) The earth station operator whose transmission is suffering harmful interference shall first check the earth station equipment to ensure that the equipment is functioning properly. (b) The earth station operator shall then check all other earth stations in the licensee...
47 CFR 25.274 - Procedures to be followed in the event of harmful interference.
Code of Federal Regulations, 2012 CFR
2012-10-01
... in the event of harmful interference. (a) The earth station operator whose transmission is suffering harmful interference shall first check the earth station equipment to ensure that the equipment is functioning properly. (b) The earth station operator shall then check all other earth stations in the licensee...
47 CFR 25.274 - Procedures to be followed in the event of harmful interference.
Code of Federal Regulations, 2014 CFR
2014-10-01
... in the event of harmful interference. (a) The earth station operator whose transmission is suffering harmful interference shall first check the earth station equipment to ensure that the equipment is functioning properly. (b) The earth station operator shall then check all other earth stations in the licensee...
47 CFR 25.274 - Procedures to be followed in the event of harmful interference.
Code of Federal Regulations, 2011 CFR
2011-10-01
... in the event of harmful interference. (a) The earth station operator whose transmission is suffering harmful interference shall first check the earth station equipment to ensure that the equipment is functioning properly. (b) The earth station operator shall then check all other earth stations in the licensee...
32 CFR Appendix A to Part 86 - Criminal History Background Check Procedures
Code of Federal Regulations, 2010 CFR
2010-07-01
... responsibility in ensuring a safe and secure environment for children within DoD activities or private... installation level. An IRC will be completed on individuals with a DoD affiliation such as living or working on... checks through the SCHR to personnel offices working with law enforcement or investigative agencies. They...
32 CFR Appendix A to Part 86 - Criminal History Background Check Procedures
Code of Federal Regulations, 2011 CFR
2011-07-01
... residences in an employment or security application. It is deemed unnecessary to conduct checks before 18... information exists regarding residence by the individual in the United States for 1 year or more since age 18... video equipment is acceptable provided it is monitored by an individual who has successfully completed a...
40 CFR 86.1340-90 - Exhaust sample analysis.
Code of Federal Regulations, 2010 CFR
2010-07-01
... may occur between the pre and post checks is not specified. However, the difference between pre... pre-analysis and post-analysis checks on any range used may exceed 3 percent for HC, or 2 percent for... Regulations for New Otto-Cycle and Diesel Heavy-Duty Engines; Gaseous and Particulate Exhaust Test Procedures...
40 CFR 86.1340-90 - Exhaust sample analysis.
Code of Federal Regulations, 2013 CFR
2013-07-01
... may occur between the pre and post checks is not specified. However, the difference between pre... pre-analysis and post-analysis checks on any range used may exceed 3 percent for HC, or 2 percent for... Regulations for New Otto-Cycle and Diesel Heavy-Duty Engines; Gaseous and Particulate Exhaust Test Procedures...
40 CFR 86.1340-90 - Exhaust sample analysis.
Code of Federal Regulations, 2012 CFR
2012-07-01
... may occur between the pre and post checks is not specified. However, the difference between pre... pre-analysis and post-analysis checks on any range used may exceed 3 percent for HC, or 2 percent for... Regulations for New Otto-Cycle and Diesel Heavy-Duty Engines; Gaseous and Particulate Exhaust Test Procedures...
40 CFR 86.1340-90 - Exhaust sample analysis.
Code of Federal Regulations, 2011 CFR
2011-07-01
... may occur between the pre and post checks is not specified. However, the difference between pre... pre-analysis and post-analysis checks on any range used may exceed 3 percent for HC, or 2 percent for... Regulations for New Otto-Cycle and Diesel Heavy-Duty Engines; Gaseous and Particulate Exhaust Test Procedures...
1985-05-31
These proposed regulations require a State agency to refund to the Federal government the Federal share of Medicaid checks issued by the State or its fiscal agent that remain uncashed 180 days after the date of issuance. In addition, we would require that the Federal share of cancelled (voided) Medicaid checks be refunded quarterly since there has been no expenditure by the State. This proposal is intended to implement in part a 1981 General Accounting Office recommendation that procedures be established for States to credit the Federal government for its portion of uncashed Medicaid checks issues by the State or its fiscal agent.
1986-10-09
These final regulations require that a State agency refund to the Federal Government the Federal share of Medicaid checks issued by the State or its fiscal agent that remain uncashed 180 days after the date of issuance. In addition, we are requiring that the Federal share of cancelled (voided) Medicaid checks be refunded quarterly since there has been no expenditure by the State. These regulations implement, in part, a 1981 General Accounting Office recommendation that procedures be established for States to credit the Federal Government for the Federal portion of uncashed Medicaid checks issued by the State or its fiscal agent.
Modelling road accidents: An approach using structural time series
NASA Astrophysics Data System (ADS)
Junus, Noor Wahida Md; Ismail, Mohd Tahir
2014-09-01
In this paper, the trend of road accidents in Malaysia for the years 2001 until 2012 was modelled using a structural time series approach. The structural time series model was identified using a stepwise method, and the residuals for each model were tested. The best-fitted model was chosen based on the smallest Akaike Information Criterion (AIC) and prediction error variance. In order to check the quality of the model, a data validation procedure was performed by predicting the monthly number of road accidents for the year 2012. Results indicate that the best specification of the structural time series model to represent road accidents is the local level with a seasonal model.
Schrock, Linda E
2008-07-01
This article reviews the literature to date and reports on a new study that documented the frequency of manual code-requiring blood glucose (BG) meters that were miscoded at the time of the patient's initial appointment in a hospital-based outpatient diabetes education program. Between January 1 and May 31, 2007, the type of BG meter and the accuracy of the patient's meter code (if required) and procedure for checking BG were checked during the initial appointment with the outpatient diabetes educator. If indicated, reeducation regarding the procedure for the BG meter code entry and/or BG test was provided. Of the 65 patients who brought their meter requiring manual entry of a code number or code chip to the initial appointment, 16 (25%) were miscoded at the time of the appointment. Two additional problems, one of dead batteries and one of improperly stored test strips, were identified and corrected at the first appointment. These findings underscore the importance of checking the patient's BG meter code (if required) and procedure for testing BG at each encounter with a health care professional or providing the patient with a meter that does not require manual entry of a code number or chip to match the container of test strips (i.e., an autocode meter).
ERIC Educational Resources Information Center
Mani, Bonnie G.
1995-01-01
In an Internal Revenue Service office using total quality management (TQM), the management development program uses Myers Briggs Type Indicator and Adjective Check List for manager self-assessment. Because management commitment is essential to TQM, the process is a way of enhancing leadership skills and demonstrating appreciation of diversity. (SK)
The Automation of Nowcast Model Assessment Processes
2016-09-01
that will automate real-time WRE-N model simulations, collect and quality control check weather observations for assimilation and verification, and...domains centered near White Sands Missile Range, New Mexico, where the Meteorological Sensor Array (MSA) will be located. The MSA will provide...observations and performing quality -control checks for the pre-forecast data assimilation period. 2. Run the WRE-N model to generate model forecast data
A procedure for seismic risk reduction in Campania Region
NASA Astrophysics Data System (ADS)
Zuccaro, G.; Palmieri, M.; Maggiò, F.; Cicalese, S.; Grassi, V.; Rauci, M.
2008-07-01
The Campania Region has set and performed a peculiar procedure in the field of seismic risk reduction. Great attention has been paid to public strategic buildings such as town halls, civil protection buildings and schools. The Ordinance 3274 promulgate in the 2004 by the Italian central authority obliged the owners of strategic buildings to perform seismic analyses within 2008 in order to check the safety of the structures and the adequacy to the use. In the procedure the Campania region, instead of the local authorities, ensure the complete drafting of seismic checks through financial resources of the Italian Government. A regional scientific technical committee has been constituted, composed of scientific experts, academics in seismic engineering. The committee has drawn up guidelines for the processing of seismic analyses. At the same time, the Region has issued a public competition to select technical seismic engineering experts to appoint seismic analysis in accordance with guidelines. The scientific committee has the option of requiring additional documents and studies in order to approve the safety checks elaborated. The Committee is supported by a technical and administrative secretariat composed of a group of expert in seismic engineering. At the moment several seismic safety checks have been completed. The results will be presented in this paper. Moreover, the policy to mitigate the seismic risk, set by Campania region, was to spend the most of the financial resources available on structural strengthening of public strategic buildings rather than in safety checks. A first set of buildings of which the response under seismic action was already known by data and studies of vulnerability previously realised, were selected for immediate retrofitting designs. Secondly, an other set of buildings were identified for structural strengthening. These were selected by using the criteria specified in the Guide Line prepared by the Scientific Committee and based on data obtained by the first set of safety checks. The strengthening philosophy adopt in the projects will be described in the paper.
User's manual for computer program BASEPLOT
Sanders, Curtis L.
2002-01-01
The checking and reviewing of daily records of streamflow within the U.S. Geological Survey is traditionally accomplished by hand-plotting and mentally collating tables of data. The process is time consuming, difficult to standardize, and subject to errors in computation, data entry, and logic. In addition, the presentation of flow data on the internet requires more timely and accurate computation of daily flow records. BASEPLOT was developed for checking and review of primary streamflow records within the U.S. Geological Survey. Use of BASEPLOT enables users to (1) provide efficiencies during the record checking and review process, (2) improve quality control, (3) achieve uniformity of checking and review techniques of simple stage-discharge relations, and (4) provide a tool for teaching streamflow computation techniques. The BASEPLOT program produces tables of quality control checks and produces plots of rating curves and discharge measurements; variable shift (V-shift) diagrams; and V-shifts converted to stage-discharge plots, using data stored in the U.S. Geological Survey Automatic Data Processing System database. In addition, the program plots unit-value hydrographs that show unit-value stages, shifts, and datum corrections; input shifts, datum corrections, and effective dates; discharge measurements; effective dates for rating tables; and numeric quality control checks. Checklist/tutorial forms are provided for reviewers to ensure completeness of review and standardize the review process. The program was written for the U.S. Geological Survey SUN computer using the Statistical Analysis System (SAS) software produced by SAS Institute, Incorporated.
2011-01-01
When applying echo-Doppler imaging for either clinical or research purposes it is very important to select the most adequate modality/technology and choose the most reliable and reproducible measurements. Quality control is a mainstay to reduce variability among institutions and operators and must be obtained by using appropriate procedures for data acquisition, storage and interpretation of echo-Doppler data. This goal can be achieved by employing an echo core laboratory (ECL), with the responsibility for standardizing image acquisition processes (performed at the peripheral echo-labs) and analysis (by monitoring and optimizing the internal intra- and inter-reader variability of measurements). Accordingly, the Working Group of Echocardiography of the Italian Society of Cardiology decided to design standardized procedures for imaging acquisition in peripheral laboratories and reading procedures and to propose a methodological approach to assess the reproducibility of echo-Doppler parameters of cardiac structure and function by using both standard and advanced technologies. A number of cardiologists experienced in cardiac ultrasound was involved to set up an ECL available for future studies involving complex imaging or including echo-Doppler measures as primary or secondary efficacy or safety end-points. The present manuscript describes the methodology of the procedures (imaging acquisition and measurement reading) and provides the documentation of the work done so far to test the reproducibility of the different echo-Doppler modalities (standard and advanced). These procedures can be suggested for utilization also in non referall echocardiographic laboratories as an "inside" quality check, with the aim at optimizing clinical consistency of echo-Doppler data. PMID:21943283
[Quality of and Attendance at Healthy Child Clinics in Germany].
Weithase, Alexandra; Vogel, Mandy; Kiep, Henriette; Schwarz, Sarah; Meißner, Laura; Herrmann, Janine; Rieger, Kristin; Koch, Christiane; Schuster, Volker; Kiess, Wieland
2017-04-01
Background For several years the German healthy child clinics program has been a highly appreciated preventive measure and is subject to constant development. However, attendance depends on the families' sociodemographic situation. Findings are documented in a medical checkup booklet (the so-called Gelbes Heft). Currently, there is no procedure to use the data collected for epidemiological purposes nor to evaluate the pediatric prevention measures in Germany. Methods Between 2011 and 2016, we recruited 3480 study participants for our population-based cohort study LIFE Child in Leipzig. 90.6 % submitted their check-up booklets which were subsequently scanned, the data was digitalized and transmitted to a computerized form. Furthermore, data on social status (so-called Winkler-Index) were collected for each family using a structured questionnaire. The study population consisted of the families' oldest child for whom both data sets were available. Results The transmission of data from the check-up booklets was time-consuming and cost-intensive due to large datasets, uncoded diagnoses as well as the necessity of trained employees for transferring often illegible handwriting. Early diagnostic tests for children enjoy a high level of acceptance among all social classes. With increasing age, attendance rate decreases gradually. Only 83 % of the population with a lower social status attend the U9 test. The documentation of diagnoses in the check-up booklets was implausible because the frequency fluctuated heavily between the different check-up time points. With only less than 2 %, the documentation of psychosocial difficulties in a child was particularly surprising Conclusion It is not possible to draw conclusions regarding the prevalence of target diseases from the frequency of documented findings in the check-up booklets. In order to make the data both comparable and evaluable, documentation must be digitalized in the future. © Georg Thieme Verlag KG Stuttgart · New York.
Cuffney, T.F.; Gurtz, M.E.; Meador, M.R.
1993-01-01
Benthic invertebrate samples are collected as part of the U.S. Geological Survey's National Water-Quality Assessment Program. This is a perennial, multidisciplinary program that integrates biological, physical, and chemical indicators of water quality to evaluate status and trends and to develop an understanding of the factors controlling observed water quality. The Program examines water quality in 60 study units (coupled ground- and surface-water systems) that encompass most of the conterminous United States and parts of Alaska and Hawaii. Study-unit teams collect and process qualitative and semi-quantitative invertebrate samples according to standardized procedures. These samples are processed (elutriated and subsampled) in the field to produce as many as four sample components: large-rare, main-body, elutriate, and split. Each sample component is preserved in 10-percent formalin, and two components, large-rare and main-body, are sent to contract laboratories for further processing. The large-rare component is composed of large invertebrates that are removed from the sample matrix during field processing and placed in one or more containers. The main-body sample component consists of the remaining sample materials (sediment, detritus, and invertebrates) and is subsampled in the field to achieve a volume of 750 milliliters or less. The remaining two sample components, elutriate and split, are used for quality-assurance and quality-control purposes. Contract laboratories are used to identify and quantify invertebrates from the large-rare and main-body sample components according to the procedures and guidelines specified within this document. These guidelines allow the use of subsampling techniques to reduce the volume of sample material processed and to facilitate identifications. These processing procedures and techniques may be modified if the modifications provide equal or greater levels of accuracy and precision. The intent of sample processing is to determine the quantity of each taxon present in the semi-quantitative samples or to list the taxa present in qualitative samples. The processing guidelines provide standardized laboratory forms, sample labels, detailed sample processing flow charts, standardized format for electronic data, quality-assurance procedures and checks, sample tracking standards, and target levels for taxonomic determinations. The contract laboratory (1) is responsible for identifications and quantifications, (2) constructs reference collections, (3) provides data in hard copy and electronic forms, (4) follows specified quality-assurance and quality-control procedures, and (5) returns all processed and unprocessed portions of the samples. The U.S. Geological Survey's Quality Management Group maintains a Biological Quality-Assurance Unit, located at the National Water-Quality Laboratory, Arvada, Colorado, to oversee the use of contract laboratories and ensure the quality of data obtained from these laboratories according to the guidelines established in this document. This unit establishes contract specifications, reviews contractor performance (timeliness, accuracy, and consistency), enters data into the National Water Information System-II data base, maintains in-house reference collections, deposits voucher specimens in outside museums, and interacts with taxonomic experts within and outside the U.S. Geological Survey. This unit also modifies the existing sample processing and quality-assurance guidelines, establishes criteria and testing procedures for qualifying potential contract laboratories, identifies qualified taxonomic experts, and establishes voucher collections.
NASA Technical Reports Server (NTRS)
Brenton, J. C.; Barbre, R. E.; Decker, R. K.; Orcutt, J. M.
2018-01-01
The National Aeronautics and Space Administration's (NASA) Marshall Space Flight Center (MSFC) Natural Environments Branch (EV44) provides atmospheric databases and analysis in support of space vehicle design and day-of-launch operations for NASA and commercial launch vehicle programs launching from the NASA Kennedy Space Center (KSC), co-located on the United States Air Force's Eastern Range (ER) at the Cape Canaveral Air Force Station. The ER complex is one of the most heavily instrumented sites in the United States with over 31 towers measuring various atmospheric parameters on a continuous basis. An inherent challenge with large datasets consists of ensuring erroneous data are removed from databases, and thus excluded from launch vehicle design analyses. EV44 has put forth great effort in developing quality control (QC) procedures for individual meteorological instruments, however no standard QC procedures for all databases currently exists resulting in QC databases that have inconsistencies in variables, development methodologies, and periods of record. The goal of this activity is to use the previous efforts to develop a standardized set of QC procedures from which to build meteorological databases from KSC and the ER, while maintaining open communication with end users from the launch community to develop ways to improve, adapt and grow the QC database. Details of the QC procedures will be described. As the rate of launches increases with additional launch vehicle programs, It is becoming more important that weather databases are continually updated and checked for data quality before use in launch vehicle design and certification analyses.
ERIC Educational Resources Information Center
Chauhan, U.; Kontopantelis, E.; Campbell, S.; Jarrett, H.; Lester, H.
2010-01-01
Background: Routine health checks have gained prominence as a way of detecting unmet need in primary care for adults with intellectual disabilities (ID) and general practitioners are being incentivised in the UK to carry out health checks for many conditions through an incentivisation scheme known as the Quality and Outcomes Framework (QOF).…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-22
... to require recurring checks of the Blade Inspection Method (BIM) indicator on each blade to determine whether the BIM indicator is signifying that the blade pressure may have been compromised by a blade crack... check procedures for BIM blades installed on the Model S-64E and S-64F helicopters. Several blade spars...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-10
... Docket No. R-1473, the Board is also proposing necessary related changes to the Board's Regulation J... receive an electronic file and create substitute checks from check images in the file to present to paying... Regulation J. Elsewhere in the Federal Register, the Board is proposing necessary related changes to this and...
Comparison of two methods of standard setting: the performance of the three-level Angoff method.
Jalili, Mohammad; Hejri, Sara M; Norcini, John J
2011-12-01
Cut-scores, reliability and validity vary among standard-setting methods. The modified Angoff method (MA) is a well-known standard-setting procedure, but the three-level Angoff approach (TLA), a recent modification, has not been extensively evaluated. This study aimed to compare standards and pass rates in an objective structured clinical examination (OSCE) obtained using two methods of standard setting with discussion and reality checking, and to assess the reliability and validity of each method. A sample of 105 medical students participated in a 14-station OSCE. Fourteen and 10 faculty members took part in the MA and TLA procedures, respectively. In the MA, judges estimated the probability that a borderline student would pass each station. In the TLA, judges estimated whether a borderline examinee would perform the task correctly or not. Having given individual ratings, judges discussed their decisions. One week after the examination, the procedure was repeated using normative data. The mean score for the total test was 54.11% (standard deviation: 8.80%). The MA cut-scores for the total test were 49.66% and 51.52% after discussion and reality checking, respectively (the consequent percentages of passing students were 65.7% and 58.1%, respectively). The TLA yielded mean pass scores of 53.92% and 63.09% after discussion and reality checking, respectively (rates of passing candidates were 44.8% and 12.4%, respectively). Compared with the TLA, the MA showed higher agreement between judges (0.94 versus 0.81) and a narrower 95% confidence interval in standards (3.22 versus 11.29). The MA seems a more credible and reliable procedure with which to set standards for an OSCE than does the TLA, especially when a reality check is applied. © Blackwell Publishing Ltd 2011.
Development of an expert planning system for OSSA
NASA Technical Reports Server (NTRS)
Groundwater, B.; Lembeck, M. F.; Sarsfield, L.; Diaz, Alphonso
1988-01-01
This paper presents concepts related to preliminary work for the development of an expert planning system for NASA's Office for Space Science and Applications (OSSA). The expert system will function as a planner's decision aid in preparing mission plans encompassing sets of proposed OSSA space science initiatives. These plans in turn will be checked against budgetary and technical constraints and tested for constraint violations. Appropriate advice will be generated by the system for making modifications to the plans to bring them in line with the constraints. The OSSA Planning Expert System (OPES) has been designed to function as an integral part of the OSSA mission planning process. It will be able to suggest a best plan, be able to accept and check a user-suggested strawman plan, and should provide a quick response to user request and actions. OPES will be written in the C programming language and have a transparent user interface running under Windows 386 on a Compaq 386/20 machine. The system's sorted knowledge and inference procedures will model the expertise of human planners familiar with the OSSA planning domain. Given mission priorities and budget guidelines, the system first sets the launch dates for each mission. It will check to make sure that planetary launch windows and precursor mission relationships are not violated. Additional levels of constraints will then be considered, checking such things as the availability of a suitable launch vehicle, total mission launch mass required vs. the identified launch mass capability, and the total power required by the payload at its destination vs. the actual power available. System output will be in the form of Gantt charts, spreadsheet hardcopy, and other presentation quality materials detailing the resulting OSSA mission plan.
The IEO Data Center Management System: Tools for quality control, analysis and access marine data
NASA Astrophysics Data System (ADS)
Casas, Antonia; Garcia, Maria Jesus; Nikouline, Andrei
2010-05-01
Since 1994 the Data Centre of the Spanish Oceanographic Institute develops system for archiving and quality control of oceanographic data. The work started in the frame of the European Marine Science & Technology Programme (MAST) when a consortium of several Mediterranean Data Centres began to work on the MEDATLAS project. Along the years, old software modules for MS DOS were rewritten, improved and migrated to Windows environment. Oceanographic data quality control includes now not only vertical profiles (mainly CTD and bottles observations) but also time series of currents and sea level observations. New powerful routines for analysis and for graphic visualization were added. Data presented originally in ASCII format were organized recently in an open source MySQL database. Nowadays, the IEO, as part of SeaDataNet Infrastructure, has designed and developed a new information system, consistent with the ISO 19115 and SeaDataNet standards, in order to manage the large and diverse marine data and information originated in Spain by different sources, and to interoperate with SeaDataNet. The system works with data stored in ASCII files (MEDATLAS, ODV) as well as data stored within the relational database. The components of the system are: 1.MEDATLAS Format and Quality Control - QCDAMAR: Quality Control of Marine Data. Main set of tools for working with data presented as text files. Includes extended quality control (searching for duplicated cruises and profiles, checking date, position, ship velocity, constant profiles, spikes, density inversion, sounding, acceptable data, impossible regional values,...) and input/output filters. - QCMareas: A set of procedures for the quality control of tide gauge data according to standard international Sea Level Observing System. These procedures include checking for unexpected anomalies in the time series, interpolation, filtering, computation of basic statistics and residuals. 2. DAMAR: A relational data base (MySql) designed to manage the wide variety of marine information as common vocabularies, Catalogues (CSR & EDIOS), Data and Metadata. 3.Other tools for analysis and data management - Import_DB: Script to import data and metadata from the Medatlas ASCII files into the database. - SelDamar/Selavi: interface with the database for local and web access. Allows selective retrievals applying the criteria introduced by the user, as geographical bounds, data responsible, cruises, platform, time periods, etc. Includes also statistical reference values calculation, plotting of original and mean profiles together with vertical interpolation. - ExtractDAMAR: Script to extract data when they are archived in ASCII files that meet the criteria upon an user request through SelDamar interface and export them in ODV format, making also a unit conversion.
Anderer, Peter; Gruber, Georg; Parapatics, Silvia; Woertz, Michael; Miazhynskaia, Tatiana; Klosch, Gerhard; Saletu, Bernd; Zeitlhofer, Josef; Barbanoj, Manuel J; Danker-Hopfe, Heidi; Himanen, Sari-Leena; Kemp, Bob; Penzel, Thomas; Grozinger, Michael; Kunz, Dieter; Rappelsberger, Peter; Schlogl, Alois; Dorffner, Georg
2005-01-01
To date, the only standard for the classification of sleep-EEG recordings that has found worldwide acceptance are the rules published in 1968 by Rechtschaffen and Kales. Even though several attempts have been made to automate the classification process, so far no method has been published that has proven its validity in a study including a sufficiently large number of controls and patients of all adult age ranges. The present paper describes the development and optimization of an automatic classification system that is based on one central EEG channel, two EOG channels and one chin EMG channel. It adheres to the decision rules for visual scoring as closely as possible and includes a structured quality control procedure by a human expert. The final system (Somnolyzer 24 x 7) consists of a raw data quality check, a feature extraction algorithm (density and intensity of sleep/wake-related patterns such as sleep spindles, delta waves, SEMs and REMs), a feature matrix plausibility check, a classifier designed as an expert system, a rule-based smoothing procedure for the start and the end of stages REM, and finally a statistical comparison to age- and sex-matched normal healthy controls (Siesta Spot Report). The expert system considers different prior probabilities of stage changes depending on the preceding sleep stage, the occurrence of a movement arousal and the position of the epoch within the NREM/REM sleep cycles. Moreover, results obtained with and without using the chin EMG signal are combined. The Siesta polysomnographic database (590 recordings in both normal healthy subjects aged 20-95 years and patients suffering from organic or nonorganic sleep disorders) was split into two halves, which were randomly assigned to a training and a validation set, respectively. The final validation revealed an overall epoch-by-epoch agreement of 80% (Cohen's kappa: 0.72) between the Somnolyzer 24 x 7 and the human expert scoring, as compared with an inter-rater reliability of 77% (Cohen's kappa: 0.68) between two human experts scoring the same dataset. Two Somnolyzer 24 x 7 analyses (including a structured quality control by two human experts) revealed an inter-rater reliability close to 1 (Cohen's kappa: 0.991), which confirmed that the variability induced by the quality control procedure, whereby approximately 1% of the epochs (in 9.5% of the recordings) are changed, can definitely be neglected. Thus, the validation study proved the high reliability and validity of the Somnolyzer 24 x 7 and demonstrated its applicability in clinical routine and sleep studies.
[Implementation of a rational standard of hygiene for preparation of operating rooms].
Bauer, M; Scheithauer, S; Moerer, O; Pütz, H; Sliwa, B; Schmidt, C E; Russo, S G; Waeschle, R M
2015-10-01
The assurance of high standards of care is a major requirement in German hospitals while cost reduction and efficient use of resources are mandatory. These requirements are particularly evident in the high-risk and cost-intensive operating theatre field with multiple process steps. The cleaning of operating rooms (OR) between surgical procedures is of major relevance for patient safety and requires time and human resources. The hygiene procedure plan for OR cleaning between operations at the university hospital in Göttingen was revised and optimized according to the plan-do-check-act principle due to not clearly defined specifications of responsibilities, use of resources, prolonged process times and increased staff engagement. The current status was evaluated in 2012 as part of the first step "plan". The subsequent step "do" included an expert symposium with external consultants, interdisciplinary consensus conferences with an actualization of the former hygiene procedure plan and the implementation process. All staff members involved were integrated into this management change process. The penetration rate of the training and information measures as well as the acceptance and compliance with the new hygiene procedure plan were reviewed within step "check". The rates of positive swabs and air sampling as well as of postoperative wound infections were analyzed for quality control and no evidence for a reduced effectiveness of the new hygiene plan was found. After the successful implementation of these measures the next improvement cycle ("act") was performed in 2014 which led to a simplification of the hygiene plan by reduction of the number of defined cleaning and disinfection programs for preparation of the OR. The reorganization measures described led to a comprehensive commitment of the hygiene procedure plan by distinct specifications for responsibilities, for the course of action and for the use of resources. Furthermore, a simplification of the plan, a rational staff assignment and reduced process times were accomplished. Finally, potential conflicts due to an insufficient evidence-based knowledge of personnel was reduced. This present project description can be used by other hospitals as a guideline for similar changes in management processes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Covington, E; Younge, K; Chen, X
Purpose: To evaluate the effectiveness of an automated plan check tool to improve first-time plan quality as well as standardize and document performance of physics plan checks. Methods: The Plan Checker Tool (PCT) uses the Eclipse Scripting API to check and compare data from the treatment planning system (TPS) and treatment management system (TMS). PCT was created to improve first-time plan quality, reduce patient delays, increase efficiency of our electronic workflow, and to standardize and partially automate plan checks in the TPS. A framework was developed which can be configured with different reference values and types of checks. One examplemore » is the prescribed dose check where PCT flags the user when the planned dose and the prescribed dose disagree. PCT includes a comprehensive checklist of automated and manual checks that are documented when performed by the user. A PDF report is created and automatically uploaded into the TMS. Prior to and during PCT development, errors caught during plan checks and also patient delays were tracked in order to prioritize which checks should be automated. The most common and significant errors were determined. Results: Nineteen of 33 checklist items were automated with data extracted with the PCT. These include checks for prescription, reference point and machine scheduling errors which are three of the top six causes of patient delays related to physics and dosimetry. Since the clinical roll-out, no delays have been due to errors that are automatically flagged by the PCT. Development continues to automate the remaining checks. Conclusion: With PCT, 57% of the physics plan checklist has been partially or fully automated. Treatment delays have declined since release of the PCT for clinical use. By tracking delays and errors, we have been able to measure the effectiveness of automating checks and are using this information to prioritize future development. This project was supported in part by P01CA059827.« less
NASA Astrophysics Data System (ADS)
Kawka, O. E.; Nelson, J. S.; Manalang, D.; Kelley, D. S.
2016-02-01
The Cabled Array component of the NSF-funded Ocean Observatories Initiative (OOI) provides access to real-time physical, chemical, geological, and biological data from water column and seafloor platforms/instruments at sites spanning the southern half of the Juan de Fuca Plate. The Quality Assurance (QA) program for OOI data is designed to ensure that data products meet OOI science requirements. This overall data QA plan establishes the guidelines for assuring OOI data quality and summarizes Quality Control (QC) protocols and procedures, based on best practices, which can be utilized to ensure the highest quality data across the OOI program. This presentation will highlight, specifically, the QA/QC approach being utilized for the OOI Cabled Array infrastructure and data and will include a summary of both shipboard and shore-based protocols currently in use. Aspects addressed will be pre-deployment instrument testing and calibration checks, post-deployment and pre-recovery field verification of data, and post-recovery "as-found" testing of instruments. Examples of QA/QC data will be presented and specific cases of cabled data will be discussed in the context of quality assessments and adjustment/correction of OOI datasets overall for inherent sensor drift and/or instrument fouling.
21 CFR 226.58 - Laboratory controls.
Code of Federal Regulations, 2010 CFR
2010-04-01
... Laboratory controls. Laboratory controls shall include the establishment of adequate specifications and test... establishment of master records containing appropriate specifications and a description of the test procedures... necessary laboratory test procedures to check such specifications. (c) Assays which shall be made of...
A shared computer-based problem-oriented patient record for the primary care team.
Linnarsson, R; Nordgren, K
1995-01-01
1. INTRODUCTION. A computer-based patient record (CPR) system, Swedestar, has been developed for use in primary health care. The principal aim of the system is to support continuous quality improvement through improved information handling, improved decision-making, and improved procedures for quality assurance. The Swedestar system has evolved during a ten-year period beginning in 1984. 2. SYSTEM DESIGN. The design philosophy is based on the following key factors: a shared, problem-oriented patient record; structured data entry based on an extensive controlled vocabulary; advanced search and query functions, where the query language has the most important role; integrated decision support for drug prescribing and care protocols and guidelines; integrated procedures for quality assurance. 3. A SHARED PROBLEM-ORIENTED PATIENT RECORD. The core of the CPR system is the problem-oriented patient record. All problems of one patient, recorded by different members of the care team, are displayed on the problem list. Starting from this list, a problem follow-up can be made, one problem at a time or for several problems simultaneously. Thus, it is possible to get an integrated view, across provider categories, of those problems of one patient that belong together. This shared problem-oriented patient record provides an important basis for the primary care team work. 4. INTEGRATED DECISION SUPPORT. The decision support of the system includes a drug prescribing module and a care protocol module. The drug prescribing module is integrated with the patient records and includes an on-line check of the patient's medication list for potential interactions and data-driven reminders concerning major drug problems. Care protocols have been developed for the most common chronic diseases, such as asthma, diabetes, and hypertension. The patient records can be automatically checked according to the care protocols. 5. PRACTICAL EXPERIENCE. The Swedestar system has been implemented in a primary care area with 30,000 inhabitants. It is being used by all the primary care team members: 15 general practitioners, 25 district nurses, and 10 physiotherapists. Several years of practical experience of the CPR system shows that it has a positive impact on quality of care on four levels: 1) improved clinical follow-up of individual patients; 2) facilitated follow-up of aggregated data such as practice activity analysis, annual reports, and clinical indicators; 3) automated medical audit; and 4) concurrent audit. Within that primary care area, quality of care has improved substantially in several aspects due to the use of the CPR system [1].
2013-01-01
Background Limited information has been published regarding standard quality assurance (QA) procedures for stroke registries. We share our experience regarding the establishment of enhanced QA procedures for the University of Texas Houston Stroke Registry (UTHSR) and evaluate whether these QA procedures have improved data quality in UTHSR. Methods All 5093 patient records that were abstracted and entered in UTHSR, between January 1, 2008 and December 31, 2011, were considered in this study. We conducted reliability and validity studies. For reliability and validity of data captured by abstractors, a random subset of 30 records was used for re-abstraction of select key variables by two abstractors. These 30 records were re-abstracted by a team of experts that included a vascular neurologist clinician as the “gold standard”. We assessed inter-rater reliability (IRR) between the two abstractors as well as validity of each abstractor with the “gold standard”. Depending on the scale of variables, IRR was assessed with Kappa or intra-class correlations (ICC) using a 2-way, random effects ANOVA. For assessment of validity of data in UTHSR we re-abstracted another set of 85 patient records for which all discrepant entries were adjudicated by a vascular neurology fellow clinician and added to the set of our “gold standard”. We assessed level of agreement between the registry data and the “gold standard” as well as sensitivity and specificity. We used logistic regression to compare error rates for different years to assess whether a significant improvement in data quality has been achieved during 2008–2011. Results The error rate dropped significantly, from 4.8% in 2008 to 2.2% in 2011 (P < 0.001). The two abstractors had an excellent IRR (Kappa or ICC ≥ 0.75) on almost all key variables checked. Agreement between data in UTHSR and the “gold standard” was excellent for almost all categorical and continuous variables. Conclusions Establishment of a rigorous data quality assurance for our UTHSR has helped to improve the validity of data. We observed an excellent IRR between the two abstractors. We recommend training of chart abstractors and systematic assessment of IRR between abstractors and validity of the abstracted data in stroke registries. PMID:23767957
NASA Astrophysics Data System (ADS)
Raghavan, Ajay; Saha, Bhaskar
2013-03-01
Photo enforcement devices for traffic rules such as red lights, toll, stops, and speed limits are increasingly being deployed in cities and counties around the world to ensure smooth traffic flow and public safety. These are typically unattended fielded systems, and so it is important to periodically check them for potential image/video quality problems that might interfere with their intended functionality. There is interest in automating such checks to reduce the operational overhead and human error involved in manually checking large camera device fleets. Examples of problems affecting such camera devices include exposure issues, focus drifts, obstructions, misalignment, download errors, and motion blur. Furthermore, in some cases, in addition to the sub-algorithms for individual problems, one also has to carefully design the overall algorithm and logic to check for and accurately classifying these individual problems. Some of these issues can occur in tandem or have the potential to be confused for each other by automated algorithms. Examples include camera misalignment that can cause some scene elements to go out of focus for wide-area scenes or download errors that can be misinterpreted as an obstruction. Therefore, the sequence in which the sub-algorithms are utilized is also important. This paper presents an overview of these problems along with no-reference and reduced reference image and video quality solutions to detect and classify such faults.
RF Conditioning and Testing of Fundamental Power Couplers for SNS Superconducting Cavity Production
DOE Office of Scientific and Technical Information (OSTI.GOV)
M. Stirbet; G.K. Davis; M. A. Drury
The Spallation Neutron Source (SNS) makes use of 33 medium beta (0.61) and 48 high beta (0.81) superconducting cavities. Each cavity is equipped with a fundamental power coupler, which should withstand the full klystron power of 550 kW in full reflection for the duration of an RF pulse of 1.3 msec at 60 Hz repetition rate. Before assembly to a superconducting cavity, the vacuum components of the coupler are submitted to acceptance procedures consisting of preliminary quality assessments, cleaning and clean room assembly, vacuum leak checks and baking under vacuum, followed by conditioning and RF high power testing. Similar acceptancemore » procedures (except clean room assembly and baking) were applied for the airside components of the coupler. All 81 fundamental power couplers for SNS superconducting cavity production have been RF power tested at JLAB Newport News and, beginning in April 2004 at SNS Oak Ridge. This paper gives details of coupler processing and RF high power-assessed performances.« less
[New context for the Individual Healthcare Professions Act (BIG law)].
Sijmons, Jaap G; Winter, Heinrich B; Hubben, Joep H
2014-01-01
In 2013 the Dutch Individual Healthcare Professions Act (known as the BIG law) was evaluated for the second time. The research showed that patients have limited awareness of the registration of healthcare professionals and that the system of reserved procedures is almost unknown. On the other hand, healthcare institutions (especially hospitals) frequently check the register, as do healthcare insurance companies when contracting institutions. Knowledge of the reserved procedures system is moderate amongst professionals too, while the organisation of care is to a great extent based on this system. Since the change of system in 2006 quality assurance in professional practice has been much more rooted in the internal structure of care; in this way, the BIG law did not go the way the legislator intended. According to the researchers, this has not prevented the BIG law from still playing an essential function. Indeed, the BIG law has not reached its final destination, but it may reach its goal via another route.
Morisse Pradier, H; Sénéchal, A; Philit, F; Tronc, F; Maury, J-M; Grima, R; Flamens, C; Paulus, S; Neidecker, J; Mornex, J-F
2016-02-01
Lung transplantation (LT) is now considered as an excellent treatment option for selected patients with end-stage pulmonary diseases, such as COPD, cystic fibrosis, idiopathic pulmonary fibrosis, and pulmonary arterial hypertension. The 2 goals of LT are to provide a survival benefit and to improve quality of life. The 3-step decision process leading to LT is discussed in this review. The first step is the selection of candidates, which requires a careful examination in order to check absolute and relative contraindications. The second step is the timing of listing for LT; it requires the knowledge of disease-specific prognostic factors available in international guidelines, and discussed in this paper. The third step is the choice of procedure: indications of heart-lung, single-lung, and bilateral-lung transplantation are described. In conclusion, this document provides guidelines to help pulmonologists in the referral and selection processes of candidates for transplantation in order to optimize the outcome of LT. Copyright © 2014 Elsevier Masson SAS. All rights reserved.
Signal processing and calibration procedures for in situ diode-laser absorption spectroscopy.
Werle, P W; Mazzinghi, P; D'Amato, F; De Rosa, M; Maurer, K; Slemr, F
2004-07-01
Gas analyzers based on tunable diode-laser spectroscopy (TDLS) provide high sensitivity, fast response and highly specific in situ measurements of several atmospheric trace gases simultaneously. Under optimum conditions even a shot noise limited performance can be obtained. For field applications outside the laboratory practical limitations are important. At ambient mixing ratios below a few parts-per-billion spectrometers become more and more sensitive towards noise, interference, drift effects and background changes associated with low level signals. It is the purpose of this review to address some of the problems which are encountered at these low levels and to describe a signal processing strategy for trace gas monitoring and a concept for in situ system calibration applicable for tunable diode-laser spectroscopy. To meet the requirement of quality assurance for field measurements and monitoring applications, procedures to check the linearity according to International Standard Organization regulations are described and some measurements of calibration functions are presented and discussed.
40 CFR 86.342-79 - Post-test procedures.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 19 2013-07-01 2013-07-01 false Post-test procedures. 86.342-79... Post-test procedures. (a) Begin a hang-up check within 30 seconds of the completion of the last mode in... does not meet the requirements of § 86.328 the test is void. (d) Read and record the post-test data...
40 CFR 86.342-79 - Post-test procedures.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 19 2012-07-01 2012-07-01 false Post-test procedures. 86.342-79... Post-test procedures. (a) Begin a hang-up check within 30 seconds of the completion of the last mode in... does not meet the requirements of § 86.328 the test is void. (d) Read and record the post-test data...
40 CFR 91.411 - Post-test analyzer procedures.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Post-test analyzer procedures. 91.411... Post-test analyzer procedures. (a) Perform a hang-up check within 60 seconds of the completion of the... and record the post-test data specified in § 91.405(e). (e) For a valid test, the analyzer drift...
40 CFR 91.411 - Post-test analyzer procedures.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Post-test analyzer procedures. 91.411... Post-test analyzer procedures. (a) Perform a hang-up check within 60 seconds of the completion of the... and record the post-test data specified in § 91.405(e). (e) For a valid test, the analyzer drift...
40 CFR 91.411 - Post-test analyzer procedures.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 20 2014-07-01 2013-07-01 true Post-test analyzer procedures. 91.411... Post-test analyzer procedures. (a) Perform a hang-up check within 60 seconds of the completion of the... and record the post-test data specified in § 91.405(e). (e) For a valid test, the analyzer drift...
40 CFR 91.411 - Post-test analyzer procedures.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 21 2012-07-01 2012-07-01 false Post-test analyzer procedures. 91.411... Post-test analyzer procedures. (a) Perform a hang-up check within 60 seconds of the completion of the... and record the post-test data specified in § 91.405(e). (e) For a valid test, the analyzer drift...
40 CFR 86.342-79 - Post-test procedures.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 18 2010-07-01 2010-07-01 false Post-test procedures. 86.342-79... Post-test procedures. (a) Begin a hang-up check within 30 seconds of the completion of the last mode in... does not meet the requirements of § 86.328 the test is void. (d) Read and record the post-test data...
40 CFR 86.342-79 - Post-test procedures.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 18 2011-07-01 2011-07-01 false Post-test procedures. 86.342-79... Post-test procedures. (a) Begin a hang-up check within 30 seconds of the completion of the last mode in... does not meet the requirements of § 86.328 the test is void. (d) Read and record the post-test data...
40 CFR 91.411 - Post-test analyzer procedures.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 21 2013-07-01 2013-07-01 false Post-test analyzer procedures. 91.411... Post-test analyzer procedures. (a) Perform a hang-up check within 60 seconds of the completion of the... and record the post-test data specified in § 91.405(e). (e) For a valid test, the analyzer drift...
49 CFR 40.65 - What does the collector check for when the employee presents a specimen?
Code of Federal Regulations, 2013 CFR
2013-10-01
... PROCEDURES FOR TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Urine Specimen Collections § 40.65.... You must check to ensure that the specimen contains at least 45 mL of urine. (1) If it does not, you... of tampering) also exists. (3) You are never permitted to combine urine collected from separate voids...
49 CFR 40.65 - What does the collector check for when the employee presents a specimen?
Code of Federal Regulations, 2011 CFR
2011-10-01
... PROCEDURES FOR TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Urine Specimen Collections § 40.65.... You must check to ensure that the specimen contains at least 45 mL of urine. (1) If it does not, you... of tampering) also exists. (3) You are never permitted to combine urine collected from separate voids...
49 CFR 40.65 - What does the collector check for when the employee presents a specimen?
Code of Federal Regulations, 2010 CFR
2010-10-01
... PROCEDURES FOR TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Urine Specimen Collections § 40.65.... You must check to ensure that the specimen contains at least 45 mL of urine. (1) If it does not, you... of tampering) also exists. (3) You are never permitted to combine urine collected from separate voids...
49 CFR 40.65 - What does the collector check for when the employee presents a specimen?
Code of Federal Regulations, 2014 CFR
2014-10-01
... PROCEDURES FOR TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Urine Specimen Collections § 40.65.... You must check to ensure that the specimen contains at least 45 mL of urine. (1) If it does not, you... of tampering) also exists. (3) You are never permitted to combine urine collected from separate voids...
49 CFR 40.65 - What does the collector check for when the employee presents a specimen?
Code of Federal Regulations, 2012 CFR
2012-10-01
... PROCEDURES FOR TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Urine Specimen Collections § 40.65.... You must check to ensure that the specimen contains at least 45 mL of urine. (1) If it does not, you... of tampering) also exists. (3) You are never permitted to combine urine collected from separate voids...
Normality Tests for Statistical Analysis: A Guide for Non-Statisticians
Ghasemi, Asghar; Zahediasl, Saleh
2012-01-01
Statistical errors are common in scientific literature and about 50% of the published articles have at least one error. The assumption of normality needs to be checked for many statistical procedures, namely parametric tests, because their validity depends on it. The aim of this commentary is to overview checking for normality in statistical analysis using SPSS. PMID:23843808
Sex Differences in Health Care Requirements Aboard U.S. Navy Ships
1990-03-20
nervous system symptoms (almost entirely headache), and then psychological symptoms (tension, nervousness). After that point, genitourinary problems...variable in accordance with procedures described by Lilienfeld and Lilienfeld .6 In those occupational specialties in which the confidence intervals do not...services such as inoculation, physical examination (e.g., check in, check out, reenlistment), pregnancy test , birth control prescription, Pap test
Bayesian model checking: A comparison of tests
NASA Astrophysics Data System (ADS)
Lucy, L. B.
2018-06-01
Two procedures for checking Bayesian models are compared using a simple test problem based on the local Hubble expansion. Over four orders of magnitude, p-values derived from a global goodness-of-fit criterion for posterior probability density functions agree closely with posterior predictive p-values. The former can therefore serve as an effective proxy for the difficult-to-calculate posterior predictive p-values.
40 CFR 243.202-3 - Recommended procedures: Operations.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 25 2014-07-01 2014-07-01 false Recommended procedures: Operations. 243.202-3 Section 243.202-3 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED... receive periodic vehicle safety checks, including, but not limited to, inspection of brakes, windshield...
40 CFR 243.202-3 - Recommended procedures: Operations.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 24 2010-07-01 2010-07-01 false Recommended procedures: Operations. 243.202-3 Section 243.202-3 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED... receive periodic vehicle safety checks, including, but not limited to, inspection of brakes, windshield...
40 CFR 243.202-3 - Recommended procedures: Operations.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 26 2013-07-01 2013-07-01 false Recommended procedures: Operations. 243.202-3 Section 243.202-3 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED... receive periodic vehicle safety checks, including, but not limited to, inspection of brakes, windshield...
40 CFR 243.202-3 - Recommended procedures: Operations.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 25 2011-07-01 2011-07-01 false Recommended procedures: Operations. 243.202-3 Section 243.202-3 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED... receive periodic vehicle safety checks, including, but not limited to, inspection of brakes, windshield...
32 CFR 242.5 - Admission procedures.
Code of Federal Regulations, 2012 CFR
2012-07-01
... HEALTH SCIENCES § 242.5 Admission procedures. (a) Application—(1) Civilians. Civilians seeking admission..., physical examinations, and National Agency Checks, as required, consistent with § 242.4(a)(5)) to determine whether or not the selected candidates are acceptable for commissioning. (Physical examinations for...
32 CFR 242.5 - Admission procedures.
Code of Federal Regulations, 2011 CFR
2011-07-01
... HEALTH SCIENCES § 242.5 Admission procedures. (a) Application—(1) Civilians. Civilians seeking admission..., physical examinations, and National Agency Checks, as required, consistent with § 242.4(a)(5)) to determine whether or not the selected candidates are acceptable for commissioning. (Physical examinations for...
32 CFR 242.5 - Admission procedures.
Code of Federal Regulations, 2014 CFR
2014-07-01
... HEALTH SCIENCES § 242.5 Admission procedures. (a) Application—(1) Civilians. Civilians seeking admission..., physical examinations, and National Agency Checks, as required, consistent with § 242.4(a)(5)) to determine whether or not the selected candidates are acceptable for commissioning. (Physical examinations for...
32 CFR 242.5 - Admission procedures.
Code of Federal Regulations, 2013 CFR
2013-07-01
... HEALTH SCIENCES § 242.5 Admission procedures. (a) Application—(1) Civilians. Civilians seeking admission..., physical examinations, and National Agency Checks, as required, consistent with § 242.4(a)(5)) to determine whether or not the selected candidates are acceptable for commissioning. (Physical examinations for...
Crew procedures development techniques
NASA Technical Reports Server (NTRS)
Arbet, J. D.; Benbow, R. L.; Hawk, M. L.; Mangiaracina, A. A.; Mcgavern, J. L.; Spangler, M. C.
1975-01-01
The study developed requirements, designed, developed, checked out and demonstrated the Procedures Generation Program (PGP). The PGP is a digital computer program which provides a computerized means of developing flight crew procedures based on crew action in the shuttle procedures simulator. In addition, it provides a real time display of procedures, difference procedures, performance data and performance evaluation data. Reconstruction of displays is possible post-run. Data may be copied, stored on magnetic tape and transferred to the document processor for editing and documentation distribution.
Bel-Peña, N; Mérida-de la Torre, F J
2015-01-01
To check whether an intervention based on direct observation and complementary information to nurses helps reduce haemolysis when drawing blood specimens. Random sampling study in primary care centres in the serrania de Málaga health management area, using a cross-sectional, longitudinal pre- and post-intervention design. The study period was from August 2012 to January 2015. The level of free haemoglobin was measured by direct spectrophotometry in the specimens extracted. It was then checked whether the intervention influenced the level of haemolysis, and if this was maintained over time. The mean haemolysis measured pre-intervention was 17%, and after intervention it was 6.1%. A year later and under the same conditions, the frequency of haemolysis was measured again the samples analysed, and the percentage was 9% These results are low when compared to the level obtained pre-intervention, but are higher when compared to the levels obtained immediately after the intervention. The transport and analysis conditions were the same. An intervention based on a direct and informative observation in the process of collecting blood samples contributes significantly to reduce the level of haemolysis. This effect is maintained in time. This intervention needs to be repeated to maintain its effectiveness. Audits and continuing education programs are useful for quality assurance procedures, and maintain the level of care needed for a good quality of care. Copyright © 2015 SECA. Published by Elsevier Espana. All rights reserved.
Building Single-Cell Models of Planktonic Metabolism Using PSAMM
NASA Astrophysics Data System (ADS)
Dufault-Thompson, K.; Zhang, Y.; Steffensen, J. L.
2016-02-01
The Genome-scale models (GEMs) of metabolic networks simulate the metabolic activities of individual cells by integrating omics data with biochemical and physiological measurements. GEMs were applied in the simulation of various photo-, chemo-, and heterotrophic organisms and provide significant insights into the function and evolution of planktonic cells. Despite the quick accumulation of GEMs, challenges remain in assembling the individual cell-based models into community-level models. Among various problems, the lack of consistencies in model representation and model quality checking has hindered the integration of individual GEMs and can lead to erroneous conclusions in the development of new modeling algorithms. Here, we present a Portable System for the Analysis of Metabolic Models (PSAMM). Along with the software a novel format of model representation was developed to enhance the readability of model files and permit the inclusion of heterogeneous, model-specific annotation information. A number of quality checking procedures was also implemented in PSAMM to ensure stoichiometric balance and to identify unused reactions. Using a case study of Shewanella piezotolerans WP3, we demonstrated the application of PSAMM in simulating the coupling of carbon utilization and energy production pathways under low-temperature and high-pressure stress. Applying PSAMM, we have also analyzed over 50 GEMs in the current literature and released an updated collection of the models with corrections on a number of common inconsistencies. Overall, PSAMM opens up new opportunities for integrating individual GEMs for the construction and mathematical simulation of community-level models in the scope of entire ecosystems.
NASA Astrophysics Data System (ADS)
Chiong, W. L.; Omar, A. F.
2017-07-01
Non-destructive technique based on visible (VIS) spectroscopy using light emitting diode (LED) as lighting was used for evaluation of the internal quality of mango fruit. The objective of this study was to investigate feasibility of white LED as lighting in spectroscopic instrumentation to predict the acidity and soluble solids content of intact Sala Mango. The reflectance spectra of the mango samples were obtained and measured in the visible range (400-700 nm) using VIS spectroscopy illuminated under different white LEDs and tungsten-halogen lamp (pro lamp). Regression models were developed by multiple linear regression to establish the relationship between spectra and internal quality. Direct calibration transfer procedure was then applied between master and slave lighting to check on the acidity prediction results after transfer. Determination of mango acidity under white LED lighting was successfully performed through VIS spectroscopy using multiple linear regression but otherwise for soluble solids content. Satisfactory results were obtained for calibration transfer between LEDs with different correlated colour temperature indicated this technique was successfully used in spectroscopy measurement between two similar light sources in prediction of internal quality of mango.
Educational quality of YouTube videos on knee arthrocentesis.
Fischer, Jonas; Geurts, Jeroen; Valderrabano, Victor; Hügle, Thomas
2013-10-01
Knee arthrocentesis is a commonly performed diagnostic and therapeutic procedure in rheumatology and orthopedic surgery. Classic teaching of arthrocentesis skills relies on hands-on practice under supervision. Video-based online teaching is an increasingly utilized educational tool in higher and clinical education. YouTube is a popular video-sharing Web site that can be accessed as a teaching source. The objective of this study was to assess the educational value of YouTube videos on knee arthrocentesis posted by health professionals and institutions during the period from 2008 to 2012. The YouTube video database was systematically searched using 5 search terms related to knee arthrocentesis. Two independent clinical reviewers assessed videos for procedural technique and educational value using a 5-point global score, ranging from 1 = poor quality to 5 = excellent educational quality. As validated international guidelines are lacking, we used the guidelines of the Swiss Society of Rheumatology as criterion standard for the procedure. Of more than thousand findings, 13 videos met the inclusion criteria. Of those, 2 contained additional animated video material: one was purely animated, and one was a check list. The average length was 3.31 ± 2.28 minutes. The most popular video had 1388 hits per month. Our mean global score for educational value was 3.1 ± 1.0. Eight videos (62 %) were considered useful for teaching purposes. Use of a "no-touch" procedure, meaning that once disinfected the skin remains untouched before needle penetration, was present in all videos. Six videos (46%) demonstrated full sterile conditions. There was no clear preference of a medial (n = 8) versus lateral (n = 5) approach. A discreet number of YouTube videos on knee arthrocentesis appeared to be suitable for application in a Web-based format for medical students, fellows, and residents. The low-average mean global score for overall educational value suggests an improvement of future video-based instructional materials on YouTube would be necessary before regular use for teaching could be recommended.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mestrovic, Ante; Chitsazzadeh, Shadi; Wells, Derek
2016-08-15
Purpose: To develop a highly sensitive patient specific QA procedure for gated VMAT stereotactic ablative radiotherapy (SABR) treatments. Methods: A platform was constructed to attach the translational stage of a Quasar respiratory motion phantom to a pinpoint ion chamber insert and move the ion chamber inside the ArcCheck. The Quasar phantom controller uses a patient-specific breathing pattern to translate the ion chamber in a superior-inferior direction inside the ArcCheck. With this system the ion chamber is used to QA the correct phase of the gated delivery and the ArcCheck diodes are used to QA the overall dose distribution. This novelmore » approach requires a single plan delivery for a complete QA of a gated plan. The sensitivity of the gating QA procedure was investigated with respect to the following parameters: PTV size, exhale duration, baseline drift, gating window size. Results: The difference between the measured dose to a point in the penumbra and the Eclipse calculated dose was under 2% for small residual motions. The QA procedure was independent of PTV size and duration of exhale. Baseline drift and gating window size, however, significantly affected the penumbral dose measurement, with differences of up to 30% compared to Eclipse. Conclusion: This study described a highly sensitive QA procedure for gated VMAT SABR treatments. The QA outcome was dependent on the gating window size and baseline drift. Analysis of additional patient breathing patterns is currently undergoing to determine a clinically relevant gating window size and an appropriate tolerance level for this procedure.« less
Impact of dose calibrators quality control programme in Argentina
NASA Astrophysics Data System (ADS)
Furnari, J. C.; de Cabrejas, M. L.; del C. Rotta, M.; Iglicki, F. A.; Milá, M. I.; Magnavacca, C.; Dima, J. C.; Rodríguez Pasqués, R. H.
1992-02-01
The national Quality Control (QC) programme for radionuclide calibrators started 12 years ago. Accuracy and the implementation of a QC programme were evaluated over all these years at 95 nuclear medicine laboratories where dose calibrators were in use. During all that time, the Metrology Group of CNEA has distributed 137Cs sealed sources to check stability and has been performing periodic "checking rounds" and postal surveys using unknown samples (external quality control). An account of the results of both methods is presented. At present, more of 65% of the dose calibrators measure activities with an error less than 10%.
Dalley, Jessica S; McMurtry, C Meghan
2016-01-01
Background. Pediatric medical information provision literature focuses on hospitalization and surgical procedures, but children would also benefit from an educational program regarding more commonly experienced medical procedures (e.g., needles, general check-up). Objective. To determine whether an evidence-based educational program reduced children's ratings of fear of and expected pain from medical stimuli and increased their knowledge of procedural coping strategies. Methods. An educational, interactive, developmentally appropriate Teddy Bear Clinic Tour was developed and delivered at a veterinary clinic. During this tour, 71 5-10-year-old children (Mage = 6.62 years, SD = 1.19) were taught about medical equipment, procedures, and coping strategies through modelling and rehearsal. In a single-group, pretest posttest design, participants reported their fear of and expected pain from medical and nonmedical stimuli. Children were also asked to report strategies they would use to cope with procedural fear. Results. Children's ratings for expected pain during a needle procedure were reduced following the intervention. No significant change occurred in children's fear of needles. Children reported more intervention-taught coping strategies at Time 2. Conclusions. The results of this study suggest that an evidence-based, interactive educational program can reduce young children's expectations of needle pain and may help teach them procedural coping strategies.
Anti Rohumaa; Christopher G. Hunt; Mark Hughes; Charles R. Frihart; Janne Logren
2013-01-01
During the rotary peeling of veneer for plywood or the laminated veneer lumber manufacture, checks are formed in the veneer that are as deep as 70 â 80 % of the veneer thickness. The results of this study show that, during adhesive bond testing, deep lathe checks in birch (Betula pendula Roth.) veneer significantly reduce the shear strength and the...
Quality Control of Meteorological Observations
NASA Technical Reports Server (NTRS)
Collins, William; Dee, Dick; Rukhovets, Leonid
1999-01-01
For the first time, a problem of the meteorological observation quality control (QC) was formulated by L.S. Gandin at the Main Geophysical Observatory in the 70's. Later in 1988 L.S. Gandin began adapting his ideas in complex quality control (CQC) to the operational environment at the National Centers for Environmental Prediction. The CQC was first applied by L.S.Gandin and his colleagues to detection and correction of errors in rawinsonde heights and temperatures using a complex of hydrostatic residuals.Later, a full complex of residuals, vertical and horizontal optimal interpolations and baseline checks were added for the checking and correction of a wide range of meteorological variables. some other of Gandin's ideas were applied and substantially developed at other meteorological centers. A new statistical QC was recently implemented in the Goddard Data Assimilation System. The central component of any quality control is a buddy check which is a test of individual suspect observations against available nearby non-suspect observations. A novel feature of this test is that the error variances which are used for QC decision are re-estimated on-line. As a result, the allowed tolerances for suspect observations can depend on local atmospheric conditions. The system is then better able to accept extreme values observed in deep cyclones, jet streams and so on. The basic statements of this adaptive buddy check are described. Some results of the on-line QC including moisture QC are presented.
40 CFR 243.202-3 - Recommended procedures: Operations.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 26 2012-07-01 2011-07-01 true Recommended procedures: Operations. 243.202-3 Section 243.202-3 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID... receive periodic vehicle safety checks, including, but not limited to, inspection of brakes, windshield...
DOT National Transportation Integrated Search
2005-09-01
This document describes a procedure for verifying a dynamic testing system (closed-loop servohydraulic). The procedure is divided into three general phases: (1) electronic system performance verification, (2) calibration check and overall system perf...
8 CFR 273.3 - Screening procedures.
Code of Federal Regulations, 2010 CFR
2010-01-01
... AT FOREIGN PORTS OF EMBARKATION; REDUCING, REFUNDING, OR WAIVING FINES UNDER SECTION 273 OF THE ACT... United States. (b) Procedures at ports of embarkation. At each port of embarkation carriers shall take... secondary information. (ii) Conducting a second check of passenger documents, when necessary at high-risk...
Pascucci, Simone; Bassani, Cristiana; Palombo, Angelo; Poscolieri, Maurizio; Cavalli, Rosa
2008-02-22
This paper describes a fast procedure for evaluating asphalt pavement surface defects using airborne emissivity data. To develop this procedure, we used airborne multispectral emissivity data covering an urban test area close to Venice (Italy).For this study, we first identify and select the roads' asphalt pavements on Multispectral Infrared Visible Imaging Spectrometer (MIVIS) imagery using a segmentation procedure. Next, since in asphalt pavements the surface defects are strictly related to the decrease of oily components that cause an increase of the abundance of surfacing limestone, the diagnostic absorption emissivity peak at 11.2μm of the limestone was used for retrieving from MIVIS emissivity data the areas exhibiting defects on asphalt pavements surface.The results showed that MIVIS emissivity allows establishing a threshold that points out those asphalt road sites on which a check for a maintenance intervention is required. Therefore, this technique can supply local government authorities an efficient, rapid and repeatable road mapping procedure providing the location of the asphalt pavements to be checked.
Adherence to balance tolerance limits at the Upper Mississippi Science Center, La Crosse, Wisconsin.
Myers, C.T.; Kennedy, D.M.
1998-01-01
Verification of balance accuracy entails applying a series of standard masses to a balance prior to use and recording the measured values. The recorded values for each standard should have lower and upper weight limits or tolerances that are accepted as verification of balance accuracy under normal operating conditions. Balance logbooks for seven analytical balances at the Upper Mississippi Science Center were checked over a 3.5-year period to determine if the recorded weights were within the established tolerance limits. A total of 9435 measurements were checked. There were 14 instances in which the balance malfunctioned and operators recorded a rationale in the balance logbook. Sixty-three recording errors were found. Twenty-eight operators were responsible for two types of recording errors: Measurements of weights were recorded outside of the tolerance limit but not acknowledged as an error by the operator (n = 40); and measurements were recorded with the wrong number of decimal places (n = 23). The adherence rate for following tolerance limits was 99.3%. To ensure the continued adherence to tolerance limits, the quality-assurance unit revised standard operating procedures to require more frequent review of balance logbooks.
Ertefaie, Ashkan; Shortreed, Susan; Chakraborty, Bibhas
2016-06-15
Q-learning is a regression-based approach that uses longitudinal data to construct dynamic treatment regimes, which are sequences of decision rules that use patient information to inform future treatment decisions. An optimal dynamic treatment regime is composed of a sequence of decision rules that indicate how to optimally individualize treatment using the patients' baseline and time-varying characteristics to optimize the final outcome. Constructing optimal dynamic regimes using Q-learning depends heavily on the assumption that regression models at each decision point are correctly specified; yet model checking in the context of Q-learning has been largely overlooked in the current literature. In this article, we show that residual plots obtained from standard Q-learning models may fail to adequately check the quality of the model fit. We present a modified Q-learning procedure that accommodates residual analyses using standard tools. We present simulation studies showing the advantage of the proposed modification over standard Q-learning. We illustrate this new Q-learning approach using data collected from a sequential multiple assignment randomized trial of patients with schizophrenia. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Helping You Choose Quality Ambulatory Care
Helping you choose: Quality ambulatory care When you need ambulatory care, you should find out some information to help you choose the best ... the center follows rules for patient safety and quality. Go to Quality Check ® at www. qualitycheck. org ...
Helping You Choose Quality Hospice Care
Helping you choose: Quality hospice care When you need hospice care, you should find out some information to help you choose the best ... the service follows rules for patient safety and quality. Go to Quality Check ® at www. qualitycheck. org ...
Chen, Guang-Pei; Ahunbay, Ergun; Li, X Allen
2016-04-01
To develop an integrated quality assurance (QA) software tool for online replanning capable of efficiently and automatically checking radiation treatment (RT) planning parameters and gross plan quality, verifying treatment plan data transfer from treatment planning system (TPS) to record and verify (R&V) system, performing a secondary monitor unit (MU) calculation with or without a presence of a magnetic field from MR-Linac, and validating the delivery record consistency with the plan. The software tool, named ArtQA, was developed to obtain and compare plan and treatment parameters from both the TPS and the R&V system database. The TPS data are accessed via direct file reading and the R&V data are retrieved via open database connectivity and structured query language. Plan quality is evaluated with both the logical consistency of planning parameters and the achieved dose-volume histograms. Beams in between the TPS and R&V system are matched based on geometry configurations. To consider the effect of a 1.5 T transverse magnetic field from MR-Linac in the secondary MU calculation, a method based on modified Clarkson integration algorithm was developed and tested for a series of clinical situations. ArtQA has been used in their clinic and can quickly detect inconsistencies and deviations in the entire RT planning process. With the use of the ArtQA tool, the efficiency for plan check including plan quality, data transfer, and delivery check can be improved by at least 60%. The newly developed independent MU calculation tool for MR-Linac reduces the difference between the plan and calculated MUs by 10%. The software tool ArtQA can be used to perform a comprehensive QA check from planning to delivery with conventional Linac or MR-Linac and is an essential tool for online replanning where the QA check needs to be performed rapidly.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Guang-Pei, E-mail: gpchen@mcw.edu; Ahunbay, Ergun; Li, X. Allen
Purpose: To develop an integrated quality assurance (QA) software tool for online replanning capable of efficiently and automatically checking radiation treatment (RT) planning parameters and gross plan quality, verifying treatment plan data transfer from treatment planning system (TPS) to record and verify (R&V) system, performing a secondary monitor unit (MU) calculation with or without a presence of a magnetic field from MR-Linac, and validating the delivery record consistency with the plan. Methods: The software tool, named ArtQA, was developed to obtain and compare plan and treatment parameters from both the TPS and the R&V system database. The TPS data aremore » accessed via direct file reading and the R&V data are retrieved via open database connectivity and structured query language. Plan quality is evaluated with both the logical consistency of planning parameters and the achieved dose–volume histograms. Beams in between the TPS and R&V system are matched based on geometry configurations. To consider the effect of a 1.5 T transverse magnetic field from MR-Linac in the secondary MU calculation, a method based on modified Clarkson integration algorithm was developed and tested for a series of clinical situations. Results: ArtQA has been used in their clinic and can quickly detect inconsistencies and deviations in the entire RT planning process. With the use of the ArtQA tool, the efficiency for plan check including plan quality, data transfer, and delivery check can be improved by at least 60%. The newly developed independent MU calculation tool for MR-Linac reduces the difference between the plan and calculated MUs by 10%. Conclusions: The software tool ArtQA can be used to perform a comprehensive QA check from planning to delivery with conventional Linac or MR-Linac and is an essential tool for online replanning where the QA check needs to be performed rapidly.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, H; Lee, H; Choi, K
Purpose: The mechanical quality assurance (QA) of medical accelerators consists of a time consuming series of procedures. Since most of the procedures are done manually – e.g., checking gantry rotation angle with the naked eye using a level attached to the gantry –, it is considered to be a process with high potential for human errors. To remove the possibilities of human errors and reduce the procedure duration, we developed a smartphone application for automated mechanical QA. Methods: The preparation for the automated process was done by attaching a smartphone to the gantry facing upward. For the assessments of gantrymore » and collimator angle indications, motion sensors (gyroscope, accelerator, and magnetic field sensor) embedded in the smartphone were used. For the assessments of jaw position indicator, cross-hair centering, and optical distance indicator (ODI), an optical-image processing module using a picture taken by the high-resolution camera embedded in the smartphone was implemented. The application was developed with the Android software development kit (SDK) and OpenCV library. Results: The system accuracies in terms of angle detection error and length detection error were < 0.1° and < 1 mm, respectively. The mean absolute error for gantry and collimator rotation angles were 0.03° and 0.041°, respectively. The mean absolute error for the measured light field size was 0.067 cm. Conclusion: The automated system we developed can be used for the mechanical QA of medical accelerators with proven accuracy. For more convenient use of this application, the wireless communication module is under development. This system has a strong potential for the automation of the other QA procedures such as light/radiation field coincidence and couch translation/rotations.« less
Variations in Daily Sleep Quality and Type 1 Diabetes Management in Late Adolescents
Queen, Tara L.; Butner, Jonathan; Wiebe, Deborah; Berg, Cynthia A.
2016-01-01
Objective To determine how between- and within-person variability in perceived sleep quality were associated with adolescent diabetes management. Methods A total of 236 older adolescents with type 1 diabetes reported daily for 2 weeks on sleep quality, self-regulatory failures, frequency of blood glucose (BG) checks, and BG values. Average, inconsistent, and daily deviations in sleep quality were examined. Results Hierarchical linear models indicated that poorer average and worse daily perceived sleep quality (compared with one’s average) was each associated with more self-regulatory failures. Sleep quality was not associated with frequency of BG checking. Poorer average sleep quality was related to greater risk of high BG. Furthermore, inconsistent and daily deviations in sleep quality interacted to predict higher BG, with more consistent sleepers benefitting more from a night of high-quality sleep. Conclusions Good, consistent sleep quality during late adolescence may benefit diabetes management by reducing self-regulatory failures and risk of high BG. PMID:26994852
Comparing the Correlation Length of Grain Markets in China and France
NASA Astrophysics Data System (ADS)
Roehner, Bertrand M.; Shiue, Carol H.
In economics, comparative analysis plays the same role as experimental research in physics. In this paper, we closely examine several methodological problems related to comparative analysis by investigating the specific example of grain markets in China and France respectively. This enables us to answer a question in economic history which has so far remained pending, namely whether or not market integration progressed in the 18th century. In economics as in physics, before any new result being accepted, it has to be checked and re-checked by different researchers. This is what we call the replication and comparison procedures. We show how these procedures should (and can) be implemented.
Code of Federal Regulations, 2013 CFR
2013-01-01
... methods, procedures, and techniques for conducting flight instruction. (4) Proper evaluation of student... unsatisfactory training progress. (6) The approved methods, procedures, and limitations for performing the... instructor certificate— (i) The fundamental principles of the teaching-learning process; (ii) Teaching...
Code of Federal Regulations, 2014 CFR
2014-01-01
... methods, procedures, and techniques for conducting flight instruction. (4) Proper evaluation of student... unsatisfactory training progress. (6) The approved methods, procedures, and limitations for performing the... instructor certificate— (i) The fundamental principles of the teaching-learning process; (ii) Teaching...
Code of Federal Regulations, 2011 CFR
2011-01-01
... methods, procedures, and techniques for conducting flight instruction. (4) Proper evaluation of student... unsatisfactory training progress. (6) The approved methods, procedures, and limitations for performing the... instructor certificate— (i) The fundamental principles of the teaching-learning process; (ii) Teaching...
Code of Federal Regulations, 2012 CFR
2012-01-01
... methods, procedures, and techniques for conducting flight instruction. (4) Proper evaluation of student... unsatisfactory training progress. (6) The approved methods, procedures, and limitations for performing the... instructor certificate— (i) The fundamental principles of the teaching-learning process; (ii) Teaching...
40 CFR 53.52 - Leak check test.
Code of Federal Regulations, 2014 CFR
2014-07-01
... MONITORING REFERENCE AND EQUIVALENT METHODS Procedures for Testing Physical (Design) and Performance Characteristics of Reference Methods and Class I and Class II Equivalent Methods for PM 2.5 or PM 10-2.5 § 53.52... to include the facility, including components, instruments, operator controls, a written procedure...
Guan, Zhonghui; Baker, Keith; Sandberg, Warren S
2009-11-01
We report a small case series in which misaligned disposable pulse oximeter sensors gave falsely low saturation readings. In each instance, the sensor performed well during preinduction oxygen administration and the early part of the case, most notably by producing a plethysmographic trace rated as high quality by the oximeter software. The reported pulse oximeter oxygen saturation eventually decreased to concerning levels in each instance, but the anesthesiologists, relying on the reported high-quality signal, initially sought other causes for apparent hypoxia. They undertook maneuvers and diagnostic procedures later deemed unnecessary. When the malpositioned sensors were discovered and repositioned, the apparent hypoxia was quickly relieved in each case. We then undertook a survey of disposable oximeter sensors as patients entered the recovery room, and discovered malposition of more than 1 cm in approximately 20% of all sensors, without apparent consequence. We conclude that the technology is quite robust, but that the diagnosis of apparent hypoxia should include a quick check of oximeter position early on.
Bien, Elizabeth Ann; Gillespie, Gordon Lee; Betcher, Cynthia Ann; Thrasher, Terri L; Mingerink, Donna R
2016-12-01
International travel and infectious respiratory illnesses worldwide place health care workers (HCWs) at increasing risk of respiratory exposures. To ensure the highest quality safety initiatives, one health care system used a quality improvement model of Plan-Do-Study-Act and guidance from Occupational Safety and Health Administration's (OSHA) May 2015 Hospital Respiratory Protection Program (RPP) Toolkit to assess a current program. The toolkit aided in identification of opportunities for improvement within their well-designed RPP. One opportunity was requiring respirator use during aerosol-generating procedures for specific infectious illnesses. Observation data demonstrated opportunities to mitigate controllable risks including strap placement, user seal check, and reuse of disposable N95 filtering facepiece respirators. Subsequent interdisciplinary collaboration resulted in other ideas to decrease risks and increase protection from potentially infectious respiratory illnesses. The toolkit's comprehensive document to evaluate the program showed that while the OSHA standards have not changed, the addition of the toolkit can better protect HCWs. © 2016 The Author(s).
Selvakumar, N; Murthy, B N; Prabhakaran, E; Sivagamasundari, S; Vasanthan, Samuel; Perumal, M; Govindaraju, R; Chauhan, L S; Wares, Fraser; Santha, T; Narayanan, P R
2005-02-01
Assessment of 12 microscopy centers in a tuberculosis unit by blinded checking of eight sputum smears selected by using a lot quality assurance sampling (LQAS) method and by unblinded checking of all positive and five negative slides, among the slides examined in a month in a microscopy centre, revealed that the LQAS method can be implemented in the field to monitor the performance of acid-fast bacillus microscopy centers in national tuberculosis control programs.
Selvakumar, N.; Murthy, B. N.; Prabhakaran, E.; Sivagamasundari, S.; Vasanthan, Samuel; Perumal, M.; Govindaraju, R.; Chauhan, L. S.; Wares, Fraser; Santha, T.; Narayanan, P. R.
2005-01-01
Assessment of 12 microscopy centers in a tuberculosis unit by blinded checking of eight sputum smears selected by using a lot quality assurance sampling (LQAS) method and by unblinded checking of all positive and five negative slides, among the slides examined in a month in a microscopy centre, revealed that the LQAS method can be implemented in the field to monitor the performance of acid-fast bacillus microscopy centers in national tuberculosis control programs. PMID:15695704
Lake water quality mapping from Landsat
NASA Technical Reports Server (NTRS)
Scherz, J. P.
1977-01-01
In the project described remote sensing was used to check the quality of lake waters. The lakes of three Landsat scenes were mapped with the Bendix MDAS multispectral analysis system. From the MDAS color coded maps, the lake with the worst algae problem was easily located. The lake was closely checked, and the presence of 100 cows in the springs which fed the lake could be identified as the pollution source. The laboratory and field work involved in the lake classification project is described.
Member Checking: A Tool to Enhance Trustworthiness or Merely a Nod to Validation?
Birt, Linda; Scott, Suzanne; Cavers, Debbie; Campbell, Christine; Walter, Fiona
2016-06-22
The trustworthiness of results is the bedrock of high quality qualitative research. Member checking, also known as participant or respondent validation, is a technique for exploring the credibility of results. Data or results are returned to participants to check for accuracy and resonance with their experiences. Member checking is often mentioned as one in a list of validation techniques. This simplistic reporting might not acknowledge the value of using the method, nor its juxtaposition with the interpretative stance of qualitative research. In this commentary, we critique how member checking has been used in published research, before describing and evaluating an innovative in-depth member checking technique, Synthesized Member Checking. The method was used in a study with patients diagnosed with melanoma. Synthesized Member Checking addresses the co-constructed nature of knowledge by providing participants with the opportunity to engage with, and add to, interview and interpreted data, several months after their semi-structured interview. © The Author(s) 2016.
42 CFR 493.1254 - Standard: Maintenance and function checks.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 42 Public Health 5 2010-10-01 2010-10-01 false Standard: Maintenance and function checks. 493.1254 Section 493.1254 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STANDARDS AND CERTIFICATION LABORATORY REQUIREMENTS Quality System for Nonwaived...
An Adaptive Buddy Check for Observational Quality Control
NASA Technical Reports Server (NTRS)
Dee, Dick P.; Rukhovets, Leonid; Todling, Ricardo; DaSilva, Arlindo M.; Larson, Jay W.; Einaudi, Franco (Technical Monitor)
2000-01-01
An adaptive buddy check algorithm is presented that adjusts tolerances for outlier observations based on the variability of surrounding data. The algorithm derives from a statistical hypothesis test combined with maximum-likelihood covariance estimation. Its stability is shown to depend on the initial identification of outliers by a simple background check. The adaptive feature ensures that the final quality control decisions are not very sensitive to prescribed statistics of first-guess and observation errors, nor on other approximations introduced into the algorithm. The implementation of the algorithm in a global atmospheric data assimilation is described. Its performance is contrasted with that of a non-adaptive buddy check, for the surface analysis of an extreme storm that took place in Europe on 27 December 1999. The adaptive algorithm allowed the inclusion of many important observations that differed greatly from the first guess and that would have been excluded on the basis of prescribed statistics. The analysis of the storm development was much improved as a result of these additional observations.
Ohshiro, Takafumi; Sasaki, Katsumi; Takenouchi, Kiyofumi; Kozuma, Mituaki; Ohshiro, Naoyuki; Kageyama, Yuichi
2013-01-01
Background and aims: There are many Q-switched lasers. The Q-switched ruby laser is the one most popularly used in dermatology, aesthetic surgery and plastic surgery, to remove pigmented lesions or tattoos. Correct and regular calibration of such a system is essential. However, some clinics fail to perform this with the excuse of having no measuring instrument (MI) in their offices or treatment rooms in some of their hospitals or clinics, or even the case of well-known medical universities in Japan. The present article explains the precise calibration procedure and beam pattern checking for the Q-switched ruby systems in the first author's clinic. Rationale: In the case of treatment with a medical laser, the calibration and the irradiated pattern (IP) check of the laser being used for treatment are the most important factors for treatment efficacy and safety. If these factors change, the treatment result could be different from that expected. Such kind of data are not acceptable as scientific information for a presentation or published paper. With such unreliable results and incorrect beam pattern, replicating such a study would be impossible Regular calibration check: In our clinic, we have 2 Q-switched ruby laser systems. On a daily basis, the beam patterns, both the optical axis of the beam and its treatment footprint, are checked on dedicated printed sheets and footprint paper, respectively, at the beginning of the day and after the last procedure. Every two weeks we calibrate our systems in-house using a precise MI. Every six months we calibrate the systems in-house with the MI, and then we send the systems back to the manufacturers for calibration. Once every year, we have our MI calibrated by an accredited facility in Japan. In this way, we are not only ensuring accurate and safe treatment for our patients, but we are also producing accurate system and treatment data which can be replicated by others, the basis of evidence-based medicine. PMID:24204090
40 CFR 63.945 - Test methods and procedures.
Code of Federal Regulations, 2011 CFR
2011-07-01
..., appendix A. Each potential leak interface (i.e., a location where organic vapor leakage could occur) on the cover and associated closure devices shall be checked. Potential leak interfaces that are associated... determined according to the procedures in Method 21 of 40 CFR part 60, appendix A. (7) Each potential leak...
40 CFR 63.945 - Test methods and procedures.
Code of Federal Regulations, 2013 CFR
2013-07-01
..., appendix A. Each potential leak interface (i.e., a location where organic vapor leakage could occur) on the cover and associated closure devices shall be checked. Potential leak interfaces that are associated... determined according to the procedures in Method 21 of 40 CFR part 60, appendix A. (7) Each potential leak...
40 CFR 63.925 - Test methods and procedures.
Code of Federal Regulations, 2013 CFR
2013-07-01
.... Each potential leak interface (i.e., a location where organic vapor leakage could occur) on the cover and associated closure devices shall be checked. Potential leak interfaces that are associated with... according to the procedures in Method 21 of 40 CFR part 60, appendix A. (7) Each potential leak interface...
40 CFR 63.925 - Test methods and procedures.
Code of Federal Regulations, 2012 CFR
2012-07-01
.... Each potential leak interface (i.e., a location where organic vapor leakage could occur) on the cover and associated closure devices shall be checked. Potential leak interfaces that are associated with... according to the procedures in Method 21 of 40 CFR part 60, appendix A. (7) Each potential leak interface...
40 CFR 63.905 - Test methods and procedures.
Code of Federal Regulations, 2012 CFR
2012-07-01
.... Each potential leak interface (i.e., a location where organic vapor leakage could occur) on the cover and associated closure devices shall be checked. Potential leak interfaces that are associated with... according to the procedures in Method 21 of 40 CFR part 60, appendix A. (7) Each potential leak interface...
40 CFR 63.905 - Test methods and procedures.
Code of Federal Regulations, 2011 CFR
2011-07-01
.... Each potential leak interface (i.e., a location where organic vapor leakage could occur) on the cover and associated closure devices shall be checked. Potential leak interfaces that are associated with... according to the procedures in Method 21 of 40 CFR part 60, appendix A. (7) Each potential leak interface...
40 CFR 63.905 - Test methods and procedures.
Code of Federal Regulations, 2013 CFR
2013-07-01
.... Each potential leak interface (i.e., a location where organic vapor leakage could occur) on the cover and associated closure devices shall be checked. Potential leak interfaces that are associated with... according to the procedures in Method 21 of 40 CFR part 60, appendix A. (7) Each potential leak interface...
40 CFR 63.925 - Test methods and procedures.
Code of Federal Regulations, 2011 CFR
2011-07-01
.... Each potential leak interface (i.e., a location where organic vapor leakage could occur) on the cover and associated closure devices shall be checked. Potential leak interfaces that are associated with... according to the procedures in Method 21 of 40 CFR part 60, appendix A. (7) Each potential leak interface...
40 CFR 63.945 - Test methods and procedures.
Code of Federal Regulations, 2012 CFR
2012-07-01
..., appendix A. Each potential leak interface (i.e., a location where organic vapor leakage could occur) on the cover and associated closure devices shall be checked. Potential leak interfaces that are associated... determined according to the procedures in Method 21 of 40 CFR part 60, appendix A. (7) Each potential leak...
32 CFR Appendix B to Part 154 - Request Procedures
Code of Federal Regulations, 2012 CFR
2012-07-01
... 32 National Defense 1 2012-07-01 2012-07-01 false Request Procedures B Appendix B to Part 154 National Defense Department of Defense OFFICE OF THE SECRETARY OF DEFENSE SECURITY DEPARTMENT OF DEFENSE... Center, Defense Investigative Service, P.O. Box 1083, Baltimore, Maryland 21203. C. National Agency Check...
32 CFR Appendix B to Part 154 - Request Procedures
Code of Federal Regulations, 2013 CFR
2013-07-01
... 32 National Defense 1 2013-07-01 2013-07-01 false Request Procedures B Appendix B to Part 154 National Defense Department of Defense OFFICE OF THE SECRETARY OF DEFENSE SECURITY DEPARTMENT OF DEFENSE... Center, Defense Investigative Service, P.O. Box 1083, Baltimore, Maryland 21203. C. National Agency Check...
14 CFR 21.127 - Tests: aircraft.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Tests: aircraft. 21.127 Section 21.127... PROCEDURES FOR PRODUCTS AND PARTS Production Under Type Certificate Only § 21.127 Tests: aircraft. (a) Each... test procedure and flight check-off form, and in accordance with that form, flight test each aircraft...
M. Thompson Conkle
1986-01-01
Check the laboratory reports after your next physical. You'll find information on a number of biochemical processes. Procedures like those used in the medical sciences are yielding valuable information about genetic differences among trees and tree pests. New procedures that provide ways to isolate and move genes are advancing progress in tree improvement. These...
40 CFR 205.54-1 - Low speed sound emission test procedures.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 25 2011-07-01 2011-07-01 false Low speed sound emission test....54-1 Low speed sound emission test procedures. (a) Instrumentation. The following instrumentation... checked annually to verify that its output has not changed. (3) An engine-speed tachometer which is...
40 CFR 53.52 - Leak check test.
Code of Federal Regulations, 2011 CFR
2011-07-01
... MONITORING REFERENCE AND EQUIVALENT METHODS Procedures for Testing Physical (Design) and Performance Characteristics of Reference Methods and Class I and Class II Equivalent Methods for PM2.5 or PM10â2.5 § 53.52... to include the facility, including components, instruments, operator controls, a written procedure...
40 CFR 53.52 - Leak check test.
Code of Federal Regulations, 2012 CFR
2012-07-01
... MONITORING REFERENCE AND EQUIVALENT METHODS Procedures for Testing Physical (Design) and Performance Characteristics of Reference Methods and Class I and Class II Equivalent Methods for PM2.5 or PM10â2.5 § 53.52... to include the facility, including components, instruments, operator controls, a written procedure...
40 CFR 53.52 - Leak check test.
Code of Federal Regulations, 2010 CFR
2010-07-01
... MONITORING REFERENCE AND EQUIVALENT METHODS Procedures for Testing Physical (Design) and Performance Characteristics of Reference Methods and Class I and Class II Equivalent Methods for PM2.5 or PM10â2.5 § 53.52... to include the facility, including components, instruments, operator controls, a written procedure...
Diagnostic Procedures for Detecting Nonlinear Relationships between Latent Variables
ERIC Educational Resources Information Center
Bauer, Daniel J.; Baldasaro, Ruth E.; Gottfredson, Nisha C.
2012-01-01
Structural equation models are commonly used to estimate relationships between latent variables. Almost universally, the fitted models specify that these relationships are linear in form. This assumption is rarely checked empirically, largely for lack of appropriate diagnostic techniques. This article presents and evaluates two procedures that can…
40 CFR 51.359 - Quality control.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 2 2014-07-01 2014-07-01 false Quality control. 51.359 Section 51.359 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS REQUIREMENTS FOR... to assure test accuracy. Computer control of quality assurance checks and quality control charts...
Safe and effective nursing shift handover with NURSEPASS: An interrupted time series.
Smeulers, Marian; Dolman, Christine D; Atema, Danielle; van Dieren, Susan; Maaskant, Jolanda M; Vermeulen, Hester
2016-11-01
Implementation of a locally developed evidence based nursing shift handover blueprint with a bedside-safety-check to determine the effect size on quality of handover. A mixed methods design with: (1) an interrupted time series analysis to determine the effect on handover quality in six domains; (2) descriptive statistics to analyze the intercepted discrepancies by the bedside-safety-check; (3) evaluation sessions to gather experiences with the new handover process. We observed a continued trend of improvement in handover quality and a significant improvement in two domains of handover: organization/efficiency and contents. The bedside-safety-check successfully identified discrepancies on drains, intravenous medications, bandages or general condition and was highly appreciated. Use of the nursing shift handover blueprint showed promising results on effectiveness as well as on feasibility and acceptability. However, to enable long term measurement on effectiveness, evaluation with large scale interrupted times series or statistical process control is needed. Copyright © 2016 Elsevier Inc. All rights reserved.
77 FR 67344 - Proposed Information Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-09
... Criminal History Checks. DATES: Written comments must be submitted to the individual and office listed in... methodology and assumptions used; Enhance the quality, utility, and clarity of the information to be collected... Criminal History Check. CNCS and its grantees must ensure that national service beneficiaries are protected...
Antonelli, Giorgia; Padoan, Andrea; Aita, Ada; Sciacovelli, Laura; Plebani, Mario
2017-08-28
Background The International Standard ISO 15189 is recognized as a valuable guide in ensuring high quality clinical laboratory services and promoting the harmonization of accreditation programmes in laboratory medicine. Examination procedures must be verified in order to guarantee that their performance characteristics are congruent with the intended scope of the test. The aim of the present study was to propose a practice model for implementing procedures employed for the verification of validated examination procedures already used for at least 2 years in our laboratory, in agreement with the ISO 15189 requirement at the Section 5.5.1.2. Methods In order to identify the operative procedure to be used, approved documents were identified, together with the definition of performance characteristics to be evaluated for the different methods; the examination procedures used in laboratory were analyzed and checked for performance specifications reported by manufacturers. Then, operative flow charts were identified to compare the laboratory performance characteristics with those declared by manufacturers. Results The choice of performance characteristics for verification was based on approved documents used as guidance, and the specific purpose tests undertaken, a consideration being made of: imprecision and trueness for quantitative methods; diagnostic accuracy for qualitative methods; imprecision together with diagnostic accuracy for semi-quantitative methods. Conclusions The described approach, balancing technological possibilities, risks and costs and assuring the compliance of the fundamental component of result accuracy, appears promising as an easily applicable and flexible procedure helping laboratories to comply with the ISO 15189 requirements.
Sindhwani, Geetika; Gupta, Monica; Arora, Sweta; Mishra, Arpita; Bhatt, Jayesh; Arora, Manali; Gehani, Anisha
2017-01-01
Introduction An organization’s transformation from imple-mentation of small, distinct Quality Improvement (QI) efforts to complete incorporation of Quality Improvement Program (QIP) into its culture occurs through a process of churning the foundational elements over time. Aim To develop a quality culture across the employees, identify measurable indicators and various tools to impart effective quality care and develop a learning culture for continuous quality improvement in the field of imaging services. Materials and Methods To establish a QIP, the bare minimum requirement started with forming a quality committee. The committee identified the areas of improvement and ascertaining the core principle of Quality Management System (QMS) by having a Quality Manual, Standard Operating Procedures (SOP’s), work-instructions, identification and monitoring of quality indicators and a training calendar. Appropriate tools like formatted daily registers, periodic check lists, run charts etc., were developed to collect the data followed by multiple PDSA cycles (Plan, Do, Study and Act) which helped identify the process bottlenecks, followed by implementing solutions and reanalysis. Results A total of 17 measurable key performance indicators were identified from the four major quality tasks namely Safety, Process Improvement, Professional Outcome and Satisfaction, to assess the performance measures and targets of QIP. Conclusion Diagnostic services should evaluate how to choose the most appropriate method and develop a comprehensive QIP to meet the needs of the staff and the end users, thus, creating a working environment, where people constitutes the intrinsic value in attaining the ultimate quality and safety. PMID:28969238
Langley Wind Tunnel Data Quality Assurance-Check Standard Results
NASA Technical Reports Server (NTRS)
Hemsch, Michael J.; Grubb, John P.; Krieger, William B.; Cler, Daniel L.
2000-01-01
A framework for statistical evaluation, control and improvement of wind funnel measurement processes is presented The methodology is adapted from elements of the Measurement Assurance Plans developed by the National Bureau of Standards (now the National Institute of Standards and Technology) for standards and calibration laboratories. The present methodology is based on the notions of statistical quality control (SQC) together with check standard testing and a small number of customer repeat-run sets. The results of check standard and customer repeat-run -sets are analyzed using the statistical control chart-methods of Walter A. Shewhart long familiar to the SQC community. Control chart results are presented for. various measurement processes in five facilities at Langley Research Center. The processes include test section calibration, force and moment measurements with a balance, and instrument calibration.
Gowrisankar, G; Jagadeshan, G; Elango, L
2017-04-01
In many regions around the globe, including India, degradation in the quality of groundwater is of great concern. The objective of this investigation is to determine the effect of recharge from a check dam on quality of groundwater in a region of Krishnagiri District of Tamil Nadu State, India. For this study, water samples from 15 wells were periodically obtained and analysed for major ions and fluoride concentrations. The amount of major ions present in groundwater was compared with the drinking water guideline values of the Bureau of Indian Standards. With respect to the sodium and fluoride concentrations, 38% of groundwater samples collected was not suitable for direct use as drinking water. Suitability of water for agricultural use was determined considering the electrical conductivity, sodium adsorption ratio, sodium percentage, permeability index, Wilcox and United States Salinity Laboratory diagrams. The influence of freshwater recharge from the dam is evident as the groundwater in wells nearer to the check dam was suitable for both irrigation and domestic purposes. However, the groundwater away from the dam had a high ionic composition. This study demonstrated that in other fluoride-affected areas, the concentration can be reduced by dilution with the construction of check dams as a measure of managed aquifer recharge.
Dalley, Jessica S.; McMurtry, C. Meghan
2016-01-01
Background. Pediatric medical information provision literature focuses on hospitalization and surgical procedures, but children would also benefit from an educational program regarding more commonly experienced medical procedures (e.g., needles, general check-up). Objective. To determine whether an evidence-based educational program reduced children's ratings of fear of and expected pain from medical stimuli and increased their knowledge of procedural coping strategies. Methods. An educational, interactive, developmentally appropriate Teddy Bear Clinic Tour was developed and delivered at a veterinary clinic. During this tour, 71 5–10-year-old children (Mage = 6.62 years, SD = 1.19) were taught about medical equipment, procedures, and coping strategies through modelling and rehearsal. In a single-group, pretest posttest design, participants reported their fear of and expected pain from medical and nonmedical stimuli. Children were also asked to report strategies they would use to cope with procedural fear. Results. Children's ratings for expected pain during a needle procedure were reduced following the intervention. No significant change occurred in children's fear of needles. Children reported more intervention-taught coping strategies at Time 2. Conclusions. The results of this study suggest that an evidence-based, interactive educational program can reduce young children's expectations of needle pain and may help teach them procedural coping strategies. PMID:27445612
Reliability of electromagnetic induction data in near surface application
NASA Astrophysics Data System (ADS)
Nüsch, A.; Werban, U.; Sauer, U.; Dietrich, P.
2012-12-01
Use of the Electromagnetic Induction method for measuring electrical conductivities is widespread in applied geosciences, since the method is easy to perform and influenced by soil parameters. The vast amount of different applications of EMI measurements for different spatial resolutions as well as for the derivation of different soil parameters necessitates a unified handling of EMI data. So the requirements to the method have been changed from a qualitative overview to a quantitative use of data. A quantitative treatment of the data however is limited by the available instruments, which were made only for qualitative use. Nevertheless the limitations of the method can be expanded by considering a few conditions. In this study, we introduce possibilities for enhancing the quality of EMI data with regards to large scale investigations. In a set of systematic investigations, we show which aspects have to be taken into account when using a commercially available instrument, related to long term stability, comparability and repeatability. In-depth knowledge of the instruments used concerning aspects such as their calibration procedure, long term stability, battery life and thermal behaviour is an essential pre-requisite before starting the measurement process. A further aspect highlighted is quality control during measurements and if necessary a subsequent data correction which is pre-requisite for a quantitative analysis of the data. Quality control during the measurement process is crucial. Before a measurement starts, it is recommended that a short term test is carried out on-site to check environmental noise. Signal to noise ratio is a decisive influencing factor of whether or not the method is applicable at the chosen field site. A measurement needs to be monitored according to possible drifts. This can be achieved with different accuracies and starting from a quality check, with the help of reference lines up to a quantitative control with reference points. Further global reference lines are necessary if measurements take place at the landscape scale. In some cases, it is possible to eliminate drifts that may occur by using a data correction based on binding lines. The suggested procedure can raise the explanatory power of the data enormously and artefacts caused by drifts or inadequate handling are minimized. This work was supported by iSOIL - Interactions between soil related sciences - Linking geophysics, soil science and digital soil mapping, which is a Collaborative Project (Grant Agreement number 211386) co-funded by the Research DG of the European Commission within the RTD activities of the FP7 Thematic Priority Environment; iSOIL is one member of the SOIL TECHNOLOGY CLUSTER of Research Projects funded by the EC.
Lubbe, R
1993-05-01
In Germany two professional groups may apply medical science to human beings: physicians and "Heilpraktiker" (naturopaths). However, no regulations exist regarding training, examination and continuation of studies of "Heilpraktiker". They only have to be checked on the basis of the Heilpraktiker low intended to exclude a danger to health of the people. In North Rhine Westphalia each of the 54 public health offices effects this checking on its own responsibility. The present analysis shows that in the Detmold administration district the checking by the individual public health offices differs greatly from one another. There are offices where most applicants fail and others where nearly all of the applicants pass. In addition the passing rate shows considerable regional and temporal differences. It can also be taken from the data that public health offices with a high passing rate not only carry out many checking cases but that they receive applicants residing outside the area of responsibility of the checking public health office ("checking tourism"). Rapid implementation of the "Federal Guide on the Checking of Heilpraktiker Applicants" is recommended for the Land of North Rhine Westphalia as this procedure would be fairer for the applicant and less expensive for the citizens.
Pad Safety Personnel Launch Support For STS-200
NASA Technical Reports Server (NTRS)
Guarino, Jennifer
2007-01-01
The launch of a space shuttle is a complex and lengthy procedure. There are many places and components to look at and prepare. The components are the orbiter, solid rocket boosters, external tank, and ground equipment. Some of the places are the launch pad, fuel locations, and surrounding structures. Preparations for a launch include equipment checks, system checks, sniff checks for hazardous commodities, and countless walkdowns. Throughout these preparations, pad safety personnel must always be on call. This requires three shifts of multiple people to be ready when needed. Also, the pad safety personnel must be available for the non-launch tasks that are always present for both launch pads
Quality Work, Quality Control in Technical Services.
ERIC Educational Resources Information Center
Horny, Karen L.
1985-01-01
Quality in library technical services is explored in light of changes produced by automation. Highlights include a definition of quality; new opportunities and shifting priorities; cataloging (fullness of records, heading consistency, accountability, local standards, automated checking); need for new skills (management, staff); and boons of…
Direct to consumer advertising via the Internet, a study of hip resurfacing.
Ogunwale, B; Clarke, J; Young, D; Mohammed, A; Patil, S; Meek, R M D
2009-02-01
With increased use of the internet for health information and direct to consumer advertising from medical companies, there is concern about the quality of information available to patients. The aim of this study was to examine the quality of health information on the internet for hip resurfacing. An assessment tool was designed to measure quality of information. Websites were measured on credibility of source; usability; currentness of the information; content relevance; content accuracy/completeness and disclosure/bias. Each website assessed was given a total score, based on number of scores achieved from the above categories websites were further analysed on author, geographical origin and possession of an independent credibility check. There was positive correlation between the overall score for the website and the score of each website in each assessment category. Websites by implant companies, doctors and hospitals scored poorly. Websites with an independent credibility check such as Health on the Net (HoN) scored twice the total scores of websites without. Like other internet health websites, the quality of information on hip resurfacing websites is variable. This study highlights methods by which to assess the quality of health information on the internet and advocates that patients should look for a statement of an "independent credibility check" when searching for information on hip resurfacing.
Sensor control of robot arc welding
NASA Technical Reports Server (NTRS)
Sias, F. R., Jr.
1985-01-01
A basic problem in the application of robots for welding which is how to guide a torch along a weld seam using sensory information was studied. Improvement of the quality and consistency of certain Gas Tungsten Arc welds on the Space Shuttle Main Engine (SSME) that are too complex geometrically for conventional automation and therefore are done by hand was examined. The particular problems associated with space shuttle main egnine (SSME) manufacturing and weld-seam tracking with an emphasis on computer vision methods were analyzed. Special interface software for the MINC computr are developed which will allow it to be used both as a test system to check out the robot interface software and later as a development tool for further investigation of sensory systems to be incorporated in welding procedures.
Siersma, Volkert; Kousgaard, Marius Brostrøm; Reventlow, Susanne; Ertmann, Ruth; Felding, Peter; Waldorff, Frans Boch
2015-02-01
This study aimed to evaluate the relative effectiveness of electronic and postal reminders for increasing adherence to the quality assurance programme for the international normalized ratio (INR) point-of-care testing (POCT) device in primary care. All 213 family practices that use the Elective Laboratory of the Capital Region, Denmark, and regularly conduct INR POCT were randomly allocated into two similarly sized groups. During the 4-month intervention, these practices were sent either computer reminders (ComRem) or computer-generated postal reminders (Postal) if they did not perform a split test to check the quality of their INR POCT for each calendar month. The adherence of the practices was tracked during the subsequent 8 months subdivided into two 4-month periods both without intervention. Outcomes were measures of split test procedure adherence. Both interventions were associated with an increase in adherence to the split test procedure - a factor 6.00 [95% confidence interval (CI) 4.46-7.72] and 8.22 [95% CI 5.87-11.52] for ComRem and Postal, respectively - but there is no evidence that one of the interventions was more effective than the other. In the ComRem group, the expected number of split tests (out of four) was 2.54 (95% CI 2.33-2.76) versus 2.44 (95% CI 2.24-2.65) in the Postal group, P = 0.14. There was a slight decrease in adherence over the two follow-ups, but neither intervention was better than the other in achieving a lasting improvement in adherence. Computer reminders are as efficient as postal reminders in increasing adherence to a quality assurance programme for the INR POCT device in primary care. © 2014 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Stockhause, M.; Höck, H.; Toussaint, F.; Weigel, T.; Lautenschlager, M.
2012-12-01
We present the publication process for the CMIP5 (Coupled Model Intercomparison Project Phase 5) data with special emphasis on the current role of identifiers and the potential future role of PIDs in such distributed technical infrastructures. The DataCite data publication with DOI assignment finalizes the 3 levels quality control procedure for CMIP5 data (Stockhause et al., 2012). WDCC utilizes the Assistant System Atarrabi to support the publication process. Atarrabi is a web-based workflow system for metadata reviews of data creators and Publication Agents (PAs). Within the quality checks for level 3 all available information in the different infrastructure components is cross-checked for consistency by the DataCite PA. This information includes: metadata on data, metadata in the long-term archive of the Publication Agency, quality information, and external metadata on model and simulation (CIM). For these consistency checks metadata related to the data publication has to be identified. The Data Reference Syntax (DRS) convention functions as global identifier for data. Since the DRS structures the data, hierarchically, it can be used to identify data collections like DataCite publication units, i.e. all data belonging to a CMIP5 simulation. Every technical component of the infrastructure uses DRS or maps to it, but there is no central repository storing DRS_ids. Thus they have to be mapped, occasionally. Additional local identifiers are used within the different technical infrastructure components. Identification of related pieces of information in their repositories is cumbersome and tricky for the PA. How could PIDs improve the situation? To establish a reliable distributed data and metadata infrastructure, PIDs for all objects are needed as well as relations between them. An ideal data publication scenario for federated community projects within Earth System Sciences, e.g. CMIP, would be: 1. Data creators at the modeling centers define their simulation, related metadata, and software, which are assigned PIDs. 2. During ESGF data publication the data entities are assigned PIDs with references to the PIDs of 1. Since we deal with different hierarchical levels, the definition of collections on these levels is advantageous. A possible implementation concept using Handles is described by Weigel et al. (2012). 3. Quality results are assigned PID(s) and a reference to the data. A quality PID is added as a reference to the data collection PID. 4. The PA accesses the PID on the data collection to get the data and all related information for cross-checking. The presented example of the technical infrastructure for the CMIP5 data distribution shows the importance of PIDs, especially as the data is distributed over multiple repositories world-wide and additional separate pieces of data related information are independently collected from the data. References: Stockhause, M., Höck, H., Toussaint, F., Lautenschlager, M. (2012): 'Quality assessment concept of the World Data Center for Climate and its application to CMIP5 data', Geosci. Model Dev. Discuss., 5, 781-802, doi:10.5194/gmdd-5-781-2012. Weigel, T., et al. (2012): 'Structural Elements in a Persistent Identifier Infrastructure and Resulting Benefits for the Earth Science Community', submitted to AGU 2012 Session IN009.
A new dataset validation system for the Planetary Science Archive
NASA Astrophysics Data System (ADS)
Manaud, N.; Zender, J.; Heather, D.; Martinez, S.
2007-08-01
The Planetary Science Archive is the official archive for the Mars Express mission. It has received its first data by the end of 2004. These data are delivered by the PI teams to the PSA team as datasets, which are formatted conform to the Planetary Data System (PDS). The PI teams are responsible for analyzing and calibrating the instrument data as well as the production of reduced and calibrated data. They are also responsible of the scientific validation of these data. ESA is responsible of the long-term data archiving and distribution to the scientific community and must ensure, in this regard, that all archived products meet quality. To do so, an archive peer-review is used to control the quality of the Mars Express science data archiving process. However a full validation of its content is missing. An independent review board recently recommended that the completeness of the archive as well as the consistency of the delivered data should be validated following well-defined procedures. A new validation software tool is being developed to complete the overall data quality control system functionality. This new tool aims to improve the quality of data and services provided to the scientific community through the PSA, and shall allow to track anomalies in and to control the completeness of datasets. It shall ensure that the PSA end-users: (1) can rely on the result of their queries, (2) will get data products that are suitable for scientific analysis, (3) can find all science data acquired during a mission. We defined dataset validation as the verification and assessment process to check the dataset content against pre-defined top-level criteria, which represent the general characteristics of good quality datasets. The dataset content that is checked includes the data and all types of information that are essential in the process of deriving scientific results and those interfacing with the PSA database. The validation software tool is a multi-mission tool that has been designed to provide the user with the flexibility of defining and implementing various types of validation criteria, to iteratively and incrementally validate datasets, and to generate validation reports.
NASA Astrophysics Data System (ADS)
Song, Yunquan; Lin, Lu; Jian, Ling
2016-07-01
Single-index varying-coefficient model is an important mathematical modeling method to model nonlinear phenomena in science and engineering. In this paper, we develop a variable selection method for high-dimensional single-index varying-coefficient models using a shrinkage idea. The proposed procedure can simultaneously select significant nonparametric components and parametric components. Under defined regularity conditions, with appropriate selection of tuning parameters, the consistency of the variable selection procedure and the oracle property of the estimators are established. Moreover, due to the robustness of the check loss function to outliers in the finite samples, our proposed variable selection method is more robust than the ones based on the least squares criterion. Finally, the method is illustrated with numerical simulations.
Evaluation of platinum resistance thermometers
NASA Technical Reports Server (NTRS)
Daryabeigi, Kamran; Dillon-Townes, Lawrence A.
1988-01-01
An evaluation procedure for the characterization of industrial platinum resistance thermometers (PRTs) for use in the temperature range -120 to 160 C was investigated. This evaluation procedure consisted of calibration, thermal stability and hysteresis testing of four surface measuring PRTs. Five different calibration schemes were investigated for these sensors. The IPTS-68 formulation produced the most accurate result, yielding average sensor systematic error of 0.02 C and random error of 0.1 C. The sensors were checked for thermal stability by successive and thermal cycling between room temperature, 160 C, and boiling point of nitrogen. All the PRTs suffered from instability and hysteresis. The applicability of the self-heating technique as an in situ method for checking the calibration of PRTs located inside wind tunnels was investigated.
Group 4: Instructor training and qualifications
NASA Technical Reports Server (NTRS)
Sessa, R.
1981-01-01
Each professional instructor or check airman used in LOFT training course should complete an FAA approved training course in the appropriate aircraft type. Instructors used in such courses need not be type-rated. If an instructor or check airman who is presently not line-qualified is used as a LOFT instructor, he or she should remain current in line-operational procedures by observing operating procedures from the jump seat on three typical line segments pr 90 days on the appropriate aircraft type. ("Line qualification" means completion as a flight crew member of at least three typical line segments per 90 days on the appropriate aircraft type.) The training should include the requirement of four hours of LOFT training, in lieu of actual aircraft training or line operating experience.
Blood venous sample collection: Recommendations overview and a checklist to improve quality.
Giavarina, Davide; Lippi, Giuseppe
2017-07-01
The extra-analytical phases of the total testing process have substantial impact on managed care, as well as an inherent high risk of vulnerability to errors which is often greater than that of the analytical phase. The collection of biological samples is a crucial preanalytical activity. Problems or errors occurring shortly before, or soon after, this preanalytical step may impair sample quality and characteristics, or else modify the final results of testing. The standardization of fasting requirements, rest, patient position and psychological state of the patient are therefore crucial for mitigating the impact of preanalytical variability. Moreover, the quality of materials used for collecting specimens, along with their compatibility, can guarantee sample quality and persistence of chemical and physical characteristics of the analytes over time, so safeguarding the reliability of testing. Appropriate techniques and sampling procedures are effective to prevent problems such as hemolysis, undue clotting in the blood tube, draw of insufficient sample volume and modification of analyte concentration. An accurate identification of both patient and blood samples is a key priority as for other healthcare activities. Good laboratory practice and appropriate training of operators, by specifically targeting collection of biological samples, blood in particular, may greatly improve this issue, thus lowering the risk of errors and their adverse clinical consequences. The implementation of a simple and rapid check-list, including verification of blood collection devices, patient preparation and sampling techniques, was found to be effective for enhancing sample quality and reducing some preanalytical errors associated with these procedures. The use of this tool, along with implementation of objective and standardized systems for detecting non-conformities related to unsuitable samples, can be helpful for standardizing preanalytical activities and improving the quality of laboratory diagnostics, ultimately helping to reaffirm a "preanalytical" culture founded on knowledge and real risk perception. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Use of food label information by urban consumers in India - a study among supermarket shoppers.
Vemula, Sudershan R; Gavaravarapu, SubbaRao M; Mendu, Vishnu Vardhana Rao; Mathur, Pulkit; Avula, Laxmaiah
2014-09-01
To study consumer knowledge and use of food labels. A cross-sectional study employing both quantitative and qualitative methods. Intercept interviews were conducted with 1832 consumers at supermarket sites selected using a stratified random sampling procedure. This information was triangulated with twenty-one focus group discussions. New Delhi and Hyderabad, two metro-cities from north and south India. Adolescent (10-19 years), adult (20-59 years) and elderly (≥60 years) consumers. While the national urban literacy rate is 84 %, about 99 % of the study participants were educated. About 45 % reported that they buy pre-packaged foods once weekly and about a fifth buy them every day. Taste, quality, convenience and ease of use are the main reasons for buying pre-packaged foods. Although 90 % of consumers across the age groups read food labels, the majority (81 %) looked only for the manufacturing date or expiry/best before date. Of those who read labels, only a third checked nutrition information and ingredients. Nutrient information on labels was not often read because most consumers either lacked nutrition knowledge or found the information too technical to understand. About 60 % read quality symbols. A positive association was found between education level and checking various aspects of food labels. Women and girls concerned about 'fat' and 'sugar' intake read the nutrition facts panel. The intention of promoting healthy food choices through use of food labels is not being completely met. Since a majority of people found it difficult to comprehend nutrition information, there is a need to take up educational activities and/or introduce new forms of labelling.
The Accounting Network: How Financial Institutions React to Systemic Crisis
Puliga, Michelangelo; Flori, Andrea; Pappalardo, Giuseppe; Chessa, Alessandro; Pammolli, Fabio
2016-01-01
The role of Network Theory in the study of the financial crisis has been widely spotted in the latest years. It has been shown how the network topology and the dynamics running on top of it can trigger the outbreak of large systemic crisis. Following this methodological perspective we introduce here the Accounting Network, i.e. the network we can extract through vector similarities techniques from companies’ financial statements. We build the Accounting Network on a large database of worldwide banks in the period 2001–2013, covering the onset of the global financial crisis of mid-2007. After a careful data cleaning, we apply a quality check in the construction of the network, introducing a parameter (the Quality Ratio) capable of trading off the size of the sample (coverage) and the representativeness of the financial statements (accuracy). We compute several basic network statistics and check, with the Louvain community detection algorithm, for emerging communities of banks. Remarkably enough sensible regional aggregations show up with the Japanese and the US clusters dominating the community structure, although the presence of a geographically mixed community points to a gradual convergence of banks into similar supranational practices. Finally, a Principal Component Analysis procedure reveals the main economic components that influence communities’ heterogeneity. Even using the most basic vector similarity hypotheses on the composition of the financial statements, the signature of the financial crisis clearly arises across the years around 2008. We finally discuss how the Accounting Networks can be improved to reflect the best practices in the financial statement analysis. PMID:27736865
The Accounting Network: How Financial Institutions React to Systemic Crisis.
Puliga, Michelangelo; Flori, Andrea; Pappalardo, Giuseppe; Chessa, Alessandro; Pammolli, Fabio
2016-01-01
The role of Network Theory in the study of the financial crisis has been widely spotted in the latest years. It has been shown how the network topology and the dynamics running on top of it can trigger the outbreak of large systemic crisis. Following this methodological perspective we introduce here the Accounting Network, i.e. the network we can extract through vector similarities techniques from companies' financial statements. We build the Accounting Network on a large database of worldwide banks in the period 2001-2013, covering the onset of the global financial crisis of mid-2007. After a careful data cleaning, we apply a quality check in the construction of the network, introducing a parameter (the Quality Ratio) capable of trading off the size of the sample (coverage) and the representativeness of the financial statements (accuracy). We compute several basic network statistics and check, with the Louvain community detection algorithm, for emerging communities of banks. Remarkably enough sensible regional aggregations show up with the Japanese and the US clusters dominating the community structure, although the presence of a geographically mixed community points to a gradual convergence of banks into similar supranational practices. Finally, a Principal Component Analysis procedure reveals the main economic components that influence communities' heterogeneity. Even using the most basic vector similarity hypotheses on the composition of the financial statements, the signature of the financial crisis clearly arises across the years around 2008. We finally discuss how the Accounting Networks can be improved to reflect the best practices in the financial statement analysis.
2015-12-01
markings are indicated, follow agency authorization procedures , e.g. RD/FRD, PROPIN, ITAR, etc. Include copyright information. 13. SUPPLEMENTARY...Contamination in Distillate Fuels (Visual Inspection Procedures ), as a final check of fuel to ensure aviation fuel is clear and bright before flight...Laboratories at the Detroit Arsenal. The online procedure for evaluating the light obscuration particle counters was modified from the concepts found
Wanja, Elizabeth; Achilla, Rachel; Obare, Peter; Adeny, Rose; Moseti, Caroline; Otieno, Victor; Morang'a, Collins; Murigi, Ephantus; Nyamuni, John; Monthei, Derek R; Ogutu, Bernhards; Buff, Ann M
2017-05-25
One objective of the Kenya National Malaria Strategy 2009-2017 is scaling access to prompt diagnosis and effective treatment. In 2013, a quality assurance (QA) pilot was implemented to improve accuracy of malaria diagnostics at selected health facilities in low-transmission counties of Kenya. Trends in malaria diagnostic and QA indicator performance during the pilot are described. From June to December 2013, 28 QA officers provided on-the-job training and mentoring for malaria microscopy, malaria rapid diagnostic tests and laboratory QA/quality control (QC) practices over four 1-day visits at 83 health facilities. QA officers observed and recorded laboratory conditions and practices and cross-checked blood slides for malaria parasite presence, and a portion of cross-checked slides were confirmed by reference laboratories. Eighty (96%) facilities completed the pilot. Among 315 personnel at pilot initiation, 13% (n = 40) reported malaria diagnostics training within the previous 12 months. Slide positivity ranged from 3 to 7%. Compared to the reference laboratory, microscopy sensitivity ranged from 53 to 96% and positive predictive value from 39 to 53% for facility staff and from 60 to 96% and 52 to 80%, respectively, for QA officers. Compared to reference, specificity ranged from 88 to 98% and negative predictive value from 98 to 99% for health-facility personnel and from 93 to 99% and 99%, respectively, for QA officers. The kappa value ranged from 0.48-0.66 for facility staff and 0.57-0.84 for QA officers compared to reference. The only significant test performance improvement observed for facility staff was for specificity from 88% (95% CI 85-90%) to 98% (95% CI 97-99%). QA/QC practices, including use of positive-control slides, internal and external slide cross-checking and recording of QA/QC activities, all increased significantly across the pilot (p < 0.001). Reference material availability also increased significantly; availability of six microscopy job aids and seven microscopy standard operating procedures increased by a mean of 32 percentage points (p < 0.001) and 38 percentage points (p < 0.001), respectively. Significant gains were observed in malaria QA/QC practices over the pilot. However, these advances did not translate into improved accuracy of malaria diagnostic performance perhaps because of the limited duration of the QA pilot implementation.
Sub-pixel analysis to support graphic security after scanning at low resolution
NASA Astrophysics Data System (ADS)
Haas, Bertrand; Cordery, Robert; Gou, Hongmei; Decker, Steve
2006-02-01
Whether in the domain of audio, video or finance, our world tends to become increasingly digital. However, for diverse reasons, the transition from analog to digital is often much extended in time, and proceeds by long steps (and sometimes never completes). One such step is the conversion of information on analog media to digital information. We focus in this paper on the conversion (scanning) of printed documents to digital images. Analog media have the advantage over digital channels that they can harbor much imperceptible information that can be used for fraud detection and forensic purposes. But this secondary information usually fails to be retrieved during the conversion step. This is particularly relevant since the Check-21 act (Check Clearing for the 21st Century act) became effective in 2004 and allows images of checks to be handled by banks as usual paper checks. We use here this situation of check scanning as our primary benchmark for graphic security features after scanning. We will first present a quick review of the most common graphic security features currently found on checks, with their specific purpose, qualities and disadvantages, and we demonstrate their poor survivability after scanning in the average scanning conditions expected from the Check-21 Act. We will then present a novel method of measurement of distances between and rotations of line elements in a scanned image: Based on an appropriate print model, we refine direct measurements to an accuracy beyond the size of a scanning pixel, so we can then determine expected distances, periodicity, sharpness and print quality of known characters, symbols and other graphic elements in a document image. Finally we will apply our method to fraud detection of documents after gray-scale scanning at 300dpi resolution. We show in particular that alterations on legitimate checks or copies of checks can be successfully detected by measuring with sub-pixel accuracy the irregularities inherently introduced by the illegitimate process.
Pascucci, Simone; Bassani, Cristiana; Palombo, Angelo; Poscolieri, Maurizio; Cavalli, Rosa
2008-01-01
This paper describes a fast procedure for evaluating asphalt pavement surface defects using airborne emissivity data. To develop this procedure, we used airborne multispectral emissivity data covering an urban test area close to Venice (Italy).For this study, we first identify and select the roads' asphalt pavements on Multispectral Infrared Visible Imaging Spectrometer (MIVIS) imagery using a segmentation procedure. Next, since in asphalt pavements the surface defects are strictly related to the decrease of oily components that cause an increase of the abundance of surfacing limestone, the diagnostic absorption emissivity peak at 11.2μm of the limestone was used for retrieving from MIVIS emissivity data the areas exhibiting defects on asphalt pavements surface.The results showed that MIVIS emissivity allows establishing a threshold that points out those asphalt road sites on which a check for a maintenance intervention is required. Therefore, this technique can supply local government authorities an efficient, rapid and repeatable road mapping procedure providing the location of the asphalt pavements to be checked. PMID:27879765
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tan, J; Pompos, A; Jiang, S
Purpose: To put forth an innovative clinical paradigm for weekly chart checking so that treatment status is periodically checked accurately and efficiently. This study also aims to help optimize the chart checking clinical workflow in a busy radiation therapy clinic. Methods: It is mandated by the Texas Administrative code to check patient charts of radiation therapy once a week or every five fractions, however it varies drastically among institutions in terms of when and how it is done. Some do it every day, but a lot of efforts are wasted on opening ineligible charts; some do it on a fixedmore » day but the distribution of intervals between subsequent checks is not optimal. To establish an optimal chart checking procedure, a new paradigm was developed to achieve 1) charts are checked more accurately and more efficiently; 2) charts are checked on optimal days without any miss; 3) workload is evened out throughout a week when multiple physicists are involved. All active charts will be accessed by querying the R&V system. Priority is assigned to each chart based on the number of days before the next due date followed by sorting and workload distribution steps. New charts are also taken into account when distributing the workload so it is reasonably even throughout the week. Results: Our clinical workflow became more streamlined and smooth. In addition, charts get checked in a more timely fashion so that errors would get caught earlier should they occur. Conclusion: We developed a new weekly chart checking diagram. It helps physicists check charts in a timely manner, saves their time in busy clinics, and consequently reduces possible errors.« less
AZ-101 Mixer Pump Test Qualification Test Procedures (QTP)
DOE Office of Scientific and Technical Information (OSTI.GOV)
THOMAS, W.K.
2000-01-10
Describes the Qualification test procedure for the AZ-101 Mixer Pump Data Acquisition System (DAS). The purpose of this Qualification Test Procedure (QTP) is to confirm that the AZ-101 Mixer Pump System has been properly programmed and hardware configured correctly. This QTP will test the software setpoints for the alarms and also check the wiring configuration from the SIMcart to the HMI. An Acceptance Test Procedure (ATP), similar to this QTP will be performed to test field devices and connections from the field.
Quality assessment concept of the World Data Center for Climate and its application to CMIP5 data
NASA Astrophysics Data System (ADS)
Stockhause, M.; Höck, H.; Toussaint, F.; Lautenschlager, M.
2012-08-01
The preservation of data in a high state of quality which is suitable for interdisciplinary use is one of the most pressing and challenging current issues in long-term archiving. For high volume data such as climate model data, the data and data replica are no longer stored centrally but distributed over several local data repositories, e.g. the data of the Climate Model Intercomparison Project Phase 5 (CMIP5). The most important part of the data is to be archived, assigned a DOI, and published according to the World Data Center for Climate's (WDCC) application of the DataCite regulations. The integrated part of WDCC's data publication process, the data quality assessment, was adapted to the requirements of a federated data infrastructure. A concept of a distributed and federated quality assessment procedure was developed, in which the workload and responsibility for quality control is shared between the three primary CMIP5 data centers: Program for Climate Model Diagnosis and Intercomparison (PCMDI), British Atmospheric Data Centre (BADC), and WDCC. This distributed quality control concept, its pilot implementation for CMIP5, and first experiences are presented. The distributed quality control approach is capable of identifying data inconsistencies and to make quality results immediately available for data creators, data users and data infrastructure managers. Continuous publication of new data versions and slow data replication prevents the quality control from check completion. This together with ongoing developments of the data and metadata infrastructure requires adaptations in code and concept of the distributed quality control approach.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barbee, D; McCarthy, A; Galavis, P
Purpose: Errors found during initial physics plan checks frequently require replanning and reprinting, resulting decreased departmental efficiency. Additionally, errors may be missed during physics checks, resulting in potential treatment errors or interruption. This work presents a process control created using the Eclipse Scripting API (ESAPI) enabling dosimetrists and physicists to detect potential errors in the Eclipse treatment planning system prior to performing any plan approvals or printing. Methods: Potential failure modes for five categories were generated based on available ESAPI (v11) patient object properties: Images, Contours, Plans, Beams, and Dose. An Eclipse script plugin (PlanCheck) was written in C# tomore » check errors most frequently observed clinically in each of the categories. The PlanCheck algorithms were devised to check technical aspects of plans, such as deliverability (e.g. minimum EDW MUs), in addition to ensuring that policy and procedures relating to planning were being followed. The effect on clinical workflow efficiency was measured by tracking the plan document error rate and plan revision/retirement rates in the Aria database over monthly intervals. Results: The number of potential failure modes the PlanCheck script is currently capable of checking for in the following categories: Images (6), Contours (7), Plans (8), Beams (17), and Dose (4). Prior to implementation of the PlanCheck plugin, the observed error rates in errored plan documents and revised/retired plans in the Aria database was 20% and 22%, respectively. Error rates were seen to decrease gradually over time as adoption of the script improved. Conclusion: A process control created using the Eclipse scripting API enabled plan checks to occur within the planning system, resulting in reduction in error rates and improved efficiency. Future work includes: initiating full FMEA for planning workflow, extending categories to include additional checks outside of ESAPI via Aria database queries, and eventual automated plan checks.« less
IEC 61511 and the capital project process--a protective management system approach.
Summers, Angela E
2006-03-17
This year, the process industry has reached an important milestone in process safety-the acceptance of an internationally recognized standard for safety instrumented systems (SIS). This standard, IEC 61511, documents good engineering practice for the assessment, design, operation, maintenance, and management of SISs. The foundation of the standard is established by several requirements in Part 1, Clauses 5-7, which cover the development of a management system aimed at ensuring that functional safety is achieved. The management system includes a quality assurance process for the entire SIS lifecycle, requiring the development of procedures, identification of resources and acquisition of tools. For maximum benefit, the deliverables and quality control checks required by the standard should be integrated into the capital project process, addressing safety, environmental, plant productivity, and asset protection. Industry has become inundated with a multitude of programs focusing on safety, quality, and cost performance. This paper introduces a protective management system, which builds upon the work process identified in IEC 61511. Typical capital project phases are integrated with the management system to yield one comprehensive program to efficiently manage process risk. Finally, the paper highlights areas where internal practices or guidelines should be developed to improve program performance and cost effectiveness.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chitsazzadeh, S; Wells, D; Mestrovic, A
2016-06-15
Purpose: To develop a QA procedure for gated VMAT stereotactic ablative radiotherapy (SABR) treatments. Methods: An interface was constructed to attach the translational stage of a Quasar respiratory motion phantom to a pinpoint ion chamber insert and move the ion chamber inside an ArcCheck diode array. The Quasar phantom controller used a patient specific breathing pattern to translate the ion chamber in a superior-inferior direction inside the ArcCheck. An amplitude-based RPM tracking system was specified to turn the beam on during the exhale phase of the breathing pattern. SABR plans were developed using Eclipse for liver PTVs ranging in sizemore » from 3-12 cm in diameter using a 2-arc VMAT technique. Dose was measured in the middle of the penumbra region, where the high dose gradient allowed for sensitive detection of any inaccuracies in gated dose delivery. The overall fidelity of the dose distribution was confirmed using ArcCheck. The sensitivity of the gating QA procedure was investigated with respect to the following four parameters: PTV size, duration of exhale, baseline drift, and gating window size. Results: The difference between the measured dose to a point in the penumbra and the Eclipse calculated dose was under 2% for small residual motions. The QA procedure was independent of PTV size and duration of exhale. Baseline drift and gating window size, however, significantly affected the penumbral dose measurement, with differences of up to 30% compared to Eclipse. Conclusion: This study described a highly sensitive QA procedure for gated VMAT SABR treatments. The QA outcome was dependent on the gating window size and baseline drift. Analysis of additional patient breathing patterns will be required to determine a clinically relevant gating window size and an appropriate tolerance level for this procedure.« less
Retrospective review of Contura HDR breast cases to improve our standardized procedure
DOE Office of Scientific and Technical Information (OSTI.GOV)
Iftimia, Ileana, E-mail: Ileana.n.iftimia@lahey.org; Cirino, Eileen T.; Ladd, Ron
2013-07-01
To retrospectively review our first 20 Contura high dose rate breast cases to improve and refine our standardized procedure and checklists. We prepared in advance checklists for all steps, developed an in-house Excel spreadsheet for second checking the plan, and generated a procedure for efficient contouring and a set of optimization constraints to meet the dose volume histogram criteria. Templates were created in our treatment planning system for structures, isodose levels, optimization constraints, and plan report. This study reviews our first 20 high dose rate Contura breast treatment plans. We followed our standardized procedure for contouring, planning, and second checking.more » The established dose volume histogram criteria were successfully met for all plans. For the cases studied here, the balloon-skin and balloon-ribs distances ranged between 5 and 43 mm and 1 and 33 mm, respectively; air{sub s}eroma volume/PTV{sub E}val volume≤5.5% (allowed≤10%); asymmetry<1.2 mm (goal≤2 mm); PTV{sub E}val V90%≥97.6%; PTV{sub E}val V95%≥94.9%; skin max dose≤98%Rx; ribs max dose≤137%Rx; V150%≤29.8 cc; V200%≤7.8 cc; the total dwell time range was 225.4 to 401.9 seconds; and the second check agreement was within 3%. Based on this analysis, more appropriate ranges for the total dwell time and balloon diameter tolerance were found. Three major problems were encountered: balloon migration toward the skin for small balloon-to-skin distances, lumen obstruction, and length change for the flexible balloon. Solutions were found for these issues and our standardized procedure and checklists were updated accordingly. Based on our review of these cases, the use of checklists resulted in consistent results, indicating good coverage for the target without sacrificing the critical structures. This review helped us to refine our standardized procedure and update our checklists.« less
Is a quasi-3D dosimeter better than a 2D dosimeter for Tomotherapy delivery quality assurance?
NASA Astrophysics Data System (ADS)
Xing, Aitang; Deshpande, Shrikant; Arumugam, Sankar; George, Armia; Holloway, Lois; Vial, Philip; Goozee, Gary
2015-01-01
Delivery quality assurance (DQA) has been performed for each Tomotherapy patient either using ArcCHECK or MatriXX Evolution in our clinic since 2012. ArcCHECK is a quasi-3D dosimeter whereas MatriXX is a 2D detector. A review of DQA results was performed for all patients in the last three years, a total of 221 DQA plans. These DQA plans came from 215 patients with a variety of treatment sites including head-neck, pelvis, and chest wall. The acceptable Gamma pass rate in our clinic is over 95% using 3mm and 3% of maximum planned dose with 10% dose threshold. The mean value and standard deviation of Gamma pass rates were 98.2% ± 1.98(1SD) for MatriXX and 98.5%±1.88 (1SD) for ArcCHECK. A paired t-test was also performed for the groups of patients whose DQA was performed with both the ArcCHECK and MatriXX. No statistical dependence was found in terms of the Gamma pass rate for ArcCHECK and MatriXX. The considered 3D and 2D dosimeters have achieved similar results in performing routine patient-specific DQA for patients treated on a TomoTherapy unit.
Garrido-Delgado, Rocío; Arce, Lourdes; Valcárcel, Miguel
2012-01-01
The potential of a headspace device coupled to multi-capillary column-ion mobility spectrometry has been studied as a screening system to differentiate virgin olive oils ("lampante," "virgin," and "extra virgin" olive oil). The last two types are virgin olive oil samples of very similar characteristics, which were very difficult to distinguish with the existing analytical method. The procedure involves the direct introduction of the virgin olive oil sample into a vial, headspace generation, and automatic injection of the volatiles into a gas chromatograph-ion mobility spectrometer. The data obtained after the analysis by duplicate of 98 samples of three different categories of virgin olive oils, were preprocessed and submitted to a detailed chemometric treatment to classify the virgin olive oil samples according to their sensory quality. The same virgin olive oil samples were also analyzed by an expert's panel to establish their category and use these data as reference values to check the potential of this new screening system. This comparison confirms the potential of the results presented here. The model was able to classify 97% of virgin olive oil samples in their corresponding group. Finally, the chemometric method was validated obtaining a percentage of prediction of 87%. These results provide promising perspectives for the use of ion mobility spectrometry to differentiate virgin olive oil samples according to their quality instead of using the classical analytical procedure.
40 CFR 90.320 - Carbon dioxide analyzer calibration.
Code of Federal Regulations, 2011 CFR
2011-07-01
... (64 percent) is required (see following table). Example calibration points (%) Acceptable for... periodic interference, system check, and calibration test procedures specified in 40 CFR part 1065...
40 CFR 90.320 - Carbon dioxide analyzer calibration.
Code of Federal Regulations, 2013 CFR
2013-07-01
... (64 percent) is required (see following table). Example calibration points (%) Acceptable for... periodic interference, system check, and calibration test procedures specified in 40 CFR part 1065...
40 CFR 90.320 - Carbon dioxide analyzer calibration.
Code of Federal Regulations, 2012 CFR
2012-07-01
... (64 percent) is required (see following table). Example calibration points (%) Acceptable for... periodic interference, system check, and calibration test procedures specified in 40 CFR part 1065...
Quality control for federal clean water act and safe drinking water act regulatory compliance.
Askew, Ed
2013-01-01
QC sample results are required in order to have confidence in the results from analytical tests. Some of the AOAC water methods include specific QC procedures, frequencies, and acceptance criteria. These are considered to be the minimum controls needed to perform the method successfully. Some regulatory programs, such as those in 40 CFR Part 136.7, require additional QC or have alternative acceptance limits. Essential QC measures include method calibration, reagent standardization, assessment of each analyst's capabilities, analysis of blind check samples, determination of the method's sensitivity (method detection level or quantification limit), and daily evaluation of bias, precision, and the presence of laboratory contamination or other analytical interference. The details of these procedures, their performance frequency, and expected ranges of results are set out in this manuscript. The specific regulatory requirements of 40 CFR Part 136.7 for the Clean Water Act, the laboratory certification requirements of 40 CFR Part 141 for the Safe Drinking Water Act, and the ISO 17025 accreditation requirements under The NELAC Institute are listed.
DNA origami-based standards for quantitative fluorescence microscopy.
Schmied, Jürgen J; Raab, Mario; Forthmann, Carsten; Pibiri, Enrico; Wünsch, Bettina; Dammeyer, Thorben; Tinnefeld, Philip
2014-01-01
Validating and testing a fluorescence microscope or a microscopy method requires defined samples that can be used as standards. DNA origami is a new tool that provides a framework to place defined numbers of small molecules such as fluorescent dyes or proteins in a programmed geometry with nanometer precision. The flexibility and versatility in the design of DNA origami microscopy standards makes them ideally suited for the broad variety of emerging super-resolution microscopy methods. As DNA origami structures are durable and portable, they can become a universally available specimen to check the everyday functionality of a microscope. The standards are immobilized on a glass slide, and they can be imaged without further preparation and can be stored for up to 6 months. We describe a detailed protocol for the design, production and use of DNA origami microscopy standards, and we introduce a DNA origami rectangle, bundles and a nanopillar as fluorescent nanoscopic rulers. The protocol provides procedures for the design and realization of fluorescent marks on DNA origami structures, their production and purification, quality control, handling, immobilization, measurement and data analysis. The procedure can be completed in 1-2 d.
Check out the Atmospheric Science User Forum
Atmospheric Science Data Center
2016-11-16
Check out the Atmospheric Science User Forum Tuesday, November 15, 2016 The ASDC would like to bring your attention to the Atmospheric Science User Forum. The purpose of this forum is to improve user service, quality, and efficiency of NASA atmospheric science data. The forum intends to provide a quick and easy way to facilitate ...
A procedure and program to calculate shuttle mask advantage
NASA Astrophysics Data System (ADS)
Balasinski, A.; Cetin, J.; Kahng, A.; Xu, X.
2006-10-01
A well-known recipe for reducing mask cost component in product development is to place non-redundant elements of layout databases related to multiple products on one reticle plate [1,2]. Such reticles are known as multi-product, multi-layer, or, in general, multi-IP masks. The composition of the mask set should minimize not only the layout placement cost, but also the cost of the manufacturing process, design flow setup, and product design and introduction to market. An important factor is the quality check which should be expeditious and enable thorough visual verification to avoid costly modifications once the data is transferred to the mask shop. In this work, in order to enable the layer placement and quality check procedure, we proposed an algorithm where mask layers are first lined up according to the price and field tone [3]. Then, depending on the product die size, expected fab throughput, and scribeline requirements, the subsequent product layers are placed on the masks with different grades. The actual reduction of this concept to practice allowed us to understand the tradeoffs between the automation of layer placement and setup related constraints. For example, the limited options of the numbers of layer per plate dictated by the die size and other design feedback, made us consider layer pairing based not only on the final price of the mask set, but also on the cost of mask design and fab-friendliness. We showed that it may be advantageous to introduce manual layer pairing to ensure that, e.g., all interconnect layers would be placed on the same plate, allowing for easy and simultaneous design fixes. Another enhancement was to allow some flexibility in mixing and matching of the layers such that non-critical ones requiring low mask grade would be placed in a less restrictive way, to reduce the count of orphan layers. In summary, we created a program to automatically propose and visualize shuttle mask architecture for design verification, with enhancements to due to the actual application of the code.
Broad-band BOS (BBYB) development and calibration in Taiwan
NASA Astrophysics Data System (ADS)
Lin, C. R.; Wang, C. C.; Kuo, B. Y.; Chen, P.; Jang, J. P.; Chang, H.; Laio, Y. C.; Chang, K. H.; Lin, F. S.
2016-12-01
Since 2009, combine with Academia Sinica, National Applied Research Laboratories and National Sun Yat-sen University formed ocean bottom seismograph (OBS) development team to develop sub-broadband OBS (called Yardbird OBS). Through a series deploy experiment at seafloor offshore Taiwan that got a lot of data can be used to study plate tectonics, seismic activity, source characteristics. Nowadays they have pretty good results already. Due to bandwidth limitations of the Yardbird OBS that inadequate to use for analyze global-scale earthquake. Therefor developing broadband ocean bottom seismograph is an important goal for the development team. Currently the broadband OBS (called BBYB) design and construction have completed the initial experiment phase. Due to underwater instruments always got high risk. Something accidentally making equipment sank in the sea cannot be recovery. Even recovery of equipment may also be causing poor performance because there is no data record. It cannot be to accomplish the experiment mission. In order to improve the OBS performance, avoid OBS dis-recovery or data collection is incomplete, must be sure all OBS's each component (such as seismic data recording device, balanced body, sonar dashboard, instruments and internal wiring ...) with well-done quality before assembly. Each component could go through very rigorous testing, strict and pick out the good components in the assembly process. Be sure all of produce the OBS under the water after a long deployment could successful recovery and got valuable data. In this presentation we will show a serial testing procedure and results for quality each BBYB component. Such as: Data logger: digitizer sensitivity, sampling rate, clock timing. Acoustic controller: function of Enable, Disable, Range, Release 1, Release 2, Option 1(Release disable). Air pressure gauge for glass ball: accuracy. A check lists of connector wiring check for assembling instrument. Design deploy and recovery procedure for operation on deck. We hope through repeated testing to ensure perfect performance of BBYB and apply the testing concept to another equipment for improve instruments performance. Keywords: ocean bottom seismograph (OBS); Data logger; Acoustic controller; Air pressure gauge.
An analysis of the ArcCHECK-MR diode array's performance for ViewRay quality assurance.
Ellefson, Steven T; Culberson, Wesley S; Bednarz, Bryan P; DeWerd, Larry A; Bayouth, John E
2017-07-01
The ArcCHECK-MR diode array utilizes a correction system with a virtual inclinometer to correct the angular response dependencies of the diodes. However, this correction system cannot be applied to measurements on the ViewRay MR-IGRT system due to the virtual inclinometer's incompatibility with the ViewRay's multiple simultaneous beams. Additionally, the ArcCHECK's current correction factors were determined without magnetic field effects taken into account. In the course of performing ViewRay IMRT quality assurance with the ArcCHECK, measurements were observed to be consistently higher than the ViewRay TPS predictions. The goals of this study were to quantify the observed discrepancies and test whether applying the current factors improves the ArcCHECK's accuracy for measurements on the ViewRay. Gamma and frequency analysis were performed on 19 ViewRay patient plans. Ion chamber measurements were performed at a subset of diode locations using a PMMA phantom with the same dimensions as the ArcCHECK. A new method for applying directionally dependent factors utilizing beam information from the ViewRay TPS was developed in order to analyze the current ArcCHECK correction factors. To test the current factors, nine ViewRay plans were altered to be delivered with only a single simultaneous beam and were measured with the ArcCHECK. The current correction factors were applied using both the new and current methods. The new method was also used to apply corrections to the original 19 ViewRay plans. It was found the ArcCHECK systematically reports doses higher than those actually delivered by the ViewRay. Application of the current correction factors by either method did not consistently improve measurement accuracy. As dose deposition and diode response have both been shown to change under the influence of a magnetic field, it can be concluded the current ArcCHECK correction factors are invalid and/or inadequate to correct measurements on the ViewRay system. © 2017 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.
40 CFR 86.1422 - Analyzer calibration.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Trucks; Certification Short Test Procedures § 86.1422 Analyzer calibration. (a) Determine that the... check. Prior to its introduction into service and at specified periods thereafter, the analyzer must...
40 CFR 91.320 - Carbon dioxide analyzer calibration.
Code of Federal Regulations, 2011 CFR
2011-07-01
... (64 percent) is required (see following table). Example calibration points (percent) Acceptable for...) The initial and periodic interference, system check, and calibration test procedures specified in 40...
40 CFR 91.320 - Carbon dioxide analyzer calibration.
Code of Federal Regulations, 2013 CFR
2013-07-01
... (64 percent) is required (see following table). Example calibration points (percent) Acceptable for...) The initial and periodic interference, system check, and calibration test procedures specified in 40...