Telecommunications Systems Career Ladder, AFSC 307XO.
1981-01-01
standard test tone levels perform impulse noise tests make in-service or out-of- service quality check.s on composite signal transmission levels Even...service or out-of- service quality control (QC) reports maintain trouble and restoration record forms (DD Form 1443) direct circuit or system checks...include: perform fault isolation on analog circuits make in-service or out-of- service quality checks on voice frequency carrier telegraph (VFCT) terminals
NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR FORM QA/QC CHECKS (UA-C-2.0)
The purpose of this SOP is to outline the process of Field Quality Assurance and Quality Control checks. This procedure was followed to ensure consistent data retrieval during the Arizona NHEXAS project and the "Border" study. Keywords: custody; QA/QC; field checks.
The Nation...
40 CFR 51.363 - Quality assurance.
Code of Federal Regulations, 2011 CFR
2011-07-01
... test, the evaporative system tests, and emission control component checks (as applicable); (vi...) A check of the Constant Volume Sampler flow calibration; (5) A check for the optimization of the... selection, and power absorption; (9) A check of the system's ability to accurately detect background...
The purpose of this SOP is to outline the process of field quality assurance and quality control checks. This procedure was followed to ensure consistent data retrieval during the Arizona NHEXAS project and the Border study. Keywords: custody; QA/QC; field checks.
The U.S.-Mex...
40 CFR 51.359 - Quality control.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 2 2014-07-01 2014-07-01 false Quality control. 51.359 Section 51.359 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS REQUIREMENTS FOR... to assure test accuracy. Computer control of quality assurance checks and quality control charts...
49 CFR 40.235 - What are the requirements for proper use and care of ASDs?
Code of Federal Regulations, 2014 CFR
2014-10-01
... ASD on the CPL. Your QAP must specify the methods used for quality control checks, temperatures at which the ASD must be stored and used, the shelf life of the device, and environmental conditions (e.g... the specified quality control checks or that has passed its expiration date. (e) As an employer, with...
49 CFR 40.235 - What are the requirements for proper use and care of ASDs?
Code of Federal Regulations, 2012 CFR
2012-10-01
... ASD on the CPL. Your QAP must specify the methods used for quality control checks, temperatures at which the ASD must be stored and used, the shelf life of the device, and environmental conditions (e.g... the specified quality control checks or that has passed its expiration date. (e) As an employer, with...
49 CFR 40.235 - What are the requirements for proper use and care of ASDs?
Code of Federal Regulations, 2011 CFR
2011-10-01
... ASD on the CPL. Your QAP must specify the methods used for quality control checks, temperatures at which the ASD must be stored and used, the shelf life of the device, and environmental conditions (e.g... the specified quality control checks or that has passed its expiration date. (e) As an employer, with...
49 CFR 40.235 - What are the requirements for proper use and care of ASDs?
Code of Federal Regulations, 2010 CFR
2010-10-01
... ASD on the CPL. Your QAP must specify the methods used for quality control checks, temperatures at which the ASD must be stored and used, the shelf life of the device, and environmental conditions (e.g... the specified quality control checks or that has passed its expiration date. (e) As an employer, with...
49 CFR 40.235 - What are the requirements for proper use and care of ASDs?
Code of Federal Regulations, 2013 CFR
2013-10-01
... ASD on the CPL. Your QAP must specify the methods used for quality control checks, temperatures at which the ASD must be stored and used, the shelf life of the device, and environmental conditions (e.g... the specified quality control checks or that has passed its expiration date. (e) As an employer, with...
The Automation of Nowcast Model Assessment Processes
2016-09-01
that will automate real-time WRE-N model simulations, collect and quality control check weather observations for assimilation and verification, and...domains centered near White Sands Missile Range, New Mexico, where the Meteorological Sensor Array (MSA) will be located. The MSA will provide...observations and performing quality -control checks for the pre-forecast data assimilation period. 2. Run the WRE-N model to generate model forecast data
Impact of dose calibrators quality control programme in Argentina
NASA Astrophysics Data System (ADS)
Furnari, J. C.; de Cabrejas, M. L.; del C. Rotta, M.; Iglicki, F. A.; Milá, M. I.; Magnavacca, C.; Dima, J. C.; Rodríguez Pasqués, R. H.
1992-02-01
The national Quality Control (QC) programme for radionuclide calibrators started 12 years ago. Accuracy and the implementation of a QC programme were evaluated over all these years at 95 nuclear medicine laboratories where dose calibrators were in use. During all that time, the Metrology Group of CNEA has distributed 137Cs sealed sources to check stability and has been performing periodic "checking rounds" and postal surveys using unknown samples (external quality control). An account of the results of both methods is presented. At present, more of 65% of the dose calibrators measure activities with an error less than 10%.
Langley Wind Tunnel Data Quality Assurance-Check Standard Results
NASA Technical Reports Server (NTRS)
Hemsch, Michael J.; Grubb, John P.; Krieger, William B.; Cler, Daniel L.
2000-01-01
A framework for statistical evaluation, control and improvement of wind funnel measurement processes is presented The methodology is adapted from elements of the Measurement Assurance Plans developed by the National Bureau of Standards (now the National Institute of Standards and Technology) for standards and calibration laboratories. The present methodology is based on the notions of statistical quality control (SQC) together with check standard testing and a small number of customer repeat-run sets. The results of check standard and customer repeat-run -sets are analyzed using the statistical control chart-methods of Walter A. Shewhart long familiar to the SQC community. Control chart results are presented for. various measurement processes in five facilities at Langley Research Center. The processes include test section calibration, force and moment measurements with a balance, and instrument calibration.
Ford, Eric C; Terezakis, Stephanie; Souranis, Annette; Harris, Kendra; Gay, Hiram; Mutic, Sasa
2012-11-01
To quantify the error-detection effectiveness of commonly used quality control (QC) measures. We analyzed incidents from 2007-2010 logged into a voluntary in-house, electronic incident learning systems at 2 academic radiation oncology clinics. None of the incidents resulted in patient harm. Each incident was graded for potential severity using the French Nuclear Safety Authority scoring scale; high potential severity incidents (score >3) were considered, along with a subset of 30 randomly chosen low severity incidents. Each report was evaluated to identify which of 15 common QC checks could have detected it. The effectiveness was calculated, defined as the percentage of incidents that each QC measure could detect, both for individual QC checks and for combinations of checks. In total, 4407 incidents were reported, 292 of which had high-potential severity. High- and low-severity incidents were detectable by 4.0 ± 2.3 (mean ± SD) and 2.6 ± 1.4 QC checks, respectively (P<.001). All individual checks were less than 50% sensitive with the exception of pretreatment plan review by a physicist (63%). An effectiveness of 97% was achieved with 7 checks used in combination and was not further improved with more checks. The combination of checks with the highest effectiveness includes physics plan review, physician plan review, Electronic Portal Imaging Device-based in vivo portal dosimetry, radiation therapist timeout, weekly physics chart check, the use of checklists, port films, and source-to-skin distance checks. Some commonly used QC checks such as pretreatment intensity modulated radiation therapy QA do not substantially add to the ability to detect errors in these data. The effectiveness of QC measures in radiation oncology depends sensitively on which checks are used and in which combinations. A small percentage of errors cannot be detected by any of the standard formal QC checks currently in broad use, suggesting that further improvements are needed. These data require confirmation with a broader incident-reporting database. Copyright © 2012 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Espinar, B.; Blanc, P.; Wald, L.; Hoyer-Klick, C.; Schroedter-Homscheidt, M.; Wanderer, T.
2012-04-01
Meteorological data measured by ground stations are often a key element in the development and validation of methods exploiting satellite images. These data are considered as a reference against which satellite-derived estimates are compared. Long-term radiation and meteorological measurements are available from a large number of measuring stations. However, close examination of the data often reveals a lack of quality, often for extended periods of time. This lack of quality has been the reason, in many cases, of the rejection of large amount of available data. The quality data must be checked before their use in order to guarantee the inputs for the methods used in modelling, monitoring, forecast, etc. To control their quality, data should be submitted to several conditions or tests. After this checking, data that are not flagged by any of the test is released as a plausible data. In this work, it has been performed a bibliographical research of quality control tests for the common meteorological variables (ambient temperature, relative humidity and wind speed) and for the usual solar radiometrical variables (horizontal global and diffuse components of the solar radiation and the beam normal component). The different tests have been grouped according to the variable and the average time period (sub-hourly, hourly, daily and monthly averages). The quality test may be classified as follows: • Range checks: test that verify values are within a specific range. There are two types of range checks, those based on extrema and those based on rare observations. • Step check: test aimed at detecting unrealistic jumps or stagnation in the time series. • Consistency checks: test that verify the relationship between two or more time series. The gathered quality tests are applicable for all latitudes as they have not been optimized regionally nor seasonably with the aim of being generic. They have been applied to ground measurements in several geographic locations, what result in the detection of some control tests that are no longer adequate, due to different reasons. After the modification of some test, based in our experience, a set of quality control tests is now presented, updated according to technology advances and classified. The presented set of quality tests allows radiation and meteorological data to be tested in order to know their plausibility to be used as inputs in theoretical or empirical methods for scientific research. The research leading to those results has partly receive funding from the European Union's Seventh Framework Programme (FP7/2007-2013) under Grant Agreement no. 262892 (ENDORSE project).
40 CFR Appendix B to Part 75 - Quality Assurance and Quality Control Procedures
Code of Federal Regulations, 2012 CFR
2012-07-01
... Systems 1.2.1Calibration Error Test and Linearity Check Procedures Keep a written record of the procedures used for daily calibration error tests and linearity checks (e.g., how gases are to be injected..., and when calibration adjustments should be made). Identify any calibration error test and linearity...
40 CFR Appendix B to Part 75 - Quality Assurance and Quality Control Procedures
Code of Federal Regulations, 2013 CFR
2013-07-01
... Systems 1.2.1Calibration Error Test and Linearity Check Procedures Keep a written record of the procedures used for daily calibration error tests and linearity checks (e.g., how gases are to be injected..., and when calibration adjustments should be made). Identify any calibration error test and linearity...
Quality Control of Meteorological Observations
NASA Technical Reports Server (NTRS)
Collins, William; Dee, Dick; Rukhovets, Leonid
1999-01-01
For the first time, a problem of the meteorological observation quality control (QC) was formulated by L.S. Gandin at the Main Geophysical Observatory in the 70's. Later in 1988 L.S. Gandin began adapting his ideas in complex quality control (CQC) to the operational environment at the National Centers for Environmental Prediction. The CQC was first applied by L.S.Gandin and his colleagues to detection and correction of errors in rawinsonde heights and temperatures using a complex of hydrostatic residuals.Later, a full complex of residuals, vertical and horizontal optimal interpolations and baseline checks were added for the checking and correction of a wide range of meteorological variables. some other of Gandin's ideas were applied and substantially developed at other meteorological centers. A new statistical QC was recently implemented in the Goddard Data Assimilation System. The central component of any quality control is a buddy check which is a test of individual suspect observations against available nearby non-suspect observations. A novel feature of this test is that the error variances which are used for QC decision are re-estimated on-line. As a result, the allowed tolerances for suspect observations can depend on local atmospheric conditions. The system is then better able to accept extreme values observed in deep cyclones, jet streams and so on. The basic statements of this adaptive buddy check are described. Some results of the on-line QC including moisture QC are presented.
User's manual for computer program BASEPLOT
Sanders, Curtis L.
2002-01-01
The checking and reviewing of daily records of streamflow within the U.S. Geological Survey is traditionally accomplished by hand-plotting and mentally collating tables of data. The process is time consuming, difficult to standardize, and subject to errors in computation, data entry, and logic. In addition, the presentation of flow data on the internet requires more timely and accurate computation of daily flow records. BASEPLOT was developed for checking and review of primary streamflow records within the U.S. Geological Survey. Use of BASEPLOT enables users to (1) provide efficiencies during the record checking and review process, (2) improve quality control, (3) achieve uniformity of checking and review techniques of simple stage-discharge relations, and (4) provide a tool for teaching streamflow computation techniques. The BASEPLOT program produces tables of quality control checks and produces plots of rating curves and discharge measurements; variable shift (V-shift) diagrams; and V-shifts converted to stage-discharge plots, using data stored in the U.S. Geological Survey Automatic Data Processing System database. In addition, the program plots unit-value hydrographs that show unit-value stages, shifts, and datum corrections; input shifts, datum corrections, and effective dates; discharge measurements; effective dates for rating tables; and numeric quality control checks. Checklist/tutorial forms are provided for reviewers to ensure completeness of review and standardize the review process. The program was written for the U.S. Geological Survey SUN computer using the Statistical Analysis System (SAS) software produced by SAS Institute, Incorporated.
40 CFR 60.2735 - Is there a minimum amount of monitoring data I must obtain?
Code of Federal Regulations, 2014 CFR
2014-07-01
... activities including, as applicable, calibration checks and required zero and span adjustments. A monitoring... monitoring system quality assurance or control activities in calculations used to report emissions or...-control periods, and required monitoring system quality assurance or quality control activities including...
40 CFR 60.2735 - Is there a minimum amount of monitoring data I must obtain?
Code of Federal Regulations, 2013 CFR
2013-07-01
... activities including, as applicable, calibration checks and required zero and span adjustments. A monitoring... monitoring system quality assurance or control activities in calculations used to report emissions or...-control periods, and required monitoring system quality assurance or quality control activities including...
Standard Reference Specimens in Quality Control of Engineering Surfaces
Song, J. F.; Vorburger, T. V.
1991-01-01
In the quality control of engineering surfaces, we aim to understand and maintain a good relationship between the manufacturing process and surface function. This is achieved by controlling the surface texture. The control process involves: 1) learning the functional parameters and their control values through controlled experiments or through a long history of production and use; 2) maintaining high accuracy and reproducibility with measurements not only of roughness calibration specimens but also of real engineering parts. In this paper, the characteristics, utilizations, and limitations of different classes of precision roughness calibration specimens are described. A measuring procedure of engineering surfaces, based on the calibration procedure of roughness specimens at NIST, is proposed. This procedure involves utilization of check specimens with waveform, wavelength, and other roughness parameters similar to functioning engineering surfaces. These check specimens would be certified under standardized reference measuring conditions, or by a reference instrument, and could be used for overall checking of the measuring procedure and for maintaining accuracy and agreement in engineering surface measurement. The concept of “surface texture design” is also suggested, which involves designing the engineering surface texture, the manufacturing process, and the quality control procedure to meet the optimal functional needs. PMID:28184115
Heudorf, U; Gasteyer, S; Samoiski, Y; Voigt, K
2012-08-01
Due to the Infectious Disease Prevention Act, public health services in Germany are obliged to check the infection prevention in hospitals and other medical facilities as well as in nursing homes. In Frankfurt/Main, Germany, standardized control visits have been performed for many years. In 2011 focus was laid on cleaning and disinfection of surfaces. All 41 nursing homes were checked according to a standardized checklist covering quality of structure (i.e. staffing, hygiene concept), quality of process (observation of the cleaning processes in the homes) and quality of output, which was monitored by checking the cleaning of fluorescent marks which had been applied some days before and should have been removed via cleaning in the following days before the final check. In more than two thirds of the homes, cleaning personnel were salaried, in one third external personnel were hired. Of the homes 85% provided service clothing and all of them offered protective clothing. All homes had established hygiene and cleaning concepts, however, in 15% of the homes concepts for the handling of Norovirus and in 30% concepts for the handling of Clostridium difficile were missing. Regarding process quality only half of the processes observed, i.e. cleaning of hand contact surfaces, such as handrails, washing areas and bins, were correct. Only 44% of the cleaning controls were correct with enormous differences between the homes (0-100%). The correlation between quality of process and quality of output was significant. There was good quality of structure in the homes but regarding quality of process and outcome there was great need for improvement. This was especially due to faults in communication and coordination between cleaning personnel and nursing personnel. Quality outcome was neither associated with the number of the places for residents nor with staffing. Thus, not only quality of structure but also quality of process and outcome should be checked by the public health services.
Austrian Daily Climate Data Rescue and Quality Control
NASA Astrophysics Data System (ADS)
Jurkovic, A.; Lipa, W.; Adler, S.; Albenberger, J.; Lechner, W.; Swietli, R.; Vossberg, I.; Zehetner, S.
2010-09-01
Checked climate datasets are a "conditio sine qua non" for all projects that are relevant for environment and climate. In the framework of climate change studies and analysis it is essential to work with quality controlled and trustful data. Furthermore these datasets are used as input for various simulation models. In regard to investigations of extreme events, like strong precipitation periods, drought periods and similar ones we need climate data in high temporal resolution (at least in daily resolution). Because of the historical background - during Second World War the majority of our climate sheets were sent to Berlin, where the historical sheets were destroyed by a bomb attack and so important information got lost - only several climate sheets, mostly duplicates, before 1939 are available and stored in our climate data archive. In 1970 the Central Institute for Meteorology and Geodynamics in Vienna started a first attempt to digitize climate data by means of punch cards. With the introduction of a routinely climate data quality control in 1984 we can speak of high-class-checked daily data (finally checked data, quality flag 6). Our group is working on the processing of digitization and quality control of the historical data for the period 1872 to 1983 for 18 years. Since 2007 it was possible to intensify the work (processes) in the framework of an internal project, namely Austrian Climate Data Rescue and Quality Control. The aim of this initiative was - and still is - to supply daily data in an outstanding good and uniform quality. So this project is a kind of pre-project for all scientific projects which are working with daily data. In addition to routine quality checks (that are running since 1984) using the commercial Bull Software we are testing our data with additional open source software, namely ProClim.db. By the use of this spatial and statistical test procedure, the elements air temperature and precipitation - for several sites in Carinthia - could already be checked, flagged and corrected. Checking the output (so called- error list) of ProClim is very time consuming and needs trained staff; however, in last instance it is necessary. Due to the guideline "Your archive is your business card for quality" the sub-project NEW ARCHIVE was initialized and started at the end of 2009. Our paper archive contains historical, up to 150 year-old, climate sheets that are valuable cultural assets. Unfortunately the storage of these historical and actual data treasures turned out to be more than suboptimal (insufficient protection against dust, dirt, humidity and light incidence). Because of this fact a concept for a new storage system and archive database was generated and already partly realized. In a nutshell this presentation shows on the one hand the importance of recovering historical climate sheets for climate change research - even if it is exhausting and time consuming - and gives on the other hand a general overview of used quality control procedures at our institute.
DOT National Transportation Integrated Search
2009-07-01
Current roadway quality control and quality acceptance (QC/QA) procedures : for Louisiana include coring for thickness, density, and air void checks in hot : mix asphalt (HMA) pavements and thickness and compressive strength for : Portland cement con...
Root, Patsy; Hunt, Margo; Fjeld, Karla; Kundrat, Laurie
2014-01-01
Quality assurance (QA) and quality control (QC) data are required in order to have confidence in the results from analytical tests and the equipment used to produce those results. Some AOAC water methods include specific QA/QC procedures, frequencies, and acceptance criteria, but these are considered to be the minimum controls needed to perform a microbiological method successfully. Some regulatory programs, such as those at Code of Federal Regulations (CFR), Title 40, Part 136.7 for chemistry methods, require additional QA/QC measures beyond those listed in the method, which can also apply to microbiological methods. Essential QA/QC measures include sterility checks, reagent specificity and sensitivity checks, assessment of each analyst's capabilities, analysis of blind check samples, and evaluation of the presence of laboratory contamination and instrument calibration and checks. The details of these procedures, their performance frequency, and expected results are set out in this report as they apply to microbiological methods. The specific regulatory requirements of CFR Title 40 Part 136.7 for the Clean Water Act, the laboratory certification requirements of CFR Title 40 Part 141 for the Safe Drinking Water Act, and the International Organization for Standardization 17025 accreditation requirements under The NELAC Institute are also discussed.
An Adaptive Buddy Check for Observational Quality Control
NASA Technical Reports Server (NTRS)
Dee, Dick P.; Rukhovets, Leonid; Todling, Ricardo; DaSilva, Arlindo M.; Larson, Jay W.; Einaudi, Franco (Technical Monitor)
2000-01-01
An adaptive buddy check algorithm is presented that adjusts tolerances for outlier observations based on the variability of surrounding data. The algorithm derives from a statistical hypothesis test combined with maximum-likelihood covariance estimation. Its stability is shown to depend on the initial identification of outliers by a simple background check. The adaptive feature ensures that the final quality control decisions are not very sensitive to prescribed statistics of first-guess and observation errors, nor on other approximations introduced into the algorithm. The implementation of the algorithm in a global atmospheric data assimilation is described. Its performance is contrasted with that of a non-adaptive buddy check, for the surface analysis of an extreme storm that took place in Europe on 27 December 1999. The adaptive algorithm allowed the inclusion of many important observations that differed greatly from the first guess and that would have been excluded on the basis of prescribed statistics. The analysis of the storm development was much improved as a result of these additional observations.
Data Quality Control of the French Permanent Broadband Network in the RESIF Framework.
NASA Astrophysics Data System (ADS)
Grunberg, M.; Lambotte, S.; Engels, F.
2014-12-01
In the framework of the RESIF (Réseau Sismologique et géodésique Français) project, a new information system is setting up, allowing the improvement of the management and the distribution of high quality data from the different elements of RESIF. Within this information system, EOST (in Strasbourg) is in charge of collecting real-time permanent broadband seismic waveform, and performing Quality Control on these data. The real-time and validated data set are pushed to the French National Distribution Center (Isterre/Grenoble) to make them publicly available. Furthermore EOST hosts the BCSF-ReNaSS, in charge of the French metropolitan seismic bulletin. This allows to benefit from some high-end quality control based on the national and world-wide seismicity. Here we present the real-time seismic data flow from the stations of the French National Broad Band Network to EOST, and then, the data Quality Control procedures that were recently installed, including some new developments.The data Quality Control consists in applying a variety of processes to check the consistency of the whole system from the stations to the data center. This allows us to verify that instruments and data transmission are operating correctly. Moreover, time quality is critical for most of the scientific data applications. To face this challenge and check the consistency of polarities and amplitudes, we deployed several high-end processes including a noise correlation procedure to check for timing accuracy (intrumental time errors result in a time-shift of the whole cross-correlation, clearly distinct from those due to change in medium physical properties), and a systematic comparison of synthetic and real data for teleseismic earthquakes of magnitude larger than 6.5 to detect timing errors as well as polarity and amplitude problems.
The Modern Measurement Technology And Checking Of Shafs Parameters
NASA Astrophysics Data System (ADS)
Tichá, Šárka; Botek, Jan
2015-12-01
This paper is focused on rationalization checking parameters of shaft in companies engaged in the production of components of electric motors, wind turbines and vacuum systems. Customers increasing constantly their requirements to ensure the overall quality of the product, i.e. the quality of machining, dimensional and shape accuracy and overall purity of the subscribed products. The aim of this paper is to introduce using modern measurement technology in controlling these components and compare the results with existing control methodology. The main objective of this rationalization is to eliminate mistakes and shortcomings of current inspection methods.
Mezzenga, Emilio; D'Errico, Vincenzo; Sarnelli, Anna; Strigari, Lidia; Menghi, Enrico; Marcocci, Francesco; Bianchini, David; Benassi, Marcello
2016-01-01
The purpose of this study was to retrospectively evaluate the results from a Helical TomoTherapy Hi-Art treatment system relating to quality controls based on daily static and dynamic output checks using statistical process control methods. Individual value X-charts, exponentially weighted moving average charts, and process capability and acceptability indices were used to monitor the treatment system performance. Daily output values measured from January 2014 to January 2015 were considered. The results obtained showed that, although the process was in control, there was an out-of-control situation in the principal maintenance intervention for the treatment system. In particular, process capability indices showed a decreasing percentage of points in control which was, however, acceptable according to AAPM TG148 guidelines. Our findings underline the importance of restricting the acceptable range of daily output checks and suggest a future line of investigation for a detailed process control of daily output checks for the Helical TomoTherapy Hi-Art treatment system.
Quality Work, Quality Control in Technical Services.
ERIC Educational Resources Information Center
Horny, Karen L.
1985-01-01
Quality in library technical services is explored in light of changes produced by automation. Highlights include a definition of quality; new opportunities and shifting priorities; cataloging (fullness of records, heading consistency, accountability, local standards, automated checking); need for new skills (management, staff); and boons of…
Statistical Quality Control of Moisture Data in GEOS DAS
NASA Technical Reports Server (NTRS)
Dee, D. P.; Rukhovets, L.; Todling, R.
1999-01-01
A new statistical quality control algorithm was recently implemented in the Goddard Earth Observing System Data Assimilation System (GEOS DAS). The final step in the algorithm consists of an adaptive buddy check that either accepts or rejects outlier observations based on a local statistical analysis of nearby data. A basic assumption in any such test is that the observed field is spatially coherent, in the sense that nearby data can be expected to confirm each other. However, the buddy check resulted in excessive rejection of moisture data, especially during the Northern Hemisphere summer. The analysis moisture variable in GEOS DAS is water vapor mixing ratio. Observational evidence shows that the distribution of mixing ratio errors is far from normal. Furthermore, spatial correlations among mixing ratio errors are highly anisotropic and difficult to identify. Both factors contribute to the poor performance of the statistical quality control algorithm. To alleviate the problem, we applied the buddy check to relative humidity data instead. This variable explicitly depends on temperature and therefore exhibits a much greater spatial coherence. As a result, reject rates of moisture data are much more reasonable and homogeneous in time and space.
[Quality control in herbal supplements].
Oelker, Luisa
2005-01-01
Quality and safety of food and herbal supplements are the result of a whole of different elements as good manufacturing practice and process control. The process control must be active and able to individuate and correct all possible hazards. The main and most utilized instrument is the hazard analysis critical control point (HACCP) system the correct application of which can guarantee the safety of the product. Herbal supplements need, in addition to standard quality control, a set of checks to assure the harmlessness and safety of the plants used.
Selvakumar, N; Murthy, B N; Prabhakaran, E; Sivagamasundari, S; Vasanthan, Samuel; Perumal, M; Govindaraju, R; Chauhan, L S; Wares, Fraser; Santha, T; Narayanan, P R
2005-02-01
Assessment of 12 microscopy centers in a tuberculosis unit by blinded checking of eight sputum smears selected by using a lot quality assurance sampling (LQAS) method and by unblinded checking of all positive and five negative slides, among the slides examined in a month in a microscopy centre, revealed that the LQAS method can be implemented in the field to monitor the performance of acid-fast bacillus microscopy centers in national tuberculosis control programs.
Selvakumar, N.; Murthy, B. N.; Prabhakaran, E.; Sivagamasundari, S.; Vasanthan, Samuel; Perumal, M.; Govindaraju, R.; Chauhan, L. S.; Wares, Fraser; Santha, T.; Narayanan, P. R.
2005-01-01
Assessment of 12 microscopy centers in a tuberculosis unit by blinded checking of eight sputum smears selected by using a lot quality assurance sampling (LQAS) method and by unblinded checking of all positive and five negative slides, among the slides examined in a month in a microscopy centre, revealed that the LQAS method can be implemented in the field to monitor the performance of acid-fast bacillus microscopy centers in national tuberculosis control programs. PMID:15695704
Check Calibration of the NASA Glenn 10- by 10-Foot Supersonic Wind Tunnel (2014 Test Entry)
NASA Technical Reports Server (NTRS)
Johnson, Aaron; Pastor-Barsi, Christine; Arrington, E. Allen
2016-01-01
A check calibration of the 10- by 10-Foot Supersonic Wind Tunnel (SWT) was conducted in May/June 2014 using an array of five supersonic wedge probes to verify the 1999 Calibration. This check calibration was necessary following a control systems upgrade and an integrated systems test (IST). This check calibration was required to verify the tunnel flow quality was unchanged by the control systems upgrade prior to the next test customer beginning their test entry. The previous check calibration of the tunnel occurred in 2007, prior to the Mars Science Laboratory test program. Secondary objectives of this test entry included the validation of the new Cobra data acquisition system (DAS) against the current Escort DAS and the creation of statistical process control (SPC) charts through the collection of series of repeated test points at certain predetermined tunnel parameters. The SPC charts secondary objective was not completed due to schedule constraints. It is hoped that this effort will be readdressed and completed in the near future.
NASA Technical Reports Server (NTRS)
Garcia-Gorriz, E.; Front, J.; Candela, J.
1997-01-01
A systematic Data Quality Checking Protocol for vessel Mounted Acoustic Doppler Current Profiler observations is proposed. Previous-to-acquisition conditions are considered along with simultaneous ones.
Revisiting the Procedures for the Vector Data Quality Assurance in Practice
NASA Astrophysics Data System (ADS)
Erdoğan, M.; Torun, A.; Boyacı, D.
2012-07-01
Immense use of topographical data in spatial data visualization, business GIS (Geographic Information Systems) solutions and applications, mobile and location-based services forced the topo-data providers to create standard, up-to-date and complete data sets in a sustainable frame. Data quality has been studied and researched for more than two decades. There have been un-countable numbers of references on its semantics, its conceptual logical and representations and many applications on spatial databases and GIS. However, there is a gap between research and practice in the sense of spatial data quality which increases the costs and decreases the efficiency of data production. Spatial data quality is well-known by academia and industry but usually in different context. The research on spatial data quality stated several issues having practical use such as descriptive information, metadata, fulfillment of spatial relationships among data, integrity measures, geometric constraints etc. The industry and data producers realize them in three stages; pre-, co- and post data capturing. The pre-data capturing stage covers semantic modelling, data definition, cataloguing, modelling, data dictionary and schema creation processes. The co-data capturing stage covers general rules of spatial relationships, data and model specific rules such as topologic and model building relationships, geometric threshold, data extraction guidelines, object-object, object-belonging class, object-non-belonging class, class-class relationships to be taken into account during data capturing. And post-data capturing stage covers specified QC (quality check) benchmarks and checking compliance to general and specific rules. The vector data quality criteria are different from the views of producers and users. But these criteria are generally driven by the needs, expectations and feedbacks of the users. This paper presents a practical method which closes the gap between theory and practice. Development of spatial data quality concepts into developments and application requires existence of conceptual, logical and most importantly physical existence of data model, rules and knowledge of realization in a form of geo-spatial data. The applicable metrics and thresholds are determined on this concrete base. This study discusses application of geo-spatial data quality issues and QA (quality assurance) and QC procedures in the topographic data production. Firstly we introduce MGCP (Multinational Geospatial Co-production Program) data profile of NATO (North Atlantic Treaty Organization) DFDD (DGIWG Feature Data Dictionary), the requirements of data owner, the view of data producers for both data capturing and QC and finally QA to fulfil user needs. Then, our practical and new approach which divides the quality into three phases is introduced. Finally, implementation of our approach to accomplish metrics, measures and thresholds of quality definitions is discussed. In this paper, especially geometry and semantics quality and quality control procedures that can be performed by the producers are discussed. Some applicable best-practices that we experienced on techniques of quality control, defining regulations that define the objectives and data production procedures are given in the final remarks. These quality control procedures should include the visual checks over the source data, captured vector data and printouts, some automatic checks that can be performed by software and some semi-automatic checks by the interaction with quality control personnel. Finally, these quality control procedures should ensure the geometric, semantic, attribution and metadata quality of vector data.
A special ionisation chamber for quality control of diagnostic and mammography X ray equipment.
Costa, A M; Caldas, L V E
2003-01-01
A quality control program for X ray equipment used for conventional radiography and mammography requires the constancy check of the beam qualities in terms of the half-value layers. In this work, a special double-faced parallel-plate ionisation chamber was developed with inner electrodes of different materials, in a tandem system. Its application will be in quality control programs of diagnostic and mammography X ray equipment for confirmation of half-value layers previously determined by the conventional method. Moreover, the chamber also may be utilised for measurements of air kerma values (and air kerma rates) in X radiation fields used for conventional radiography and mammography. The chamber was studied in relation to the characteristics of saturation, ion collection efficiency, polarity effects, leakage current, and short-term stability. The energy dependence in response of each of the two faces of the chamber was determined over the conventional radiography and mammography X ray ranges (unattenuated beams). The different energy response of the two faces of the chamber allowed the formation of a tandem system useful for the constancy check of beam qualities.
Applicability of refractometry for fast routine checking of hospital preparations.
Hendrickx, Stijn; Verón, Aurora Monteagudo; Van Schepdael, Ann; Adams, Erwin
2016-04-30
Quality control of hospital pharmacy formulations is of the utmost importance to ensure constant quality and to avoid potential mistakes before administration to the patient. In this study we investigated the applicability of refractometry as a fast, inexpensive and easy-to-use quality control measurement. Refractive indices (RI) of a multitude of different hospital formulations with varying concentrations of active compound were measured. The samples consisted of a number of binary aqueous solutions (one compound in water), complex aqueous solutions (multiple compounds in water or in a constant matrix), two suspensions and one emulsion. For all these formulations, linear regression analysis was performed, quality control limits determined and accuracy and repeatability were checked. Subsequently, actual hospital pharmacy samples were analyzed to check whether they were within the specified limits. For both binary and complex aqueous formulations, repeatability was good and a linear correlation for all samples could be observed on condition that the concentration of the active compound was sufficiently high. The refractometer was not sensitive enough for solutions of folic acid and levothyroxine, which had too low a concentration of active compound. Due to lack of homogeneity and light scattering, emulsions and suspensions do not seem suitable for quality control by refractometry. A mathematical equation was generated to predict the refractive index of an aqueous solution containing clonidine HCl as active compound. Values calculated from the equation were compared with measured values and deviations of all samples were found to be lower than 1.3%. In order to use refractometry in a hospital pharmacy for quality control of multicomponent samples, additional intermediate measurements would be required, to overcome the fact that refractometry is not compound specific. In conclusion, we found that refractometry could potentially be useful for daily, fast quality measurements of relatively concentrated binary and more complex aqueous solutions in the hospital pharmacy. Copyright © 2016 Elsevier B.V. All rights reserved.
Enhancement of the Automated Quality Control Procedures for the International Soil Moisture Network
NASA Astrophysics Data System (ADS)
Heer, Elsa; Xaver, Angelika; Dorigo, Wouter; Messner, Romina
2017-04-01
In-situ soil moisture observations are still trusted to be the most reliable data to validate remotely sensed soil moisture products. Thus, the quality of in-situ soil moisture observations is of high importance. The International Soil Moisture Network (ISMN; http://ismn.geo.tuwien.ac.at/) provides in-situ soil moisture data from all around the world. The data is collected from individual networks and data providers, measured by different sensors in various depths. The data sets which are delivered in different units, time zones and data formats are then transformed into homogeneous data sets. An erroneous behavior of soil moisture data is very difficult to detect, due to annual and daily changes and most significantly the high influence of precipitation and snow melting processes. Only few of the network providers have a quality assessment for their data sets. Therefore, advanced quality control procedures have been developed for the ISMN (Dorigo et al. 2013). Three categories of quality checks were introduced: exceeding boundary values, geophysical consistency checks and a spectrum based approach. The spectrum based quality control algorithms aim to detect erroneous measurements which occur within plausible geophysical ranges, e.g. a sudden drop in soil moisture caused by a sensor malfunction. By defining several conditions which have to be met by the original soil moisture time series and their first and second derivative, such error types can be detected. Since the development of these sophisticated methods many more data providers shared their data with the ISMN and new types of erroneous measurements were identified. Thus, an enhancement of the automated quality control procedures became necessary. In the present work, we introduce enhancements of the existing quality control algorithms. Additionally, six completely new quality checks have been developed, e.g. detection of suspicious values before or after NAN-values, constant values and values that lie in a spectrum where a high majority of values before and after is flagged and therefore a sensor malfunction is certain. For the evaluation of the enhanced automated quality control system many test data sets were chosen, and manually validated to be compared to the existing quality control procedures and the new algorithms. Improvements will be shown that assure an appropriate assessment of the ISMN data sets, which are used for validations of soil moisture data retrieved by satellite data and are the foundation many other scientific publications.
46 CFR 160.132-9 - Preapproval review.
Code of Federal Regulations, 2014 CFR
2014-10-01
... inventory of materials; (iii) The method for checking quality of fabrication and joints, including welding... §§ 160.132-19 and 160.132-21 of this subpart; (5) A description of the quality control procedures and... between the independent laboratory and Commandant under 46 CFR subpart 159.010. (d) Plan quality. All...
46 CFR 160.132-9 - Preapproval review.
Code of Federal Regulations, 2012 CFR
2012-10-01
... inventory of materials; (iii) The method for checking quality of fabrication and joints, including welding... §§ 160.132-19 and 160.132-21 of this subpart; (5) A description of the quality control procedures and... between the independent laboratory and Commandant under 46 CFR subpart 159.010. (d) Plan quality. All...
46 CFR 160.132-9 - Preapproval review.
Code of Federal Regulations, 2013 CFR
2013-10-01
... inventory of materials; (iii) The method for checking quality of fabrication and joints, including welding... §§ 160.132-19 and 160.132-21 of this subpart; (5) A description of the quality control procedures and... between the independent laboratory and Commandant under 46 CFR subpart 159.010. (d) Plan quality. All...
Assessing Educational Processes Using Total-Quality-Management Measurement Tools.
ERIC Educational Resources Information Center
Macchia, Peter, Jr.
1993-01-01
Discussion of the use of Total Quality Management (TQM) assessment tools in educational settings highlights and gives examples of fishbone diagrams, or cause and effect charts; Pareto diagrams; control charts; histograms and check sheets; scatter diagrams; and flowcharts. Variation and quality are discussed in terms of continuous process…
NASA Astrophysics Data System (ADS)
Servilla, M. S.; O'Brien, M.; Costa, D.
2013-12-01
Considerable ecological research performed today occurs through the analysis of data downloaded from various repositories and archives, often resulting in derived or synthetic products generated by automated workflows. These data are only meaningful for research if they are well documented by metadata, lest semantic or data type errors may occur in interpretation or processing. The Long Term Ecological Research (LTER) Network now screens all data packages entering its long-term archive to ensure that each package contains metadata that is complete, of high quality, and accurately describes the structure of its associated data entity and the data are structurally congruent to the metadata. Screening occurs prior to the upload of a data package into the Provenance Aware Synthesis Tracking Architecture (PASTA) data management system through a series of quality checks, thus preventing ambiguously or incorrectly documented data packages from entering the system. The quality checks within PASTA are designed to work specifically with the Ecological Metadata Language (EML), the metadata standard adopted by the LTER Network to describe data generated by their 26 research sites. Each quality check is codified in Java as part of the ecological community-supported Data Manager Library, which is a resource of the EML specification and used as a component of the PASTA software stack. Quality checks test for metadata quality, data integrity, or metadata-data congruence. Quality checks are further classified as either conditional or informational. Conditional checks issue a 'valid', 'warning' or 'error' response. Only an 'error' response blocks the data package from upload into PASTA. Informational checks only provide descriptive content pertaining to a particular facet of the data package. Quality checks are designed by a group of LTER information managers and reviewed by the LTER community before deploying into PASTA. A total of 32 quality checks have been deployed to date. Quality checks can be customized through a configurable template, which includes turning checks 'on' or 'off' and setting the severity of conditional checks. This feature is important to other potential users of the Data Manager Library who wish to configure its quality checks in accordance with the standards of their community. Executing the complete set of quality checks produces a report that describes the result of each check. The report is an XML document that is stored by PASTA for future reference.
40 CFR 63.10010 - What are my monitoring, installation, operation, and maintenance requirements?
Code of Federal Regulations, 2013 CFR
2013-07-01
... that emissions are controlled with a common control device or series of control devices, are discharged... parallel control devices or multiple series of control devices are discharged to the atmosphere through... quality control activities (including, as applicable, calibration checks and required zero and span...
40 CFR 63.10010 - What are my monitoring, installation, operation, and maintenance requirements?
Code of Federal Regulations, 2014 CFR
2014-07-01
... that emissions are controlled with a common control device or series of control devices, are discharged... parallel control devices or multiple series of control devices are discharged to the atmosphere through... quality control activities (including, as applicable, calibration checks and required zero and span...
NASA Astrophysics Data System (ADS)
Klaessens, John H.; van der Veen, Albert; Verdaasdonk, Rudolf M.
2017-03-01
Recently, low cost smart phone based thermal cameras are being considered to be used in a clinical setting for monitoring physiological temperature responses such as: body temperature change, local inflammations, perfusion changes or (burn) wound healing. These thermal cameras contain uncooled micro-bolometers with an internal calibration check and have a temperature resolution of 0.1 degree. For clinical applications a fast quality measurement before use is required (absolute temperature check) and quality control (stability, repeatability, absolute temperature, absolute temperature differences) should be performed regularly. Therefore, a calibrated temperature phantom has been developed based on thermistor heating on both ends of a black coated metal strip to create a controllable temperature gradient from room temperature 26 °C up to 100 °C. The absolute temperatures on the strip are determined with software controlled 5 PT-1000 sensors using lookup tables. In this study 3 FLIR-ONE cameras and one high end camera were checked with this temperature phantom. The results show a relative good agreement between both low-cost and high-end camera's and the phantom temperature gradient, with temperature differences of 1 degree up to 6 degrees between the camera's and the phantom. The measurements were repeated as to absolute temperature and temperature stability over the sensor area. Both low-cost and high-end thermal cameras measured relative temperature changes with high accuracy and absolute temperatures with constant deviations. Low-cost smart phone based thermal cameras can be a good alternative to high-end thermal cameras for routine clinical measurements, appropriate to the research question, providing regular calibration checks for quality control.
Data Quality Control of the French Permanent Broadband Network in the RESIF Framework
NASA Astrophysics Data System (ADS)
Grunberg, Marc; Lambotte, Sophie; Engels, Fabien; Dretzen, Remi; Hernandez, Alain
2014-05-01
In the framework of the RESIF (Réseau Sismologique et géodésique Français) project, a new information system is being setting up, allowing the improvement of the management and the distribution of high quality data from the different elements of RESIF and the associated networks. Within this information system, EOST (in Strasbourg) is in charge of collecting real-time permanent broadband seismic waveform, and performing Quality Control on these data. The real-time and validated data set are pushed to the French National Distribution Center (Isterre/Grenoble) in order to make them publicly available. Furthermore EOST hosts the BCSF-ReNaSS, in charge of the French metropolitan seismic bulletin. This allows to benefit from some high-end quality control based on the national and world-wide seismicity. Here we present first the real-time seismic data flow from the stations of the French National Broad Band Network to EOST, and then, the data Quality Control procedures that were recently installed, including some new developments. The data Quality Control consists in applying a variety of subprocesses to check the consistency of the whole system and process from the stations to the data center. This allows us to verify that instruments and data transmission are operating correctly. Moreover analysis of the ambient noise helps to characterize intrinsic seismic quality of the stations and to identify other kind of disturbances. The deployed Quality Control consist in a pipeline that starts with low-level procedures : check the real-time miniseed data file (file naming convention, data integrity), check for inconsistencies between waveform and meta-data (channel name, sample rate, etc.), compute waveform statistics (data availability, gap/overlap, mean, rms, time quality, spike). It is followed by some high-level procedures such as : power spectral density computation (PSD), STA/LTA computation to be correlated to the seismicity, phases picking and stations magnitudes discrepancies. The results of quality control is visualized through a web interface. This latter gathers data from different information systems to provide a global view on last events that could impact the data (like intervention on site or seismic events, etc.). This work is still an ongoing project. We intend to add more sophisticated procedures to enhanced our data Quality Control. Among them, we will deploy a seismic moment tensor inversion tool for amplitude, time and polarity control and a noise correlation procedure for time drift detections.
NASA Astrophysics Data System (ADS)
Chen, Min; Zhang, Yu
2017-04-01
A wind profiler network with a total of 65 profiling radars was operated by the MOC/CMA in China until July 2015. In this study, a quality control procedure is constructed to incorporate the profiler data from the wind-profiling network into the local data assimilation and forecasting system (BJRUC). The procedure applies a blacklisting check that removes stations with gross errors and an outlier check that rejects data with large deviations from the background. Instead of the bi-weighting method, which has been commonly implemented in outlier elimination for one-dimensional scalar observations, an outlier elimination method is developed based on the iterated reweighted minimum covariance determinant (IRMCD) for multi-variate observations such as wind profiler data. A quality control experiment is separately performed for subsets containing profiler data tagged in parallel with/without rain flags at every 00UTC/12UTC from 20 June to 30 Sep 2015. From the results, we find that with the quality control, the frequency distributions of the differences between the observations and model background become more Gaussian-like and meet the requirements of a Gaussian distribution for data assimilation. Further intensive assessment for each quality control step reveals that the stations rejected by blacklisting contain poor data quality, and the IRMCD rejects outliers in a robust and physically reasonable manner.
10 CFR 63.142 - Quality assurance criteria.
Code of Federal Regulations, 2014 CFR
2014-01-01
...) Design control. (1) DOE shall establish measures to assure that applicable regulatory requirements and... control design interfaces and for coordination among participating design organizations. These measures... control measures must provide for verifying or checking the adequacy of design, such as by the performance...
10 CFR 63.142 - Quality assurance criteria.
Code of Federal Regulations, 2013 CFR
2013-01-01
...) Design control. (1) DOE shall establish measures to assure that applicable regulatory requirements and... control design interfaces and for coordination among participating design organizations. These measures... control measures must provide for verifying or checking the adequacy of design, such as by the performance...
10 CFR 63.142 - Quality assurance criteria.
Code of Federal Regulations, 2012 CFR
2012-01-01
...) Design control. (1) DOE shall establish measures to assure that applicable regulatory requirements and... control design interfaces and for coordination among participating design organizations. These measures... control measures must provide for verifying or checking the adequacy of design, such as by the performance...
10 CFR 63.142 - Quality assurance criteria.
Code of Federal Regulations, 2011 CFR
2011-01-01
...) Design control. (1) DOE shall establish measures to assure that applicable regulatory requirements and... control design interfaces and for coordination among participating design organizations. These measures... control measures must provide for verifying or checking the adequacy of design, such as by the performance...
46 CFR 160.115-9 - Preapproval review.
Code of Federal Regulations, 2012 CFR
2012-10-01
... inventory of materials; (iii) The method for checking quality of fabrication and joints, including welding... §§ 160.115-19 and 160.115-21 of this subpart; (5) A description of the quality control procedures and... between the independent laboratory and Commandant under 46 CFR part 159, subpart 159.010. (d) Plan quality...
46 CFR 160.115-9 - Preapproval review.
Code of Federal Regulations, 2014 CFR
2014-10-01
... inventory of materials; (iii) The method for checking quality of fabrication and joints, including welding... §§ 160.115-19 and 160.115-21 of this subpart; (5) A description of the quality control procedures and... between the independent laboratory and Commandant under 46 CFR part 159, subpart 159.010. (d) Plan quality...
46 CFR 160.115-9 - Preapproval review.
Code of Federal Regulations, 2013 CFR
2013-10-01
... inventory of materials; (iii) The method for checking quality of fabrication and joints, including welding... §§ 160.115-19 and 160.115-21 of this subpart; (5) A description of the quality control procedures and... between the independent laboratory and Commandant under 46 CFR part 159, subpart 159.010. (d) Plan quality...
Quality control of FWC during assembly and commissioning in SST-1 Tokamak
NASA Astrophysics Data System (ADS)
Patel, Hitesh; Santra, Prosenjit; Parekh, Tejas; Biswas, Prabal; Jayswal, Snehal; Chauhan, Pradeep; Paravastu, Yuvakiran; George, Siju; Semwal, Pratibha; Thankey, Prashant; Ramesh, Gattu; Prakash, Arun; Dhanani, Kalpesh; Raval, D. C.; Khan, Ziauddin; Pradhan, Subrata
2017-04-01
First Wall Components (FWC) of SST-1 tokamak, which are in the immediate vicinity of plasma, comprises of limiters, divertors, baffles, passive stabilizers designed to operate long duration (∼1000 s) discharges of elongated plasma. All FWC consist of copper alloy heat sink modules with SS cooling tubes brazed onto it, graphite tiles acting as armour material facing the plasma, and are mounted to the vacuum vessels with suitable Inconel support structures at inter-connected ring & port locations. The FWC are very recently assembled and commissioned successfully inside the vacuum vessel of SST-1 undergoing a rigorous quality control and checks at every stage of the assembly process. This paper will present the quality control aspects and checks of FWC from commencement of assembly procedure, namely material test reports, leak testing of high temperature baked components, assembled dimensional tolerances, leak testing of all welded joints, graphite tile tightening torques, electrical continuity and electrical isolation of passive stabilizers from vacuum vessel, baking and cooling hydraulic connections inside vacuum vessel.
Basal Area Growth Estimators for Survivor Component: A Quality Control Application
Charles E. Thomas; Francis A. Roesch
1990-01-01
Several possible estimators are available for basal area growth of survivor trees, when horizontal prism (or point) plots (HPP) are remeasured. This study's comparison of three estimators not only provides a check for the estimate of basal area growth but suggests that they can provide a quality control indicator for yield procedures. An example is derived from...
40 CFR 85.2233 - Steady state test equipment calibrations, adjustments, and quality control-EPA 91.
Code of Federal Regulations, 2013 CFR
2013-07-01
... tolerance range. The pressure in the sample cell must be the same with the calibration gas flowing during... this chapter. The check is done at 30 mph (48 kph), and a power absorption load setting to generate a... in § 85.2225(c)(1) are not met. (2) Leak checks. Each time the sample line integrity is broken, a...
40 CFR 85.2233 - Steady state test equipment calibrations, adjustments, and quality control-EPA 91.
Code of Federal Regulations, 2011 CFR
2011-07-01
... tolerance range. The pressure in the sample cell must be the same with the calibration gas flowing during... this chapter. The check is done at 30 mph (48 kph), and a power absorption load setting to generate a... in § 85.2225(c)(1) are not met. (2) Leak checks. Each time the sample line integrity is broken, a...
40 CFR 85.2233 - Steady state test equipment calibrations, adjustments, and quality control-EPA 91.
Code of Federal Regulations, 2012 CFR
2012-07-01
... tolerance range. The pressure in the sample cell must be the same with the calibration gas flowing during... this chapter. The check is done at 30 mph (48 kph), and a power absorption load setting to generate a... in § 85.2225(c)(1) are not met. (2) Leak checks. Each time the sample line integrity is broken, a...
Design of experiments enhanced statistical process control for wind tunnel check standard testing
NASA Astrophysics Data System (ADS)
Phillips, Ben D.
The current wind tunnel check standard testing program at NASA Langley Research Center is focused on increasing data quality, uncertainty quantification and overall control and improvement of wind tunnel measurement processes. The statistical process control (SPC) methodology employed in the check standard testing program allows for the tracking of variations in measurements over time as well as an overall assessment of facility health. While the SPC approach can and does provide researchers with valuable information, it has certain limitations in the areas of process improvement and uncertainty quantification. It is thought by utilizing design of experiments methodology in conjunction with the current SPC practices that one can efficiently and more robustly characterize uncertainties and develop enhanced process improvement procedures. In this research, methodologies were developed to generate regression models for wind tunnel calibration coefficients, balance force coefficients and wind tunnel flow angularities. The coefficients of these regression models were then tracked in statistical process control charts, giving a higher level of understanding of the processes. The methodology outlined is sufficiently generic such that this research can be applicable to any wind tunnel check standard testing program.
Safe and effective nursing shift handover with NURSEPASS: An interrupted time series.
Smeulers, Marian; Dolman, Christine D; Atema, Danielle; van Dieren, Susan; Maaskant, Jolanda M; Vermeulen, Hester
2016-11-01
Implementation of a locally developed evidence based nursing shift handover blueprint with a bedside-safety-check to determine the effect size on quality of handover. A mixed methods design with: (1) an interrupted time series analysis to determine the effect on handover quality in six domains; (2) descriptive statistics to analyze the intercepted discrepancies by the bedside-safety-check; (3) evaluation sessions to gather experiences with the new handover process. We observed a continued trend of improvement in handover quality and a significant improvement in two domains of handover: organization/efficiency and contents. The bedside-safety-check successfully identified discrepancies on drains, intravenous medications, bandages or general condition and was highly appreciated. Use of the nursing shift handover blueprint showed promising results on effectiveness as well as on feasibility and acceptability. However, to enable long term measurement on effectiveness, evaluation with large scale interrupted times series or statistical process control is needed. Copyright © 2016 Elsevier Inc. All rights reserved.
7 CFR 58.243 - Checking quality.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 3 2010-01-01 2010-01-01 false Checking quality. 58.243 Section 58.243 Agriculture... Procedures § 58.243 Checking quality. All milk, milk products and dry milk products shall be subject to inspection and analysis by the dairy plant for quality and condition throughout each processing operation...
40 CFR 63.8465 - How do I monitor and collect data to demonstrate continuous compliance?
Code of Federal Regulations, 2010 CFR
2010-07-01
... use data recorded during monitoring malfunctions, associated repairs, out-of-control periods, or required quality assurance or control activities for purposes of calculating data averages. A monitoring... assurance or control activities (including, as applicable, calibration checks and required zero and span...
40 CFR 63.5355 - How do I monitor and collect data to demonstrate continuous compliance?
Code of Federal Regulations, 2010 CFR
2010-07-01
... periods in assessing the compliance ratio, and, if an emission control device is used, in assessing the...) For emission control devices, except for monitor malfunctions, associated repairs, and required quality assurance or control activities (including, as applicable, calibration checks and required zero...
46 CFR 160.133-9 - Preapproval review.
Code of Federal Regulations, 2012 CFR
2012-10-01
... manual as described in §§ 160.133-19 and 160.133-21 of this subpart; (7) A description of the quality... suppliers; (ii) The method for controlling the inventory of materials; (iii) The method for checking quality of fabrication and joints, including welding inspection procedures; and (iv) The inspection...
46 CFR 160.133-9 - Preapproval review.
Code of Federal Regulations, 2013 CFR
2013-10-01
... manual as described in §§ 160.133-19 and 160.133-21 of this subpart; (7) A description of the quality... suppliers; (ii) The method for controlling the inventory of materials; (iii) The method for checking quality of fabrication and joints, including welding inspection procedures; and (iv) The inspection...
46 CFR 160.170-9 - Preapproval review.
Code of Federal Regulations, 2014 CFR
2014-10-01
... manual as described in §§ 160.170-19 and 160.170-21 of this subpart; (7) A description of the quality... suppliers; (ii) The method for controlling the inventory of materials; (iii) The method for checking quality of fabrication and joints, including welding inspection procedures; and (iv) The inspection...
46 CFR 160.170-9 - Preapproval review.
Code of Federal Regulations, 2013 CFR
2013-10-01
... manual as described in §§ 160.170-19 and 160.170-21 of this subpart; (7) A description of the quality... suppliers; (ii) The method for controlling the inventory of materials; (iii) The method for checking quality of fabrication and joints, including welding inspection procedures; and (iv) The inspection...
46 CFR 160.133-9 - Preapproval review.
Code of Federal Regulations, 2014 CFR
2014-10-01
... manual as described in §§ 160.133-19 and 160.133-21 of this subpart; (7) A description of the quality... suppliers; (ii) The method for controlling the inventory of materials; (iii) The method for checking quality of fabrication and joints, including welding inspection procedures; and (iv) The inspection...
46 CFR 160.170-9 - Preapproval review.
Code of Federal Regulations, 2012 CFR
2012-10-01
... manual as described in §§ 160.170-19 and 160.170-21 of this subpart; (7) A description of the quality... suppliers; (ii) The method for controlling the inventory of materials; (iii) The method for checking quality of fabrication and joints, including welding inspection procedures; and (iv) The inspection...
40 CFR 63.6135 - How do I monitor and collect data to demonstrate continuous compliance?
Code of Federal Regulations, 2012 CFR
2012-07-01
... Combustion Turbines Continuous Compliance Requirements § 63.6135 How do I monitor and collect data to... quality assurance or quality control activities (including, as applicable, calibration checks and required... times the stationary combustion turbine is operating. (b) Do not use data recorded during monitor...
40 CFR 63.6135 - How do I monitor and collect data to demonstrate continuous compliance?
Code of Federal Regulations, 2013 CFR
2013-07-01
... Combustion Turbines Continuous Compliance Requirements § 63.6135 How do I monitor and collect data to... quality assurance or quality control activities (including, as applicable, calibration checks and required... times the stationary combustion turbine is operating. (b) Do not use data recorded during monitor...
40 CFR 63.6135 - How do I monitor and collect data to demonstrate continuous compliance?
Code of Federal Regulations, 2014 CFR
2014-07-01
... Combustion Turbines Continuous Compliance Requirements § 63.6135 How do I monitor and collect data to... quality assurance or quality control activities (including, as applicable, calibration checks and required... times the stationary combustion turbine is operating. (b) Do not use data recorded during monitor...
Data quality in a DRG-based information system.
Colin, C; Ecochard, R; Delahaye, F; Landrivon, G; Messy, P; Morgon, E; Matillon, Y
1994-09-01
The aim of this study initiated in May 1990 was to evaluate the quality of the medical data collected from the main hospital of the "Hospices Civils de Lyon", Edouard Herriot Hospital. We studied a random sample of 593 discharge abstracts from 12 wards of the hospital. Quality control was performed by checking multi-hospitalized patients' personal data, checking that each discharge abstract was exhaustive, examining the quality of abstracting, studying diagnoses and medical procedures coding, and checking data entry. Assessment of personal data showed a 4.4% error rate. It was mainly accounted for by spelling mistakes in surnames and first names, and mistakes in dates of birth. The quality of a discharge abstract was estimated according to the two purposes of the medical information system: description of hospital morbidity per patient and Diagnosis Related Group's case mix. Error rates in discharge abstracts were expressed in two ways: an overall rate for errors of concordance between Discharge Abstracts and Medical Records, and a specific rate for errors modifying classification in Diagnosis Related Groups (DRG). For abstracting medical information, these error rates were 11.5% (SE +/- 2.2) and 7.5% (SE +/- 1.9) respectively. For coding diagnoses and procedures, they were 11.4% (SE +/- 1.5) and 1.3% (SE +/- 0.5) respectively. For data entry on the computerized data base, the error rate was 2% (SE +/- 0.5) and 0.2% (SE +/- 0.05). Quality control must be performed regularly because it demonstrates the degree of participation from health care teams and the coherence of the database.(ABSTRACT TRUNCATED AT 250 WORDS)
Application of reiteration of Hankel singular value decomposition in quality control
NASA Astrophysics Data System (ADS)
Staniszewski, Michał; Skorupa, Agnieszka; Boguszewicz, Łukasz; Michalczuk, Agnieszka; Wereszczyński, Kamil; Wicher, Magdalena; Konopka, Marek; Sokół, Maria; Polański, Andrzej
2017-07-01
Medical centres are obliged to store past medical records, including the results of quality assurance (QA) tests of the medical equipment, which is especially useful in checking reproducibility of medical devices and procedures. Analysis of multivariate time series is an important part of quality control of NMR data. In this work we proposean anomaly detection tool based on Reiteration of Hankel Singular Value Decomposition method. The presented method was compared with external software and authors obtained comparable results.
Shikata, Satoru; Nakayama, Takeo; Yamagishi, Hisakazu
2008-01-01
In this study, we conducted a limited survey of reports of surgical randomized controlled trials, using the consolidated standards of reporting trials (CONSORT) statement and additional check items to clarify problems in the evaluation of surgical reports. A total of 13 randomized trials were selected from two latest review articles on biliary surgery. Each randomized trial was evaluated according to 28 quality measures that comprised items from the CONSORT statement plus additional items. Analysis focused on relationships between the quality of each study and the estimated effect gap ("pooled estimate in meta-analysis" -- "estimated effect of each study"). No definite relationships were found between individual study quality and the estimated effect gap. The following items could have been described but were not provided in almost all the surgical RCT reports: "clearly defined outcomes"; "details of randomization"; "participant flow charts"; "intention-to-treat analysis"; "ancillary analyses"; and "financial conflicts of interest". The item, "participation of a trial methodologist in the study" was not found in any of the reports. Although the quality of reporting trials is not always related to a biased estimation of treatment effect, the items used for quality measures must be described to enable readers to evaluate the quality and applicability of the reporting. Further development of an assessment tool is needed for items specific to surgical randomized controlled trials.
Quality Control System using Simple Implementation of Seven Tools for Batik Textile Manufacturing
NASA Astrophysics Data System (ADS)
Ragil Suryoputro, Muhammad; Sugarindra, Muchamad; Erfaisalsyah, Hendy
2017-06-01
In order to produce better products and mitigate defect in products, every company must implement a quality control system. Company will find means to implement a quality control system that is capable and reliable. One of the methods is using the simple implementation of the seven tools in quality control defects. The case studied in this research was the level of disability xyz grey fabric on a shuttle loom 2 on the Batik manufacturing company. The seven tools that include: flowchart, check sheet, histogram, scatter diagram combined with control charts, Pareto diagrams and fishbone diagrams (causal diagram). Check sheet results obtained types of defects in the grey fabric was woven xyz is warp, double warp, the warp break, double warp, empty warp, warp tenuous, ugly edges, thick warp, and rust. Based on the analysis of control chart indicates that the process is out of control. This can be seen in the graph control where there is still a lot of outlier data. Based on a scatter diagram shows a positive correlation between the percentage of disability and the number of production. Based on Pareto diagram, repair needs priority is for the dominant type of defect is warp (44%) and based on double warp value histogram is also the highest with a value of 23635.11 m. In addition, based on the analysis of the factors causing defect by fishbone diagram double warp or other types of defects originating from the materials, methods, machines, measurements, man and environment. Thus the company can take to minimize the prevention and repair of defects and improve product quality.
Quality monitored distributed voting system
Skogmo, David
1997-01-01
A quality monitoring system can detect certain system faults and fraud attempts in a distributed voting system. The system uses decoy voters to cast predetermined check ballots. Absent check ballots can indicate system faults. Altered check ballots can indicate attempts at counterfeiting votes. The system can also cast check ballots at predetermined times to provide another check on the distributed voting system.
[Quality control of laser imagers].
Winkelbauer, F; Ammann, M; Gerstner, N; Imhof, H
1992-11-01
Multiformat imagers based on laser systems are used for documentation in an increasing number of investigations. The specific problems of quality control are explained and the persistence of film processing in these imager systems of different configuration with (Machine 1: 3M-Laser-Imager-Plus M952 with connected 3M Film-Processor, 3M-Film IRB, X-Rax Chemical Mixer 3M-XPM, 3M-Developer and Fixer) or without (Machine 2: 3M-Laser-Imager-Plus M952 with separate DuPont-Cronex Film-processor, Kodak IR-Film, Kodak Automixer, Kodak-Developer and Fixer) connected film processing unit are investigated. In our checking based on DIN 6868 and ONORM S 5240 we found persistence of film processing in the equipment with directly adapted film processing unit according to DIN and ONORM. The checking of film persistence as demanded by DIN 6868 in these equipment could therefore be performed in longer periods. Systems with conventional darkroom processing comparatively show plain increased fluctuation, and hence the demanded daily control is essential to guarantee appropriate reaction and constant quality of documentation.
Quality monitored distributed voting system
Skogmo, D.
1997-03-18
A quality monitoring system can detect certain system faults and fraud attempts in a distributed voting system. The system uses decoy voters to cast predetermined check ballots. Absent check ballots can indicate system faults. Altered check ballots can indicate attempts at counterfeiting votes. The system can also cast check ballots at predetermined times to provide another check on the distributed voting system. 6 figs.
40 CFR 75.59 - Certification, quality assurance, and quality control record provisions.
Code of Federal Regulations, 2010 CFR
2010-07-01
... and the run average); (B) The raw data and results for all required pre-test, post-test, pre-run and...-day calibration error tests, all daily system integrity checks (Hg monitors, only), and all off-line calibration demonstrations, including any follow-up tests after corrective action: (i) Component-system...
Radiation Safety and Quality Assurance in North American Dental Schools.
ERIC Educational Resources Information Center
Farman, Allan G.; Hines, Vickie G.
1986-01-01
A survey of dental schools that revealed processing quality control and routine maintenance checks on x-ray generators are being carried out in a timely manner is discussed. However, methods for reducing patient exposure to radiation are not being fully implemented, and some dental students are being exposed to x-rays. (Author/MLW)
NASA Astrophysics Data System (ADS)
Manzella, G. M. R.; Scoccimarro, E.; Pinardi, N.; Tonani, M.
2003-01-01
A "ship of opportunity" program was launched as part of the Mediterranean Forecasting System Pilot Project. During the operational period (September 1999 to May 2000), six tracks covered the Mediterranean from the northern to southern boundaries approximately every 15 days, while a long eastwest track from Haifa to Gibraltar was covered approximately every month. XBT data were collected, sub-sampled at 15 inflection points and transmitted through a satellite communication system to a regional data centre. It was found that this data transmission system has limitations in terms of quality of the temperature profiles and quantity of data successfully transmitted. At the end of the MFSPP operational period, a new strategy for data transmission and management was developed. First of all, VOS-XBT data are transmitted with full resolution. Secondly, a new data management system, called Near Real Time Quality Control for XBT (NRT.QC.XBT), was defined to produce a parallel stream of high quality XBT data for further scientific analysis. The procedure includes: (1) Position control; (2) Elimination of spikes; (3) Re-sampling at a 1 metre vertical interval; (4) Filtering; (5) General malfunctioning check; (6) Comparison with climatology (and distance from this in terms of standard deviations); (7) Visual check; and (8) Data consistency check. The first six steps of the new procedure are completely automated; they are also performed using a new climatology developed as part of the project. The visual checks are finally done with a free-market software that allows NRT final data assessment.
Agricultural Baseline (BL0) scenario
Davis, Maggie R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000181319328); Hellwinckel, Chad M [University of Tennessee] (ORCID:0000000173085058); Eaton, Laurence [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000312709626); Turhollow, Anthony [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000228159350); Brandt, Craig [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000214707379); Langholtz, Matthew H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000281537154)
2016-07-13
Scientific reason for data generation: to serve as the reference case for the BT16 volume 1 agricultural scenarios. The agricultural baseline runs from 2015 through 2040; a starting year of 2014 is used. Date the data set was last modified: 02/12/2016 How each parameter was produced (methods), format, and relationship to other data in the data set: simulation was developed without offering a farmgate price to energy crops or residues (i.e., building on both the USDA 2015 baseline and the agricultural census data (USDA NASS 2014). Data generated are .txt output files by year, simulation identifier, county code (1-3109). Instruments used: POLYSYS (version POLYS2015_V10_alt_JAN22B) supplied by the University of Tennessee APAC The quality assurance and quality control that have been applied: • Check for negative planted area, harvested area, production, yield and cost values. • Check if harvested area exceeds planted area for annuals. • Check FIPS codes.
Software Quality Control at Belle II
NASA Astrophysics Data System (ADS)
Ritter, M.; Kuhr, T.; Hauth, T.; Gebard, T.; Kristof, M.; Pulvermacher, C.;
2017-10-01
Over the last seven years the software stack of the next generation B factory experiment Belle II has grown to over one million lines of C++ and Python code, counting only the part included in offline software releases. There are several thousand commits to the central repository by about 100 individual developers per year. To keep a coherent software stack of high quality that it can be sustained and used efficiently for data acquisition, simulation, reconstruction, and analysis over the lifetime of the Belle II experiment is a challenge. A set of tools is employed to monitor the quality of the software and provide fast feedback to the developers. They are integrated in a machinery that is controlled by a buildbot master and automates the quality checks. The tools include different compilers, cppcheck, the clang static analyzer, valgrind memcheck, doxygen, a geometry overlap checker, a check for missing or extra library links, unit tests, steering file level tests, a sophisticated high-level validation suite, and an issue tracker. The technological development infrastructure is complemented by organizational means to coordinate the development.
PACS quality control and automatic problem notifier
NASA Astrophysics Data System (ADS)
Honeyman-Buck, Janice C.; Jones, Douglas; Frost, Meryll M.; Staab, Edward V.
1997-05-01
One side effect of installing a clinical PACS Is that users become dependent upon the technology and in some cases it can be very difficult to revert back to a film based system if components fail. The nature of system failures range from slow deterioration of function as seen in the loss of monitor luminance through sudden catastrophic loss of the entire PACS networks. This paper describes the quality control procedures in place at the University of Florida and the automatic notification system that alerts PACS personnel when a failure has happened or is anticipated. The goal is to recover from a failure with a minimum of downtime and no data loss. Routine quality control is practiced on all aspects of PACS, from acquisition, through network routing, through display, and including archiving. Whenever possible, the system components perform self and between platform checks for active processes, file system status, errors in log files, and system uptime. When an error is detected or a exception occurs, an automatic page is sent to a pager with a diagnostic code. Documentation on each code, trouble shooting procedures, and repairs are kept on an intranet server accessible only to people involved in maintaining the PACS. In addition to the automatic paging system for error conditions, acquisition is assured by an automatic fax report sent on a daily basis to all technologists acquiring PACS images to be used as a cross check that all studies are archived prior to being removed from the acquisition systems. Daily quality control is preformed to assure that studies can be moved from each acquisition and contrast adjustment. The results of selected quality control reports will be presented. The intranet documentation server will be described with the automatic pager system. Monitor quality control reports will be described and the cost of quality control will be quantified. As PACS is accepted as a clinical tool, the same standards of quality control must be established as are expected on other equipment used in the diagnostic process.
NASA Astrophysics Data System (ADS)
Rzonca, A.
2013-12-01
The paper presents the state of the art of quality control of photogrammetric and laser scanning data captured by airborne sensors. The described subject is very important for photogrammetric and LiDAR project execution, because the data quality a prior decides about the final product quality. On the other hand, precise and effective quality control process allows to execute the missions without wide margin of safety, especially in case of the mountain areas projects. For introduction, the author presents theoretical background of the quality control, basing on his own experience, instructions and technical documentation. He describes several variants of organization solutions. Basically, there are two main approaches: quality control of the captured data and the control of discrepancies of the flight plan and its results of its execution. Both of them are able to use test of control and analysis of the data. The test is an automatic algorithm controlling the data and generating the control report. Analysis is a less complicated process, that is based on documentation, data and metadata manual check. The example of quality control system for large area project was presented. The project is being realized periodically for the territory of all Spain and named National Plan of Aerial Orthophotography (Plan Nacional de Ortofotografía Aérea, PNOA). The system of the internal control guarantees its results soon after the flight and informs the flight team of the company. It allows to correct all the errors shortly after the flight and it might stop transferring the data to another team or company, for further data processing. The described system of data quality control contains geometrical and radiometrical control of photogrammetric data and geometrical control of LiDAR data. According to all specified parameters, it checks all of them and generates the reports. They are very helpful in case of some errors or low quality data. The paper includes the author experience in the field of data quality control, presents the conclusions and suggestions of the organization and technical aspects, with a short definition of the necessary control software.
NASA Technical Reports Server (NTRS)
Orcutt, John M.; Brenton, James C.
2016-01-01
An accurate database of meteorological data is essential for designing any aerospace vehicle and for preparing launch commit criteria. Meteorological instrumentation were recently placed on the three Lightning Protection System (LPS) towers at Kennedy Space Center (KSC) launch complex 39B (LC-39B), which provide a unique meteorological dataset existing at the launch complex over an extensive altitude range. Data records of temperature, dew point, relative humidity, wind speed, and wind direction are produced at 40, 78, 116, and 139 m at each tower. The Marshall Space Flight Center Natural Environments Branch (EV44) received an archive that consists of one-minute averaged measurements for the period of record of January 2011 - April 2015. However, before the received database could be used EV44 needed to remove any erroneous data from within the database through a comprehensive quality control (QC) process. The QC process applied to the LPS towers' meteorological data is similar to other QC processes developed by EV44, which were used in the creation of meteorological databases for other towers at KSC. The QC process utilized in this study has been modified specifically for use with the LPS tower database. The QC process first includes a check of each individual sensor. This check includes removing any unrealistic data and checking the temporal consistency of each variable. Next, data from all three sensors at each height are checked against each other, checked against climatology, and checked for sensors that erroneously report a constant value. Then, a vertical consistency check of each variable at each tower is completed. Last, the upwind sensor at each level is selected to minimize the influence of the towers and other structures at LC-39B on the measurements. The selection process for the upwind sensor implemented a study of tower-induced turbulence. This paper describes in detail the QC process, QC results, and the attributes of the LPS towers meteorological database.
Valid internal standard technique for arson detection based on gas chromatography-mass spectrometry.
Salgueiro, Pedro A S; Borges, Carlos M F; Bettencourt da Silva, Ricardo J N
2012-09-28
The most popular procedures for the detection of residues of accelerants in fire debris are the ones published by the American Society for Testing and Materials (ASTM E1412-07 and E1618-10). The most critical stages of these tests are the conservation of fire debris from the sampling to the laboratory, the extraction of residues of accelerants from the debris to the activated charcoal strips (ACS) and from those to the final solvent, as well as the analysis of sample extract by gas chromatography-mass spectrometry (GC-MS) and the interpretation of the instrumental signal. This work proposes a strategy for checking the quality of the sample conservation, the accelerant residues transference to final solvent and GC-MS analysis, using internal standard additions. It is used internal standards ranging from a highly volatile compound for checking debris conservation to low volatile compound for checking GC-MS repeatability. The developed quality control (QC) parameters are not affected by GC-MS sensitivity variation and, specifically, the GC-MS performance control is not affected by ACS adsorption saturation that may mask test performance deviations. The proposed QC procedure proved to be adequate to check GC-MS repeatability, ACS extraction and sample conservation since: (1) standard additions are affected by negligible uncertainty and (2) observed dispersion of QC parameters are fit for its intended use. Copyright © 2012 Elsevier B.V. All rights reserved.
DOT National Transportation Integrated Search
2010-06-01
This manual provides information and recommended procedures to be utilized by an agencys Weigh-in-Motion (WIM) Office Data Analyst to perform validation and quality control (QC) checks of WIM traffic data. This manual focuses on data generated by ...
Quality Assurance and Quality Control, Part 2.
Akers, Michael J
2015-01-01
The tragedy surrounding the New England Compounding Center and contaminated steroid syringe preparations clearly points out what can happen if quality-assurance and quality-control procedures are not strictly practiced in the compounding of sterile preparations. This article is part 2 of a two-part article on requirements to comply with United States Pharmacopeia general chapters <797> and <1163> with respect to quality assurance of compounded sterile preparations. Part 1 covered documentation requirements, inspection procedures, compounding accuracy checks, and part of a discussion on bacterial endotoxin testing. Part 2 covers sterility testing, the completion from part 1 on bacterial endotoxin testing, a brief dicussion of United States Pharmacopeia <1163>, and advances in pharmaceutical quality systems.
Flow Control and Design Assessment for Drainage System at McMurdo Station, Antarctica
2014-11-24
Council BMP Best Management Practice CASQUA California Storm Water Quality Task Force CRREL Cold Regions Research and Engineering Laboratory DS...ponds The California Storm Water Quality Task Force (CASQUA 1993) defines a sediment basin as “a pond created by excavation or constructing an em...British Standards Institution. California Storm Water Quality Task Force (CASQUA). 1993. ESC41: Check Dams. In Stormwater Best Management Practices
Implementing Model-Check for Employee and Management Satisfaction
NASA Technical Reports Server (NTRS)
Jones, Corey; LaPha, Steven
2013-01-01
This presentation will discuss methods to which ModelCheck can be implemented to not only improve model quality, but also satisfy both employees and management through different sets of quality checks. This approach allows a standard set of modeling practices to be upheld throughout a company, with minimal interaction required by the end user. The presenter will demonstrate how to create multiple ModelCheck standards, preventing users from evading the system, and how it can improve the quality of drawings and models.
The Effect of the MassHealth Hospital Pay-for-Performance Program on Quality
Ryan, Andrew M; Blustein, Jan
2011-01-01
Objective To test the effect of Massachusetts Medicaid's (MassHealth) hospital-based pay-for-performance (P4P) program, implemented in 2008, on quality of care for pneumonia and surgical infection prevention (SIP). Data Hospital Compare process of care quality data from 2004 to 2009 for acute care hospitals in Massachusetts (N = 62) and other states (N = 3,676) and American Hospital Association data on hospital characteristics from 2005. Study Design Panel data models with hospital fixed effects and hospital-specific trends are estimated to test the effect of P4P on composite quality for pneumonia and SIP. This base model is extended to control for the completeness of measure reporting. Further sensitivity checks include estimation with propensity-score matched control hospitals, excluding hospitals in other P4P programs, varying the time period during which the program was assumed to have an effect, and testing the program effect across hospital characteristics. Principal Findings Estimates from our preferred specification, including hospital fixed effects, trends, and the control for measure completeness, indicate small and nonsignificant program effects for pneumonia (−0.67 percentage points, p>.10) and SIP (−0.12 percentage points, p>.10). Sensitivity checks indicate a similar pattern of findings across specifications. Conclusions Despite offering substantial financial incentives, the MassHealth P4P program did not improve quality in the first years of implementation. PMID:21210796
AutoLock: a semiautomated system for radiotherapy treatment plan quality control
Lowe, Matthew; Hardy, Mark J.; Boylan, Christopher J.; Whitehurst, Philip; Rowbottom, Carl G.
2015-01-01
A semiautomated system for radiotherapy treatment plan quality control (QC), named AutoLock, is presented. AutoLock is designed to augment treatment plan QC by automatically checking aspects of treatment plans that are well suited to computational evaluation, whilst summarizing more subjective aspects in the form of a checklist. The treatment plan must pass all automated checks and all checklist items must be acknowledged by the planner as correct before the plan is finalized. Thus AutoLock uniquely integrates automated treatment plan QC, an electronic checklist, and plan finalization. In addition to reducing the potential for the propagation of errors, the integration of AutoLock into the plan finalization workflow has improved efficiency at our center. Detailed audit data are presented, demonstrating that the treatment plan QC rejection rate fell by around a third following the clinical introduction of AutoLock. PACS number: 87.55.Qr PMID:26103498
AutoLock: a semiautomated system for radiotherapy treatment plan quality control.
Dewhurst, Joseph M; Lowe, Matthew; Hardy, Mark J; Boylan, Christopher J; Whitehurst, Philip; Rowbottom, Carl G
2015-05-08
A semiautomated system for radiotherapy treatment plan quality control (QC), named AutoLock, is presented. AutoLock is designed to augment treatment plan QC by automatically checking aspects of treatment plans that are well suited to computational evaluation, whilst summarizing more subjective aspects in the form of a checklist. The treatment plan must pass all automated checks and all checklist items must be acknowledged by the planner as correct before the plan is finalized. Thus AutoLock uniquely integrates automated treatment plan QC, an electronic checklist, and plan finalization. In addition to reducing the potential for the propagation of errors, the integration of AutoLock into the plan finalization workflow has improved efficiency at our center. Detailed audit data are presented, demonstrating that the treatment plan QC rejection rate fell by around a third following the clinical introduction of AutoLock.
Amelogenin test: From forensics to quality control in clinical and biochemical genomics.
Francès, F; Portolés, O; González, J I; Coltell, O; Verdú, F; Castelló, A; Corella, D
2007-01-01
The increasing number of samples from the biomedical genetic studies and the number of centers participating in the same involves increasing risk of mistakes in the different sample handling stages. We have evaluated the usefulness of the amelogenin test for quality control in sample identification. Amelogenin test (frequently used in forensics) was undertaken on 1224 individuals participating in a biomedical study. Concordance between referred sex in the database and amelogenin test was estimated. Additional sex-error genetic detecting systems were developed. The overall concordance rate was 99.84% (1222/1224). Two samples showed a female amelogenin test outcome, being codified as males in the database. The first, after checking sex-specific biochemical and clinical profile data was found to be due to a codification error in the database. In the second, after checking the database, no apparent error was discovered because a correct male profile was found. False negatives in amelogenin male sex determination were discarded by additional tests, and feminine sex was confirmed. A sample labeling error was revealed after a new DNA extraction. The amelogenin test is a useful quality control tool for detecting sex-identification errors in large genomic studies, and can contribute to increase its validity.
Continuous integration and quality control for scientific software
NASA Astrophysics Data System (ADS)
Neidhardt, A.; Ettl, M.; Brisken, W.; Dassing, R.
2013-08-01
Modern software has to be stable, portable, fast and reliable. This is going to be also more and more important for scientific software. But this requires a sophisticated way to inspect, check and evaluate the quality of source code with a suitable, automated infrastructure. A centralized server with a software repository and a version control system is one essential part, to manage the code basis and to control the different development versions. While each project can be compiled separately, the whole code basis can also be compiled with one central “Makefile”. This is used to create automated, nightly builds. Additionally all sources are inspected automatically with static code analysis and inspection tools, which check well-none error situations, memory and resource leaks, performance issues, or style issues. In combination with an automatic documentation generator it is possible to create the developer documentation directly from the code and the inline comments. All reports and generated information are presented as HTML page on a Web server. Because this environment increased the stability and quality of the software of the Geodetic Observatory Wettzell tremendously, it is now also available for scientific communities. One regular customer is already the developer group of the DiFX software correlator project.
40 CFR 63.7535 - How do I monitor and collect data to demonstrate continuous compliance?
Code of Federal Regulations, 2012 CFR
2012-07-01
... activities, including, as applicable, calibration checks and required zero and span adjustments. A monitoring...-control periods, or required monitoring system quality assurance or control activities in data averages... data according to this section and the site-specific monitoring plan required by § 63.7505(d). (b) You...
40 CFR 63.7535 - How do I monitor and collect data to demonstrate continuous compliance?
Code of Federal Regulations, 2011 CFR
2011-07-01
... activities, including, as applicable, calibration checks and required zero and span adjustments. A monitoring...-control periods, or required monitoring system quality assurance or control activities in data averages... data according to this section and the site-specific monitoring plan required by § 63.7505(d). (b) You...
NASA Technical Reports Server (NTRS)
Tighe, R. J.; Shen, M. Y. H.
1984-01-01
The Nimbus 7 ERB MATRIX Tape is a computer program in which radiances and irradiances are converted into fluxes which are used to compute the basic scientific output parameters, emitted flux, albedo, and net radiation. They are spatially averaged and presented as time averages over one-day, six-day, and monthly periods. MATRIX data for the period November 16, 1978 through October 31, 1979 are presented. Described are the Earth Radiation Budget experiment, the Science Quality Control Report, Items checked by the MATRIX Science Quality Control Program, and Science Quality Control Data Analysis Report. Additional material from the detailed scientific quality control of the tapes which may be very useful to a user of the MATRIX tapes is included. Known errors and data problems and some suggestions on how to use the data for further climatologic and atmospheric physics studies are also discussed.
Data services providing by the Ukrainian NODC (MHI NASU)
NASA Astrophysics Data System (ADS)
Eremeev, V.; Godin, E.; Khaliulin, A.; Ingerov, A.; Zhuk, E.
2009-04-01
At modern stage of the World Ocean study information support of investigation based on ad-vanced computer technologies becomes of particular importance. These abstracts are devoted to presentation of several data services developed in the Ukrainian NODC on the base of the Ma-rine Environmental and Information Technologies Department of MHI NASU. The Data Quality Control Service Using experience of international collaboration in the field of data collection and quality check we have developed the quality control (QC) software providing both preliminary(automatic) and expert(manual) data quality check procedures. The current version of the QC software works for the Mediterranean and Black seas and includes the climatic arrays for hydrological and few hydrochemical parameters based on such products as MEDAR/MEDATLAS II, Physical Oceanography of the Black Sea and Climatic Atlas of Oxygen and Hydrogen Sulfide in the Black sea. The data quality check procedure includes metadata control and hydrological and hydrochemical data control. Metadata control provides checking of duplicate cruises and pro-files, date and chronology, ship velocity, station location, sea depth and observation depth. Data QC procedure includes climatic (or range for parameters with small number of observations) data QC, density inversion check for hydrological data and searching for spikes. Using of cli-matic fields and profiles prepared by regional oceanography experts leads to more reliable results of data quality check procedure. The Data Access Services The Ukrainian NODC provides two products for data access - on-line software and data access module for the MHI NASU local net. This software allows select-ing data on rectangle area, on date, on months, on cruises. The result of query is metadata which are presented in the table and the visual presentation of stations on the map. It is possible to see both metadata and data. For this purpose it is necessary to select station in the table of metadata or on the map. There is also an opportunity to export data in ODV format. The product is avail-able on http://www.ocean.nodc.org.ua/DataAccess.php The local net version provides access to the oceanological database of the MHI NASU. The cur-rent version allows selecting data by spatial and temporal limits, depth, values of parameters, quality flags and works for the Mediterranean and Black seas. It provides visualization of meta-data and data, statistics of data selection, data export into several data formats. The Operational Data Management Services The collaborators of the MHI Experimental Branch developed a system of obtaining information on water pressure and temperature, as well as on atmospheric pressure. Sea level observations are also conducted. The obtained data are transferred online. The interface for operation data access was developed. It allows to select parameters (sea level, water temperature, atmospheric pressure, wind and wa-ter pressure) and time interval to see parameter graphics. The product is available on http://www.ocean.nodc.org.ua/Katsively.php . The Climatic products The current version of the Climatic Atlas includes maps on such pa-rameters as temperature, salinity, density, heat storage, dynamic heights, upper boundary of hy-drogen sulfide and lower boundary of oxygen for the Black sea basin. Maps for temperature, sa-linity, density were calculated on 19 standard depths and averaged monthly for depths 0 - 300 m and annually for lower depth values. The climatic maps of upper boundary of hydrogen sulfide and lower boundary of oxygen were averaged by decades from 20 till 90 of the XX century and by seasons. Two versions of climatic atlas viewer - on-line and desktop for presentation of the climatic maps were developed. They provide similar functions of selection and viewing maps by parameter, month and depth and saving maps in various formats. On-line version of atlas is available on http://www.ocean.nodc.org.ua/Main_Atlas.php .
40 CFR Appendix B to Part 75 - Quality Assurance and Quality Control Procedures
Code of Federal Regulations, 2010 CFR
2010-07-01
... in section 2.3 of this appendix and the Hg emission tests described in §§ 75.81(c) and 75.81(d)(4). 1.2Specific Requirements for Continuous Emissions Monitoring Systems 1.2.1Calibration Error Test and Linearity Check Procedures Keep a written record of the procedures used for daily calibration error tests and...
Some Inspection Methods for Quality Control and In-service Inspection of GLARE
NASA Astrophysics Data System (ADS)
Sinke, J.
2003-07-01
Quality control of materials and structures is an important issue, also for GLARE. During the manufacturing stage the processes and materials should be monitored and checked frequently in order to obtain a qualified product. During the operation of the aircraft, frequent monitoring and inspections are performed to maintain the quality at a prescribed level. Therefore, in-service inspection methods are applied, and when necessary repair activities are conducted. For the quality control of the GLARE panels and components during manufacturing, the C-scan method proves to be an effective tool. For in-service inspection the Eddy Current Method is one of the suitable options. In this paper a brief overview is presented of both methods and their application on GLARE products.
Agricultural Baseline (BL0) scenario of the 2016 Billion-Ton Report
Davis, Maggie R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000181319328); Hellwinkel, Chad [University of Tennessee, APAC] (ORCID:0000000173085058); Eaton, Laurence [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000312709626); Langholtz, Matthew H [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000281537154); Turhollow, Anthony [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000228159350); Brandt, Craig [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000214707379); Myers, Aaron [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000320373827)
2016-07-13
Scientific reason for data generation: to serve as the reference case for the BT16 volume 1 agricultural scenarios. The agricultural baseline runs from 2015 through 2040; a starting year of 2014 is used. Date the data set was last modified: 02/12/2016 How each parameter was produced (methods), format, and relationship to other data in the data set: simulation was developed without offering a farmgate price to energy crops or residues (i.e., building on both the USDA 2015 baseline and the agricultural census data (USDA NASS 2014). Data generated are .txt output files by year, simulation identifier, county code (1-3109). Instruments used: POLYSYS (version POLYS2015_V10_alt_JAN22B) supplied by the University of Tennessee APAC The quality assurance and quality control that have been applied: • Check for negative planted area, harvested area, production, yield and cost values. • Check if harvested area exceeds planted area for annuals. • Check FIPS codes.
KCBX Letter to EPA - Aug. 27, 2014
A quality control check revealed that temperature data for June 26-Aug. 6, 2014 from the meteorological monitor at NT-NW was not accurate. KCBX replaced the invalidated data with hourly ambient temperature data from other equipment and sent updated files.
1997-12-16
An image of the F-16XL #1 during its functional flight check of the Digital Flight Control System (DFCS) on December 16, 1997. The mission was flown by NASA research pilot Dana Purifoy, and lasted 1 hour and 25 minutes. The tests included pilot familiarly, functional check, and handling qualities evaluation maneuvers to a speed of Mach 0.6 and 300 knots. Purifoy completed all the briefed data points with no problems, and reported that the DFCS handled as well, if not better than the analog computer system that it replaced.
Model Checking Verification and Validation at JPL and the NASA Fairmont IV and V Facility
NASA Technical Reports Server (NTRS)
Schneider, Frank; Easterbrook, Steve; Callahan, Jack; Montgomery, Todd
1999-01-01
We show how a technology transfer effort was carried out. The successful use of model checking on a pilot JPL flight project demonstrates the usefulness and the efficacy of the approach. The pilot project was used to model a complex spacecraft controller. Software design and implementation validation were carried out successfully. To suggest future applications we also show how the implementation validation step can be automated. The effort was followed by the formal introduction of the modeling technique as a part of the JPL Quality Assurance process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Montano, Joshua Daniel
2015-03-23
Coordinate Measuring Machines (CMM) are widely used in industry, throughout the Nuclear Weapons Complex and at Los Alamos National Laboratory (LANL) to verify part conformance to design definition. Calibration cycles for CMMs at LANL are predominantly one year in length. Unfortunately, several nonconformance reports have been generated to document the discovery of a certified machine found out of tolerance during a calibration closeout. In an effort to reduce risk to product quality two solutions were proposed – shorten the calibration cycle which could be costly, or perform an interim check to monitor the machine’s performance between cycles. The CMM interimmore » check discussed makes use of Renishaw’s Machine Checking Gauge. This off-the-shelf product simulates a large sphere within a CMM’s measurement volume and allows for error estimation. Data was gathered, analyzed, and simulated from seven machines in seventeen different configurations to create statistical process control run charts for on-the-floor monitoring.« less
CRN5EXP: Expert system for statistical quality control
NASA Technical Reports Server (NTRS)
Hentea, Mariana
1991-01-01
The purpose of the Expert System CRN5EXP is to assist in checking the quality of the coils at two very important mills: Hot Rolling and Cold Rolling in a steel plant. The system interprets the statistical quality control charts, diagnoses and predicts the quality of the steel. Measurements of process control variables are recorded in a database and sample statistics such as the mean and the range are computed and plotted on a control chart. The chart is analyzed through patterns using the C Language Integrated Production System (CLIPS) and a forward chaining technique to reach a conclusion about the causes of defects and to take management measures for the improvement of the quality control techniques. The Expert System combines the certainty factors associated with the process control variables to predict the quality of the steel. The paper presents the approach to extract data from the database, the reason to combine certainty factors, the architecture and the use of the Expert System. However, the interpretation of control charts patterns requires the human expert's knowledge and lends to Expert Systems rules.
The design of temporary sediment controls with special reference to water quality.
DOT National Transportation Integrated Search
1975-01-01
The laboratory and field trapping efficiencies of several types of flow barriers were ascertained. The materials used to fabricate the barriers were various types of hay straw crushed stone and crushed stone/straw mixes. Field checks of systems of ba...
The Method of Manufacturing Nonmetallic Test-Blocks on Different Sensitivity Classes
NASA Astrophysics Data System (ADS)
Kalinichenko, N. P.; Kalinichenko, A. N.; Lobanova, I. S.; Zaitseva, A. A.; Loboda, E. L.
2016-01-01
Nowadays in our modern world there is a vital question of quality control of details made from nonmetallic materials due to their wide spreading. Nondestructive penetrant testing is effective, and in some cases it is the only possible method of accidents prevention at high- risk sites. A brief review of check sample necessary for quality evaluation of penetrant materials is considered. There was offered a way of making agents for quality of penetrant materials testing according to different liquid penetrant testing sensibility classes.
Application Research of Quality Control Technology of Asphalt Pavement based on GPS Intelligent
NASA Astrophysics Data System (ADS)
Wang, Min; Gao, Bo; Shang, Fei; Wang, Tao
2017-10-01
Due to the difficulty of steel deck pavement asphalt layer compaction caused by the effect of the flexible supporting system (orthotropic steel deck plate), it is usually hard and difficult to control for the site compactness to reach the design goal. The intelligent compaction technology is based on GPS control technology and real-time acquisition of actual compaction tracks, and then forms a cloud maps of compaction times, which guide the roller operator to do the compaction in accordance with the design requirement to ensure the deck compaction technology and compaction quality. From the actual construction situation of actual bridge and checked data, the intelligent compaction technology is significant in guaranteeing the steel deck asphalt pavement compactness and quality stability.
7 CFR 90.2 - General terms defined.
Code of Federal Regulations, 2011 CFR
2011-01-01
... agency, or other agency, organization or person that defines in the general terms the basis on which the... analytical data using proficiency check sample or analyte recovery techniques. In addition, the certainty.... Quality control. The system of close examination of the critical details of an analytical procedure in...
10 CFR 74.59 - Quality assurance and accounting requirements.
Code of Federal Regulations, 2010 CFR
2010-01-01
... capabilities described in paragraphs (b) through (h) of this section. (b) Management structure. The licensee shall: (1) Establish and maintain a management structure that includes clear overall responsibility for... such that the activities of one individual or organizational unit serve as controls over and checks of...
NASA Astrophysics Data System (ADS)
Pastorello, G.; Agarwal, D.; Poindexter, C.; Papale, D.; Trotta, C.; Ribeca, A.; Canfora, E.; Faybishenko, B.; Gunter, D.; Chu, H.
2015-12-01
The fluxes-measuring sites that are part of AmeriFlux are operated and maintained in a fairly independent fashion, both in terms of scientific goals and operational practices. This is also the case for most sites from other networks in FLUXNET. This independence leads to a degree of heterogeneity in the data sets collected at the sites, which is also reflected in data quality levels. The generation of derived data products and data synthesis efforts, two of the main goals of these networks, are directly affected by the heterogeneity in data quality. In a collaborative effort between AmeriFlux and ICOS, a series of quality checks are being conducted for the data sets before any network-level data processing and product generation take place. From these checks, a set of common data issues were identified, and are being cataloged and classified into data quality patterns. These patterns are now being used as a basis for implementing automation for certain data quality checks, speeding up the process of applying the checks and evaluating the data. Currently, most data checks are performed individually in each data set, requiring visual inspection and inputs from a data curator. This manual process makes it difficult to scale the quality checks, creating a bottleneck for the data processing. One goal of the automated checks is to free up time of data curators so they can focus on new or less common issues. As new issues are identified, they can also be cataloged and classified, extending the coverage of existing patterns or potentially generating new patterns, helping both improve existing automated checks and create new ones. This approach is helping make data quality evaluation faster, more systematic, and reproducible. Furthermore, these patterns are also helping with documenting common causes and solutions for data problems. This can help tower teams with diagnosing problems in data collection and processing, and also in correcting historical data sets. In this presentation, using AmeriFlux fluxes and micrometeorological data, we discuss our approach to creating observational data patterns, and how we are using them to implement new automated checks. We also detail examples of these observational data patterns, illustrating how they are being used.
Consumers' perceptions of food risks: A snapshot of the Italian Triveneto area.
Tiozzo, Barbara; Mari, Silvia; Ruzza, Mirko; Crovato, Stefania; Ravarotto, Licia
2017-04-01
This study investigated the food risk perceptions of people living in the Triveneto area (Northeast Italy), a territory characterized by a particular interest in the production of quality foodstuffs, to determine what aspects people associate with food risk and to understand what beliefs underlie these perceptions. Four focus groups were conducted in the major towns of the target area (N = 45). A semi-structured interview was used that focused on beliefs about food risks, the use of information and media sources in relation to food risk, and the behaviours adopted when eating outside the home. A homogeneous view of food risk emerged among the respondents, and a common definition of risky food was identified. The concept of risk was in opposition to the quality and controllability of food, which emerged as major strategies to cope with food risks. Quality was linked to freshness and local origin, whereas controllability reflected a direct (e.g., checking labels, having a relationship with the vendor, cultivating one's own vegetable garden) or indirect (e.g., control guarantees provided by suppliers and the government) means to check the safety and quality of food. Although people seemed quite informed about food risks, a common sense of impotence with regard to one's own protection prevailed, together with a fatalistic sense of incomplete control over risk. The results identified food concerns for consumers living in this specific territory and might represent a starting point for public health authorities to increase compliance with responsible behaviours for risk mitigation and to define successful food policies for this area. Copyright © 2016 Elsevier Ltd. All rights reserved.
Norman, Laura M.; Niraula, Rewati
2016-01-01
The objective of this study was to evaluate the effect of check dam infrastructure on soil and water conservation at the catchment scale using the Soil and Water Assessment Tool (SWAT). This paired watershed study includes a watershed treated with over 2000 check dams and a Control watershed which has none, in the West Turkey Creek watershed, Southeast Arizona, USA. SWAT was calibrated for streamflow using discharge documented during the summer of 2013 at the Control site. Model results depict the necessity to eliminate lateral flow from SWAT models of aridland environments, the urgency to standardize geospatial soils data, and the care for which modelers must document altering parameters when presenting findings. Performance was assessed using the percent bias (PBIAS), with values of ±2.34%. The calibrated model was then used to examine the impacts of check dams at the Treated watershed. Approximately 630 tons of sediment is estimated to be stored behind check dams in the Treated watershed over the 3-year simulation, increasing water quality for fish habitat. A minimum precipitation event of 15 mm was necessary to instigate the detachment of soil, sediments, or rock from the study area, which occurred 2% of the time. The resulting watershed model is useful as a predictive framework and decision-support tool to consider long-term impacts of restoration and potential for future restoration.
40 CFR 63.7742 - How do I monitor and collect data to demonstrate continuous compliance?
Code of Federal Regulations, 2013 CFR
2013-07-01
... continuous compliance? (a) Except for monitoring malfunctions, associated repairs, and required quality assurance or control activities (including as applicable, calibration checks and required zero and span... emissions is operating. (b) You may not use data recorded during monitoring malfunctions, associated repairs...
40 CFR 63.7742 - How do I monitor and collect data to demonstrate continuous compliance?
Code of Federal Regulations, 2012 CFR
2012-07-01
... continuous compliance? (a) Except for monitoring malfunctions, associated repairs, and required quality assurance or control activities (including as applicable, calibration checks and required zero and span... emissions is operating. (b) You may not use data recorded during monitoring malfunctions, associated repairs...
40 CFR 63.9922 - How do I monitor and collect data to demonstrate continuous compliance?
Code of Federal Regulations, 2013 CFR
2013-07-01
... demonstrate continuous compliance? (a) Except for monitoring malfunctions, associated repairs, and required quality assurance or control activities (including, as applicable, calibration checks and required zero... all times an affected source is operating. (b) You may not use data recorded during monitoring...
40 CFR 63.9922 - How do I monitor and collect data to demonstrate continuous compliance?
Code of Federal Regulations, 2014 CFR
2014-07-01
... demonstrate continuous compliance? (a) Except for monitoring malfunctions, associated repairs, and required quality assurance or control activities (including, as applicable, calibration checks and required zero... all times an affected source is operating. (b) You may not use data recorded during monitoring...
40 CFR 63.7742 - How do I monitor and collect data to demonstrate continuous compliance?
Code of Federal Regulations, 2010 CFR
2010-07-01
... continuous compliance? (a) Except for monitoring malfunctions, associated repairs, and required quality assurance or control activities (including as applicable, calibration checks and required zero and span... emissions is operating. (b) You may not use data recorded during monitoring malfunctions, associated repairs...
40 CFR 63.9633 - How do I monitor and collect data to demonstrate continuous compliance?
Code of Federal Regulations, 2013 CFR
2013-07-01
... continuous compliance? (a) Except for monitoring malfunctions, associated repairs, and required quality assurance or control activities (including as applicable, calibration checks and required zero and span... affected source is operating. (b) You may not use data recorded during monitoring malfunctions, associated...
40 CFR 63.9922 - How do I monitor and collect data to demonstrate continuous compliance?
Code of Federal Regulations, 2012 CFR
2012-07-01
... demonstrate continuous compliance? (a) Except for monitoring malfunctions, associated repairs, and required quality assurance or control activities (including, as applicable, calibration checks and required zero... all times an affected source is operating. (b) You may not use data recorded during monitoring...
40 CFR 63.9922 - How do I monitor and collect data to demonstrate continuous compliance?
Code of Federal Regulations, 2010 CFR
2010-07-01
... demonstrate continuous compliance? (a) Except for monitoring malfunctions, associated repairs, and required quality assurance or control activities (including, as applicable, calibration checks and required zero... all times an affected source is operating. (b) You may not use data recorded during monitoring...
40 CFR 63.7742 - How do I monitor and collect data to demonstrate continuous compliance?
Code of Federal Regulations, 2014 CFR
2014-07-01
... continuous compliance? (a) Except for monitoring malfunctions, associated repairs, and required quality assurance or control activities (including as applicable, calibration checks and required zero and span... emissions is operating. (b) You may not use data recorded during monitoring malfunctions, associated repairs...
40 CFR 63.7742 - How do I monitor and collect data to demonstrate continuous compliance?
Code of Federal Regulations, 2011 CFR
2011-07-01
... continuous compliance? (a) Except for monitoring malfunctions, associated repairs, and required quality assurance or control activities (including as applicable, calibration checks and required zero and span... emissions is operating. (b) You may not use data recorded during monitoring malfunctions, associated repairs...
40 CFR 63.9633 - How do I monitor and collect data to demonstrate continuous compliance?
Code of Federal Regulations, 2010 CFR
2010-07-01
... continuous compliance? (a) Except for monitoring malfunctions, associated repairs, and required quality assurance or control activities (including as applicable, calibration checks and required zero and span... affected source is operating. (b) You may not use data recorded during monitoring malfunctions, associated...
40 CFR 63.9633 - How do I monitor and collect data to demonstrate continuous compliance?
Code of Federal Regulations, 2014 CFR
2014-07-01
... continuous compliance? (a) Except for monitoring malfunctions, associated repairs, and required quality assurance or control activities (including as applicable, calibration checks and required zero and span... affected source is operating. (b) You may not use data recorded during monitoring malfunctions, associated...
40 CFR 63.9633 - How do I monitor and collect data to demonstrate continuous compliance?
Code of Federal Regulations, 2012 CFR
2012-07-01
... continuous compliance? (a) Except for monitoring malfunctions, associated repairs, and required quality assurance or control activities (including as applicable, calibration checks and required zero and span... affected source is operating. (b) You may not use data recorded during monitoring malfunctions, associated...
Program Manual for Producing Weight Scaling Conversion Tables
Gary L. Tyre; Clyde A. Fasick; Frank M. Riley; Frank O. Lege
1973-01-01
Three computer programs are presented which can be applied by individual firms to establish a weight-scaling information system, The first generates volume estimates from truckload weights for any combination of veneer, sawmill, and pulpwood volumes. The second provides quality-control information by tabulating differences between estimated volumes and observed check-...
40 CFR 60.63 - Monitoring of operations.
Code of Federal Regulations, 2011 CFR
2011-07-01
... assurance or quality control activities (including, as applicable, calibration checks and required zero and... period. (7) The flow rate sensor must have provisions to determine the daily zero and upscale calibration... chapter for a discussion of CD). (i) Conduct the CD tests at two reference signal levels, zero (e.g., 0 to...
40 CFR 60.63 - Monitoring of operations.
Code of Federal Regulations, 2012 CFR
2012-07-01
... assurance or quality control activities (including, as applicable, calibration checks and required zero and... period. (7) The flow rate sensor must have provisions to determine the daily zero and upscale calibration... chapter for a discussion of CD). (i) Conduct the CD tests at two reference signal levels, zero (e.g., 0 to...
Principles of continuous quality improvement applied to intravenous therapy.
Dunavin, M K; Lane, C; Parker, P E
1994-01-01
Documentation of the application of the principles of continuous quality improvement (CQI) to the health care setting is crucial for understanding the transition from traditional management models to CQI models. A CQI project was designed and implemented by the IV Therapy Department at Lawrence Memorial Hospital to test the application of these principles to intravenous therapy and as a learning tool for the entire organization. Through a prototype inventory project, significant savings in cost and time were demonstrated using check sheets, flow diagrams, control charts, and other statistical tools, as well as using the Plan-Do-Check-Act cycle. As a result, a primary goal, increased time for direct patient care, was achieved. Eight hours per week in nursing time was saved, relationships between two work areas were improved, and $6,000 in personnel costs, storage space, and inventory were saved.
Methods to achieve high interrater reliability in data collection from primary care medical records.
Liddy, Clare; Wiens, Miriam; Hogg, William
2011-01-01
We assessed interrater reliability (IRR) of chart abstractors within a randomized trial of cardiovascular care in primary care. We report our findings, and outline issues and provide recommendations related to determining sample size, frequency of verification, and minimum thresholds for 2 measures of IRR: the κ statistic and percent agreement. We designed a data quality monitoring procedure having 4 parts: use of standardized protocols and forms, extensive training, continuous monitoring of IRR, and a quality improvement feedback mechanism. Four abstractors checked a 5% sample of charts at 3 time points for a predefined set of indicators of the quality of care. We set our quality threshold for IRR at a κ of 0.75, a percent agreement of 95%, or both. Abstractors reabstracted a sample of charts in 16 of 27 primary care practices, checking a total of 132 charts with 38 indicators per chart. The overall κ across all items was 0.91 (95% confidence interval, 0.90-0.92) and the overall percent agreement was 94.3%, signifying excellent agreement between abstractors. We gave feedback to the abstractors to highlight items that had a κ of less than 0.70 or a percent agreement less than 95%. No practice had to have its charts abstracted again because of poor quality. A 5% sampling of charts for quality control using IRR analysis yielded κ and agreement levels that met or exceeded our quality thresholds. Using 3 time points during the chart audit phase allows for early quality control as well as ongoing quality monitoring. Our results can be used as a guide and benchmark for other medical chart review studies in primary care.
Data quality assessment for comparative effectiveness research in distributed data networks
Brown, Jeffrey; Kahn, Michael; Toh, Sengwee
2015-01-01
Background Electronic health information routinely collected during healthcare delivery and reimbursement can help address the need for evidence about the real-world effectiveness, safety, and quality of medical care. Often, distributed networks that combine information from multiple sources are needed to generate this real-world evidence. Objective We provide a set of field-tested best practices and a set of recommendations for data quality checking for comparative effectiveness research (CER) in distributed data networks. Methods Explore the requirements for data quality checking and describe data quality approaches undertaken by several existing multi-site networks. Results There are no established standards regarding how to evaluate the quality of electronic health data for CER within distributed networks. Data checks of increasing complexity are often employed, ranging from consistency with syntactic rules to evaluation of semantics and consistency within and across sites. Temporal trends within and across sites are widely used, as are checks of each data refresh or update. Rates of specific events and exposures by age group, sex, and month are also common. Discussion Secondary use of electronic health data for CER holds promise but is complex, especially in distributed data networks that incorporate periodic data refreshes. The viability of a learning health system is dependent on a robust understanding of the quality, validity, and optimal secondary uses of routinely collected electronic health data within distributed health data networks. Robust data quality checking can strengthen confidence in findings based on distributed data network. PMID:23793049
Quality assurance and ergonomics in the mammography department.
Reynolds, April
2014-01-01
Quality assurance (QA) in mammography is a system of checks that helps ensure the proper functioning of imaging equipment and processes. Ergonomics is a scientific approach to arranging the work environment to reduce the risk of work-related injuries while increasing staff productivity and job satisfaction. This article reviews both QA and ergonomics in mammography and explains how they work together to create a safe and healthy environment for radiologic technologists and their patients. QA and quality control requirements in mammography are discussed, along with ergonomic best practices in the mammography setting.
Clinical implementation of RNA signatures for pharmacogenomic decision-making
Tang, Weihua; Hu, Zhiyuan; Muallem, Hind; Gulley, Margaret L
2011-01-01
RNA profiling is increasingly used to predict drug response, dose, or toxicity based on analysis of drug pharmacokinetic or pharmacodynamic pathways. Before implementing multiplexed RNA arrays in clinical practice, validation studies are carried out to demonstrate sufficient evidence of analytic and clinical performance, and to establish an assay protocol with quality assurance measures. Pathologists assure quality by selecting input tissue and by interpreting results in the context of the input tissue as well as the technologies that were used and the clinical setting in which the test was ordered. A strength of RNA profiling is the array-based measurement of tens to thousands of RNAs at once, including redundant tests for critical analytes or pathways to promote confidence in test results. Instrument and reagent manufacturers are crucial for supplying reliable components of the test system. Strategies for quality assurance include careful attention to RNA preservation and quality checks at pertinent steps in the assay protocol, beginning with specimen collection and proceeding through the various phases of transport, processing, storage, analysis, interpretation, and reporting. Specimen quality is checked by probing housekeeping transcripts, while spiked and exogenous controls serve as a check on analytic performance of the test system. Software is required to manipulate abundant array data and present it for interpretation by a laboratory physician who reports results in a manner facilitating therapeutic decision-making. Maintenance of the assay requires periodic documentation of personnel competency and laboratory proficiency. These strategies are shepherding genomic arrays into clinical settings to provide added value to patients and to the larger health care system. PMID:23226056
MOM: A meteorological data checking expert system in CLIPS
NASA Technical Reports Server (NTRS)
Odonnell, Richard
1990-01-01
Meteorologists have long faced the problem of verifying the data they use. Experience shows that there is a sizable number of errors in the data reported by meteorological observers. This is unacceptable for computer forecast models, which depend on accurate data for accurate results. Most errors that occur in meteorological data are obvious to the meteorologist, but time constraints prevent hand-checking. For this reason, it is necessary to have a 'front end' to the computer model to ensure the accuracy of input. Various approaches to automatic data quality control have been developed by several groups. MOM is a rule-based system implemented in CLIPS and utilizing 'consistency checks' and 'range checks'. The system is generic in the sense that it knows some meteorological principles, regardless of specific station characteristics. Specific constraints kept as CLIPS facts in a separate file provide for system flexibility. Preliminary results show that the expert system has detected some inconsistencies not noticed by a local expert.
NASA Astrophysics Data System (ADS)
Kapanen, Mika; Tenhunen, Mikko; Hämäläinen, Tuomo; Sipilä, Petri; Parkkinen, Ritva; Järvinen, Hannu
2006-07-01
Quality control (QC) data of radiotherapy linear accelerators, collected by Helsinki University Central Hospital between the years 2000 and 2004, were analysed. The goal was to provide information for the evaluation and elaboration of QC of accelerator outputs and to propose a method for QC data analysis. Short- and long-term drifts in outputs were quantified by fitting empirical mathematical models to the QC measurements. Normally, long-term drifts were well (<=1%) modelled by either a straight line or a single-exponential function. A drift of 2% occurred in 18 ± 12 months. The shortest drift times of only 2-3 months were observed for some new accelerators just after the commissioning but they stabilized during the first 2-3 years. The short-term reproducibility and the long-term stability of local constancy checks, carried out with a sealed plane parallel ion chamber, were also estimated by fitting empirical models to the QC measurements. The reproducibility was 0.2-0.5% depending on the positioning practice of a device. Long-term instabilities of about 0.3%/month were observed for some checking devices. The reproducibility of local absorbed dose measurements was estimated to be about 0.5%. The proposed empirical model fitting of QC data facilitates the recognition of erroneous QC measurements and abnormal output behaviour, caused by malfunctions, offering a tool to improve dose control.
NASA Astrophysics Data System (ADS)
Xie, Chang; Wen, Jing; Liu, Wenying; Wang, Jiaming
With the development of intelligent dispatching, the intelligence level of network control center full-service urgent need to raise. As an important daily work of network control center, the application of maintenance scheduling intelligent arrangement to achieve high-quality and safety operation of power grid is very important. By analyzing the shortages of the traditional maintenance scheduling software, this paper designs a power grid maintenance scheduling intelligence arrangement supporting system based on power flow forecasting, which uses the advanced technologies in maintenance scheduling, such as artificial intelligence, online security checking, intelligent visualization techniques. It implements the online security checking of maintenance scheduling based on power flow forecasting and power flow adjusting based on visualization, in order to make the maintenance scheduling arrangement moreintelligent and visual.
Take the Reins on Model Quality with ModelCHECK and Gatekeeper
NASA Technical Reports Server (NTRS)
Jones, Corey
2012-01-01
Model quality and consistency has been an issue for us due to the diverse experience level and imaginative modeling techniques of our users. Fortunately, setting up ModelCHECK and Gatekeeper to enforce our best practices has helped greatly, but it wasn't easy. There were many challenges associated with setting up ModelCHECK and Gatekeeper including: limited documentation, restrictions within ModelCHECK, and resistance from end users. However, we consider ours a success story. In this presentation we will describe how we overcame these obstacles and present some of the details of how we configured them to work for us.
Use of Longitudinal Regression in Quality Control. Research Report. ETS RR-14-31
ERIC Educational Resources Information Center
Lu, Ying; Yen, Wendy M.
2014-01-01
This article explores the use of longitudinal regression as a tool for identifying scoring inaccuracies. Student progression patterns, as evaluated through longitudinal regressions, typically are more stable from year to year than are scale score distributions and statistics, which require representative samples to conduct credibility checks.…
1991-12-01
materials. TABLE I DRMO Market Price Paper $ 45 / ton Canvas $ 0.024 / lb Aluminum $ 0.26/1b Tires * $ 0.02 / lb Corrugated $ 63 /ton Silver Reclaimed...quality control check in accordance with their permit requirements. They pull samples and do a fingerprint analysis. If during that analysis they find that
STS-34 onboard view of iodine comparator assembly used to check water quality
NASA Technical Reports Server (NTRS)
1989-01-01
STS-34 closeup view taken onboard Atlantis, Orbiter Vehicle (OV) 104, is of the iodine comparator assembly. Potable water quality is checked by comparing the water color to the color chart on the surrounding board.
Schönherr, Sebastian; Neuner, Mathias; Forer, Lukas; Specht, Günther; Kloss-Brandstätter, Anita; Kronenberg, Florian; Coassin, Stefan
2013-01-01
Single nucleotide polymorphisms (SNPs) play a prominent role in modern genetics. Current genotyping technologies such as Sequenom iPLEX, ABI TaqMan and KBioscience KASPar made the genotyping of huge SNP sets in large populations straightforward and allow the generation of hundreds of thousands of genotypes even in medium sized labs. While data generation is straightforward, the subsequent data conversion, storage and quality control steps are time-consuming, error-prone and require extensive bioinformatic support. In order to ease this tedious process, we developed SNPflow. SNPflow is a lightweight, intuitive and easily deployable application, which processes genotype data from Sequenom MassARRAY (iPLEX) and ABI 7900HT (TaqMan, KASPar) systems and is extendible to other genotyping methods as well. SNPflow automatically converts the raw output files to ready-to-use genotype lists, calculates all standard quality control values such as call rate, expected and real amount of replicates, minor allele frequency, absolute number of discordant replicates, discordance rate and the p-value of the HWE test, checks the plausibility of the observed genotype frequencies by comparing them to HapMap/1000-Genomes, provides a module for the processing of SNPs, which allow sex determination for DNA quality control purposes and, finally, stores all data in a relational database. SNPflow runs on all common operating systems and comes as both stand-alone version and multi-user version for laboratory-wide use. The software, a user manual, screenshots and a screencast illustrating the main features are available at http://genepi-snpflow.i-med.ac.at. PMID:23527209
A novel method for routine quality assurance of volumetric-modulated arc therapy.
Wang, Qingxin; Dai, Jianrong; Zhang, Ke
2013-10-01
Volumetric-modulated arc therapy (VMAT) is delivered through synchronized variation of gantry angle, dose rate, and multileaf collimator (MLC) leaf positions. The delivery dynamic nature challenges the parameter setting accuracy of linac control system. The purpose of this study was to develop a novel method for routine quality assurance (QA) of VMAT linacs. ArcCheck is a detector array with diodes distributing in spiral pattern on cylindrical surface. Utilizing its features, a QA plan was designed to strictly test all varying parameters during VMAT delivery on an Elekta Synergy linac. In this plan, there are 24 control points. The gantry rotates clockwise from 181° to 179°. The dose rate, gantry speed, and MLC positions cover their ranges commonly used in clinic. The two borders of MLC-shaped field seat over two columns of diodes of ArcCheck when the gantry rotates to the angle specified by each control point. The ratio of dose rate between each of these diodes and the diode closest to the field center is a certain value and sensitive to the MLC positioning error of the leaf crossing the diode. Consequently, the positioning error can be determined by the ratio with the help of a relationship curve. The time when the gantry reaches the angle specified by each control point can be acquired from the virtual inclinometer that is a feature of ArcCheck. The gantry speed between two consecutive control points is then calculated. The aforementioned dose rate is calculated from an acm file that is generated during ArcCheck measurements. This file stores the data measured by each detector in 50 ms updates with each update in a separate row. A computer program was written in MATLAB language to process the data. The program output included MLC positioning errors and the dose rate at each control point as well as the gantry speed between control points. To evaluate this method, this plan was delivered for four consecutive weeks. The actual dose rate and gantry speed were compared with the QA plan specified. Additionally, leaf positioning errors were intentionally introduced to investigate the sensitivity of this method. The relationship curves were established for detecting MLC positioning errors during VMAT delivery. For four consecutive weeks measured, 98.4%, 94.9%, 89.2%, and 91.0% of the leaf positioning errors were within ± 0.5 mm, respectively. For the intentionally introduced leaf positioning systematic errors of -0.5 and +1 mm, the detected leaf positioning errors of 20 Y1 leaf were -0.48 ± 0.14 and 1.02 ± 0.26 mm, respectively. The actual gantry speed and dose rate closely followed the values specified in the VMAT QA plan. This method can assess the accuracy of MLC positions and the dose rate at each control point as well as the gantry speed between control points at the same time. It is efficient and suitable for routine quality assurance of VMAT.
[Pharmaceutical product quality control and good manufacturing practices].
Hiyama, Yukio
2010-01-01
This report describes the roles of Good Manufacturing Practices (GMP) in pharmaceutical product quality control. There are three keys to pharmaceutical product quality control. They are specifications, thorough product characterization during development, and adherence to GMP as the ICH Q6A guideline on specifications provides the most important principles in its background section. Impacts of the revised Pharmaceutical Affairs Law (rPAL) which became effective in 2005 on product quality control are discussed. Progress of ICH discussion for Pharmaceutical Development (Q8), Quality Risk Management (Q9) and Pharmaceutical Quality System (Q10) are reviewed. In order to reconstruct GMP guidelines and GMP inspection system in the regulatory agencies under the new paradigm by rPAL and the ICH, a series of Health Science studies were conducted. For GMP guidelines, product GMP guideline, technology transfer guideline, laboratory control guideline and change control system guideline were written. For the GMP inspection system, inspection check list, inspection memo and inspection scenario were proposed also by the Health Science study groups. Because pharmaceutical products and their raw materials are manufactured and distributed internationally, collaborations with other national authorities are highly desired. In order to enhance the international collaborations, consistent establishment of GMP inspection quality system throughout Japan will be essential.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ford, Eric C., E-mail: eford@uw.edu; Terezakis, Stephanie; Souranis, Annette
Purpose: To quantify the error-detection effectiveness of commonly used quality control (QC) measures. Methods: We analyzed incidents from 2007-2010 logged into a voluntary in-house, electronic incident learning systems at 2 academic radiation oncology clinics. None of the incidents resulted in patient harm. Each incident was graded for potential severity using the French Nuclear Safety Authority scoring scale; high potential severity incidents (score >3) were considered, along with a subset of 30 randomly chosen low severity incidents. Each report was evaluated to identify which of 15 common QC checks could have detected it. The effectiveness was calculated, defined as the percentagemore » of incidents that each QC measure could detect, both for individual QC checks and for combinations of checks. Results: In total, 4407 incidents were reported, 292 of which had high-potential severity. High- and low-severity incidents were detectable by 4.0 {+-} 2.3 (mean {+-} SD) and 2.6 {+-} 1.4 QC checks, respectively (P<.001). All individual checks were less than 50% sensitive with the exception of pretreatment plan review by a physicist (63%). An effectiveness of 97% was achieved with 7 checks used in combination and was not further improved with more checks. The combination of checks with the highest effectiveness includes physics plan review, physician plan review, Electronic Portal Imaging Device-based in vivo portal dosimetry, radiation therapist timeout, weekly physics chart check, the use of checklists, port films, and source-to-skin distance checks. Some commonly used QC checks such as pretreatment intensity modulated radiation therapy QA do not substantially add to the ability to detect errors in these data. Conclusions: The effectiveness of QC measures in radiation oncology depends sensitively on which checks are used and in which combinations. A small percentage of errors cannot be detected by any of the standard formal QC checks currently in broad use, suggesting that further improvements are needed. These data require confirmation with a broader incident-reporting database.« less
New simple and low-cost methods for periodic checks of Cyclone® Plus Storage Phosphor System.
Edalucci, Elisabetta; Maffione, Anna Margherita; Fornasier, Maria Rosa; De Denaro, Mario; Scian, Giovanni; Dore, Franca; Rubello, Domenico
2017-01-01
The recent large use of the Cyclone® Plus Storage Phosphor System, especially in European countries, as imaging system for quantification of radiochemical purity of radiopharmaceuticals raised the problem of setting the periodic controls as required by European Legislation. We described simple, low-cost methods for Cyclone® Plus quality controls, which can be useful to evaluate the performance measurement of this imaging system.
Control Chart on Semi Analytical Weighting
NASA Astrophysics Data System (ADS)
Miranda, G. S.; Oliveira, C. C.; Silva, T. B. S. C.; Stellato, T. B.; Monteiro, L. R.; Marques, J. R.; Faustino, M. G.; Soares, S. M. V.; Ulrich, J. C.; Pires, M. A. F.; Cotrim, M. E. B.
2018-03-01
Semi-analytical balance verification intends to assess the balance performance using graphs that illustrate measurement dispersion, trough time, and to demonstrate measurements were performed in a reliable manner. This study presents internal quality control of a semi-analytical balance (GEHAKA BG400) using control charts. From 2013 to 2016, 2 weight standards were monitored before any balance operation. This work intended to evaluate if any significant difference or bias were presented on weighting procedure over time, to check the generated data reliability. This work also exemplifies how control intervals are established.
A rigorous approach to self-checking programming
NASA Technical Reports Server (NTRS)
Hua, Kien A.; Abraham, Jacob A.
1986-01-01
Self-checking programming is shown to be an effective concurrent error detection technique. The reliability of a self-checking program however relies on the quality of its assertion statements. A self-checking program written without formal guidelines could provide a poor coverage of the errors. A constructive technique for self-checking programming is presented. A Structured Program Design Language (SPDL) suitable for self-checking software development is defined. A set of formal rules, was also developed, that allows the transfromation of SPDL designs into self-checking designs to be done in a systematic manner.
31 CFR 240.6 - Provisional credit; first examination; declination; final payment.
Code of Federal Regulations, 2010 CFR
2010-07-01
... alteration without examining the original check or a better quality image of the check and Treasury is on... after the check is presented to a Federal Reserve Processing Center for payment, Treasury will be deemed...
31 CFR 240.6 - Provisional credit; first examination; declination; final payment.
Code of Federal Regulations, 2011 CFR
2011-07-01
... alteration without examining the original check or a better quality image of the check and Treasury is on... after the check is presented to a Federal Reserve Processing Center for payment, Treasury will be deemed...
31 CFR 240.6 - Provisional credit; first examination; declination; final payment.
Code of Federal Regulations, 2014 CFR
2014-07-01
... alteration without examining the original check or a better quality image of the check and Treasury is on... after the check is presented to a Federal Reserve Processing Center for payment, Treasury will be deemed...
31 CFR 240.6 - Provisional credit; first examination; declination; final payment.
Code of Federal Regulations, 2013 CFR
2013-07-01
... alteration without examining the original check or a better quality image of the check and Treasury is on... after the check is presented to a Federal Reserve Processing Center for payment, Treasury will be deemed...
31 CFR 240.6 - Provisional credit; first examination; declination; final payment.
Code of Federal Regulations, 2012 CFR
2012-07-01
... alteration without examining the original check or a better quality image of the check and Treasury is on... after the check is presented to a Federal Reserve Processing Center for payment, Treasury will be deemed...
The System of Checks and Balances in General Education Management
ERIC Educational Resources Information Center
Kravtsov, Sergei
2015-01-01
The project of modernization of regional education systems is now in its second year, but by no means all schools in Russia currently meet modern requirements. The division between strong and weak schools is still preserved. Quality control is frequently regarded as a formality and the effectiveness of how a specific educational institution is run…
The relation of mechanical properties of wood and nosebar pressure in the production of veneer
Charles W. McMillin
1958-01-01
Observations of checking frequency, depth of check penetration, veneer thickness, and surface quality were made at 20 machining conditions. An inverse relationship between depth of check and frequency of checking was established. The effect of cutting temperature was demonstrated, and strength in compression perpendicular to the grain, tension perpendicular to the...
[A re-evaluation of the program for diabetes mellitus type 2. A proposal for quality indices].
Espinàs, J; Salla, R M; Bellvehí, M; Reig, E; Iruela, T; Muñoz, E; Isern, R; Molas, M
1993-02-28
To find out how accurate our records are and the state of health of the patients with diabetes mellitus type II (DM) in our Base Health Area (BHA) in Osona county (Barcelona), both before and after introducing a new procedure. Quality control study based on the medical records (PCMR) of DM patients. The evaluation took place between 1.1.90 and 31.12.90; and the re-evaluation between 1.1.91 and 31.12.91, after the DM procedure had been put in place as a corrective measure. 198 patients: all of those suffering from DM type II. 110 women and 88 men, with an average age of 65.4 +/- 11.9, were under study. We observed from the records of attendance that 94.4% were or had been smokers, whereas the question of the eye fundus was only mentioned in 36.8%. The introduction of a procedure has improved the records in almost every parameter. In 1991, 36.8% of the patients had normal-weight criteria, 33.3% had good biochemical control and 15.6% fulfilled both these criteria. Those tests which could be performed with few instruments were carried out much better than those which needed more complex technology or specialist support. Arising from this study, the authors propose four indicators of quality control: 1) Weight normality. 2) Annual plasmatic fructosamine. 3) Annual eye fundus check. 4) Annual proteinuria check.
Dosimetry audits and intercomparisons in radiotherapy: A Malaysian profile
NASA Astrophysics Data System (ADS)
M. Noor, Noramaliza; Nisbet, A.; Hussein, M.; Chu S, Sarene; Kadni, T.; Abdullah, N.; Bradley, D. A.
2017-11-01
Quality audits and intercomparisons are important in ensuring control of processes in any system of endeavour. Present interest is in control of dosimetry in teletherapy, there being a need to assess the extent to which there is consistent radiation dose delivery to the patient. In this study we review significant factors that impact upon radiotherapy dosimetry, focusing upon the example situation of radiotherapy delivery in Malaysia, examining existing literature in support of such efforts. A number of recommendations are made to provide for increased quality assurance and control. In addition to this study, the first level of intercomparison audit i.e. measuring beam output under reference conditions at eight selected Malaysian radiotherapy centres is checked; use being made of 9 μm core diameter Ge-doped silica fibres (Ge-9 μm). The results of Malaysian Secondary Standard Dosimetry Laboratory (SSDL) participation in the IAEA/WHO TLD postal dose audit services during the period between 2011 and 2015 will also been discussed. In conclusion, following review of the development of dosimetry audits and the conduct of one such exercise in Malaysia, it is apparent that regular periodic radiotherapy audits and intercomparison programmes should be strongly supported and implemented worldwide. The programmes to-date demonstrate these to be a good indicator of errors and of consistency between centres. A total of ei+ght beams have been checked in eight Malaysian radiotherapy centres. One out of the eight beams checked produced an unacceptable deviation; this was found to be due to unfamiliarity with the irradiation procedures. Prior to a repeat measurement, the mean ratio of measured to quoted dose was found to be 0.99 with standard deviation of 3%. Subsequent to the repeat measurement, the mean distribution was 1.00, and the standard deviation was 1.3%.
Quality Assurance and Control Considerations in Environmental Measurements and Monitoring
NASA Astrophysics Data System (ADS)
Sedlet, Jacob
1982-06-01
Quality assurance and quality control have become accepted as essential parts of all environmental surveillance, measurements, and monitoring programs, both nuclear and non-nuclear. The same principles and details apply to each. It is primarily the final measurement technique that differs. As the desire and need to measure smaller amounts of pollutants with greater accuracy has increased, it has been recognized that quality assurance and control programs are cost-effective in achieving the expected results. Quality assurance (QA) consists of all the actions necessary to provide confidence in the results. Quality control (QC) is a part of QA, and consists of those actions and activities that permit the control of the individual steps in the environmental program. The distinction between the two terms is not always clearly defined, but a sharp division is not necessary. The essential principle of QA and QC is a commitment to high quality results. The essential components of a QA and QC program are a complete, written procedures manual for all parts of the environmental program, the use of standard or validated procedures, participation in applicable interlaboratory comparison or QA programs, replicate analysis and measurement, training of personnel, and a means of auditing or checking that the QA and QC programs are properly conducted. These components are discussed below in some detail.
Chinman, Matthew; Ebener, Patricia; Burkhart, Q; Osilla, Karen Chan; Imm, Pamela; Paddock, Susan M.; Wright, Patricia Ann
2017-01-01
Underage drinking is a significant problem facing US communities. Several environmental alcohol prevention (EAP) strategies (laws, regulations, responsible beverage service training and practices) successfully address underage drinking. Communities, however, face challenges carrying out these EAP strategies effectively. This small-scale, three-year, randomized controlled trial assessed whether providing prevention coalitions with Getting To Outcomes-Underage Drinking (GTO-UD), a tool kit and implementation support intervention, helped improve implementation of two common EAP strategies, responsible beverage service training (RBS) and Compliance Checks. Three coalitions in South Carolina and their RBS and Compliance Check programs received the 16 month GTO-UD intervention, including the GTO-UD manual, training, and onsite technical assistance, while another three in South Carolina maintained routine operations. The measures, collected at baseline and after the intervention, were a structured interview assessing how well coalitions carried out their work and a survey of merchant attitudes and practices in the six counties served by the participating coalitions. Over time, the quality of some RBS and Compliance Check activities improved more in GTO-UD coalitions than in the control sites. No changes in merchant practices or attitudes significantly differed between the GTO-UD and control groups, although merchants in the GTO-UD counties did significantly improve on refusing sales to minors while control merchants did not. PMID:23564504
QCScreen: a software tool for data quality control in LC-HRMS based metabolomics.
Simader, Alexandra Maria; Kluger, Bernhard; Neumann, Nora Katharina Nicole; Bueschl, Christoph; Lemmens, Marc; Lirk, Gerald; Krska, Rudolf; Schuhmacher, Rainer
2015-10-24
Metabolomics experiments often comprise large numbers of biological samples resulting in huge amounts of data. This data needs to be inspected for plausibility before data evaluation to detect putative sources of error e.g. retention time or mass accuracy shifts. Especially in liquid chromatography-high resolution mass spectrometry (LC-HRMS) based metabolomics research, proper quality control checks (e.g. for precision, signal drifts or offsets) are crucial prerequisites to achieve reliable and comparable results within and across experimental measurement sequences. Software tools can support this process. The software tool QCScreen was developed to offer a quick and easy data quality check of LC-HRMS derived data. It allows a flexible investigation and comparison of basic quality-related parameters within user-defined target features and the possibility to automatically evaluate multiple sample types within or across different measurement sequences in a short time. It offers a user-friendly interface that allows an easy selection of processing steps and parameter settings. The generated results include a coloured overview plot of data quality across all analysed samples and targets and, in addition, detailed illustrations of the stability and precision of the chromatographic separation, the mass accuracy and the detector sensitivity. The use of QCScreen is demonstrated with experimental data from metabolomics experiments using selected standard compounds in pure solvent. The application of the software identified problematic features, samples and analytical parameters and suggested which data files or compounds required closer manual inspection. QCScreen is an open source software tool which provides a useful basis for assessing the suitability of LC-HRMS data prior to time consuming, detailed data processing and subsequent statistical analysis. It accepts the generic mzXML format and thus can be used with many different LC-HRMS platforms to process both multiple quality control sample types as well as experimental samples in one or more measurement sequences.
Fontes, Tânia; Li, Peilin; Barros, Nelson; Zhao, Pengjun
2018-08-01
Air quality traffic-related measures have been implemented worldwide to control the pollution levels of urban areas. Although some of those measures are claiming environmental improvements, few studies have checked their real impact. In fact, quantitative estimates are often focused on reducing emissions, rather than on evaluating the actual measures' effect on air quality. Even when air quality studies are conducted, results are frequently unclear. In order to properly assess the real impact on air quality of traffic-related measures, a statistical method is proposed. The method compares the pollutant concentration levels observed after the implementation of a measure with the concentration values of the previous year. Short- and long-term impact is assessed considering not only their influence on the average pollutant concentration, but also on its maximum level. To control the effect of the main confounding factors, only the days with similar environmental conditions are analysed. The changeability of the key meteorological variables that affect the transport and dispersion of the pollutant studied are used to identify and group the days categorized as similar. Resemblance of the pollutants' concentration of the previous day is also taken into account. The impact of the road traffic measures on the air pollutants' concentration is then checked for those similar days using specific statistical functions. To evaluate the proposed method, the impact on PM 2.5 concentrations of two air quality traffic-related measures (M1 and M2) implemented in the city of Beijing are taken into consideration: M1 was implemented in 2009, restricting the circulation of yellow-labelled vehicles, while M2 was implemented in 2014, restricting the circulation of heavy-duty vehicles. To compare the results of each measure, a time-period when these measures were not applied is used as case-control. Copyright © 2018 Elsevier Ltd. All rights reserved.
Analyses of Blood Bank Efficiency, Cost-Effectiveness and Quality
NASA Astrophysics Data System (ADS)
Lam, Hwai-Tai Chen
In view of the increasing costs of hospital care, it is essential to investigate methods to improve the labor efficiency and the cost-effectiveness of the hospital technical core in order to control costs while maintaining the quality of care. This study was conducted to develop indices to measure efficiency, cost-effectiveness, and the quality of blood banks; to identify factors associated with efficiency, cost-effectiveness, and quality; and to generate strategies to improve blood bank labor efficiency and cost-effectiveness. Indices developed in this study for labor efficiency and cost-effectiveness were not affected by patient case mix and illness severity. Factors that were associated with labor efficiency were identified as managerial styles, and organizational designs that balance workload and labor resources. Medical directors' managerial involvement was not associated with labor efficiency, but their continuing education and specialty in blood bank were found to reduce the performance of unnecessary tests. Surprisingly, performing unnecessary tests had no association with labor efficiency. This suggested the existence of labor slack in blood banks. Cost -effectiveness was associated with workers' benefits, wages, and the production of high-end transfusion products by hospital-based donor rooms. Quality indices used in this study included autologous transfusion rates, platelet transfusion rates, and the check points available in an error-control system. Because the autologous transfusion rate was related to patient case mix, severity of illness, and possible inappropriate transfusion, it was not recommended to be used for quality index. Platelet-pheresis transfusion rates were associated with the transfusion preferences of the blood bank medical directors. The total number of check points in an error -control system was negatively associated with government ownership and workers' experience. Recommendations for improving labor efficiency and cost-effectiveness were focused on an incentive system that encourages team effort, and the use of appropriate measurements for laboratory efficiency and operational system designs.
NASA Technical Reports Server (NTRS)
Biezad, Daniel
1997-01-01
Handling qualities analysis and control law design would seem to be naturally complimenting components of aircraft flight control system design, however these two closely coupled disciplines are often not well integrated in practice. Handling qualities engineers and control system engineers may work in separate groups within an aircraft company. Flight control system engineers and handling quality specialists may come from different backgrounds and schooling and are often not aware of the other group's research. Thus while the handling qualities specifications represent desired aircraft response characteristics, these are rarely incorporated directly in the control system design process. Instead modem control system design techniques are based on servo-loop robustness specifications, and simple representations of the desired control response. Comprehensive handling qualities analysis is often left until the end of the design cycle and performed as a check of the completed design for satisfactory performance. This can lead to costly redesign or less than satisfactory aircraft handling qualities when the flight testing phase is reached. The desire to integrate the fields of handling qualities and flight,control systems led to the development of the CONDUIT system. This tool facilitates control system designs that achieve desired handling quality requirements and servo-loop specifications in a single design process. With CONDUIT, the control system engineer is now able to directly design and control systems to meet the complete handling specifications. CONDUIT allows the designer to retain a preferred control law structure, but then tunes the system parameters to meet the handling quality requirements.
Huang, Hsiao-Yun
2015-01-01
tRNAs perform an essential role in translating the genetic code. They are long-lived RNAs that are generated via numerous posttranscriptional steps. Eukaryotic cells have evolved numerous layers of quality control mechanisms to ensure that the tRNAs are appropriately structured, processed, and modified. We describe the known tRNA quality control processes that check tRNAs and correct or destroy aberrant tRNAs. These mechanisms employ two types of exonucleases, CCA end addition, tRNA nuclear aminoacylation, and tRNA subcellular traffic. We arrange these processes in order of the steps that occur from generation of precursor tRNAs by RNA polymerase (Pol) III transcription to end maturation and modification in the nucleus to splicing and additional modifications in the cytoplasm. Finally, we discuss the tRNA retrograde pathway, which allows tRNA reimport into the nucleus for degradation or repair. PMID:25848089
Sasaki, Kei; Sasaki, Hiroto; Takahashi, Atsuki; Kang, Siu; Yuasa, Tetsuya; Kato, Ryuji
2016-02-01
In recent years, cell and tissue therapy in regenerative medicine have advanced rapidly towards commercialization. However, conventional invasive cell quality assessment is incompatible with direct evaluation of the cells produced for such therapies, especially in the case of regenerative medicine products. Our group has demonstrated the potential of quantitative assessment of cell quality, using information obtained from cell images, for non-invasive real-time evaluation of regenerative medicine products. However, image of cells in the confluent state are often difficult to evaluate, because accurate recognition of cells is technically difficult and the morphological features of confluent cells are non-characteristic. To overcome these challenges, we developed a new image-processing algorithm, heterogeneity of orientation (H-Orient) processing, to describe the heterogeneous density of cells in the confluent state. In this algorithm, we introduced a Hessian calculation that converts pixel intensity data to orientation data and a statistical profiling calculation that evaluates the heterogeneity of orientations within an image, generating novel parameters that yield a quantitative profile of an image. Using such parameters, we tested the algorithm's performance in discriminating different qualities of cellular images with three types of clinically important cell quality check (QC) models: remaining lifespan check (QC1), manipulation error check (QC2), and differentiation potential check (QC3). Our results show that our orientation analysis algorithm could predict with high accuracy the outcomes of all types of cellular quality checks (>84% average accuracy with cross-validation). Copyright © 2015 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.
Earth Observation Data Quality Monitoring and Control: A Case Study of STAR Central Data Repository
NASA Astrophysics Data System (ADS)
Han, W.; Jochum, M.
2017-12-01
Earth observation data quality is very important for researchers and decision makers involved in weather forecasting, severe weather warning, disaster and emergency response, environmental monitoring, etc. Monitoring and control earth observation data quality, especially accuracy, completeness, and timeliness, is very useful in data management and governance to optimize data flow, discover potential transmission issues, and better connect data providers and users. Taking a centralized near real-time satellite data repository, STAR (Center for Satellite Applications and Research of NOAA) Central Data Repository (SCDR), as an example, this paper describes how to develop new mechanism to verify data integrity, check data completeness, and monitor data latency in an operational data management system. Such quality monitoring and control of large volume satellite data help data providers and managers improve data transmission of near real-time satellite data, enhance its acquisition and management, and overcome performance and management issues to better serve research and development activities.
Braz, Nayara Felicidade Tomaz; Carneiro, Ana Paula Scalia; Avelar, Núbia Carelli Pereira de; Miranda, Aline Silva de; Lacerda, Ana Cristina Rodrigues; Teixeira, Mauro Martins; Teixeira, Antônio Lúcio; Mendonça, Vanessa Amaral
2016-03-01
The aim of the study was to evaluate the plasma levels of inflammatory mediators in subjects exposed to silica, with and without silicosis compared with unexposed control group; and to check the association between inflammatory mediators with pulmonary function, quality of life, functional capacity, and dyspnea grade. Inflammatory mediators were measured by enzyme-linked immunosorbent assay. There were 30 subjects exposed to silica and 24 control group. Interleukin-6 plasma levels were higher in subjects exposed to silica with and without silicosis than in the control group. There was a positive correlation between radiological severity and the quality of life, whereas there was a negative correlation between radiological severity and pulmonary function. A negative correlation between sTNFR1 plasma level and functional capacity was found. Interleukin-10 was negatively correlated with the quality of life total score and was positively correlated with the functional capacity and pulmonary function.
Military Suicide Research Consortium
2014-10-01
increasing and decreasing (or even ceasing entirely) across different periods of time but still building on itself with each progressive episode...community from suicide. One study found that social norms, high levels of support, identification with role models , and high self-esteem help pro - tect...in follow-up. o Conducted quality control checks of clinical data . Monitored safety, adverse events for DSMB reporting. Initiated Database
Financial Record Checking in Surveys: Do Prompts Improve Data Quality?
ERIC Educational Resources Information Center
Murphy, Joe; Rosen, Jeffrey; Richards, Ashley; Riley, Sarah; Peytchev, Andy; Lindblad, Mark
2016-01-01
Self-reports of financial information in surveys, such as wealth, income, and assets, are particularly prone to inaccuracy. We sought to improve the quality of financial information captured in a survey conducted by phone and in person by encouraging respondents to check records when reporting on income and assets. We investigated whether…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hollister, R
QC sample results (daily background check drum and 100-gram SGS check drum) were within acceptance criteria established by WIPP's Quality Assurance Objectives for TRU Waste Characterization. Replicate runs were performed on drum LL85501243TRU. Replicate measurement results are identical at the 95% confidence level as established by WIPP criteria. HWM NCAR No. 02-1000168 issued on 17-Oct-2002 regarding a partially dislodged Cd sheet filter on the HPGe coaxial detector. This physical geometry occurred on 01-Oct-2002 and was not corrected until 10-Oct-2002, during which period is inclusive of the present batch run of drums. Per discussions among the Independent Technical Reviewer, Expert Reviewermore » and the Technical QA Supervisor, as well as in consultation with John Fleissner, Technical Point of Contact from Canberra, the analytical results are technically reliable. All QC standard runs during this period were in control. Data packet for SGS Batch 2002-13 generated using passive gamma-ray spectroscopy with the Pu Facility SGS unit is technically reasonable. All QC samples are in compliance with establiShed control limits. The batch data packet has been reviewed for correctness, completeness, consistency and compliance with WIPP's Quality Assurance Objectives and determined to be acceptable.« less
NASA Astrophysics Data System (ADS)
Lee, Rena; Kim, Kyubo; Cho, Samju; Lim, Sangwook; Lee, Suk; Shim, Jang Bo; Huh, Hyun Do; Lee, Sang Hoon; Ahn, Sohyun
2017-11-01
This study applied statistical process control to set and verify the quality assurances (QA) tolerance standard for our hospital's characteristics with the criteria standards that are applied to all the treatment sites with this analysis. Gamma test factor of delivery quality assurances (DQA) was based on 3%/3 mm. Head and neck, breast, prostate cases of intensity modulated radiation therapy (IMRT) or volumetric arc radiation therapy (VMAT) were selected for the analysis of the QA treatment sites. The numbers of data used in the analysis were 73 and 68 for head and neck patients. Prostate and breast were 49 and 152 by MapCHECK and ArcCHECK respectively. C p value of head and neck and prostate QA were above 1.0, C pml is 1.53 and 1.71 respectively, which is close to the target value of 100%. C pml value of breast (IMRT) was 1.67, data values are close to the target value of 95%. But value of was 0.90, which means that the data values are widely distributed. C p and C pml of breast VMAT QA were respectively 1.07 and 2.10. This suggests that the VMAT QA has better process capability than the IMRT QA. Consequently, we should pay more attention to planning and QA before treatment for breast Radiotherapy.
Kachani, Adriana Trejger; Barroso, Lucia Pereira; Brasiliano, Silvia; Cordás, Táki Athanássios; Hochgraf, Patrícia Brunfentrinker
2015-12-01
Compare inadequate eating behaviors and their relationship to body checking in three groups: patients with anorexia nervosa (AN), patients with bulimia nervosa (BN) and a control group (C). Eighty three outpatients with eating disorders (ED) and 40 controls completed eating attitudes and body checking questionnaires. The overall relationship between the eating attitude and body checking was statistically significant in all three groups. The worse the eating attitude, the greater the body checking behavior. However, when we look at each group individually, the relationship was only statistically significant in the AN group (r=.354, p=0.020). The lower the desired weight and the worse the eating attitude, the more people check themselves, although in the presence of an ED the relationship between body checking and food restrictions is greater. In patients displaying the AN subgroup, body checking is also related to continued dietary control. Copyright © 2015 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lawrence Livermore National Laboratory
2009-12-09
QC sample results (daily background checks, 20-gram and 100-gram SGS drum checks) were within acceptable criteria established by WIPP's Quality Assurance Objectives for TRU Waste Characterization. Replicate runs were performed on 5 drums with IDs LL85101099TRU, LL85801147TRU, LL85801109TRU, LL85300999TRU and LL85500979TRU. All replicate measurement results are identical at the 95% confidence level as established by WIPP criteria. Note that the batch covered 5 weeks of SGS measurements from 23-Jan-2002 through 22-Feb-2002. Data packet for SGS Batch 2002-02 generated using gamma spectroscopy with the Pu Facility SGS unit is technically reasonable. All QC samples are in compliance with established control limits.more » The batch data packet has been reviewed for correctness, completeness, consistency and compliance with WIPP's Quality Assurance Objectives and determined to be acceptable. An Expert Review was performed on the data packet between 28-Feb-02 and 09-Jul-02 to check for potential U-235, Np-237 and Am-241 interferences and address drum cases where specific scan segments showed Se gamma ray transmissions for the 136-keV gamma to be below 0.1 %. Two drums in the batch showed Pu-238 at a relative mass ratio more than 2% of all the Pu isotopes.« less
Control of crankshaft finish by scattering technique
NASA Astrophysics Data System (ADS)
Fontani, Daniela; Francini, Franco; Longobardi, Giuseppe; Sansoni, Paola
2001-06-01
The paper describes a new sensor dedicated to measure and check the surface quality of mechanical products. The results were obtained comparing the light scattered from two different ranges of angles by means of 16 photodiodes. The device is designed for obtaining valid data from curved surfaces as that of a crankshaft. Experimental measurements show that the ratio between scattered and reflected light intensity increases with the surface roughness. This device was developed for the off-tolerance detection of mechanical pieces in industrial production. Results of surface quality on crankshaft supplied by Renault were carried out.
Argo workstation: a key component of operational oceanography
NASA Astrophysics Data System (ADS)
Dong, Mingmei; Xu, Shanshan; Miao, Qingsheng; Yue, Xinyang; Lu, Jiawei; Yang, Yang
2018-02-01
Operational oceanography requires the quantity, quality, and availability of data set and the timeliness and effectiveness of data products. Without steady and strong operational system supporting, operational oceanography will never be proceeded far. In this paper we describe an integrated platform named Argo Workstation. It operates as a data processing and management system, capable of data collection, automatic data quality control, visualized data check, statistical data search and data service. After it is set up, Argo workstation provides global high quality Argo data to users every day timely and effectively. It has not only played a key role in operational oceanography but also set up an example for operational system.
NASA Technical Reports Server (NTRS)
Shewhart, Mark
1991-01-01
Statistical Process Control (SPC) charts are one of several tools used in quality control. Other tools include flow charts, histograms, cause and effect diagrams, check sheets, Pareto diagrams, graphs, and scatter diagrams. A control chart is simply a graph which indicates process variation over time. The purpose of drawing a control chart is to detect any changes in the process signalled by abnormal points or patterns on the graph. The Artificial Intelligence Support Center (AISC) of the Acquisition Logistics Division has developed a hybrid machine learning expert system prototype which automates the process of constructing and interpreting control charts.
Strekalov performing maintenance on Core module control panel
1995-07-01
NM18-302-025 (March-July 1995) --- Onboard Mir's base block module cosmonaut Gennadiy M. Strekalov, flight engineer, prepares to check the air quality control and the propulsion system of the station. Strekalov told reporters at a July 18 press conference in Houston that even though he tried not to awaken astronaut Norman E. Thagard, who was asleep nearby, he was unable to keep from disturbing the cosmonaut researcher. He went on to point out that Thagard was always very cooperative and tolerant of such interruptions.
NASA Astrophysics Data System (ADS)
Daughtrey, E. Hunter; Adams, Jeffrey R.; Oliver, Karen D.; Kronmiller, Keith G.; McClenny, William A.
1998-09-01
A trailer-deployed automated gas chromatograph-mass spectrometer (autoGC-MS) system capable of making continuous hourly measurements was used to determine volatile organic compounds (VOCs) in ambient air at New Hendersonville, Tennessee, and Research Triangle Park, North Carolina, in 1995. The system configuration, including the autoGC-MS, trailer and transfer line, siting, and sampling plan and schedule, is described. The autoGC-MS system employs a pair of matched sorbent traps to allow simultaneous sampling and desorption. Desorption is followed by Stirling engine cryofocusing and subsequent GC separation and mass spectral identification and quantification. Quality control measurements described include evaluating precision and accuracy of replicate analyses of independently supplied audit and round-robin canisters and determining the completeness of the data sets taken in Tennessee. Data quality objectives for precision (±10%) and accuracy (±20%) of 10- to 20-ppbv audit canisters and a completeness of >75% data capture were met. Quality assurance measures used in reviewing the data set include retention time stability, calibration checks, frequency distribution checks, and checks of the mass spectra. Special procedures and tests were used to minimize sorbent trap artifacts, to verify the quality of a standard prepared in our laboratory, and to prove the integrity of the insulated, heated transfer line. A rigorous determination of total system blank concentration levels using humidified scientific air spiked with ozone allowed estimation of method detection limits, ranging from 0.01 to 1.0 ppb C, for most of the 100 target compounds, which were a composite list of the target compounds for the Photochemical Assessment Monitoring Station network, those for Environmental Protection Agency method TO-14, and selected oxygenated VOCs.
Chang, Hyein; Shaw, Daniel S; Shelleby, Elizabeth C; Dishion, Thomas J; Wilson, Melvin N
2017-05-01
We examined the longitudinal effects of the Family Check-Up (FCU) intervention beginning in toddlerhood on children's peer preference at school-age. Specifically, a sequential mediational model was proposed in which the FCU was hypothesized to promote peer preference (i.e., higher acceptance and lower rejection by peers) in middle childhood through its positive effects on parent-child interaction and child effortful control in early childhood. Participants were 731 low-income families (49 % female). Qualities of parent-child interaction were observed during structured activities at 2 to 5 years, child effortful control was assessed using behavioral tasks at 5 years, and peer acceptance and rejection were rated by teachers at 7.5 to 10.5 years. Results indicated that the FCU indirectly predicted peer preference by sequentially improving parent-child interaction and child effortful control. The findings are discussed with respect to implications for understanding mechanisms by which early parenting-focused programs may enhance child functioning across time and context.
Shaw, Daniel S.; Shelleby, Elizabeth C.; Dishion, Thomas J.; Wilson, Melvin N.
2018-01-01
We examined the longitudinal effects of the Family Check-Up (FCU) intervention beginning in toddlerhood on children’s peer preference at school-age. Specifically, a sequential mediational model was proposed in which the FCU was hypothesized to promote peer preference (i.e., higher acceptance and lower rejection by peers) in middle childhood through its positive effects on parent-child interaction and child effortful control in early childhood. Participants were 731 low-income families (49 % female). Qualities of parent-child interaction were observed during structured activities at 2 to 5 years, child effortful control was assessed using behavioral tasks at 5 years, and peer acceptance and rejection were rated by teachers at 7.5 to 10.5 years. Results indicated that the FCU indirectly predicted peer preference by sequentially improving parent-child interaction and child effortful control. The findings are discussed with respect to implications for understanding mechanisms by which early parenting-focused programs may enhance child functioning across time and context. PMID:27558394
[Study of quality of a branch laboratory--an opinion of a laboratory manager].
Yazawa, Naoyuki
2006-11-01
At the stage of establishing a branch laboratory, quality evaluation is extremely difficult. Even the results of a control survey by the headquarters of the branch laboratory are unhelpful. For a clinical laboratory, the most important function is to provide reliable data all the time, and to maintain the reliability of clinical doctors with informed responses. We mostly refer to control surveys and daily quality control data to evaluate a clinical laboratory, but we rarely check its fundamental abilities, such as planning events, preserving statistical data about the standard range, using the right method for quality control and others. This is generally disregarded and it is taken for granted that they will be correct the first time. From my six years of experience working with X's branch laboratory, I realized that there might be some relation between the quality of a branch laboratory and the fundamental abilities of the company itself. I would never argue that all branch laboratories are ineffective, but they should be conscious of fundamental activities. The referring laboratory, not the referral laboratory, should be responsible for ensuring that the referral laboratory's examination results and findings are correct.
Salkovskis, Paul M; Millar, Josie; Gregory, James D; Wahl, Karina
2017-03-01
Repeated checking in OCD can be understood from a cognitive perspective as the motivated need to achieve certainty about the outcome of a potentially risky action, leading to the application of Elevated Evidence Requirements (EER) and overuse of subjective criteria. Twenty-four obsessional checkers, 22 anxious controls, and 26 non-clinical controls were interviewed about and rated recent episodes where they felt (a) they needed to check and (b) checked mainly out of habit (i.e. not obsessionally). Both subjective and objective criteria were rated as significantly more important in obsessional checkers than in controls; obsessional checkers also used more criteria overall for the termination of the check, and rated more criteria as "extremely important" than the control groups. The termination of the check was rated as more effortful for obsessional checkers than for the comparison groups. Analysis of the interview data was consistent with the ratings. Feelings of "rightness" were associated with the termination of a check for obsessional checkers but not for controls. Results were consistent with the proposal that the use of "just right feelings" to terminate checking are related to EER.
Tracing and control of raw materials sourcing for vaccine manufacturers.
Faretra Peysson, Laurence
2010-05-01
The control of the raw materials used to manufacture vaccines is mandatory; therefore, a very clear process must be in place to guarantee that raw materials are traced. Those who make products or supplies used in vaccine manufacture (suppliers of culture media, diagnostic tests, etc.) must apply quality systems proving that they adhere to certain standards. ISO certification, Good Manufacturing Practices for production sites and the registration of culture media with a 'Certificate of Suitability' from the European Directorate for the Quality of Medicines and Healthcare are reliable quality systems pertaining to vaccine production. Suppliers must assure that each lot of raw materials used in a product that will be used in vaccine manufacture adheres to the level of safety and traceability required. Incoming materials must be controlled in a single 'Enterprise Resource Planning' system which is used to document important information, such as the assignment of lot number, expiration date, etc. Ingredients for culture media in particular must conform to certain specifications. The specifications that need to be checked vary according to the ingredient, based on the level of risk. The way a raw material is produced is also important, and any aspect relative to cross-contamination, such as the sanitary measures used in producing and storing the raw material must be checked as well. In addition, suppliers can reduce the risk of viral contamination of raw materials by avoiding purchases in countries where a relevant outbreak is currently declared. 2010 The International Association for Biologicals. Published by Elsevier Ltd. All rights reserved.
Varet, Hugo; Brillet-Guéguen, Loraine; Coppée, Jean-Yves; Dillies, Marie-Agnès
2016-01-01
Several R packages exist for the detection of differentially expressed genes from RNA-Seq data. The analysis process includes three main steps, namely normalization, dispersion estimation and test for differential expression. Quality control steps along this process are recommended but not mandatory, and failing to check the characteristics of the dataset may lead to spurious results. In addition, normalization methods and statistical models are not exchangeable across the packages without adequate transformations the users are often not aware of. Thus, dedicated analysis pipelines are needed to include systematic quality control steps and prevent errors from misusing the proposed methods. SARTools is an R pipeline for differential analysis of RNA-Seq count data. It can handle designs involving two or more conditions of a single biological factor with or without a blocking factor (such as a batch effect or a sample pairing). It is based on DESeq2 and edgeR and is composed of an R package and two R script templates (for DESeq2 and edgeR respectively). Tuning a small number of parameters and executing one of the R scripts, users have access to the full results of the analysis, including lists of differentially expressed genes and a HTML report that (i) displays diagnostic plots for quality control and model hypotheses checking and (ii) keeps track of the whole analysis process, parameter values and versions of the R packages used. SARTools provides systematic quality controls of the dataset as well as diagnostic plots that help to tune the model parameters. It gives access to the main parameters of DESeq2 and edgeR and prevents untrained users from misusing some functionalities of both packages. By keeping track of all the parameters of the analysis process it fits the requirements of reproducible research.
Development of an analysis tool for cloud base height and visibility
NASA Astrophysics Data System (ADS)
Umdasch, Sarah; Reinhold, Steinacker; Manfred, Dorninger; Markus, Kerschbaum; Wolfgang, Pöttschacher
2014-05-01
The meteorological variables cloud base height (CBH) and horizontal atmospheric visibility (VIS) at surface level are of vital importance for safety and effectiveness in aviation. Around 20% of all civil aviation accidents in the USA from 2003 to 2007 were due to weather related causes, around 18% of which were owing to decreased visibility or ceiling (main CBH). The aim of this study is to develop a system generating quality-controlled gridded analyses of the two parameters based on the integration of various kinds of observational data. Upon completion, the tool is planned to provide guidance for nowcasting during take-off and landing as well as for flights operated under visual flight rules. Primary input data consists of manual as well as instrumental observation of CBH and VIS. In Austria, restructuring of part of the standard meteorological stations from human observation to automatic measurement of VIS and CBH is currently in progress. As ancillary data, satellite derived products can add 2-dimensional information, e.g. Cloud Type by NWC SAF (Nowcasting Satellite Application Facilities) MSG (Meteosat Second Generation). Other useful available data are meteorological surface measurements (in particular of temperature, humidity, wind and precipitation), radiosonde, radar and high resolution topography data. A one-year data set is used to study the spatial and weather-dependent representativeness of the CBH and VIS measurements. The VERA (Vienna Enhanced Resolution Analysis) system of the Institute of Meteorology and Geophysics of the University of Vienna provides the framework for the analysis development. Its integrated "Fingerprint" technique allows the insertion of empirical prior knowledge and ancillary information in the form of spatial patterns. Prior to the analysis, a quality control of input data is performed. For CBH and VIS, quality control can consist of internal consistency checks between different data sources. The possibility of two-dimensional consistency checks has to be explored. First results in the development of quality control features and fingerprints will be shown.
Upper-Air Quality Control, A Comparison Study
1993-12-01
hydrostatic different quadrant . From these four increments the check is based on the redundancy of reported value of the point in question is interpolated...NUAV COCHT NUAV Europa (01-17) 5,347 4,900 214 432 4.00 8.82 Former USSR (20-38) 9,152 8,824 838 704 9.16 .98 Aa (40-41, 44-48) 3,868 3,797 443 295 11.45
NASA Astrophysics Data System (ADS)
Carson, Richard T.; Mitchell, Robert Cameron
1993-07-01
This paper presents the findings of a study designed to determine the national benefits of freshwater pollution control. By using data from a national contingent valuation survey, we estimate the aggregate benefits of meeting the goals of the Clean Water Act. A valuation function is estimated which depicts willingness to pay as a function of water quality, income, and other variables. Several validation checks and tests for specific biases are performed, and the benefit estimates are corrected for missing and invalid responses. The two major policy implications from our work are that the benefits and costs of water pollution control efforts are roughly equal and that many of the new policy actions necessary to ensure that all water bodies reach at least a swimmable quality level will not have positive net benefits.
NASA Astrophysics Data System (ADS)
Toussaint, F.; Hoeck, H.; Stockhause, M.; Lautenschlager, M.
2014-12-01
The classical goals of a quality assessment system in the data life cycle are (1) to encourage data creators to improve their quality assessment procedures to reach the next quality level and (2) enable data consumers to decide, whether a dataset has a quality that is sufficient for usage in the target application, i.e. to appraise the data usability for their own purpose.As the data volumes of projects and the interdisciplinarity of data usage grow, the need for homogeneous structure and standardised notation of data and metadata increases. This third aspect is especially valid for the data repositories, as they manage data through machine agents. So checks for homogeneity and consistency in early parts of the workflow become essential to cope with today's data volumes.Selected parts of the workflow in the model intercomparison project CMIP5 and the archival of the data for the interdiscipliary user community of the IPCC-DDC AR5 and the associated quality checks are reviewed. We compare data and metadata checks and relate different types of checks to their positions in the data life cycle.The project's data citation approach is included in the discussion, with focus on temporal aspects of the time necessary to comply with the project's requirements for formal data citations and the demand for the availability of such data citations.In order to make different quality assessments of projects comparable, WDCC developed a generic Quality Assessment System. Based on the self-assessment approach of a maturity matrix, an objective and uniform quality level system for all data at WDCC is derived which consists of five maturity quality levels.
Parks, Donovan H.; Imelfort, Michael; Skennerton, Connor T.; Hugenholtz, Philip; Tyson, Gene W.
2015-01-01
Large-scale recovery of genomes from isolates, single cells, and metagenomic data has been made possible by advances in computational methods and substantial reductions in sequencing costs. Although this increasing breadth of draft genomes is providing key information regarding the evolutionary and functional diversity of microbial life, it has become impractical to finish all available reference genomes. Making robust biological inferences from draft genomes requires accurate estimates of their completeness and contamination. Current methods for assessing genome quality are ad hoc and generally make use of a limited number of “marker” genes conserved across all bacterial or archaeal genomes. Here we introduce CheckM, an automated method for assessing the quality of a genome using a broader set of marker genes specific to the position of a genome within a reference genome tree and information about the collocation of these genes. We demonstrate the effectiveness of CheckM using synthetic data and a wide range of isolate-, single-cell-, and metagenome-derived genomes. CheckM is shown to provide accurate estimates of genome completeness and contamination and to outperform existing approaches. Using CheckM, we identify a diverse range of errors currently impacting publicly available isolate genomes and demonstrate that genomes obtained from single cells and metagenomic data vary substantially in quality. In order to facilitate the use of draft genomes, we propose an objective measure of genome quality that can be used to select genomes suitable for specific gene- and genome-centric analyses of microbial communities. PMID:25977477
Parks, Donovan H; Imelfort, Michael; Skennerton, Connor T; Hugenholtz, Philip; Tyson, Gene W
2015-07-01
Large-scale recovery of genomes from isolates, single cells, and metagenomic data has been made possible by advances in computational methods and substantial reductions in sequencing costs. Although this increasing breadth of draft genomes is providing key information regarding the evolutionary and functional diversity of microbial life, it has become impractical to finish all available reference genomes. Making robust biological inferences from draft genomes requires accurate estimates of their completeness and contamination. Current methods for assessing genome quality are ad hoc and generally make use of a limited number of "marker" genes conserved across all bacterial or archaeal genomes. Here we introduce CheckM, an automated method for assessing the quality of a genome using a broader set of marker genes specific to the position of a genome within a reference genome tree and information about the collocation of these genes. We demonstrate the effectiveness of CheckM using synthetic data and a wide range of isolate-, single-cell-, and metagenome-derived genomes. CheckM is shown to provide accurate estimates of genome completeness and contamination and to outperform existing approaches. Using CheckM, we identify a diverse range of errors currently impacting publicly available isolate genomes and demonstrate that genomes obtained from single cells and metagenomic data vary substantially in quality. In order to facilitate the use of draft genomes, we propose an objective measure of genome quality that can be used to select genomes suitable for specific gene- and genome-centric analyses of microbial communities. © 2015 Parks et al.; Published by Cold Spring Harbor Laboratory Press.
Checking the possibility of controlling fuel element by X-ray computerized tomography
NASA Astrophysics Data System (ADS)
Trinh, V. B.; Zhong, Y.; Osipov, S. P.; Batranin, A. V.
2017-08-01
The article considers the possibility of checking fuel elements by X-ray computerized tomography. The checking tasks are based on the detection of particles of active material, evaluation of the heterogeneity of the distribution of uranium salts and the detection of clusters of uranium particles. First of all, scheme of scanning improve the performance and quality of the resulting three-dimensional images of the internal structure is determined. Further, the possibility of detecting clusters of uranium particles having the size of 1 mm3 and measuring the coordinates of clusters of uranium particles in the middle layer with the accuracy of within a voxel size (for the considered experiments of about 80 μm) is experimentally proved in the main part. The problem of estimating the heterogeneity of the distribution of the active material in the middle layer and the detection of particles of active material with a nominal diameter of 0.1 mm in the “blank” is solved.
Check & Connect: The Importance of Relationships for Promoting Engagement with School
ERIC Educational Resources Information Center
Anderson, Amy R.; Christenson, Sandra L.; Sinclair, Mary F.; Lehr, Camilla A.
2004-01-01
The purpose of this study was to examine whether the closeness and quality of relationships between intervention staff and students involved in the Check & Connect program were associated with improved student engagement in school. Participants included 80 elementary and middle school students referred to the Check & Connect program for poor…
Rural-Urban Differences in Medicare Quality Outcomes and the Impact of Risk Adjustment.
Henning-Smith, Carrie; Kozhimannil, Katy; Casey, Michelle; Prasad, Shailendra; Moscovice, Ira
2017-09-01
There has been considerable debate in recent years about whether, and how, to risk-adjust quality measures for sociodemographic characteristics. However, geographic location, especially rurality, has been largely absent from the discussion. To examine differences by rurality in quality outcomes, and the impact of adjustment for individual and community-level sociodemographic characteristics on quality outcomes. The 2012 Medicare Current Beneficiary Survey, Access to Care module, combined with the 2012 County Health Rankings. All data used were publicly available, secondary data. We merged the 2012 Medicare Current Beneficiary Survey data with the 2012 County Health Rankings data using county of residence. We compared 6 unadjusted quality of care measures for Medicare beneficiaries (satisfaction with care, blood pressure checked, cholesterol checked, flu shot receipt, change in health status, and all-cause annual readmission) by rurality (rural noncore, micropolitan, and metropolitan). We then ran nested multivariable logistic regression models to assess the impact of adjusting for community and individual-level sociodemographic characteristics to determine whether these mediate the rurality difference in quality of care. The relationship between rurality and change in health status was mediated by the inclusion of community-level characteristics; however, adjusting for community and individual-level characteristics caused differences by rurality to emerge in 2 of the measures: blood pressure checked and cholesterol checked. For all quality scores, model fit improved after adding community and individual characteristics. Quality is multifaceted and is impacted by individual and community-level socio-demographic characteristics, as well as by geographic location. Current debates about risk-adjustment procedures should take rurality into account.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harpool, K; De La Fuente Herman, T; Ahmad, S
Purpose: To evaluate the performance of a two-dimensional (2D) array-diode- detector for geometric and dosimetric quality assurance (QA) tests of high-dose-rate (HDR) brachytherapy with an Ir-192-source. Methods: A phantom setup was designed that encapsulated a two-dimensional (2D) array-diode-detector (MapCheck2) and a catheter for the HDR brachytherapy Ir-192 source. This setup was used to perform both geometric and dosimetric quality assurance for the HDR-Ir192 source. The geometric tests included: (a) measurement of the position of the source and (b) spacing between different dwell positions. The dosimteric tests include: (a) linearity of output with time, (b) end effect and (c) relative dosemore » verification. The 2D-dose distribution measured with MapCheck2 was used to perform the previous tests. The results of MapCheck2 were compared with the corresponding quality assurance testes performed with Gafchromic-film and well-ionization-chamber. Results: The position of the source and the spacing between different dwell-positions were reproducible within 1 mm accuracy by measuring the position of maximal dose using MapCheck2 in contrast to the film which showed a blurred image of the dwell positions due to limited film sensitivity to irradiation. The linearity of the dose with dwell times measured from MapCheck2 was superior to the linearity measured with ionization chamber due to higher signal-to-noise ratio of the diode readings. MapCheck2 provided more accurate measurement of the end effect with uncertainty < 1.5% in comparison with the ionization chamber uncertainty of 3%. Although MapCheck2 did not provide absolute calibration dosimeter for the activity of the source, it provided accurate tool for relative dose verification in HDR-brachytherapy. Conclusion: The 2D-array-diode-detector provides a practical, compact and accurate tool to perform quality assurance for HDR-brachytherapy with an Ir-192 source. The diodes in MapCheck2 have high radiation sensitivity and linearity that is superior to Gafchromic-films and ionization chamber used for geometric and dosimetric QA in HDR-brachytherapy, respectively.« less
Code of Federal Regulations, 2010 CFR
2010-07-01
... until the leak check is passed. Post-test leak check ≤4% of average sampling rate After sampling ** See... the test site. The sorbent media must be obtained from a source that can demonstrate the quality...-traceable calibration gas standards and reagents shall be used for the tests and procedures required under...
ERIC Educational Resources Information Center
GRITTNER, FRANK; PAVLAT, RUSSELL
IN ORDER TO ASSIST NON-TECHNICAL PEOPLE IN SCHOOLS TO CONDUCT A FIELD CHECK OF LANGUAGE LABORATORY EQUIPMENT BEFORE THEY MAKE FINAL PAYMENTS, THIS MANUAL OFFERS CRITERIA, TESTS, AND METHODS OF SCORING THE QUALITY OF THE EQUIPMENT. CHECKLISTS ARE PROVIDED FOR EVALUATING CONSOLE FUNCTIONS, TAPE RECORDERS, AMPLIFIERS, SOUND QUALITY (INCLUDING…
Bez, Yasin; Yesilova, Yavuz; Arı, Mustafa; Kaya, Mehmet Cemal; Alpak, Gokay; Bulut, Mahmut
2013-11-01
Acne is one of the most common dermatological diseases, and obsessive compulsive disorder is among the most frequent psychiatric conditions seen in dermatology clinics. Comorbidity of these conditions may therefore be expected. The aim of this study was to measure obsessive compulsive symptoms and quality of life in patients with acne vulgaris, compare them with those of healthy control subjects, and determine whether there is any predictive value of obsessive compulsive symptoms for quality of life in patients with acne. Obsessive compulsive symptoms and quality of life measurements of 146 patients with acne vulgaris and 94 healthy control subjects were made using the Maudsley Obsessive Compulsive Questionnaire and Short Form-36 in a cross-sectional design. Patients with acne vulgaris had lower scores for physical functioning, physical role dysfunction, general health perception, vitality, and emotional role dysfunction. They also had higher scores for checking, slowness, and rumination. The only predictor of physical functioning and vitality dimensions of health-related quality of life in these patients was rumination score. Obsessive compulsive symptoms in patients with acne vulgaris are higher than in controls, and this may correlate with both disease severity and quality of life for patients.
Design and performance of daily quality assurance system for carbon ion therapy at NIRS
NASA Astrophysics Data System (ADS)
Saotome, N.; Furukawa, T.; Hara, Y.; Mizushima, K.; Tansho, R.; Saraya, Y.; Shirai, T.; Noda, K.
2017-09-01
At National Institute of Radiological Sciences (NIRS), we have been commissioning a rotating-gantry system for carbon-ion radiotherapy. This rotating gantry can transport heavy ions at 430 MeV/u to an isocenter with irradiation angles of ±180° that can rotate around the patient so that the tumor can be irradiated from any direction. A three-dimensional pencil-beam scanning irradiation system equipped with the rotating gantry enables the optimal use of physical characteristics of carbon ions to provide accurate treatment. To ensure the treatment quality using such a complex system, the calibration of the primary dose monitor, output check, range check, dose rate check, machine safety check, and some mechanical tests should be performed efficiently. For this purpose, we have developed a measurement system dedicated for quality assurance (QA) of this gantry system: the Daily QA system. The system consists of an ionization chamber system and a scintillator system. The ionization chamber system is used for the calibration of the primary dose monitor, output check, and dose rate check, and the scintillator system is used for the range check, isocenter, and gantry angle. The performance of the Daily QA system was verified by a beam test. The stability of the output was within 0.5%, and the range was within 0.5 mm. The coincidence of the coordinates between the patient-positioning system and the irradiation system was verified using the Daily QA system. Our present findings verified that the new Daily QA system for a rotating gantry is capable of verifying the irradiation system with sufficient accuracy.
ERIC Educational Resources Information Center
Mani, Bonnie G.
1995-01-01
In an Internal Revenue Service office using total quality management (TQM), the management development program uses Myers Briggs Type Indicator and Adjective Check List for manager self-assessment. Because management commitment is essential to TQM, the process is a way of enhancing leadership skills and demonstrating appreciation of diversity. (SK)
An index to quantify street cleanliness: the case of Granada (Spain).
Sevilla, Aitana; Rodríguez, Miguel Luis; García-Maraver, Angela; Zamorano, Montserrat
2013-05-01
Urban surfaces receive waste deposits from natural and human sources, which create a negative visual impact and are identified as potentially significant contributors to water and air pollution. Local councils are usually responsible for the sweep of roads and footpaths to keep the environment clean and free of litter. Quality controls are useful in order to check whether the services are being executed according to the quantity, quality and performance standards that are provided. In this sense, several factors might affect the efficiency of the management of cleaning and waste collection services; however, only a few contributions are available in the literature on the various aspects associated with the level of street cleanliness. In this paper, the suitability of a Cleanliness Index has been checked, for the case of Granada (South of Spain), in order to contribute to the proper management of public expenditure, improving the quality and cost of an essential service for any municipality. Results have concluded that the city exhibits a good level of cleanliness, although the standard of cleaning varied from one area of the city to another. The Cleaning Index fits well to the general situation of the different districts of Granada and thus, it could be considered a useful tool for measuring the level of cleanliness of the streets of the city and for evaluating the organization of the cleaning service, such that an outsourced company would not be responsible for controlling all the cleaning services. Copyright © 2013 Elsevier Ltd. All rights reserved.
Quality assessment concept of the World Data Center for Climate and its application to CMIP5 data
NASA Astrophysics Data System (ADS)
Stockhause, M.; Höck, H.; Toussaint, F.; Lautenschlager, M.
2012-08-01
The preservation of data in a high state of quality which is suitable for interdisciplinary use is one of the most pressing and challenging current issues in long-term archiving. For high volume data such as climate model data, the data and data replica are no longer stored centrally but distributed over several local data repositories, e.g. the data of the Climate Model Intercomparison Project Phase 5 (CMIP5). The most important part of the data is to be archived, assigned a DOI, and published according to the World Data Center for Climate's (WDCC) application of the DataCite regulations. The integrated part of WDCC's data publication process, the data quality assessment, was adapted to the requirements of a federated data infrastructure. A concept of a distributed and federated quality assessment procedure was developed, in which the workload and responsibility for quality control is shared between the three primary CMIP5 data centers: Program for Climate Model Diagnosis and Intercomparison (PCMDI), British Atmospheric Data Centre (BADC), and WDCC. This distributed quality control concept, its pilot implementation for CMIP5, and first experiences are presented. The distributed quality control approach is capable of identifying data inconsistencies and to make quality results immediately available for data creators, data users and data infrastructure managers. Continuous publication of new data versions and slow data replication prevents the quality control from check completion. This together with ongoing developments of the data and metadata infrastructure requires adaptations in code and concept of the distributed quality control approach.
Maindal, Helle Terkildsen; Støvring, Henrik; Sandbaek, Annelli
2014-08-29
The periodic health check-up has been a fundamental part of routine medical practice for decades, despite a lack of consensus regarding its value in health promotion and disease prevention. A large-scale Danish population-based preventive programme 'Check your health' was developed based on available evidence of screening and successive accepted treatment, prevention for diseases and health promotion, and is closely aligned with the current health care system.The objective of the 'Check your health' [CORE] trial is to investigate effectiveness on health outcomes of a preventive health check offered at a population-level to all individuals aged 30-49 years, and to establish the cost-effectiveness. The trial will be conducted as a pragmatic household-cluster randomised controlled trial involving 10,505 individuals. All individuals within a well-defined geographical area in the Central Denmark Region, Denmark (DK) were randomised to be offered a preventive health check (Intervention group, n = 5250) or to maintain routine access to healthcare until a delayed intervention (Comparison group, n = 5255). The programme consists of a health examination which yields an individual risk profile, and according to this participants are assigned to one of the following interventions: (a) referral to a health promoting consultation in general practice, (b) behavioural programmes at the local Health Centre, or (c) no need for follow-up.The primary outcomes at 4 years follow-up are: ten-year-risk of fatal cardiovascular event (Heart-SCORE model), physical activity level (self-report and cardiorespiratory fitness), quality of life (SF12), sick leave and labour market attachment. Cost-effectiveness will be evaluated according to life years gained, direct costs and total health costs. Intention to treat analysis will be performed. Results from the largest Danish health check programme conducted within the current healthcare system, spanning the sectors which share responsibility for the individual, will provide a scientific basis to be used in the development of systems to optimise population health in the 21st century. The trial has registered at ClinicalTrials.gov with an ID: NCT02028195 (7. March 2014).
Charnock, P; Jones, R; Fazakerley, J; Wilde, R; Dunn, A F
2011-09-01
Data are currently being collected from hospital radiology information systems in the North West of the UK for the purposes of both clinical audit and patient dose audit. Could these data also be used to satisfy quality assurance (QA) requirements according to UK guidance? From 2008 to 2009, 731 653 records were submitted from 8 hospitals from the North West England. For automatic exposure control QA, the protocol from Institute of Physics and Engineering in Medicine (IPEM) report 91 recommends that milliamperes per second can be monitored for repeatability and reproducibility using a suitable phantom, at 70-81 kV. Abdomen AP and chest PA examinations were analysed to find the most common kilovoltage used with these records then used to plot average monthly milliamperes per second with time. IPEM report 91 also recommends that a range of commonly used clinical settings is used to check output reproducibility and repeatability. For each tube, the dose area product values were plotted over time for two most common exposure factor sets. Results show that it is possible to do performance checks of AEC systems; however more work is required to be able to monitor tube output performance. Procedurally, the management system requires work and the benefits to the workflow would need to be demonstrated.
Improving the quality of marine geophysical track line data: Along-track analysis
NASA Astrophysics Data System (ADS)
Chandler, Michael T.; Wessel, Paul
2008-02-01
We have examined 4918 track line geophysics cruises archived at the U.S. National Geophysical Data Center (NGDC) using comprehensive error checking methods. Each cruise was checked for observation outliers, excessive gradients, metadata consistency, and general agreement with satellite altimetry-derived gravity and predicted bathymetry grids. Thresholds for error checking were determined empirically through inspection of histograms for all geophysical values, gradients, and differences with gridded data sampled along ship tracks. Robust regression was used to detect systematic scale and offset errors found by comparing ship bathymetry and free-air anomalies to the corresponding values from global grids. We found many recurring error types in the NGDC archive, including poor navigation, inappropriately scaled or offset data, excessive gradients, and extended offsets in depth and gravity when compared to global grids. While ˜5-10% of bathymetry and free-air gravity records fail our conservative tests, residual magnetic errors may exceed twice this proportion. These errors hinder the effective use of the data and may lead to mistakes in interpretation. To enable the removal of gross errors without over-writing original cruise data, we developed an errata system that concisely reports all errors encountered in a cruise. With such errata files, scientists may share cruise corrections, thereby preventing redundant processing. We have implemented these quality control methods in the modified MGD77 supplement to the Generic Mapping Tools software suite.
Boggia, Raffaella; Turrini, Federica; Anselmo, Marco; Zunin, Paola; Donno, Dario; Beccaro, Gabriele L
2017-07-01
Bud extracts, named also "gemmoderivatives", are a new category of natural products, obtained macerating meristematic fresh tissues of trees and plants. In the European Community these botanical remedies are classified as plant food supplements. Nowadays these products are still poorly studied, even if they are widely used and commercialized. Several analytical tools for the quality control of these very expensive supplements are urgently needed in order to avoid mislabelling and frauds. In fact, besides the usual quality controls common to the other botanical dietary supplements, these extracts should be checked in order to quickly detect if the cheaper adult parts of the plants are deceptively used in place of the corresponding buds whose harvest-period and production are extremely limited. This study aims to provide a screening analytical method based on UV-VIS-Fluorescence spectroscopy coupled to multivariate analysis for a rapid, inexpensive and non-destructive quality control of these products.
Getting Ready for Inspection of Investigational Site at Short Notice
Talele, Rajendra
2010-01-01
India is becoming an attractive destination for drug development and clinical research. This is evidenced by the three fold increment in clinical trial applications in last four years to the office of Drugs Controller General of India (DCGI). This upward trend is collaborative efforts of all stake holders and the quality of Indian data. Therefore to sustain this trend, it is important that stake holders such as Regulators, Sponsor, CRO, Monitor, Investigators and trial subjects required maintaining high standards of data and conduct of clinical trials. Indian regulations and the role of DCGI in quality check for Indian clinical trials is always a topic of discussion in various forums. A recent move by DCGI for conducting random inspections of investigational sites and companies at short notice, checking their compliance in accordance with the guidelines, and taking action against non-complier is welcomed. This will certainly increase over quality of the clinical trials. Quality of clinical trial conduct is measured on essential documents for their appropriateness and its correctness. It is observed that the stakeholders engaged in multitasking often overlook the requirements or appropriateness of the document due to their focused approach on a specific activity which is on priority. This can lead to serious quality problem and issues. Understanding of the process and documents reviewed by auditor is important to maintain such high quality. The proper planning and time management working on essential documents can minimize the quality issues, and we can be always ready for any type of inspection, announced or unannounced, or “short notice”. PMID:21829785
Bodner, Martin; Bastisch, Ingo; Butler, John M; Fimmers, Rolf; Gill, Peter; Gusmão, Leonor; Morling, Niels; Phillips, Christopher; Prinz, Mechthild; Schneider, Peter M; Parson, Walther
2016-09-01
The statistical evaluation of autosomal Short Tandem Repeat (STR) genotypes is based on allele frequencies. These are empirically determined from sets of randomly selected human samples, compiled into STR databases that have been established in the course of population genetic studies. There is currently no agreed procedure of performing quality control of STR allele frequency databases, and the reliability and accuracy of the data are largely based on the responsibility of the individual contributing research groups. It has been demonstrated with databases of haploid markers (EMPOP for mitochondrial mtDNA, and YHRD for Y-chromosomal loci) that centralized quality control and data curation is essential to minimize error. The concepts employed for quality control involve software-aided likelihood-of-genotype, phylogenetic, and population genetic checks that allow the researchers to compare novel data to established datasets and, thus, maintain the high quality required in forensic genetics. Here, we present STRidER (http://strider.online), a publicly available, centrally curated online allele frequency database and quality control platform for autosomal STRs. STRidER expands on the previously established ENFSI DNA WG STRbASE and applies standard concepts established for haploid and autosomal markers as well as novel tools to reduce error and increase the quality of autosomal STR data. The platform constitutes a significant improvement and innovation for the scientific community, offering autosomal STR data quality control and reliable STR genotype estimates. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Quality control in the year 2000.
Schade, B
1992-01-01
'Just-in-time' production is a prerequisite for a company to meet the challenges of competition. Manufacturing cycles have been so successfully optimized that release time now has become a significant factor. A vision for a major quality-control (QC) contribution to profitability in this decade seems to be the just-in-time release. Benefits will go beyond cost savings for lower inventory. The earlier detection of problems will reduce rejections and scrap. In addition, problem analysis and problem-solving will be easier. To achieve just-in-time release, advanced automated systems like robots will become the workhorses in QC for high volume pharmaceutical production. The requirements for these systems are extremely high in terms of quality, reliability and ruggedness. Crucial for the success might be advances in use of microelectronics for error checks, system recording, trouble shooting, etc. as well as creative new approaches (for example the use of redundant assay systems).
Quality control in the year 2000
Schade, Bernd
1992-01-01
‘Just-in-time’ production is a prerequisite for a company to meet the challenges of competition. Manufacturing cycles have been so successfully optimized that release time now has become a significant factor. A vision for a major quality-control (QC) contribution to profitability in this decade seems to be the just-in-time release. Benefits will go beyond cost savings for lower inventory. The earlier detection of problems will reduce rejections and scrap. In addition, problem analysis and problem-solving will be easier. To achieve just-in-time release, advanced automated systems like robots will become the workhorses in QC for high volume pharmaceutical production. The requirements for these systems are extremely high in terms of quality, reliability and ruggedness. Crucial for the success might be advances in use of microelectronics for error checks, system recording, trouble shooting, etc. as well as creative new approaches (for example the use of redundant assay systems). PMID:18924930
Quality assurance of the international computerised 24 h dietary recall method (EPIC-Soft).
Crispim, Sandra P; Nicolas, Genevieve; Casagrande, Corinne; Knaze, Viktoria; Illner, Anne-Kathrin; Huybrechts, Inge; Slimani, Nadia
2014-02-01
The interview-administered 24 h dietary recall (24-HDR) EPIC-Soft® has a series of controls to guarantee the quality of dietary data across countries. These comprise all steps that are part of fieldwork preparation, data collection and data management; however, a complete characterisation of these quality controls is still lacking. The present paper describes in detail the quality controls applied in EPIC-Soft, which are, to a large extent, built on the basis of the EPIC-Soft error model and are present in three phases: (1) before, (2) during and (3) after the 24-HDR interviews. Quality controls for consistency and harmonisation are implemented before the interviews while preparing the seventy databases constituting an EPIC-Soft version (e.g. pre-defined and coded foods and recipes). During the interviews, EPIC-Soft uses a cognitive approach by helping the respondent to recall the dietary intake information in a stepwise manner and includes controls for consistency (e.g. probing questions) as well as for completeness of the collected data (e.g. system calculation for some unknown amounts). After the interviews, a series of controls can be applied by dietitians and data managers to further guarantee data quality. For example, the interview-specific 'note files' that were created to track any problems or missing information during the interviews can be checked to clarify the information initially provided. Overall, the quality controls employed in the EPIC-Soft methodology are not always perceivable, but prove to be of assistance for its overall standardisation and possibly for the accuracy of the collected data.
Fernández-Sanjuan, María; Lacorte, Silvia; Rigol, Anna; Sahuquillo, Angels
2012-11-01
The determination of alkylphenols in sewage sludge is still hindered by the complexity of the matrix and of the analytes, some of which are a mixture of isomers. Most of the methods published in the literature have not been validated, due to the lack of reference materials for the determination of alkylphenols in sludge. Given this situation, the objectives of the present study were to develop a new quality-control material for determining octylphenol, nonylphenol and nonylphenol monoethoxylate in sludge. The material was prepared from an anaerobically digested sewage sludge, which was thermally dried, sieved, homogenized and bottled after checking for the bulk homogeneity of the processed material. Together with the sewage sludge, an extract was also prepared, in order to provide a quality-control material for allowing laboratories to test the measuring step. The homogeneity and 1-year stability of the two materials were evaluated. Statistical analysis proved that the materials were homogeneous and stable for at least 1 year stored at different temperatures. These materials are intended to assist in the quality control of the determination of alkylphenols and alkylphenol ethoxylates in sewage sludge.
Statistical process control analysis for patient-specific IMRT and VMAT QA.
Sanghangthum, Taweap; Suriyapee, Sivalee; Srisatit, Somyot; Pawlicki, Todd
2013-05-01
This work applied statistical process control to establish the control limits of the % gamma pass of patient-specific intensity modulated radiotherapy (IMRT) and volumetric modulated arc therapy (VMAT) quality assurance (QA), and to evaluate the efficiency of the QA process by using the process capability index (Cpml). A total of 278 IMRT QA plans in nasopharyngeal carcinoma were measured with MapCHECK, while 159 VMAT QA plans were undertaken with ArcCHECK. Six megavolts with nine fields were used for the IMRT plan and 2.5 arcs were used to generate the VMAT plans. The gamma (3%/3 mm) criteria were used to evaluate the QA plans. The % gamma passes were plotted on a control chart. The first 50 data points were employed to calculate the control limits. The Cpml was calculated to evaluate the capability of the IMRT/VMAT QA process. The results showed higher systematic errors in IMRT QA than VMAT QA due to the more complicated setup used in IMRT QA. The variation of random errors was also larger in IMRT QA than VMAT QA because the VMAT plan has more continuity of dose distribution. The average % gamma pass was 93.7% ± 3.7% for IMRT and 96.7% ± 2.2% for VMAT. The Cpml value of IMRT QA was 1.60 and VMAT QA was 1.99, which implied that the VMAT QA process was more accurate than the IMRT QA process. Our lower control limit for % gamma pass of IMRT is 85.0%, while the limit for VMAT is 90%. Both the IMRT and VMAT QA processes are good quality because Cpml values are higher than 1.0.
ERIC Educational Resources Information Center
Chauhan, U.; Kontopantelis, E.; Campbell, S.; Jarrett, H.; Lester, H.
2010-01-01
Background: Routine health checks have gained prominence as a way of detecting unmet need in primary care for adults with intellectual disabilities (ID) and general practitioners are being incentivised in the UK to carry out health checks for many conditions through an incentivisation scheme known as the Quality and Outcomes Framework (QOF).…
The IEO Data Center Management System: Tools for quality control, analysis and access marine data
NASA Astrophysics Data System (ADS)
Casas, Antonia; Garcia, Maria Jesus; Nikouline, Andrei
2010-05-01
Since 1994 the Data Centre of the Spanish Oceanographic Institute develops system for archiving and quality control of oceanographic data. The work started in the frame of the European Marine Science & Technology Programme (MAST) when a consortium of several Mediterranean Data Centres began to work on the MEDATLAS project. Along the years, old software modules for MS DOS were rewritten, improved and migrated to Windows environment. Oceanographic data quality control includes now not only vertical profiles (mainly CTD and bottles observations) but also time series of currents and sea level observations. New powerful routines for analysis and for graphic visualization were added. Data presented originally in ASCII format were organized recently in an open source MySQL database. Nowadays, the IEO, as part of SeaDataNet Infrastructure, has designed and developed a new information system, consistent with the ISO 19115 and SeaDataNet standards, in order to manage the large and diverse marine data and information originated in Spain by different sources, and to interoperate with SeaDataNet. The system works with data stored in ASCII files (MEDATLAS, ODV) as well as data stored within the relational database. The components of the system are: 1.MEDATLAS Format and Quality Control - QCDAMAR: Quality Control of Marine Data. Main set of tools for working with data presented as text files. Includes extended quality control (searching for duplicated cruises and profiles, checking date, position, ship velocity, constant profiles, spikes, density inversion, sounding, acceptable data, impossible regional values,...) and input/output filters. - QCMareas: A set of procedures for the quality control of tide gauge data according to standard international Sea Level Observing System. These procedures include checking for unexpected anomalies in the time series, interpolation, filtering, computation of basic statistics and residuals. 2. DAMAR: A relational data base (MySql) designed to manage the wide variety of marine information as common vocabularies, Catalogues (CSR & EDIOS), Data and Metadata. 3.Other tools for analysis and data management - Import_DB: Script to import data and metadata from the Medatlas ASCII files into the database. - SelDamar/Selavi: interface with the database for local and web access. Allows selective retrievals applying the criteria introduced by the user, as geographical bounds, data responsible, cruises, platform, time periods, etc. Includes also statistical reference values calculation, plotting of original and mean profiles together with vertical interpolation. - ExtractDAMAR: Script to extract data when they are archived in ASCII files that meet the criteria upon an user request through SelDamar interface and export them in ODV format, making also a unit conversion.
Riazuddin, -; Khan, Sarzamin; Imtiaz, Naila; Aslam, Saima; Ahmad, Shakoor; Khan, Hamayun; Rabbani, Masood; Muhammad, Javed; Tanveer, Zafar Iqbal; Ali, Sakhawat
2017-03-01
The present study was conducted to investigate the quality and efficacy of commercially available preparations of tylosin and doxycycline available in the local market at Peshawar for poultry. In vitro and in vivo, tests were conducted to check the quality of these antimicrobial drugs. In vitro quality control test was performed by High performance liquid chromatographic (HPLC) and micro dilution method. In vivo, efficacy of the test drugs was checked in broilers infected with Mycoplasma gallisepticum. Results of HPLC indicated that test drug-2 contains doxycycline hydrochloride within specified limits but contain high quantity of active ingredient (Tylosin tartrate 120%). Recovery percentage of test drugs (3, 4, 5) were below the pharmacopoeial limit, which contained low quantity of tylosin tartrate (85%, 87.5%, 85%) respectively however, percent recovery of doxycycline were in the appropriate limits. All the tested drugs were effective against Mycoplasma gallisepticum and showed minimum inhibitory concentration (MIC) at 1.9μg/ml. The in vivo result indicated that all tested drugs decreased morbidity and mortality in infected chicks. The birds treated with test drugs (3 and 5) showed mortality of 9.5%, which was slightly higher than the other test groups. The current study suggested that there are incidences of substandard drugs in Pakistan and the drug regularity authorities should take strict actions against the manufacturing companies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Covington, E; Younge, K; Chen, X
Purpose: To evaluate the effectiveness of an automated plan check tool to improve first-time plan quality as well as standardize and document performance of physics plan checks. Methods: The Plan Checker Tool (PCT) uses the Eclipse Scripting API to check and compare data from the treatment planning system (TPS) and treatment management system (TMS). PCT was created to improve first-time plan quality, reduce patient delays, increase efficiency of our electronic workflow, and to standardize and partially automate plan checks in the TPS. A framework was developed which can be configured with different reference values and types of checks. One examplemore » is the prescribed dose check where PCT flags the user when the planned dose and the prescribed dose disagree. PCT includes a comprehensive checklist of automated and manual checks that are documented when performed by the user. A PDF report is created and automatically uploaded into the TMS. Prior to and during PCT development, errors caught during plan checks and also patient delays were tracked in order to prioritize which checks should be automated. The most common and significant errors were determined. Results: Nineteen of 33 checklist items were automated with data extracted with the PCT. These include checks for prescription, reference point and machine scheduling errors which are three of the top six causes of patient delays related to physics and dosimetry. Since the clinical roll-out, no delays have been due to errors that are automatically flagged by the PCT. Development continues to automate the remaining checks. Conclusion: With PCT, 57% of the physics plan checklist has been partially or fully automated. Treatment delays have declined since release of the PCT for clinical use. By tracking delays and errors, we have been able to measure the effectiveness of automating checks and are using this information to prioritize future development. This project was supported in part by P01CA059827.« less
Power quality analysis of DC arc furnace operation using the Bowman model for electric arc
NASA Astrophysics Data System (ADS)
Gherman, P. L.
2018-01-01
This work is about a relatively new domain. The DC electric arc is superior to the AC electric arc and it’s not used in Romania. This is why we analyzed the work functions of these furnaces by simulation and model checking of the simulation results.The conclusions are favorable, to be carried is to develop a real-time control system of steel elaboration process.
Kraus, Nicole; Lindenberg, Julia; Zeeck, Almut; Kosfelder, Joachim; Vocks, Silja
2015-09-01
Cognitive-behavioural models of eating disorders state that body checking arises in response to negative emotions in order to reduce the aversive emotional state and is therefore negatively reinforced. This study empirically tests this assumption. For a seven-day period, women with eating disorders (n = 26) and healthy controls (n = 29) were provided with a handheld computer for assessing occurring body checking strategies as well as negative and positive emotions. Serving as control condition, randomized computer-emitted acoustic signals prompted reports on body checking and emotions. There was no difference in the intensity of negative emotions before body checking and in control situations across groups. However, from pre- to post-body checking, an increase in negative emotions was found. This effect was more pronounced in women with eating disorders compared with healthy controls. Results are contradictory to the assumptions of the cognitive-behavioural model, as body checking does not seem to reduce negative emotions. Copyright © 2015 John Wiley & Sons, Ltd and Eating Disorders Association.
Day, Niamh; Criss, Joshua; Griffiths, Benjamin; Gujral, Shireen Kaur; John-Leader, Franklin; Johnston, Jennifer; Pit, Sabrina
2018-01-05
Drug checking is a harm reduction strategy which allows users to check the content and purity of illicit drugs. Although drug checking has been trialled internationally, with demonstrated value as a harm reduction and health promotion strategy, the use of such services in Australia remains a contentious issue. This study aimed to investigate the proportion and patterns of illicit drug use among young people, their attitudes towards drug checking at festivals and the potential impact of drug checking on intended drug use behaviour. The survey was conducted at a major Australian music festival in 2016. Data was collected from a sample of festival attendees (n = 642) aged between 18 and 30 years. A descriptive analysis of the data was performed. Nearly three-quarters (73.4%) of participants reported that they had used illicit drugs in the past 12 months, most commonly cannabis (63.9%) and ecstasy (59.8%). A large proportion of participants believed 'somewhat' or 'a lot' that drug checking services could help users seek help to reduce harm (86.5%) and that drug checking services should be combined with harm reduction advice (84.9%). However, two thirds of the participants agreed 'somewhat' or 'a lot' that drug sellers may use this service as a quality control mechanism (68.6%). Approximately half (54.4%) indicated they would be highly likely and a third (32.7%) would be somewhat likely to utilise free drug checking services should they be available at music festivals. When asked whether the results of drug checking would influence their drug use behaviour, participants reported that they would not take substances shown to contain methamphetamine (65.1%), ketamine (57.5%) or para-methoxyamphetamine (PMA) (58.4%). The majority of festival attendees aged 18-30 participating in this study reported a history of illicit drug use and were in favour of the provision of free drug checking at festivals. A considerable proportion reported that the results of drug checking would influence their drug use behaviour. The findings of this study can contribute to the debate regarding whether drug checking services could potentially play a major role in harm reduction and health promotion programming for young people attending festivals.
End-user perspective of low-cost sensors for outdoor air pollution monitoring.
Rai, Aakash C; Kumar, Prashant; Pilla, Francesco; Skouloudis, Andreas N; Di Sabatino, Silvana; Ratti, Carlo; Yasar, Ansar; Rickerby, David
2017-12-31
Low-cost sensor technology can potentially revolutionise the area of air pollution monitoring by providing high-density spatiotemporal pollution data. Such data can be utilised for supplementing traditional pollution monitoring, improving exposure estimates, and raising community awareness about air pollution. However, data quality remains a major concern that hinders the widespread adoption of low-cost sensor technology. Unreliable data may mislead unsuspecting users and potentially lead to alarming consequences such as reporting acceptable air pollutant levels when they are above the limits deemed safe for human health. This article provides scientific guidance to the end-users for effectively deploying low-cost sensors for monitoring air pollution and people's exposure, while ensuring reasonable data quality. We review the performance characteristics of several low-cost particle and gas monitoring sensors and provide recommendations to end-users for making proper sensor selection by summarizing the capabilities and limitations of such sensors. The challenges, best practices, and future outlook for effectively deploying low-cost sensors, and maintaining data quality are also discussed. For data quality assurance, a two-stage sensor calibration process is recommended, which includes laboratory calibration under controlled conditions by the manufacturer supplemented with routine calibration checks performed by the end-user under final deployment conditions. For large sensor networks where routine calibration checks are impractical, statistical techniques for data quality assurance should be utilised. Further advancements and adoption of sophisticated mathematical and statistical techniques for sensor calibration, fault detection, and data quality assurance can indeed help to realise the promised benefits of a low-cost air pollution sensor network. Copyright © 2017 Elsevier B.V. All rights reserved.
Integrating automated support for a software management cycle into the TAME system
NASA Technical Reports Server (NTRS)
Sunazuka, Toshihiko; Basili, Victor R.
1989-01-01
Software managers are interested in the quantitative management of software quality, cost and progress. An integrated software management methodology, which can be applied throughout the software life cycle for any number purposes, is required. The TAME (Tailoring A Measurement Environment) methodology is based on the improvement paradigm and the goal/question/metric (GQM) paradigm. This methodology helps generate a software engineering process and measurement environment based on the project characteristics. The SQMAR (software quality measurement and assurance technology) is a software quality metric system and methodology applied to the development processes. It is based on the feed forward control principle. Quality target setting is carried out before the plan-do-check-action activities are performed. These methodologies are integrated to realize goal oriented measurement, process control and visual management. A metric setting procedure based on the GQM paradigm, a management system called the software management cycle (SMC), and its application to a case study based on NASA/SEL data are discussed. The expected effects of SMC are quality improvement, managerial cost reduction, accumulation and reuse of experience, and a highly visual management reporting system.
NASA Astrophysics Data System (ADS)
Raghavan, Ajay; Saha, Bhaskar
2013-03-01
Photo enforcement devices for traffic rules such as red lights, toll, stops, and speed limits are increasingly being deployed in cities and counties around the world to ensure smooth traffic flow and public safety. These are typically unattended fielded systems, and so it is important to periodically check them for potential image/video quality problems that might interfere with their intended functionality. There is interest in automating such checks to reduce the operational overhead and human error involved in manually checking large camera device fleets. Examples of problems affecting such camera devices include exposure issues, focus drifts, obstructions, misalignment, download errors, and motion blur. Furthermore, in some cases, in addition to the sub-algorithms for individual problems, one also has to carefully design the overall algorithm and logic to check for and accurately classifying these individual problems. Some of these issues can occur in tandem or have the potential to be confused for each other by automated algorithms. Examples include camera misalignment that can cause some scene elements to go out of focus for wide-area scenes or download errors that can be misinterpreted as an obstruction. Therefore, the sequence in which the sub-algorithms are utilized is also important. This paper presents an overview of these problems along with no-reference and reduced reference image and video quality solutions to detect and classify such faults.
Quevauviller, P; Bennink, D; Bøwadt, S
2001-05-01
It is now well recognised that the quality control (QC) of all types of analyses, including environmental analyses depends on the appropriate use of reference materials. One of the ways to check the accuracy of methods is based on the use of Certified Reference Materials (CRMs), whereas other types of (not certified) Reference Materials (RMs) are used for routine quality control (establishment of control charts) and interlaboratory testing (e.g. proficiency testing). The perception of these materials, in particular with respect to their production and use, differs widely according to various perspectives (e.g. RM producers, routine laboratories, researchers). This review discusses some critical aspects of RM use and production for the QC of environmental analyses and describes the new approach followed by the Measurements & Testing Generic Activity (European Commission) to tackle new research and production needs.
Anti Rohumaa; Christopher G. Hunt; Mark Hughes; Charles R. Frihart; Janne Logren
2013-01-01
During the rotary peeling of veneer for plywood or the laminated veneer lumber manufacture, checks are formed in the veneer that are as deep as 70 â 80 % of the veneer thickness. The results of this study show that, during adhesive bond testing, deep lathe checks in birch (Betula pendula Roth.) veneer significantly reduce the shear strength and the...
Effectiveness of nonresuscitative first aid training in laypersons: a systematic review.
Van de Velde, Stijn; Heselmans, Annemie; Roex, Ann; Vandekerckhove, Philippe; Ramaekers, Dirk; Aertgeerts, Bert
2009-09-01
This study reviewed evidence on the effects of nonresuscitative first aid training on competence and helping behavior in laypersons. We identified randomized and nonrandomized controlled trials and interrupted time series on nonresuscitative first aid training for laypersons by using 12 databases (including MEDLINE, EMBASE, and PsycINFO), hand searching, reference checking, and author communication. Two reviewers independently evaluated selected studies with the Cochrane Effective Practice and Organisation of Care Review Group quality criteria. One reviewer extracted data with a standard form and another checked them. In anticipation of substantial heterogeneity across studies, we elected a descriptive summary of the included studies. We included 4 studies, 3 of which were randomized trials. We excluded 11 studies on quality issues. Two studies revealed that participants trained in first aid demonstrated higher written test scores than controls (poisoning first aid: relative risk 2.11, 95% confidence interval [CI] 1.64 to 2.72; various first aid cases: mean difference 4.75, 95% CI 3.02 to 6.48). Two studies evaluated helping responses during unannounced simulations. First aid training improved the quality of help for a bleeding emergency (relative risk 25.94; 95% CI 3.60 to 186.93), not the rate of helping (relative risk 1.13; 95% CI 0.88 to 1.45). Training in first aid and helping behavior increased the helping rates in a chest pain emergency compared with training in first aid only (relative risk 2.80; 95% CI 1.05 to 7.50) or controls (relative risk 3.81; 95% CI 0.98 to 14.89). Participants trained in first aid only did not help more than controls (relative risk 1.36; 95% CI 0.28 to 6.61). First aid programs that also train participants to overcome inhibitors of emergency helping behavior could lead to better help and higher helping rates.
A new dataset validation system for the Planetary Science Archive
NASA Astrophysics Data System (ADS)
Manaud, N.; Zender, J.; Heather, D.; Martinez, S.
2007-08-01
The Planetary Science Archive is the official archive for the Mars Express mission. It has received its first data by the end of 2004. These data are delivered by the PI teams to the PSA team as datasets, which are formatted conform to the Planetary Data System (PDS). The PI teams are responsible for analyzing and calibrating the instrument data as well as the production of reduced and calibrated data. They are also responsible of the scientific validation of these data. ESA is responsible of the long-term data archiving and distribution to the scientific community and must ensure, in this regard, that all archived products meet quality. To do so, an archive peer-review is used to control the quality of the Mars Express science data archiving process. However a full validation of its content is missing. An independent review board recently recommended that the completeness of the archive as well as the consistency of the delivered data should be validated following well-defined procedures. A new validation software tool is being developed to complete the overall data quality control system functionality. This new tool aims to improve the quality of data and services provided to the scientific community through the PSA, and shall allow to track anomalies in and to control the completeness of datasets. It shall ensure that the PSA end-users: (1) can rely on the result of their queries, (2) will get data products that are suitable for scientific analysis, (3) can find all science data acquired during a mission. We defined dataset validation as the verification and assessment process to check the dataset content against pre-defined top-level criteria, which represent the general characteristics of good quality datasets. The dataset content that is checked includes the data and all types of information that are essential in the process of deriving scientific results and those interfacing with the PSA database. The validation software tool is a multi-mission tool that has been designed to provide the user with the flexibility of defining and implementing various types of validation criteria, to iteratively and incrementally validate datasets, and to generate validation reports.
Allemani, Claudia; Harewood, Rhea; Johnson, Christopher J; Carreira, Helena; Spika, Devon; Bonaventure, Audrey; Ward, Kevin; Weir, Hannah K; Coleman, Michel P
2017-12-15
Robust comparisons of population-based cancer survival estimates require tight adherence to the study protocol, standardized quality control, appropriate life tables of background mortality, and centralized analysis. The CONCORD program established worldwide surveillance of population-based cancer survival in 2015, analyzing individual data on 26 million patients (including 10 million US patients) diagnosed between 1995 and 2009 with 1 of 10 common malignancies. In this Cancer supplement, we analyzed data from 37 state cancer registries that participated in the second cycle of the CONCORD program (CONCORD-2), covering approximately 80% of the US population. Data quality checks were performed in 3 consecutive phases: protocol adherence, exclusions, and editorial checks. One-, 3-, and 5-year age-standardized net survival was estimated using the Pohar Perme estimator and state- and race-specific life tables of all-cause mortality for each year. The cohort approach was adopted for patients diagnosed between 2001 and 2003, and the complete approach for patients diagnosed between 2004 and 2009. Articles in this supplement report population coverage, data quality indicators, and age-standardized 5-year net survival by state, race, and stage at diagnosis. Examples of tables, bar charts, and funnel plots are provided in this article. Population-based cancer survival is a key measure of the overall effectiveness of services in providing equitable health care. The high quality of US cancer registry data, 80% population coverage, and use of an unbiased net survival estimator ensure that the survival trends reported in this supplement are robustly comparable by race and state. The results can be used by policymakers to identify and address inequities in cancer survival in each state and for the United States nationally. Cancer 2017;123:4982-93. Published 2017. This article is a U.S. Government work and is in the public domain in the USA. Published 2017. This article is a U.S. Government work and is in the public domain in the USA.
Helping You Choose Quality Ambulatory Care
Helping you choose: Quality ambulatory care When you need ambulatory care, you should find out some information to help you choose the best ... the center follows rules for patient safety and quality. Go to Quality Check ® at www. qualitycheck. org ...
Helping You Choose Quality Hospice Care
Helping you choose: Quality hospice care When you need hospice care, you should find out some information to help you choose the best ... the service follows rules for patient safety and quality. Go to Quality Check ® at www. qualitycheck. org ...
Implementation of a GPS-RO data processing system for the KIAPS-LETKF data assimilation system
NASA Astrophysics Data System (ADS)
Kwon, H.; Kang, J.-S.; Jo, Y.; Kang, J. H.
2014-11-01
The Korea Institute of Atmospheric Prediction Systems (KIAPS) has been developing a new global numerical weather prediction model and an advanced data assimilation system. As part of the KIAPS Package for Observation Processing (KPOP) system for data assimilation, preprocessing and quality control modules for bending angle measurements of global positioning system radio occultation (GPS-RO) data have been implemented and examined. GPS-RO data processing system is composed of several steps for checking observation locations, missing values, physical values for Earth radius of curvature, and geoid undulation. An observation-minus-background check is implemented by use of a one-dimensional observational bending angle operator and tangent point drift is also considered in the quality control process. We have tested GPS-RO observations utilized by the Korean Meteorological Administration (KMA) within KPOP, based on both the KMA global model and the National Center for Atmospheric Research (NCAR) Community Atmosphere Model-Spectral Element (CAM-SE) as a model background. Background fields from the CAM-SE model are incorporated for the preparation of assimilation experiments with the KIAPS-LETKF data assimilation system, which has been successfully implemented to a cubed-sphere model with fully unstructured quadrilateral meshes. As a result of data processing, the bending angle departure statistics between observation and background shows significant improvement. Also, the first experiment in assimilating GPS-RO bending angle resulting from KPOP within KIAPS-LETKF shows encouraging results.
Chen, Guang-Pei; Ahunbay, Ergun; Li, X Allen
2016-04-01
To develop an integrated quality assurance (QA) software tool for online replanning capable of efficiently and automatically checking radiation treatment (RT) planning parameters and gross plan quality, verifying treatment plan data transfer from treatment planning system (TPS) to record and verify (R&V) system, performing a secondary monitor unit (MU) calculation with or without a presence of a magnetic field from MR-Linac, and validating the delivery record consistency with the plan. The software tool, named ArtQA, was developed to obtain and compare plan and treatment parameters from both the TPS and the R&V system database. The TPS data are accessed via direct file reading and the R&V data are retrieved via open database connectivity and structured query language. Plan quality is evaluated with both the logical consistency of planning parameters and the achieved dose-volume histograms. Beams in between the TPS and R&V system are matched based on geometry configurations. To consider the effect of a 1.5 T transverse magnetic field from MR-Linac in the secondary MU calculation, a method based on modified Clarkson integration algorithm was developed and tested for a series of clinical situations. ArtQA has been used in their clinic and can quickly detect inconsistencies and deviations in the entire RT planning process. With the use of the ArtQA tool, the efficiency for plan check including plan quality, data transfer, and delivery check can be improved by at least 60%. The newly developed independent MU calculation tool for MR-Linac reduces the difference between the plan and calculated MUs by 10%. The software tool ArtQA can be used to perform a comprehensive QA check from planning to delivery with conventional Linac or MR-Linac and is an essential tool for online replanning where the QA check needs to be performed rapidly.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Guang-Pei, E-mail: gpchen@mcw.edu; Ahunbay, Ergun; Li, X. Allen
Purpose: To develop an integrated quality assurance (QA) software tool for online replanning capable of efficiently and automatically checking radiation treatment (RT) planning parameters and gross plan quality, verifying treatment plan data transfer from treatment planning system (TPS) to record and verify (R&V) system, performing a secondary monitor unit (MU) calculation with or without a presence of a magnetic field from MR-Linac, and validating the delivery record consistency with the plan. Methods: The software tool, named ArtQA, was developed to obtain and compare plan and treatment parameters from both the TPS and the R&V system database. The TPS data aremore » accessed via direct file reading and the R&V data are retrieved via open database connectivity and structured query language. Plan quality is evaluated with both the logical consistency of planning parameters and the achieved dose–volume histograms. Beams in between the TPS and R&V system are matched based on geometry configurations. To consider the effect of a 1.5 T transverse magnetic field from MR-Linac in the secondary MU calculation, a method based on modified Clarkson integration algorithm was developed and tested for a series of clinical situations. Results: ArtQA has been used in their clinic and can quickly detect inconsistencies and deviations in the entire RT planning process. With the use of the ArtQA tool, the efficiency for plan check including plan quality, data transfer, and delivery check can be improved by at least 60%. The newly developed independent MU calculation tool for MR-Linac reduces the difference between the plan and calculated MUs by 10%. Conclusions: The software tool ArtQA can be used to perform a comprehensive QA check from planning to delivery with conventional Linac or MR-Linac and is an essential tool for online replanning where the QA check needs to be performed rapidly.« less
Variations in Daily Sleep Quality and Type 1 Diabetes Management in Late Adolescents
Queen, Tara L.; Butner, Jonathan; Wiebe, Deborah; Berg, Cynthia A.
2016-01-01
Objective To determine how between- and within-person variability in perceived sleep quality were associated with adolescent diabetes management. Methods A total of 236 older adolescents with type 1 diabetes reported daily for 2 weeks on sleep quality, self-regulatory failures, frequency of blood glucose (BG) checks, and BG values. Average, inconsistent, and daily deviations in sleep quality were examined. Results Hierarchical linear models indicated that poorer average and worse daily perceived sleep quality (compared with one’s average) was each associated with more self-regulatory failures. Sleep quality was not associated with frequency of BG checking. Poorer average sleep quality was related to greater risk of high BG. Furthermore, inconsistent and daily deviations in sleep quality interacted to predict higher BG, with more consistent sleepers benefitting more from a night of high-quality sleep. Conclusions Good, consistent sleep quality during late adolescence may benefit diabetes management by reducing self-regulatory failures and risk of high BG. PMID:26994852
Bellaloui, Nacer; Smith, James R; Mengistu, Alemu
2017-01-01
The timing of harvest is a major factor affecting seed quality in soybean, particularly in Midsouthern USA, when rain during harvest period is not uncommon. The objective of this research was to evaluate the effects of time of harvest on soybean seed quality (seed composition, germination, seed coat boron, and lignin) in high germinability (HG) breeding lines (50% exotic) developed under high heat. The hypothesis was that seeds of HG lines possess physiological and genetic traits for a better seed quality at harvest maturity and delayed harvest. A 2-year field experiment was conducted under irrigated conditions. Results showed that, at harvest maturity, the exotic HG lines had higher seed protein, oleic acid, sugars, seed coat boron, and seed coat lignin, but lower seed oil compared with the non-exotic checks (Control), confirming our hypothesis. At 28 days after harvest maturity (delayed harvest), the content of seed protein, oleic acid, sugars, seed coat boron, and seed coat lignin were higher in some of the HG lines compared with the checks, indicating a possible involvement of these seed constituents, especially seed coat boron and seed coat lignin, in maintaining seed coat integrity and protecting seed coat against physical damage. Highly significant positive correlations were found between germination and seed protein, oleic acid, sugars, and seed coat boron and seed coat lignin. Highly significant negative correlation was found between germination and oil, linoleic acid, seed coat wrinkling, shattering, and hard seed. Yields of some HG lines were competitive with checks. This research demonstrated that time of harvesting is an important factor influencing seed protein and oil production. Also, since high oleic acid is desirable for oxidative stability, shelf-life and biodiesel properties, using HG lines could positively influence these important traits. This result should suggest to breeders of some of the advantages of selecting for high seed coat boron and lignin, and inform growers of the importance of timely harvest for maintaining high seed quality.
[Update on microbiological quality assurance meat and meat products in Morocco].
Rachidi, H; Latrache, H
2018-03-01
Food safety has become an absolute necessity in all countries. As a result, Morocco has taken several measures and actions to develop food safety and food-borne disease control. This study aimed to highlight the level of improvement in the quality assurance of meat and meat products in Morocco. It is based on a non-exhaustive review of the regulatory texts governing food safety in the country, as well as a statistical study on establishments of meat and meat products adopting a self-checking system and approved by the National Office of Sanitary Safety of Food. Morocco has introduced several laws and regulations requiring sanitary control of food products. Also, the number of establishments of meat and meat products adopting a system of self-control and approved by the National Office of Sanitary Safety of Food has improved significantly. It has increased from 58 in 2007 to 273 in 2016. The adoption of self-monitoring systems allows better access to international markets, improved quality of food products and a considerable reduction in microbial contamination. Copyright © 2018 Elsevier Masson SAS. All rights reserved.
Lake water quality mapping from Landsat
NASA Technical Reports Server (NTRS)
Scherz, J. P.
1977-01-01
In the project described remote sensing was used to check the quality of lake waters. The lakes of three Landsat scenes were mapped with the Bendix MDAS multispectral analysis system. From the MDAS color coded maps, the lake with the worst algae problem was easily located. The lake was closely checked, and the presence of 100 cows in the springs which fed the lake could be identified as the pollution source. The laboratory and field work involved in the lake classification project is described.
Whole grain cereals for the primary or secondary prevention of cardiovascular disease.
Kelly, Sarah Am; Hartley, Louise; Loveman, Emma; Colquitt, Jill L; Jones, Helen M; Al-Khudairy, Lena; Clar, Christine; Germanò, Roberta; Lunn, Hannah R; Frost, Gary; Rees, Karen
2017-08-24
There is evidence from observational studies that whole grains can have a beneficial effect on risk for cardiovascular disease (CVD). Earlier versions of this review found mainly short-term intervention studies. There are now longer-term randomised controlled trials (RCTs) available. This is an update and expansion of the original review conducted in 2007. The aim of this systematic review was to assess the effect of whole grain foods or diets on total mortality, cardiovascular events, and cardiovascular risk factors (blood lipids, blood pressure) in healthy people or people who have established cardiovascular disease or related risk factors, using all eligible RCTs. We searched CENTRAL (Issue 8, 2016) in the Cochrane Library, MEDLINE (1946 to 31 August 2016), Embase (1980 to week 35 2016), and CINAHL Plus (1937 to 31 August 2016) on 31 August 2016. We also searched ClinicalTrials.gov on 5 July 2017 and the World Health Organization International Clinical Trials Registry Platform (WHO ICTRP) on 6 July 2017. We checked reference lists of relevant articles and applied no language restrictions. We selected RCTs assessing the effects of whole grain foods or diets containing whole grains compared to foods or diets with a similar composition, over a minimum of 12 weeks, on cardiovascular disease and related risk factors. Eligible for inclusion were healthy adults, those at increased risk of CVD, or those previously diagnosed with CVD. Two review authors independently selected studies. Data were extracted and quality-checked by one review author and checked by a second review author. A second review author checked the analyses. We assessed treatment effect using mean difference in a fixed-effect model and heterogeneity using the I 2 statistic and the Chi 2 test of heterogeneity. We assessed the overall quality of evidence using GRADE with GRADEpro software. We included nine RCTs randomising a total of 1414 participants (age range 24 to 70; mean age 45 to 59, where reported) to whole grain versus lower whole grain or refined grain control groups. We found no studies that reported the effect of whole grain diets on total cardiovascular mortality or cardiovascular events (total myocardial infarction, unstable angina, coronary artery bypass graft surgery, percutaneous transluminal coronary angioplasty, total stroke). All included studies reported the effect of whole grain diets on risk factors for cardiovascular disease including blood lipids and blood pressure. All studies were in primary prevention populations and had an unclear or high risk of bias, and no studies had an intervention duration greater than 16 weeks.Overall, we found no difference between whole grain and control groups for total cholesterol (mean difference 0.07, 95% confidence interval -0.07 to 0.21; 6 studies (7 comparisons); 722 participants; low-quality evidence).Using GRADE, we assessed the overall quality of the available evidence on cholesterol as low. Four studies were funded by independent national and government funding bodies, while the remaining studies reported funding or partial funding by organisations with commercial interests in cereals. There is insufficient evidence from RCTs of an effect of whole grain diets on cardiovascular outcomes or on major CVD risk factors such as blood lipids and blood pressure. Trials were at unclear or high risk of bias with small sample sizes and relatively short-term interventions, and the overall quality of the evidence was low. There is a need for well-designed, adequately powered RCTs with longer durations assessing cardiovascular events as well as cardiovascular risk factors.
Member Checking: A Tool to Enhance Trustworthiness or Merely a Nod to Validation?
Birt, Linda; Scott, Suzanne; Cavers, Debbie; Campbell, Christine; Walter, Fiona
2016-06-22
The trustworthiness of results is the bedrock of high quality qualitative research. Member checking, also known as participant or respondent validation, is a technique for exploring the credibility of results. Data or results are returned to participants to check for accuracy and resonance with their experiences. Member checking is often mentioned as one in a list of validation techniques. This simplistic reporting might not acknowledge the value of using the method, nor its juxtaposition with the interpretative stance of qualitative research. In this commentary, we critique how member checking has been used in published research, before describing and evaluating an innovative in-depth member checking technique, Synthesized Member Checking. The method was used in a study with patients diagnosed with melanoma. Synthesized Member Checking addresses the co-constructed nature of knowledge by providing participants with the opportunity to engage with, and add to, interview and interpreted data, several months after their semi-structured interview. © The Author(s) 2016.
42 CFR 493.1254 - Standard: Maintenance and function checks.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 42 Public Health 5 2010-10-01 2010-10-01 false Standard: Maintenance and function checks. 493.1254 Section 493.1254 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STANDARDS AND CERTIFICATION LABORATORY REQUIREMENTS Quality System for Nonwaived...
Rah, Jeong-Eun; Shin, Dongho; Oh, Do Hoon; Kim, Tae Hyun; Kim, Gwe-Ya
2014-09-01
To evaluate and improve the reliability of proton quality assurance (QA) processes and, to provide an optimal customized tolerance level using the statistical process control (SPC) methodology. The authors investigated the consistency check of dose per monitor unit (D/MU) and range in proton beams to see whether it was within the tolerance level of the daily QA process. This study analyzed the difference between the measured and calculated ranges along the central axis to improve the patient-specific QA process in proton beams by using process capability indices. The authors established a customized tolerance level of ±2% for D/MU and ±0.5 mm for beam range in the daily proton QA process. In the authors' analysis of the process capability indices, the patient-specific range measurements were capable of a specification limit of ±2% in clinical plans. SPC methodology is a useful tool for customizing the optimal QA tolerance levels and improving the quality of proton machine maintenance, treatment delivery, and ultimately patient safety.
The need for a formalised system of Quality Control for environmental policy-science.
Larcombe, Piers; Ridd, Peter
2018-01-01
Research science used to inform public policy decisions, herein defined as "Policy-Science", is rarely subjected to rigorous checking, testing and replication. Studies of biomedical and other sciences indicate that a considerable fraction of published peer-reviewed scientific literature, perhaps half, has significant flaws. To demonstrate the potential failings of the present approaches to scientific Quality Control (QC), we describe examples of science associated with perceived threats to the Great Barrier Reef (GBR), Australia. There appears a serious risk of efforts to improve the health of the GBR being directed inefficiently and/or away from the more serious threats. We suggest the need for a new organisation to undertake quality reviews and audits of important scientific results that underpin government spending decisions on the environment. Logically, such a body could also examine policy science in other key areas where governments rely heavily upon scientific results, such as education, health and criminology. Copyright © 2017 Elsevier Ltd. All rights reserved.
A study to assess the long-term stability of the ionization chamber reference system in the LNMRI
NASA Astrophysics Data System (ADS)
Trindade Filho, O. L.; Conceição, D. A.; da Silva, C. J.; Delgado, J. U.; de Oliveira, A. E.; Iwahara, A.; Tauhata, L.
2018-03-01
Ionization chambers are used as secondary standard in order to maintain the calibration factors of radionuclides in the activity measurements in metrology laboratories. Used as radionuclide calibrator in nuclear medicine clinics to control dose in patients, its long-term performance is not evaluated systematically. A methodology for long-term evaluation for its stability is monitored and checked. Historical data produced monthly of 2012 until 2017, by an ionization chamber, electrometer and 226Ra, were analyzed via control chart, aiming to follow the long-term performance. Monitoring systematic errors were consistent within the limits of control, demonstrating the quality of measurements in compliance with ISO17025.
NASA Technical Reports Server (NTRS)
1972-01-01
The guidelines for selecting hardware to be used in manned spacecraft to obtain a five year operational lifetime without maintenance were developed. An analysis was conducted on the design, application, failure mechanisms, manufacturing processes and controls, screen and burn-in techniques, and quality control of hardware items. The equipment considered for evaluation include: (1) electric motors and bearings; (2) accelerometers; (3) gyroscopes and bearings; (4) compressors and pumps, (5) magnetic tape recorders; (6) plumbing components and tubing; (7) check valves; (8) pressure regulators and solenoid valves; (9) thermal control valves; (10) pressure vessels and positive expulsion devices; (11) nickel cadmium batteries; and (12) transducers.
Rostami, R; Dehghani-Arani, F
2015-09-01
This study aimed to compare the effectiveness of neurofeedback (NFB) plus pharmacotherapy with pharmacotherapy alone, on addiction severity, mental health, and quality of life in crystal methamphetamine-dependent (CMD) patients. The study included 100 CMD patients undergoing a medical treatment who volunteered for this randomized controlled trial. After being evaluated by a battery of questionnaires that included addiction severity index questionnaire, Symptoms Check List 90 version, and World Health Organization Quality of Life, the participants were randomly assigned to an experimental or a control group. The experimental group received thirty 50-min sessions of NFB in addition to their usual medication over a 2-month period; meanwhile, the control group received only their usual medication. In accordance with this study's pre-test-post-test design, both study groups were evaluated again after completing their respective treatment regimens. Multivariate analysis of covariance showed the experimental group to have lower severity of addiction, better psychological health, and better quality of life in than the control group. The differences between the two groups were statistically significant. These finding suggest that NFB can be used to improve the effectiveness of treatment results in CMD patients.
[New idea of traditional Chinese medicine quality control based on "composition structure" theory].
Liu, Dan; Jia, Xiaobin; Yu, Danhong
2012-03-01
On the road of the modern Chinese medicine developing internationally, there is a key issues that setting up a reasonable, accurate and be quantified quality evaluation system which is comply with the basic theory of Chinese medicine. Based on the overall understanding of the role of traditional Chinese medicine components, author suggested that the idea of "structural components" theory should be embedded into the system and thought the Chinese medicine play a multi-target, multi-channel pharmacodynamic effects founded on the specific microcosmic structural relationship between the components and the components within the group. At present, the way of Chinese pharmacopoeia checking the quality of Chinese medicine is mainly depends on controlling the single or multiple targets of ingredients. In fact, this way is out of the overall effectiveness of the Chinese medicine, so we can not thoroughly controlling the quality of Chinese medicine from the essence of the Chinese medicine. Secondly, it's only macro-structural quantity that the Chinese pharmacopoeia just controlling the less effective ingredients, this is not enough to reflect the internal microstructure of the integrity and systematic. In other words, this cannot reflect the structural components of the Chinese medicine (the essence of traditional Chinese medicine). In view of above mentioned reasons, the author propose the new idea on the quality control in the medicine that quantify the ratio structural relationship in component and the ingredients of the components, set the optimal controlling proportion between the components and ingredients. At the same time, author thought we should conduct the depth study in the micro-quantified the multi-component and multi-ingredient, in the process of studying the material basis of Chinese medicine. Therefore, it could establish a more rational basis for the Chinese medicine quality controlling system.
77 FR 67344 - Proposed Information Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-09
... Criminal History Checks. DATES: Written comments must be submitted to the individual and office listed in... methodology and assumptions used; Enhance the quality, utility, and clarity of the information to be collected... Criminal History Check. CNCS and its grantees must ensure that national service beneficiaries are protected...
The importance of reference materials in doping-control analysis.
Mackay, Lindsey G; Kazlauskas, Rymantas
2011-08-01
Currently a large range of pure substance reference materials are available for calibration of doping-control methods. These materials enable traceability to the International System of Units (SI) for the results generated by World Anti-Doping Agency (WADA)-accredited laboratories. Only a small number of prohibited substances have threshold limits for which quantification is highly important. For these analytes only the highest quality reference materials that are available should be used. Many prohibited substances have no threshold limits and reference materials provide essential identity confirmation. For these reference materials the correct identity is critical and the methods used to assess identity in these cases should be critically evaluated. There is still a lack of certified matrix reference materials to support many aspects of doping analysis. However, in key areas a range of urine matrix materials have been produced for substances with threshold limits, for example 19-norandrosterone and testosterone/epitestosterone (T/E) ratio. These matrix-certified reference materials (CRMs) are an excellent independent means of checking method recovery and bias and will typically be used in method validation and then regularly as quality-control checks. They can be particularly important in the analysis of samples close to threshold limits, in which measurement accuracy becomes critical. Some reference materials for isotope ratio mass spectrometry (IRMS) analysis are available and a matrix material certified for steroid delta values is currently under production. In other new areas, for example the Athlete Biological Passport, peptide hormone testing, designer steroids, and gene doping, reference material needs still need to be thoroughly assessed and prioritised.
PLC based automatic control of pasteurize mix in ice cream production
NASA Astrophysics Data System (ADS)
Yao, Xudong; Liang, Kai
2013-03-01
This paper describes the automatic control device of pasteurized mix in the ice cream production process.We design a scheme of control system using FBD program language and develop the programmer in the STEP 7-Micro/WIN software, check for any bugs before downloading into PLC .These developed devices will able to provide flexibility and accuracy to control the step of pasteurized mix. The operator just Input the duration and temperature of pasteurized mix through control panel. All the steps will finish automatically without any intervention in a preprogrammed sequence stored in programmable logic controller (PLC). With the help of this equipment we not only can control the quality of ice cream for various conditions, but also can simplify the production process. This control system is inexpensive and can be widely used in ice cream production industry.
Gowrisankar, G; Jagadeshan, G; Elango, L
2017-04-01
In many regions around the globe, including India, degradation in the quality of groundwater is of great concern. The objective of this investigation is to determine the effect of recharge from a check dam on quality of groundwater in a region of Krishnagiri District of Tamil Nadu State, India. For this study, water samples from 15 wells were periodically obtained and analysed for major ions and fluoride concentrations. The amount of major ions present in groundwater was compared with the drinking water guideline values of the Bureau of Indian Standards. With respect to the sodium and fluoride concentrations, 38% of groundwater samples collected was not suitable for direct use as drinking water. Suitability of water for agricultural use was determined considering the electrical conductivity, sodium adsorption ratio, sodium percentage, permeability index, Wilcox and United States Salinity Laboratory diagrams. The influence of freshwater recharge from the dam is evident as the groundwater in wells nearer to the check dam was suitable for both irrigation and domestic purposes. However, the groundwater away from the dam had a high ionic composition. This study demonstrated that in other fluoride-affected areas, the concentration can be reduced by dilution with the construction of check dams as a measure of managed aquifer recharge.
NASA Technical Reports Server (NTRS)
Lane, J. H.
1976-01-01
Performance tests completed on the Space Shuttle Carrier Aircraft (SCA) transceiver console, verifying its design objectives, were described. These tests included: (1) check of power supply voltages for correct output voltage and energization at the proper point in the turn on sequence, (2) check of cooling system (LRU blower, overload sensors and circuitry, and thermocouple probe), (3) check of control circuits logic, including the provisions for remote control and display, (4) check of the LRU connector for presence of correct voltages and absence of incorrect voltages under both energized and deenergized conditions, and (5) check of the AGC and power output monitor circuits.
Smoking habits and health-related quality of life in a rural Japanese population.
Funahashi, Koichi; Takahashi, Ippei; Danjo, Kazuma; Matsuzaka, Masashi; Umeda, Takashi; Nakaji, Shigeyuki
2011-03-01
To investigate the association between smoking and health-related quality of life (HRQOL) in a rural Japanese population. A cross-sectional study of data from 823 subjects in Iwaki area of Hirosaki City, Japan. SF-36 scores between non-smokers and smokers were compared. To test the sensitivity of SF-36 scores in detecting health deterioration, effects of having diseases and having deviations from normal thresholds in health check-up were analyzed by adding them into covariates in ANCOVA. There was no significant difference in SF-36 scores between non-smokers and smokers. Presence of diseases significantly decreased the physical components of SF-36 scores while the results of health check-up had no significant influence on SF-36 scores. The results suggested the possibility that in Japan, where smoking prevalence is still relatively high, smokers may be less sensitive to sub-clinical deterioration in their own health status than smokers in Western countries that already have experienced the major decline in their smoking rate. The importance of having the smoker become more sensitive to the sub-clinical adverse effects of cigarette smoking should be stressed for the success of smoking control programs.
Data Quality Control and Maintenance for the Qweak Experiment
NASA Astrophysics Data System (ADS)
Heiner, Nicholas; Spayde, Damon
2014-03-01
The Qweak collaboration seeks to quantify the weak charge of a proton through the analysis of the parity-violating electron asymmetry in elastic electron-proton scattering. The asymmetry is calculated by measuring how many electrons deflect from a hydrogen target at the chosen scattering angle for aligned and anti-aligned electron spins, then evaluating the difference between the numbers of deflections that occurred for both polarization states. The weak charge can then be extracted from this data. Knowing the weak charge will allow us to calculate the electroweak mixing angle for the particular Q2 value of the chosen electrons, which the Standard Model makes a firm prediction for. Any significant deviation from this prediction would be a prime indicator of the existence of physics beyond what the Standard Model describes. After the experiment was conducted at Jefferson Lab, collected data was stored within a MySQL database for further analysis. I will present an overview of the database and its functions as well as a demonstration of the quality checks and maintenance performed on the data itself. These checks include an analysis of errors occurring throughout the experiment, specifically data acquisition errors within the main detector array, and an analysis of data cuts.
Geochemical and analytical implications of extensive sulfur retention in ash from Indonesian peats
Kane, Jean S.; Neuzil, Sandra G.
1993-01-01
Sulfur is an analyte of considerable importance to the complete major element analysis of ash from low-sulfur, low-ash Indonesian peats. Most analytical schemes for major element peat- and coal-ash analyses, including the inductively coupled plasma atomic emission spectrometry method used in this work, do not permit measurement of sulfur in the ash. As a result, oxide totals cannot be used as a check on accuracy of analysis. Alternative quality control checks verify the accuracy of the cation analyses. Cation and sulfur correlations with percent ash yield suggest that silicon and titanium, and to a lesser extent, aluminum, generally originate as minerals, whereas magnesium and sulfur generally originate from organic matter. Cation correlations with oxide totals indicate that, for these Indonesian peats, magnesium dominates sulfur fixation during ashing because it is considerably more abundant in the ash than calcium, the next most important cation in sulfur fixation.
MilxXplore: a web-based system to explore large imaging datasets.
Bourgeat, P; Dore, V; Villemagne, V L; Rowe, C C; Salvado, O; Fripp, J
2013-01-01
As large-scale medical imaging studies are becoming more common, there is an increasing reliance on automated software to extract quantitative information from these images. As the size of the cohorts keeps increasing with large studies, there is a also a need for tools that allow results from automated image processing and analysis to be presented in a way that enables fast and efficient quality checking, tagging and reporting on cases in which automatic processing failed or was problematic. MilxXplore is an open source visualization platform, which provides an interface to navigate and explore imaging data in a web browser, giving the end user the opportunity to perform quality control and reporting in a user friendly, collaborative and efficient way. Compared to existing software solutions that often provide an overview of the results at the subject's level, MilxXplore pools the results of individual subjects and time points together, allowing easy and efficient navigation and browsing through the different acquisitions of a subject over time, and comparing the results against the rest of the population. MilxXplore is fast, flexible and allows remote quality checks of processed imaging data, facilitating data sharing and collaboration across multiple locations, and can be easily integrated into a cloud computing pipeline. With the growing trend of open data and open science, such a tool will become increasingly important to share and publish results of imaging analysis.
Direct to consumer advertising via the Internet, a study of hip resurfacing.
Ogunwale, B; Clarke, J; Young, D; Mohammed, A; Patil, S; Meek, R M D
2009-02-01
With increased use of the internet for health information and direct to consumer advertising from medical companies, there is concern about the quality of information available to patients. The aim of this study was to examine the quality of health information on the internet for hip resurfacing. An assessment tool was designed to measure quality of information. Websites were measured on credibility of source; usability; currentness of the information; content relevance; content accuracy/completeness and disclosure/bias. Each website assessed was given a total score, based on number of scores achieved from the above categories websites were further analysed on author, geographical origin and possession of an independent credibility check. There was positive correlation between the overall score for the website and the score of each website in each assessment category. Websites by implant companies, doctors and hospitals scored poorly. Websites with an independent credibility check such as Health on the Net (HoN) scored twice the total scores of websites without. Like other internet health websites, the quality of information on hip resurfacing websites is variable. This study highlights methods by which to assess the quality of health information on the internet and advocates that patients should look for a statement of an "independent credibility check" when searching for information on hip resurfacing.
Does integrated care lead to both improved service quality and lower care cost
Waldeyer, Regina; Siegel, Achim; Daul, Gisela; Gaiser, Karin; Hildebrandt, Helmut; Köster, Ingrid; Schubert, Ingrid; Stunder, Brigitte; Stützle, Yvonne
2010-01-01
Purpose and context ‘Gesundes Kinzigtal’ is one of the few population-based integrated care approaches in Germany, organising care across all health service sectors and indications. The management company and its contracting partners (the physicians’ network in the region and two statutory health insurers) strive to reach a higher quality of care at a lower overall cost as compared with the German standard. During its first two years of operation (2006–2007), the Kinzigtal project achieved surprisingly positive financial results compared with its reference value. To gain independent evidence on the quality aspects of the system, the management company and its partners provided a remarkable budget for its evaluation by independent scientific institutions. Case description and data sources We will present interim results of a population-based controlled cohort study. In this study, quality of care is checked by relying on health and service quality indicators that have been constructed from health insurers’ administrative data (claims data). Interim results are presented for the intervention region (Kinzigtal area) and the control region (the rest of Baden-Württemberg, i.e., Southwest Germany). Preliminary conclusions and discussion The evaluation of ‘Gesundes Kinzigtal’ is in full progress. Until now, there is no evidence that the surprisingly positive financial results of the Kinzigtal system have been achieved at the expense of care quality. Rather, Gesundes Kinzigtal Integrated Care seems to be about to increasingly realize comparative advantages regarding health service quality (in comparison to the control region).
Sub-pixel analysis to support graphic security after scanning at low resolution
NASA Astrophysics Data System (ADS)
Haas, Bertrand; Cordery, Robert; Gou, Hongmei; Decker, Steve
2006-02-01
Whether in the domain of audio, video or finance, our world tends to become increasingly digital. However, for diverse reasons, the transition from analog to digital is often much extended in time, and proceeds by long steps (and sometimes never completes). One such step is the conversion of information on analog media to digital information. We focus in this paper on the conversion (scanning) of printed documents to digital images. Analog media have the advantage over digital channels that they can harbor much imperceptible information that can be used for fraud detection and forensic purposes. But this secondary information usually fails to be retrieved during the conversion step. This is particularly relevant since the Check-21 act (Check Clearing for the 21st Century act) became effective in 2004 and allows images of checks to be handled by banks as usual paper checks. We use here this situation of check scanning as our primary benchmark for graphic security features after scanning. We will first present a quick review of the most common graphic security features currently found on checks, with their specific purpose, qualities and disadvantages, and we demonstrate their poor survivability after scanning in the average scanning conditions expected from the Check-21 Act. We will then present a novel method of measurement of distances between and rotations of line elements in a scanned image: Based on an appropriate print model, we refine direct measurements to an accuracy beyond the size of a scanning pixel, so we can then determine expected distances, periodicity, sharpness and print quality of known characters, symbols and other graphic elements in a document image. Finally we will apply our method to fraud detection of documents after gray-scale scanning at 300dpi resolution. We show in particular that alterations on legitimate checks or copies of checks can be successfully detected by measuring with sub-pixel accuracy the irregularities inherently introduced by the illegitimate process.
NASA Astrophysics Data System (ADS)
Wang, Xiaofeng; Jiang, Qin; Zhang, Lei
2016-04-01
A quality control system for the Aircraft Meteorological Data Relay (AMDAR) data has been implemented in China. This system is an extension to the AMDAR quality control system used at the US National Centers for Environmental Prediction. We present a study in which the characteristics of each AMDAR data quality type were examined and the impact of the AMDAR data quality system on short-range convective weather forecasts using the WRF model was investigated. The main results obtained from this study are as follows. (1) The hourly rejection rate of AMDAR data during 2014 was 5.79%, and most of the rejections happened in Near Duplicate Check. (2) There was a significant diurnal variation for both quantity and quality of AMDAR data. Duplicated reports increased with the increase of data quantity, while suspicious and disorderly reports decreased with the increase of data quantity. (3) The characteristics of the data quality were different in each model layer, with the quality problems occurring mainly at the surface as well as at the height where the power or the flight mode of the aircraft underwent adjustment. (4) Assimilating the AMDAR data improved the forecast accuracy, particularly over the region where strong convection occurred. (5) Significant improvements made by assimilating AMDAR data were found after six hours into the model forecast. The conclusion from this study is that the newly implemented AMDAR data quality system can help improve the accuracy of short-range convection forecasts using the WRF model.
Simşek, Hülya; Ceyhan, Ismail; Tarhan, Gülnur; Güner, Uğur
2010-10-01
Recently, the diagnosis of pulmonary tuberculosis (TB) has based on smear microscopy in the Direct Observed Treatment Strategy (DOTS) programme which provides the basis of treatment worldwide. Microscopic detection of AFB (Acid-Fast Bacilli) is one of the main components in the National TB Control Programmes (NTCP). Precision level in microscopy procedures and evaluations are the most important steps for accurate diagnosis of the disease and to initiate proper treatment. Therefore, the external quality assessment (EQA) is the most important implement to provide the reliability and validity of tests. In countries where NTCP are performed, this task is fulfilled by the National Reference Laboratories (NRL) according to the guidelines of the World Health Organization (WHO). For this purpose a pilot study was initiated by the central NRL of Turkey for EQA of AFB smear microscopy as part of the NTCP on January 1, 2005. A total of 5 laboratories of which 2 were district TB laboratories (A, B), 2 were tuberculosis control dispensaries (C, D), 1 was a national reference laboratory (E), participated in this study. Blind re-checking method (re-examination of randomly selected slides) was used for the evaluation, and the slides were sent to the central NRL with 3 months interval, four times a year, selected according to LQAS (Lot Quality Assurance Sampling) guides. In the re-evaluation of the slides, false positivity (FP), false negativity (FN) and quantification errors (QE) were noted. Laboratory A, sent totally 525 slides between January 1, 2005 and April 1, 2008. In the result of re-checking, 514 (97.9%) slides were found concordant, and 11 (2.1%) were discordant (10 FP, 1 FN). Laboratory B, participated in the study between October 1, 2005 and July 1, 2006 and of the 67 re-examined slides, 60 (89.5%) were concordant and 7 (10.5%) were discordant (2 FP, 0 FN, 5 QE). Laboratory C, sent 235 slides between January 1, 2005 and April 1, 2006; of them 218 (92.8%) were detected as compatible and 17 (7.2%) slides were incompatible (4 FP, 9 FN, 4 QE). Laboratory D, participated in QC for only once between January 1, 2008 and April 1, 2008; and all the 50 slides were found compatible, with no FP, FN and QE. Laboratory E, was included in the study between January 1, 2005 and January 1, 2008 and of the 696 re-checked slides, 690 (99.1%) were reported as compatible and 6 (0.9%) were incompatible (3 FN, 3 QE). Following EQA, on-site evaluation of the laboratories with major errors, was performed and necessary adjustments and training were done. In conclusion, external quality control measures for AFB microscopy is crucial and essential for the tuberculosis laboratory performances for accurate and reliable results.
Reporting the accuracy of biochemical measurements for epidemiologic and nutrition studies.
McShane, L M; Clark, L C; Combs, G F; Turnbull, B W
1991-06-01
Procedures for reporting and monitoring the accuracy of biochemical measurements are presented. They are proposed as standard reporting procedures for laboratory assays for epidemiologic and clinical-nutrition studies. The recommended procedures require identification and estimation of all major sources of variability and explanations of laboratory quality control procedures employed. Variance-components techniques are used to model the total variability and calculate a maximum percent error that provides an easily understandable measure of laboratory precision accounting for all sources of variability. This avoids ambiguities encountered when reporting an SD that may taken into account only a few of the potential sources of variability. Other proposed uses of the total-variability model include estimating precision of laboratory methods for various replication schemes and developing effective quality control-checking schemes. These procedures are demonstrated with an example of the analysis of alpha-tocopherol in human plasma by using high-performance liquid chromatography.
Is a quasi-3D dosimeter better than a 2D dosimeter for Tomotherapy delivery quality assurance?
NASA Astrophysics Data System (ADS)
Xing, Aitang; Deshpande, Shrikant; Arumugam, Sankar; George, Armia; Holloway, Lois; Vial, Philip; Goozee, Gary
2015-01-01
Delivery quality assurance (DQA) has been performed for each Tomotherapy patient either using ArcCHECK or MatriXX Evolution in our clinic since 2012. ArcCHECK is a quasi-3D dosimeter whereas MatriXX is a 2D detector. A review of DQA results was performed for all patients in the last three years, a total of 221 DQA plans. These DQA plans came from 215 patients with a variety of treatment sites including head-neck, pelvis, and chest wall. The acceptable Gamma pass rate in our clinic is over 95% using 3mm and 3% of maximum planned dose with 10% dose threshold. The mean value and standard deviation of Gamma pass rates were 98.2% ± 1.98(1SD) for MatriXX and 98.5%±1.88 (1SD) for ArcCHECK. A paired t-test was also performed for the groups of patients whose DQA was performed with both the ArcCHECK and MatriXX. No statistical dependence was found in terms of the Gamma pass rate for ArcCHECK and MatriXX. The considered 3D and 2D dosimeters have achieved similar results in performing routine patient-specific DQA for patients treated on a TomoTherapy unit.
Implementation of a GPS-RO data processing system for the KIAPS-LETKF data assimilation system
NASA Astrophysics Data System (ADS)
Kwon, H.; Kang, J.-S.; Jo, Y.; Kang, J. H.
2015-03-01
The Korea Institute of Atmospheric Prediction Systems (KIAPS) has been developing a new global numerical weather prediction model and an advanced data assimilation system. As part of the KIAPS package for observation processing (KPOP) system for data assimilation, preprocessing, and quality control modules for bending-angle measurements of global positioning system radio occultation (GPS-RO) data have been implemented and examined. The GPS-RO data processing system is composed of several steps for checking observation locations, missing values, physical values for Earth radius of curvature, and geoid undulation. An observation-minus-background check is implemented by use of a one-dimensional observational bending-angle operator, and tangent point drift is also considered in the quality control process. We have tested GPS-RO observations utilized by the Korean Meteorological Administration (KMA) within KPOP, based on both the KMA global model and the National Center for Atmospheric Research Community Atmosphere Model with Spectral Element dynamical core (CAM-SE) as a model background. Background fields from the CAM-SE model are incorporated for the preparation of assimilation experiments with the KIAPS local ensemble transform Kalman filter (LETKF) data assimilation system, which has been successfully implemented to a cubed-sphere model with unstructured quadrilateral meshes. As a result of data processing, the bending-angle departure statistics between observation and background show significant improvement. Also, the first experiment in assimilating GPS-RO bending angle from KPOP within KIAPS-LETKF shows encouraging results.
Check out the Atmospheric Science User Forum
Atmospheric Science Data Center
2016-11-16
Check out the Atmospheric Science User Forum Tuesday, November 15, 2016 The ASDC would like to bring your attention to the Atmospheric Science User Forum. The purpose of this forum is to improve user service, quality, and efficiency of NASA atmospheric science data. The forum intends to provide a quick and easy way to facilitate ...
Improving Quality of Shoe Soles Product using Six Sigma
NASA Astrophysics Data System (ADS)
Jesslyn Wijaya, Athalia; Trusaji, Wildan; Akbar, Muhammad; Ma’ruf, Anas; Irianto, Dradjad
2018-03-01
A manufacture in Bandung produce kind of rubber-based product i.e. trim, rice rollers, shoe soles, etc. After penetrating the shoe soles market, the manufacture has met customer with tight quality control. Based on the past data, defect level of this product was 18.08% that caused the manufacture’s loss of time and money. Quality improvement effort was done using six sigma method that included phases of define, measure, analyse, improve, and control (DMAIC). In the design phase, the object’s problem and definition were defined. Delphi method was also used in this phase to identify critical factors. In the measure phase, the existing process stability and sigma quality level were measured. Fishbone diagram and failure mode and effect analysis (FMEA) were used in the next phase to analyse the root cause and determine the priority issues. Improve phase was done by designing alternative improvement strategy using 5W1H method. Some improvement efforts were identified, i.e. (i) modifying design of the hanging rack, (ii) create pantone colour book and check sheet, (iii) provide pedestrian line at compound department, (iv) buying stop watch, and (v) modifying shoe soles dies. Some control strategies for continuous improvement were proposed such as SOP or reward and punishment system.
Nobels, Frank; Debacker, Noëmi; Brotons, Carlos; Elisaf, Moses; Hermans, Michel P; Michel, Georges; Muls, Erik
2011-09-22
To investigate the effect of physician- and patient-specific feedback with benchmarking on the quality of care in adults with type 2 diabetes mellitus (T2DM). Study centres in six European countries were randomised to either a benchmarking or control group. Physicians in both groups received feedback on modifiable outcome indicators (glycated haemoglobin [HbA1c], glycaemia, total cholesterol, high density lipoprotein-cholesterol, low density lipoprotein [LDL]-cholesterol and triglycerides) for each patient at 0, 4, 8 and 12 months, based on the four times yearly control visits recommended by international guidelines. The benchmarking group also received comparative results on three critical quality indicators of vascular risk (HbA1c, LDL-cholesterol and systolic blood pressure [SBP]), checked against the results of their colleagues from the same country, and versus pre-set targets. After 12 months of follow up, the percentage of patients achieving the pre-determined targets for the three critical quality indicators will be assessed in the two groups. Recruitment was completed in December 2008 with 3994 evaluable patients. This paper discusses the study rationale and design of OPTIMISE, a randomised controlled study, that will help assess whether benchmarking is a useful clinical tool for improving outcomes in T2DM in primary care. NCT00681850.
2011-01-01
Background To investigate the effect of physician- and patient-specific feedback with benchmarking on the quality of care in adults with type 2 diabetes mellitus (T2DM). Methods Study centres in six European countries were randomised to either a benchmarking or control group. Physicians in both groups received feedback on modifiable outcome indicators (glycated haemoglobin [HbA1c], glycaemia, total cholesterol, high density lipoprotein-cholesterol, low density lipoprotein [LDL]-cholesterol and triglycerides) for each patient at 0, 4, 8 and 12 months, based on the four times yearly control visits recommended by international guidelines. The benchmarking group also received comparative results on three critical quality indicators of vascular risk (HbA1c, LDL-cholesterol and systolic blood pressure [SBP]), checked against the results of their colleagues from the same country, and versus pre-set targets. After 12 months of follow up, the percentage of patients achieving the pre-determined targets for the three critical quality indicators will be assessed in the two groups. Results Recruitment was completed in December 2008 with 3994 evaluable patients. Conclusions This paper discusses the study rationale and design of OPTIMISE, a randomised controlled study, that will help assess whether benchmarking is a useful clinical tool for improving outcomes in T2DM in primary care. Trial registration NCT00681850 PMID:21939502
Helical tomotherapy quality assurance with ArcCHECK.
Chapman, David; Barnett, Rob; Yartsev, Slav
2014-01-01
To design a quality assurance (QA) procedure for helical tomotherapy that measures multiple beam parameters with 1 delivery and uses a rotating gantry to simulate treatment conditions. The customized QA procedure was preprogrammed on the tomotherapy operator station. The dosimetry measurements were performed using an ArcCHECK diode array and an A1SL ion chamber inserted in the central holder. The ArcCHECK was positioned 10cm above the isocenter so that the 21-cm diameter detector array could measure the 40-cm wide tomotherapy beam. During the implementation of the new QA procedure, separate comparative measurements were made using ion chambers in both liquid and solid water, the tomotherapy onboard detector array, and a MapCHECK diode array for a period of 10 weeks. There was good agreement (within 1.3%) for the beam output and cone ratio obtained with the new procedure and the routine QA measurements. The measured beam energy was comparable (0.3%) to solid water measurement during the 10-week evaluation period, excluding 2 of the 10 measurements with unusually high background. The symmetry reading was similarly compromised for those 2 weeks, and on the other weeks, it deviated from the solid water reading by ~2.5%. The ArcCHECK phantom presents a suitable alternative for performing helical tomotherapy QA, provided the background is collected properly. The proposed weekly procedure using ArcCHECK and water phantom makes the QA process more efficient. Copyright © 2014 American Association of Medical Dosimetrists. Published by Elsevier Inc. All rights reserved.
Helical tomotherapy quality assurance with ArcCHECK
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chapman, David; Barnett, Rob; Yartsev, Slav, E-mail: slav.yartsev@lhsc.on.ca
2014-07-01
To design a quality assurance (QA) procedure for helical tomotherapy that measures multiple beam parameters with 1 delivery and uses a rotating gantry to simulate treatment conditions. The customized QA procedure was preprogrammed on the tomotherapy operator station. The dosimetry measurements were performed using an ArcCHECK diode array and an A1SL ion chamber inserted in the central holder. The ArcCHECK was positioned 10 cm above the isocenter so that the 21-cm diameter detector array could measure the 40-cm wide tomotherapy beam. During the implementation of the new QA procedure, separate comparative measurements were made using ion chambers in both liquidmore » and solid water, the tomotherapy onboard detector array, and a MapCHECK diode array for a period of 10 weeks. There was good agreement (within 1.3%) for the beam output and cone ratio obtained with the new procedure and the routine QA measurements. The measured beam energy was comparable (0.3%) to solid water measurement during the 10-week evaluation period, excluding 2 of the 10 measurements with unusually high background. The symmetry reading was similarly compromised for those 2 weeks, and on the other weeks, it deviated from the solid water reading by ∼2.5%. The ArcCHECK phantom presents a suitable alternative for performing helical tomotherapy QA, provided the background is collected properly. The proposed weekly procedure using ArcCHECK and water phantom makes the QA process more efficient.« less
Quality assurance of weather data for agricultural system model input
USDA-ARS?s Scientific Manuscript database
It is well known that crop production and hydrologic variation on watersheds is weather related. Rarely, however, is meteorological data quality checks reported for agricultural systems model research. We present quality assurance procedures for agricultural system model weather data input. Problems...
Building validation tools for knowledge-based systems
NASA Technical Reports Server (NTRS)
Stachowitz, R. A.; Chang, C. L.; Stock, T. S.; Combs, J. B.
1987-01-01
The Expert Systems Validation Associate (EVA), a validation system under development at the Lockheed Artificial Intelligence Center for more than a year, provides a wide range of validation tools to check the correctness, consistency and completeness of a knowledge-based system. A declarative meta-language (higher-order language), is used to create a generic version of EVA to validate applications written in arbitrary expert system shells. The architecture and functionality of EVA are presented. The functionality includes Structure Check, Logic Check, Extended Structure Check (using semantic information), Extended Logic Check, Semantic Check, Omission Check, Rule Refinement, Control Check, Test Case Generation, Error Localization, and Behavior Verification.
An analysis of the ArcCHECK-MR diode array's performance for ViewRay quality assurance.
Ellefson, Steven T; Culberson, Wesley S; Bednarz, Bryan P; DeWerd, Larry A; Bayouth, John E
2017-07-01
The ArcCHECK-MR diode array utilizes a correction system with a virtual inclinometer to correct the angular response dependencies of the diodes. However, this correction system cannot be applied to measurements on the ViewRay MR-IGRT system due to the virtual inclinometer's incompatibility with the ViewRay's multiple simultaneous beams. Additionally, the ArcCHECK's current correction factors were determined without magnetic field effects taken into account. In the course of performing ViewRay IMRT quality assurance with the ArcCHECK, measurements were observed to be consistently higher than the ViewRay TPS predictions. The goals of this study were to quantify the observed discrepancies and test whether applying the current factors improves the ArcCHECK's accuracy for measurements on the ViewRay. Gamma and frequency analysis were performed on 19 ViewRay patient plans. Ion chamber measurements were performed at a subset of diode locations using a PMMA phantom with the same dimensions as the ArcCHECK. A new method for applying directionally dependent factors utilizing beam information from the ViewRay TPS was developed in order to analyze the current ArcCHECK correction factors. To test the current factors, nine ViewRay plans were altered to be delivered with only a single simultaneous beam and were measured with the ArcCHECK. The current correction factors were applied using both the new and current methods. The new method was also used to apply corrections to the original 19 ViewRay plans. It was found the ArcCHECK systematically reports doses higher than those actually delivered by the ViewRay. Application of the current correction factors by either method did not consistently improve measurement accuracy. As dose deposition and diode response have both been shown to change under the influence of a magnetic field, it can be concluded the current ArcCHECK correction factors are invalid and/or inadequate to correct measurements on the ViewRay system. © 2017 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.
Symbolic LTL Compilation for Model Checking: Extended Abstract
NASA Technical Reports Server (NTRS)
Rozier, Kristin Y.; Vardi, Moshe Y.
2007-01-01
In Linear Temporal Logic (LTL) model checking, we check LTL formulas representing desired behaviors against a formal model of the system designed to exhibit these behaviors. To accomplish this task, the LTL formulas must be translated into automata [21]. We focus on LTL compilation by investigating LTL satisfiability checking via a reduction to model checking. Having shown that symbolic LTL compilation algorithms are superior to explicit automata construction algorithms for this task [16], we concentrate here on seeking a better symbolic algorithm.We present experimental data comparing algorithmic variations such as normal forms, encoding methods, and variable ordering and examine their effects on performance metrics including processing time and scalability. Safety critical systems, such as air traffic control, life support systems, hazardous environment controls, and automotive control systems, pervade our daily lives, yet testing and simulation alone cannot adequately verify their reliability [3]. Model checking is a promising approach to formal verification for safety critical systems which involves creating a formal mathematical model of the system and translating desired safety properties into a formal specification for this model. The complement of the specification is then checked against the system model. When the model does not satisfy the specification, model-checking tools accompany this negative answer with a counterexample, which points to an inconsistency between the system and the desired behaviors and aids debugging efforts.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spirydovich, S; Huq, M
2014-06-15
Purpose: The improvement of quality in healthcare can be assessed by Failure Mode and Effects Analysis (FMEA). In radiation oncology, FMEA, as applied to the billing CPT code 77336, can improve both charge capture and, most importantly, quality of the performed services. Methods: We created an FMEA table for the process performed under CPT code 77336. For a given process step, each member of the assembled team (physicist, dosimetrist, and therapist) independently assigned numerical values for: probability of occurrence (O, 1–10), severity (S, 1–10), and probability of detection (D, 1–10) for every failure mode cause and effect combination. The riskmore » priority number, RPN, was then calculated as a product of O, S and D from which an average RPN was calculated for each combination mentioned above. A fault tree diagram, with each process sorted into 6 categories, was created with linked RPN. For processes with high RPN recommended actions were assigned. 2 separate R and V systems (Lantis and EMR-based ARIA) were considered. Results: We identified 9 potential failure modes and corresponding 19 potential causes of these failure modes all resulting in unjustified 77336 charge and compromised quality of care. In Lantis, the range of RPN was 24.5–110.8, and of S values – 2–10. The highest ranking RPN of 110.8 came from the failure mode described as “end-of-treatment check not done before the completion of treatment”, and the highest S value of 10 (RPN=105) from “overrides not checked”. For the same failure modes, within ARIA electronic environment with its additional controls, RPN values were significantly lower (44.3 for end-of-treatment missing check and 20.0 for overrides not checked). Conclusion: Our work has shown that when charge capture was missed that also resulted in some services not being performed. Absence of such necessary services may result in sub-optimal quality of care rendered to patients.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tan, J; Yan, Y; Hager, F
Purpose: Radiation therapy has evolved to become not only more precise and potent, but also more complicated to monitor and deliver. More rigorous and comprehensive quality assurance is needed to safeguard ever advancing radiation therapy. ICRU standards dictate that an ever growing set of treatment parameters are manually checked weekly by medical physicists. This “weekly chart check” procedure is laborious and subject to human errors or other factors. A computer-assisted chart checking process will enable more complete and accurate human review of critical parameters, reduce the risk of medical errors, and improve the efficiency. Methods: We developed a web-based softwaremore » system that enables a thorough weekly quality assurance checks. In the backend, the software retrieves all machine parameters from a Treatment Management System (TMS) and compares them against the corresponding ones from the treatment planning system. They are also checked for validity against preset rules. The results are displayed as a web page in the front-end for physicists to review. Then a summary report is generated and uploaded automatically to the TMS as a record for weekly chart checking. Results: The software system has been deployed on a web server in our department’s intranet, and has been tested thoroughly by our clinical physicists. A plan parameter would be highlighted when it is off the preset limit. The developed system has changed the way of checking charts with significantly improved accuracy, efficiency, and completeness. It has been shown to be robust, fast, and easy to use. Conclusion: A computer-assisted system has been developed for efficient, accurate, and comprehensive weekly chart checking. The system has been extensively validated and is being implemented for routine clinical use.« less
An efficient visualization method for analyzing biometric data
NASA Astrophysics Data System (ADS)
Rahmes, Mark; McGonagle, Mike; Yates, J. Harlan; Henning, Ronda; Hackett, Jay
2013-05-01
We introduce a novel application for biometric data analysis. This technology can be used as part of a unique and systematic approach designed to augment existing processing chains. Our system provides image quality control and analysis capabilities. We show how analysis and efficient visualization are used as part of an automated process. The goal of this system is to provide a unified platform for the analysis of biometric images that reduce manual effort and increase the likelihood of a match being brought to an examiner's attention from either a manual or lights-out application. We discuss the functionality of FeatureSCOPE™ which provides an efficient tool for feature analysis and quality control of biometric extracted features. Biometric databases must be checked for accuracy for a large volume of data attributes. Our solution accelerates review of features by a factor of up to 100 times. Review of qualitative results and cost reduction is shown by using efficient parallel visual review for quality control. Our process automatically sorts and filters features for examination, and packs these into a condensed view. An analyst can then rapidly page through screens of features and flag and annotate outliers as necessary.
An Integrated Framework for Multipollutant Air Quality Management and Its Application in Georgia
NASA Astrophysics Data System (ADS)
Cohan, Daniel S.; Boylan, James W.; Marmur, Amit; Khan, Maudood N.
2007-10-01
Air protection agencies in the United States increasingly confront non-attainment of air quality standards for multiple pollutants sharing interrelated emission origins. Traditional approaches to attainment planning face important limitations that are magnified in the multipollutant context. Recognizing those limitations, the Georgia Environmental Protection Division has adopted an integrated framework to address ozone, fine particulate matter, and regional haze in the state. Rather than applying atmospheric modeling merely as a final check of an overall strategy, photochemical sensitivity analysis is conducted upfront to compare the effectiveness of controlling various precursor emission species and source regions. Emerging software enables the modeling of health benefits and associated economic valuations resulting from air pollution control. Photochemical sensitivity and health benefits analyses, applied together with traditional cost and feasibility assessments, provide a more comprehensive characterization of the implications of various control options. The fuller characterization both informs the selection of control options and facilitates the communication of impacts to affected stakeholders and the public. Although the integrated framework represents a clear improvement over previous attainment-planning efforts, key remaining shortcomings are also discussed.
An integrated framework for multipollutant air quality management and its application in Georgia.
Cohan, Daniel S; Boylan, James W; Marmur, Amit; Khan, Maudood N
2007-10-01
Air protection agencies in the United States increasingly confront non-attainment of air quality standards for multiple pollutants sharing interrelated emission origins. Traditional approaches to attainment planning face important limitations that are magnified in the multipollutant context. Recognizing those limitations, the Georgia Environmental Protection Division has adopted an integrated framework to address ozone, fine particulate matter, and regional haze in the state. Rather than applying atmospheric modeling merely as a final check of an overall strategy, photochemical sensitivity analysis is conducted upfront to compare the effectiveness of controlling various precursor emission species and source regions. Emerging software enables the modeling of health benefits and associated economic valuations resulting from air pollution control. Photochemical sensitivity and health benefits analyses, applied together with traditional cost and feasibility assessments, provide a more comprehensive characterization of the implications of various control options. The fuller characterization both informs the selection of control options and facilitates the communication of impacts to affected stakeholders and the public. Although the integrated framework represents a clear improvement over previous attainment-planning efforts, key remaining shortcomings are also discussed.
Timmermans, Catherine; Doffagne, Erik; Venet, David; Desmet, Lieven; Legrand, Catherine; Burzykowski, Tomasz; Buyse, Marc
2016-01-01
Data quality may impact the outcome of clinical trials; hence, there is a need to implement quality control strategies for the data collected. Traditional approaches to quality control have primarily used source data verification during on-site monitoring visits, but these approaches are hugely expensive as well as ineffective. There is growing interest in central statistical monitoring (CSM) as an effective way to ensure data quality and consistency in multicenter clinical trials. CSM with SMART™ uses advanced statistical tools that help identify centers with atypical data patterns which might be the sign of an underlying quality issue. This approach was used to assess the quality and consistency of the data collected in the Stomach Cancer Adjuvant Multi-institutional Trial Group Trial, involving 1495 patients across 232 centers in Japan. In the Stomach Cancer Adjuvant Multi-institutional Trial Group Trial, very few atypical data patterns were found among the participating centers, and none of these patterns were deemed to be related to a quality issue that could significantly affect the outcome of the trial. CSM can be used to provide a check of the quality of the data from completed multicenter clinical trials before analysis, publication, and submission of the results to regulatory agencies. It can also form the basis of a risk-based monitoring strategy in ongoing multicenter trials. CSM aims at improving data quality in clinical trials while also reducing monitoring costs.
WQEP - a computer spreadsheet program to evaluate water quality data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liddle, R.G.
1996-12-31
A flexible spreadsheet Water Quality Evaluation Program (WQEP) has been developed for mining companies, consultants, and regulators to interpret the results of water quality sampling. In order properly to evaluate hydrologic data, unit conversions and chemical calculations are done, quality control checks are needed, and a complete and up-to-date listing of water quality standards is necessary. This process is time consuming and tends not to be done for every sample. This program speeds the process by allowing the input of up to 115 chemical parameters from one sample. WQEP compares concentrations with EPA primary and secondary drinking water MCLs ormore » MCLG, EPA warmwater and Coldwater acute and chronic aquatic life criteria, irrigation criteria, livestock criteria, EPA human health criteria, and several other categories of criteria. The spreadsheet allows the input of State or local water standards of interest. Water quality checks include: anion/cations, TDS{sub m}/TDS{sub c} (where m=measured and c=calculated), EC{sub m}/EC{sub c}, EC{sub m}/ion sums, TDS{sub c}/EC ratio, TDS{sub m}/EC, EC vs. alkalinity, two hardness values, and EC vs. {Sigma} cations. WQEP computes the dissolved transport index of 23 parameters, computes ratios of 26 species for trend analysis, calculates non-carbonate alkalinity to adjust the bicarbonate concentration, and calculates 35 interpretive formulas (pE, SAR, S.I., unionized ammonia, ionized sulfide HS-, pK{sub x} values, etc.). Fingerprinting is conducted by automatic generation of stiff diagrams and ion histograms. Mass loading calculations, mass balance calculations, conversions of concentrations, ionic strength, and the activity coefficient and chemical activity of 33 parameters is calculated. This program allows a speedy and thorough evaluation of water quality data from metal mines, coal mining, and natural surface water systems and has been tested against hand calculations.« less
Diagnosis checking of statistical analysis in RCTs indexed in PubMed.
Lee, Paul H; Tse, Andy C Y
2017-11-01
Statistical analysis is essential for reporting of the results of randomized controlled trials (RCTs), as well as evaluating their effectiveness. However, the validity of a statistical analysis also depends on whether the assumptions of that analysis are valid. To review all RCTs published in journals indexed in PubMed during December 2014 to provide a complete picture of how RCTs handle assumptions of statistical analysis. We reviewed all RCTs published in December 2014 that appeared in journals indexed in PubMed using the Cochrane highly sensitive search strategy. The 2014 impact factors of the journals were used as proxies for their quality. The type of statistical analysis used and whether the assumptions of the analysis were tested were reviewed. In total, 451 papers were included. Of the 278 papers that reported a crude analysis for the primary outcomes, 31 (27·2%) reported whether the outcome was normally distributed. Of the 172 papers that reported an adjusted analysis for the primary outcomes, diagnosis checking was rarely conducted, with only 20%, 8·6% and 7% checked for generalized linear model, Cox proportional hazard model and multilevel model, respectively. Study characteristics (study type, drug trial, funding sources, journal type and endorsement of CONSORT guidelines) were not associated with the reporting of diagnosis checking. The diagnosis of statistical analyses in RCTs published in PubMed-indexed journals was usually absent. Journals should provide guidelines about the reporting of a diagnosis of assumptions. © 2017 Stichting European Society for Clinical Investigation Journal Foundation.
Plan-Do-Check-Act and the Management of Institutional Research. AIR 1992 Annual Forum Paper.
ERIC Educational Resources Information Center
McLaughlin, Gerald W.; Snyder, Julie K.
This paper describes the application of a Total Quality Management strategy called Plan-Do-Check-Act (PDCA) to the projects and activities of an institutional research office at the Virginia Polytechnic Institute and State University. PDCA is a cycle designed to facilitate incremental continual improvement through change. The specific steps are…
Simple colonoscopy reporting system checking the detection rate of colon polyps.
Kim, Jae Hyun; Choi, Youn Jung; Kwon, Hye Jung; Park, Seun Ja; Park, Moo In; Moon, Won; Kim, Sung Eun
2015-08-21
To present a simple colonoscopy reporting system that can be checked easily the detection rate of colon polyps. A simple colonoscopy reporting system Kosin Gastroenterology (KG quality reporting system) was developed. The polyp detection rate (PDR), adenoma detection rate (ADR), serrated polyp detection rate (SDR), and advanced adenoma detection rate (AADR) are easily calculated to use this system. In our gastroenterology center, the PDR, ADR, SDR, and AADR test results from each gastroenterologist were updated, every month. Between June 2014, when the program was started, and December 2014, the overall PDR and ADR in our center were 62.5% and 41.4%, respectively. And the overall SDR and AADR were 7.5% and 12.1%, respectively. We envision that KG quality reporting system can be applied to develop a comprehensive system to check colon polyp detection rates in other gastroenterology centers.
Wanja, Elizabeth; Achilla, Rachel; Obare, Peter; Adeny, Rose; Moseti, Caroline; Otieno, Victor; Morang'a, Collins; Murigi, Ephantus; Nyamuni, John; Monthei, Derek R; Ogutu, Bernhards; Buff, Ann M
2017-05-25
One objective of the Kenya National Malaria Strategy 2009-2017 is scaling access to prompt diagnosis and effective treatment. In 2013, a quality assurance (QA) pilot was implemented to improve accuracy of malaria diagnostics at selected health facilities in low-transmission counties of Kenya. Trends in malaria diagnostic and QA indicator performance during the pilot are described. From June to December 2013, 28 QA officers provided on-the-job training and mentoring for malaria microscopy, malaria rapid diagnostic tests and laboratory QA/quality control (QC) practices over four 1-day visits at 83 health facilities. QA officers observed and recorded laboratory conditions and practices and cross-checked blood slides for malaria parasite presence, and a portion of cross-checked slides were confirmed by reference laboratories. Eighty (96%) facilities completed the pilot. Among 315 personnel at pilot initiation, 13% (n = 40) reported malaria diagnostics training within the previous 12 months. Slide positivity ranged from 3 to 7%. Compared to the reference laboratory, microscopy sensitivity ranged from 53 to 96% and positive predictive value from 39 to 53% for facility staff and from 60 to 96% and 52 to 80%, respectively, for QA officers. Compared to reference, specificity ranged from 88 to 98% and negative predictive value from 98 to 99% for health-facility personnel and from 93 to 99% and 99%, respectively, for QA officers. The kappa value ranged from 0.48-0.66 for facility staff and 0.57-0.84 for QA officers compared to reference. The only significant test performance improvement observed for facility staff was for specificity from 88% (95% CI 85-90%) to 98% (95% CI 97-99%). QA/QC practices, including use of positive-control slides, internal and external slide cross-checking and recording of QA/QC activities, all increased significantly across the pilot (p < 0.001). Reference material availability also increased significantly; availability of six microscopy job aids and seven microscopy standard operating procedures increased by a mean of 32 percentage points (p < 0.001) and 38 percentage points (p < 0.001), respectively. Significant gains were observed in malaria QA/QC practices over the pilot. However, these advances did not translate into improved accuracy of malaria diagnostic performance perhaps because of the limited duration of the QA pilot implementation.
Recall intervals for oral health in primary care patients.
Beirne, P; Forgie, A; Clarkson, Je; Worthington, H V
2005-04-18
The frequency with which patients should attend for a dental check-up and the potential effects on oral health of altering recall intervals between check-ups have been the subject of ongoing international debate for almost 3 decades. Although recommendations regarding optimal recall intervals vary between countries and dental healthcare systems, 6-monthly dental check-ups have traditionally been advocated by general dental practitioners in many developed countries. To determine the beneficial and harmful effects of different fixed recall intervals (for example 6 months versus 12 months) for the following different types of dental check-up: a) clinical examination only; b) clinical examination plus scale and polish; c) clinical examination plus preventive advice; d) clinical examination plus preventive advice plus scale and polish. To determine the relative beneficial and harmful effects between any of these different types of dental check-up at the same fixed recall interval. To compare the beneficial and harmful effects of recall intervals based on clinicians' assessment of patients' disease risk with fixed recall intervals. To compare the beneficial and harmful effects of no recall interval/patient driven attendance (which may be symptomatic) with fixed recall intervals. We searched the Cochrane Oral Health Group Trials Register, the Cochrane Central Register of Controlled Trials (CENTRAL), MEDLINE and EMBASE. Reference lists from relevant articles were scanned and the authors of some papers were contacted to identify further trials and obtain additional information. Date of most recent searches: 9th April 2003. Trials were selected if they met the following criteria: design- random allocation of participants; participants - all children and adults receiving dental check-ups in primary care settings, irrespective of their level of risk for oral disease; interventions -recall intervals for the following different types of dental check-ups: a) clinical examination only; b) clinical examination plus scale and polish; c) clinical examination plus preventive advice; d) clinical examination plus scale and polish plus preventive advice; e) no recall interval/patient driven attendance (which may be symptomatic); f) clinician risk-based recall intervals; outcomes - clinical status outcomes for dental caries (including, but not limited to, mean dmft/DMFT, dmfs/DMFS scores, caries increment, filled teeth (including replacement restorations), early carious lesions arrested or reversed); periodontal disease (including, but not limited to, plaque, calculus, gingivitis, periodontitis, change in probing depth, attachment level); oral mucosa (presence or absence of mucosal lesions, potentially malignant lesions, cancerous lesions, size and stage of cancerous lesions at diagnosis). In addition the following outcomes were considered where reported: patient-centred outcomes, economic cost outcomes, other outcomes such as improvements in oral health knowledge and attitudes, harms, changes in dietary habits and any other oral health-related behavioural change. Information regarding methods, participants, interventions, outcome measures and results were independently extracted, in duplicate, by two authors. Authors were contacted, where deemed necessary and where possible, for further details regarding study design and for data clarification. A quality assessment of the included trial was carried out. The Cochrane Oral Health Group's statistical guidelines were followed. Only one study (with 188 participants) was included in this review and was assessed as having a high risk of bias. This study provided limited data for dental caries outcomes (dmfs/DMFS increment) and economic cost outcomes (reported time taken to provide examinations and treatment). There is insufficient evidence from randomised controlled trials (RCTs) to draw any conclusions regarding the potential beneficial and harmful effects of altering the recall interval between dental check-ups. There is insufficient evidence to support or refute the practice of encouraging patients to attend for dental check-ups at 6-monthly intervals. It is important that high quality RCTs are conducted for the outcomes listed in this review in order to address the objectives of this review.
76 FR 50881 - Airworthiness Directives; M7 Aerospace LP Airplanes
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-17
... interconnect primary control cables, and checking and setting of flight control cable tension. This AD was prompted by a report of a failure of a rudder control cable. We are issuing this AD to correct the unsafe... paragraphs (g)(2) or (h)(1) of this AD, check (set) flight control cable tension. (i) Alternative Methods of...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rah, Jeong-Eun; Oh, Do Hoon; Shin, Dongho
Purpose: To evaluate and improve the reliability of proton quality assurance (QA) processes and, to provide an optimal customized tolerance level using the statistical process control (SPC) methodology. Methods: The authors investigated the consistency check of dose per monitor unit (D/MU) and range in proton beams to see whether it was within the tolerance level of the daily QA process. This study analyzed the difference between the measured and calculated ranges along the central axis to improve the patient-specific QA process in proton beams by using process capability indices. Results: The authors established a customized tolerance level of ±2% formore » D/MU and ±0.5 mm for beam range in the daily proton QA process. In the authors’ analysis of the process capability indices, the patient-specific range measurements were capable of a specification limit of ±2% in clinical plans. Conclusions: SPC methodology is a useful tool for customizing the optimal QA tolerance levels and improving the quality of proton machine maintenance, treatment delivery, and ultimately patient safety.« less
[Service quality in health care: the application of the results of marketing research].
Verheggen, F W; Harteloh, P P
1993-01-01
This paper deals with quality assurance in health care and its relation to quality assurance in trade and industry. We present the service quality model--a model of quality from marketing research--and discuss how it can be applied to health care. Traditional quality assurance appears to have serious flaws. It lacks a general theory of the sources of hazards in the complex process of patient care and tends to stagnate, for no real improvement takes place. Departing from this criticism, modern quality assurance in health care is marked by: defining quality in a preferential sense as "fitness for use"; the use of theories and models of trade and industry (process-control); an emphasis on analyzing the process, instead of merely inspecting it; use of the Deming problem solving technique (plan, do, check, act); improvement of the process of care by altering perceptions of parties involved. We present an experience of application and utilization of this method in the University Hospital Maastricht, The Netherlands. The successful application of this model requires a favorable corporate culture and motivation of the health care workers. This model provides a useful framework to uplift the traditional approach to quality assurance in health care.
Are greenhouse gas emissions and cognitive skills related? Cross-country evidence.
Omanbayev, Bekhzod; Salahodjaev, Raufhon; Lynn, Richard
2018-01-01
Are greenhouse gas emissions (GHG) and cognitive skills (CS) related? We attempt to answer this question by exploring this relationship, using cross-country data for 150 countries, for the period 1997-2012. After controlling for the level of economic development, quality of political regimes, population size and a number of other controls, we document that CS robustly predict GHG. In particular, when CS at a national level increase by one standard deviation, the average annual rate of air pollution changes by nearly 1.7% (slightly less than one half of a standard deviation). This significance holds for a number of robustness checks. Copyright © 2017 Elsevier Inc. All rights reserved.
MilxXplore: a web-based system to explore large imaging datasets
Bourgeat, P; Dore, V; Villemagne, V L; Rowe, C C; Salvado, O; Fripp, J
2013-01-01
Objective As large-scale medical imaging studies are becoming more common, there is an increasing reliance on automated software to extract quantitative information from these images. As the size of the cohorts keeps increasing with large studies, there is a also a need for tools that allow results from automated image processing and analysis to be presented in a way that enables fast and efficient quality checking, tagging and reporting on cases in which automatic processing failed or was problematic. Materials and methods MilxXplore is an open source visualization platform, which provides an interface to navigate and explore imaging data in a web browser, giving the end user the opportunity to perform quality control and reporting in a user friendly, collaborative and efficient way. Discussion Compared to existing software solutions that often provide an overview of the results at the subject's level, MilxXplore pools the results of individual subjects and time points together, allowing easy and efficient navigation and browsing through the different acquisitions of a subject over time, and comparing the results against the rest of the population. Conclusions MilxXplore is fast, flexible and allows remote quality checks of processed imaging data, facilitating data sharing and collaboration across multiple locations, and can be easily integrated into a cloud computing pipeline. With the growing trend of open data and open science, such a tool will become increasingly important to share and publish results of imaging analysis. PMID:23775173
Visplause: Visual Data Quality Assessment of Many Time Series Using Plausibility Checks.
Arbesser, Clemens; Spechtenhauser, Florian; Muhlbacher, Thomas; Piringer, Harald
2017-01-01
Trends like decentralized energy production lead to an exploding number of time series from sensors and other sources that need to be assessed regarding their data quality (DQ). While the identification of DQ problems for such routinely collected data is typically based on existing automated plausibility checks, an efficient inspection and validation of check results for hundreds or thousands of time series is challenging. The main contribution of this paper is the validated design of Visplause, a system to support an efficient inspection of DQ problems for many time series. The key idea of Visplause is to utilize meta-information concerning the semantics of both the time series and the plausibility checks for structuring and summarizing results of DQ checks in a flexible way. Linked views enable users to inspect anomalies in detail and to generate hypotheses about possible causes. The design of Visplause was guided by goals derived from a comprehensive task analysis with domain experts in the energy sector. We reflect on the design process by discussing design decisions at four stages and we identify lessons learned. We also report feedback from domain experts after using Visplause for a period of one month. This feedback suggests significant efficiency gains for DQ assessment, increased confidence in the DQ, and the applicability of Visplause to summarize indicators also outside the context of DQ.
NASA Astrophysics Data System (ADS)
Kirmani, Sheeraz; Kumar, Brijesh
2018-01-01
“Electric Power Quality (EPQ) is a term that refers to maintaining the near sinusoidal waveform of power distribution bus voltages and currents at rated magnitude and frequency”. Today customers are more aware of the seriousness that the power quality possesses, this prompt the utilities to assure good quality of power to their customer. The power quality is basically customer centric. Increased focus of utilities toward maintaining reliable power supply by employing power quality improvement tools has reduced the power outages and black out considerably. Good power quality is the characteristic of reliable power supply. Low power factor, harmonic pollution, load imbalance, fast voltage variations are some common parameters which are used to define the power quality. If the power quality issues are not checked i.e. the parameters that define power quality doesn't fall within the predefined standards than it will lead into high electricity bill, high running cost in industries, malfunctioning of equipments, challenges in connecting renewable. Capacitor banks, FACTS devices, harmonic filters, SVC’s (static voltage compensators), STATCOM (Static-Compensator) are the solutions to achieve the power quality. The performance of Wind turbine generators is affected by poor quality power, at the same time these wind power generating plant affects the power quality negatively. This paper presents the STATCOM-BESS (battery energy storage system) system and studies its impact on the power quality in a system which consists of wind turbine generator, non linear load, hysteresis controller for controlling the operation of STATCOM and grid. The model is simulated in the MATLAB/Simulink. This scheme mitigates the power quality issues, improves voltage profile and also reduces harmonic distortion of the waveforms. BESS level out the imbalances caused in real power due to intermittent nature of wind power available due to varying wind speeds.
A School-Based Quality Improvement Program.
ERIC Educational Resources Information Center
Rappaport, Lewis A.
1993-01-01
As one Brooklyn high school discovered, quality improvement begins with administrator commitment and participants' immersion in the literature. Other key elements include ongoing training of personnel involved in the quality-improvement process, tools such as the Deming Cycle (plan-do-check-act), voluntary and goal-oriented teamwork, and a worthy…
In pursuit of quality by viable quality assurance system: the controllers' perceptions.
Aziz, Anwar
2011-01-01
Patients, families and communities expect safe, competent and compassionate nursing care that has always been a core value of nursing. To meet these expectations, a valid and reliable quality assurance (QA) system is crucial to ensure that nurse-graduates are competent, confident and fit to practice. The QA approach is seen to be fundamental for quality improvement, it would be appropriate to consider its influence in the nursing education in Pakistan as the current situation is evident of non-existence of such a system to assure its quality. The data is drawn from a qualitative case study conducted in 2004. Among a purposive sample of 71 nurses inclusive of a group of Controllers were interviewed on one-to-one basis. Interviews were audio taped to reduce the risk of any misinterpretation and to facilitate the exact description of data as it was said. The non-directive, semi-structured and open-ended questionnaire was used to collect data. Thematic analysis of verbatim transcripts of the interviews was done. The study findings reveal a unanimous desire of the nurses to gauge quality of nurse education through efficient and effective quality assurance system. A crucial need is felt to develop a viable quality assurance system to ensure approved level of quality in nursing education to deliver the right care to the right patient at the right time, every time. The continuous quality assurance and improvement (CQAI) framework based on Deming Quality Cycle (Plan, Do, Check and Act) could facilitate appropriate designing and development of mechanism.
Frommenwiler, Débora Arruda; Kim, Jonghwan; Yook, Chang-Soo; Tran, Thi Thu Trang; Cañigueral, Salvador; Reich, Eike
2018-04-01
The quality of herbal drugs is usually controlled using several tests recommended in a monograph. HPTLC is the method of choice for identification in many pharmacopoeias. If combined with a suitable reference material for comparison, HPTLC can provide information beyond identification and thus may simplify quality control. This paper describes, as a proof of concept, how HPTLC can be applied to define specifications for an herbal reference material and to control the quality of an herbal drug according to these specifications. Based on multiple batches of cultivated Angelica gigas root, a specific HPTLC method for identification was optimized. This method can distinguish 27 related species. It also can detect the presence of mixtures of A. gigas with two other Angelica species traded as "Dang gui" and is suitable as well for quantitative assessment of samples in a test for minimum content of the sum of decursin and decursinol angelate. The new concept of "comprehensive HPTLC fingerprinting" is proposed: HPTLC fingerprints (images), which are used for identification, are converted into peak profiles and the intensities of selected zones are quantitatively compared to those of the corresponding zones of the reference material. Following a collaborative trial involving three laboratories in three countries, the method was applied to check the quality of further candidates for establishing an appropriate reference material. In conclusion, this case demonstrates that a single HPTLC analysis can provide information about identity, purity, and minimum content of markers of an herbal drug. Georg Thieme Verlag KG Stuttgart · New York.
Real-time simulation model of the HL-20 lifting body
NASA Technical Reports Server (NTRS)
Jackson, E. Bruce; Cruz, Christopher I.; Ragsdale, W. A.
1992-01-01
A proposed manned spacecraft design, designated the HL-20, has been under investigation at Langley Research Center. Included in that investigation are flight control design and flying qualities studies utilizing a man-in-the-loop real-time simulator. This report documents the current real-time simulation model of the HL-20 lifting body vehicle, known as version 2.0, presently in use at NASA Langley Research Center. Included are data on vehicle aerodynamics, inertias, geometries, guidance and control laws, and cockpit displays and controllers. In addition, trim case and dynamic check case data is provided. The intent of this document is to provide the reader with sufficient information to develop and validate an equivalent simulation of the HL-20 for use in real-time or analytical studies.
Forster, Alice S; Burgess, Caroline; McDermott, Lisa; Wright, Alison J; Dodhia, Hiten; Conner, Mark; Miller, Jane; Rudisill, Caroline; Cornelius, Victoria; Gulliford, Martin C
2014-08-30
NHS Health Checks is a new program for primary prevention of heart disease, stroke, diabetes, chronic kidney disease, and vascular dementia in adults aged 40 to 74 years in England. Individuals without existing cardiovascular disease or diabetes are invited for a Health Check every 5 years. Uptake among those invited is lower than anticipated. The project is a three-arm randomized controlled trial to test the hypothesis that enhanced invitation methods, using the Question-Behaviour Effect (QBE), will increase uptake of NHS Health Checks compared with a standard invitation. Participants comprise individuals eligible for an NHS Health Check registered in two London boroughs. Participants are randomized into one of three arms. Group A receives the standard NHS Health Check invitation letter, information sheet, and reminder letter at 12 weeks for nonattenders. Group B receives a QBE questionnaire 1 week before receiving the standard invitation, information sheet, and reminder letter where appropriate. Group C is the same as Group B, but participants are offered a £5 retail voucher if they return the questionnaire. Participants are randomized in equal proportions, stratified by general practice. The primary outcome is uptake of NHS Health Checks 6 months after invitation from electronic health records. We will estimate the incremental health service cost per additional completed Health Check for trial groups B and C versus trial arm A, as well as evaluating the impact of the QBE questionnaire, and questionnaire plus voucher, on the socioeconomic inequality in uptake of Health Checks.The trial includes a nested comparison of two methods for implementing allocation, one implemented manually at general practices and the other implemented automatically through the information systems used to generate invitations for the Health Check. The research will provide evidence on whether asking individuals to complete a preliminary questionnaire, by using the QBE, is effective in increasing uptake of Health Checks and whether an incentive alters questionnaire return rates as well as uptake of Health Checks. The trial interventions can be readily translated into routine service delivery if they are shown to be cost-effective. Current Controlled Trials ISRCTN42856343. Date registered: 21.03.2013.
Class Model Development Using Business Rules
NASA Astrophysics Data System (ADS)
Skersys, Tomas; Gudas, Saulius
New developments in the area of computer-aided system engineering (CASE) greatly improve processes of the information systems development life cycle (ISDLC). Much effort is put into the quality improvement issues, but IS development projects still suffer from the poor quality of models during the system analysis and design cycles. At some degree, quality of models that are developed using CASE tools can be assured using various. automated. model comparison, syntax. checking procedures. It. is also reasonable to check these models against the business domain knowledge, but the domain knowledge stored in the repository of CASE tool (enterprise model) is insufficient (Gudas et al. 2004). Involvement of business domain experts into these processes is complicated because non- IT people often find it difficult to understand models that were developed by IT professionals using some specific modeling language.
Audit and internal quality control in immunohistochemistry
Maxwell, P; McCluggage, W
2000-01-01
Aims—Although positive and negative controls are performed and checked in surgical pathology cases undergoing immunohistochemistry, internal quality control procedures for immunohistochemistry are not well described. This study, comprising a retrospective audit, aims to describe a method of internal quality control for immunohistochemistry. A scoring system that allows comparison between cases is described. Methods—Two positive tissue controls for each month over a three year period (1996–1998) of the 10 antibodies used most frequently were evaluated. All test cases undergoing immunohistochemistry in the months of April in this three year period were also studied. When the test case was completely negative for a given antibody, the corresponding positive tissue control from that day was examined. A marking system was devised whereby each immunohistochemical slide was assessed out of a possible score of 8 to take account of staining intensity, uniformity, specificity, background, and counterstaining. Using this scoring system, cases were classified as showing optimal (7–8), borderline (5–6), or unacceptable (0–4) staining. Results—Most positive tissue controls showed either optimal or borderline staining with the exception of neurone specific enolase (NSE), where most slides were unacceptable or borderline as a result of a combination of low intensity, poor specificity, and excessive background staining. All test cases showed either optimal or borderline staining with the exception of a single case stained for NSE, which was unacceptable. Conclusions—This retrospective audit shows that immunohistochemically stained slides can be assessed using this scoring system. With most antibodies, acceptable staining was achieved in most cases. However, there were problems with staining for NSE, which needs to be reviewed. Laboratories should use a system such as this to evaluate which antibodies regularly result in poor staining so that they can be excluded from panels. Routine evaluation of immunohistochemical staining should become part of everyday internal quality control procedures. Key Words: immunohistochemistry • audit • internal quality control PMID:11265178
Membrane oxygenator heat exchanger failure detected by unique blood gas findings.
Hawkins, Justin L
2014-03-01
Failure of components integrated into the cardiopulmonary bypass circuit, although rare, can bring about catastrophic results. One of these components is the heat exchanger of the membrane oxygenator. In this compartment, unsterile water from the heater cooler device is separated from the sterile blood by stainless steel, aluminum, or by polyurethane. These areas are glued or welded to keep the two compartments separate, maintaining sterility of the blood. Although quality control testing is performed by the manufacturer at the factory level, transport presents the real possibility for damage. Because of this, each manufacturer has included in the instructions for use a testing procedure for testing the integrity of the heat exchanger component. Water is circulated through the heat exchanger before priming and a visible check is made of the oxygenator bundle to check for leaks. If none are apparent, then priming of the oxygenator is performed. In this particular case, this procedure was not useful in detecting communication between the water and blood chambers of the oxygenator.
Cognitive responses to hypobaric hypoxia: implications for aviation training
Neuhaus, Christopher; Hinkelbein, Jochen
2014-01-01
The aim of this narrative review is to provide an overview on cognitive responses to hypobaric hypoxia and to show relevant implications for aviation training. A principal element of hypoxia-awareness training is the intentional evocation of hypoxia symptoms during specific training sessions within a safe and controlled environment. Repetitive training should enable pilots to learn and recognize their personal hypoxia symptoms. A time span of 3–6 years is generally considered suitable to refresh knowledge of the more subtle and early symptoms especially. Currently, there are two different technical approaches available to induce hypoxia during training: hypobaric chamber training and reduced-oxygen breathing devices. Hypoxia training for aircrew is extremely important and effective, and the hypoxia symptoms should be emphasized clearly to aircrews. The use of tight-fitting masks, leak checks, and equipment checks should be taught to all aircrew and reinforced regularly. It is noteworthy that there are major differences in the required quality and quantity of hypoxia training for both military and civilian pilots. PMID:25419162
White, Jacquie; Lucas, Joanne; Swift, Louise; Barton, Garry R; Johnson, Harriet; Irvine, Lisa; Abotsie, Gabriel; Jones, Martin; Gray, Richard J
2018-05-01
This study tested the effectiveness of a nurse-delivered health check with the Health Improvement Profile (HIP), which takes approximately 1.5 hours to complete and code, for persons with severe mental illness. A single-blind, cluster-randomized controlled trial was conducted in England to test whether health checks improved the general medical well-being of persons with severe mental illness at 12-month follow-up. Sixty nurses were randomly assigned to the HIP group or the treatment-as-usual group. From their case lists, 173 patients agreed to participate. HIP group nurses completed health checks for 38 of their 90 patients (42%) at baseline and 22 (24%) at follow-up. No significant between-group differences were noted in patients' general medical well-being at follow-up. Nurses who had volunteered for a clinical trial administered health checks only to a minority of participating patients, suggesting that it may not be feasible to undertake such lengthy structured health checks in routine practice.
Song, Wenqi; Shen, Ying; Peng, Xiaoxia; Tian, Jian; Wang, Hui; Xu, Lili; Nie, Xiaolu; Ni, Xin
2015-05-26
The program of continuous quality improvement in clinical laboratory processes for complete blood count (CBC) was launched via the platform of Beijing Children's Hospital Group in order to improve the quality of pediatric clinical laboratories. Fifteen children's hospitals of Beijing Children's Hospital group were investigated using the method of Chinese adapted continuous quality improvement with PDCA (Plan-Do-Check-Action). The questionnaire survey and inter-laboratory comparison was conducted to find the existing problems, to analyze reasons, to set forth quality targets and to put them into practice. Then, targeted training was conducted to 15 children's hospitals and the second questionnaire survey, self examinations by the clinical laboratories was performed. At the same time, the Group's online internal quality control platform was established. Overall effects of the program were evaluated so that lay a foundation for the next stage of PDCA. Both quality of control system documents and CBC internal quality control scheme for all of clinical laboratories were improved through this program. In addition, standardization of performance verification was also improved, especially with the comparable verification rate of precision and internal laboratory results up to 100%. In terms of instrument calibration and mandatory diagnostic rates, only three out of the 15 hospitals (20%) failed to pass muster in 2014 from 46.67% (seven out of the 15 hospitals) in 2013. The abnormal data of intraday precision variance coefficients of the five CBC indicator parameters (WBC, RBC, Hb, Plt and Hct) of all the 15 laboratories accounted for 1.2% (2/165) in 2014, a marked decrease from 9.6% (14/145) in 2013. While the number of the hospitals using only one horizontal quality control object for daily quality control has dropped to three from five. The 15 hospitals organized a total of 263 times of training in 2014 from 101 times in 2013, up 160%. The quality improvement program for the clinical laboratories launched via the Hospital Group platform can promote the joint development of the pediatric clinical laboratory discipline of all the member hospitals with remarkable improvement results, and the experience is recommendable for further rollout.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu, SH; Tsai, YC; Lan, HT
2016-06-15
Purpose: Intensity-modulated radiotherapy (IMRT) and volumetric modulated arc therapy (VMAT) have been widely investigated for use in radiotherapy and found to have a highly conformal dose distribution. Delta{sup 4} is a novel cylindrical phantom consisting of 1069 p-type diodes with true treatments measured in the 3D target volume. The goal of this study was to compare the performance of a Delta{sup 4} diode array for IMRT and VMAT planning with ion chamber and MapCHECK2. Methods: Fifty-four IMRT (n=9) and VMAT (n=45) plans were imported to Philips Pinnacle Planning System 9.2 for recalculation with a solid water phantom, MapCHECK2, and themore » Delta4 phantom. To evaluate the difference between the measured and calculated dose, we used MapCHECK2 and Delta{sup 4} for a dose-map comparison and an ion chamber (PTW 31010 Semiflex 0.125 cc) for a point-dose comparison. Results: All 54 plans met the criteria of <3% difference for the point dose (at least two points) by ion chamber. The mean difference was 0.784% with a standard deviation of 1.962%. With a criteria of 3 mm/3% in a gamma analysis, the average passing rates were 96.86%±2.19% and 98.42%±1.97% for MapCHECK2 and Delta{sup 4}, respectively. The student t-test of MapCHECK2/Delta{sup 4}, ion chamber/Delta{sup 4}, and ion chamber/MapCHECK2 were 0.0008, 0.2944, and 0.0002, respectively. There was no significant difference in passing rates between MapCHECK2 and Delta{sup 4} for the IMRT plan (p = 0.25). However, a higher pass rate was observed in Delta{sup 4} (98.36%) as compared to MapCHECK2 (96.64%, p < 0.0001) for the VMAT plan. Conclusion: The Pinnacle planning system can accurately calculate doses for VMAT and IMRT plans. The Delta{sup 4} shows a similar result when compared to ion chamber and MapCHECK2, and is an efficient tool for patient-specific quality assurance, especially for rotation therapy.« less
Wils, Julien; Fonfrède, Michèle; Augereau, Christine; Watine, Joseph
2014-01-01
Several tools are available to help evaluate the quality of clinical practice guidelines (CPG). The AGREE instrument (Appraisal of guidelines for research & evaluation) is the most consensual tool but it has been designed to assess CPG methodology only. The European federation of laboratory medicine (EFLM) recently designed a check-list dedicated to laboratory medicine which is supposed to be comprehensive and which therefore makes it possible to evaluate more thoroughly the quality of CPG in laboratory medicine. In the present work we test the comprehensiveness of this check-list on a sample of CPG written in French and published in Annales de biologie clinique (ABC). Thus we show that some work remains to be achieved before a truly comprehensive check-list is designed. We also show that there is some room for improvement for the CPG published in ABC, for example regarding the fact that some of these CPG do not provide any information about allowed durations of transport and of storage of biological samples before analysis, or about standards of minimal analytical performance, or about the sensitivities or the specificities of the recommended tests.
The SeaDataNet data products: regional temperature and salinity historical data collections
NASA Astrophysics Data System (ADS)
Simoncelli, Simona; Coatanoan, Christine; Bäck, Orjan; Sagen, Helge; Scoy, Serge; Myroshnychenko, Volodymyr; Schaap, Dick; Schlitzer, Reiner; Iona, Sissy; Fichaut, Michele
2016-04-01
Temperature and Salinity (TS) historical data collections covering the time period 1900-2013 were created for each European marginal sea (Arctic Sea, Baltic Sea, Black Sea, North Sea, North Atlantic Ocean and Mediterranean Sea) within the framework of SeaDataNet2 (SDN) EU-Project and they are now available as ODV collections through the SeaDataNet web catalog at http://sextant.ifremer.fr/en/web/seadatanet/. Two versions have been published and they represent a snapshot of the SDN database content at two different times: V1.1 (January 2014) and V2 (March 2015). A Quality Control Strategy (QCS) has been developped and continuously refined in order to improve the quality of the SDN database content and to create the best product deriving from SDN data. The QCS was originally implemented in collaboration with MyOcean2 and MyOcean Follow On projects in order to develop a true synergy at regional level to serve operational oceanography and climate change communities. The QCS involved the Regional Coordinators, responsible of the scientific assessment, the National Oceanographic Data Centers (NODC) and the data providers that, on the base of the data quality assessment outcome, checked and eventually corrected anomalies in the original data. The QCS consists of four main phases: 1) data harvesting from the central CDI; 2) file and parameter aggregation; 3) quality check analysis at regional level; 4) analysis and correction of data anomalies. The approach is iterative to facilitate the upgrade of SDN database content and it allows also the versioning of data products with the release of new regional data collections at the end of each QCS loop. SDN data collections and the QCS will be presented and the results summarized.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barbee, D; McCarthy, A; Galavis, P
Purpose: Errors found during initial physics plan checks frequently require replanning and reprinting, resulting decreased departmental efficiency. Additionally, errors may be missed during physics checks, resulting in potential treatment errors or interruption. This work presents a process control created using the Eclipse Scripting API (ESAPI) enabling dosimetrists and physicists to detect potential errors in the Eclipse treatment planning system prior to performing any plan approvals or printing. Methods: Potential failure modes for five categories were generated based on available ESAPI (v11) patient object properties: Images, Contours, Plans, Beams, and Dose. An Eclipse script plugin (PlanCheck) was written in C# tomore » check errors most frequently observed clinically in each of the categories. The PlanCheck algorithms were devised to check technical aspects of plans, such as deliverability (e.g. minimum EDW MUs), in addition to ensuring that policy and procedures relating to planning were being followed. The effect on clinical workflow efficiency was measured by tracking the plan document error rate and plan revision/retirement rates in the Aria database over monthly intervals. Results: The number of potential failure modes the PlanCheck script is currently capable of checking for in the following categories: Images (6), Contours (7), Plans (8), Beams (17), and Dose (4). Prior to implementation of the PlanCheck plugin, the observed error rates in errored plan documents and revised/retired plans in the Aria database was 20% and 22%, respectively. Error rates were seen to decrease gradually over time as adoption of the script improved. Conclusion: A process control created using the Eclipse scripting API enabled plan checks to occur within the planning system, resulting in reduction in error rates and improved efficiency. Future work includes: initiating full FMEA for planning workflow, extending categories to include additional checks outside of ESAPI via Aria database queries, and eventual automated plan checks.« less
NASA Astrophysics Data System (ADS)
Sturtevant, C.; Hackley, S.; Lee, R.; Holling, G.; Bonarrigo, S.
2017-12-01
Quality assurance and control (QA/QC) is one of the most important yet challenging aspects of producing research-quality data. Data quality issues are multi-faceted, including sensor malfunctions, unmet theoretical assumptions, and measurement interference from humans or the natural environment. Tower networks such as Ameriflux, ICOS, and NEON continue to grow in size and sophistication, yet tools for robust, efficient, scalable QA/QC have lagged. Quality control remains a largely manual process heavily relying on visual inspection of data. In addition, notes of measurement interference are often recorded on paper without an explicit pathway to data flagging. As such, an increase in network size requires a near-proportional increase in personnel devoted to QA/QC, quickly stressing the human resources available. We present a scalable QA/QC framework in development for NEON that combines the efficiency and standardization of automated checks with the power and flexibility of human review. This framework includes fast-response monitoring of sensor health, a mobile application for electronically recording maintenance activities, traditional point-based automated quality flagging, and continuous monitoring of quality outcomes and longer-term holistic evaluations. This framework maintains the traceability of quality information along the entirety of the data generation pipeline, and explicitly links field reports of measurement interference to quality flagging. Preliminary results show that data quality can be effectively monitored and managed for a multitude of sites with a small group of QA/QC staff. Several components of this framework are open-source, including a R-Shiny application for efficiently monitoring, synthesizing, and investigating data quality issues.
Evaluation plan for state gas heating system retrofit pilot programs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berry, L.; Vineyard, T.
This report presents a detailed plan for the evaluation of state gas heating system retrofit pilot programs. The major goals of the evaluation procedures are to document the fuel savings and cost effectiveness of (1) the programs implemented by the states and (2) the four retrofit types installed. The major tasks involved in the evaluation include identification of program-eligible households, screening for data quality, assignment of eligible households to treatment or control groups, assembling cost data, collecting pre- and postretrofit consumption data, obtainin pre- and postretrofit weather data, checking for data quality, and analyzing the data. Data analysis relies onmore » the calculation of weather-adjusted normalized annual consumption (NAC) figures for pre- and postretrofit years for treatment and control groups. The differences between the treatment and control groups' NACs for the pre- and postretrofit years are the measure of the program's impact. Cost effectiveness analysis will combine the NAC results with cost data and with a variety of assumptions concerning future fuel prices, retrofit lifetimes, and discount rates to produce benefit/cost indicators.« less
Kuhn, Stefan; Schlörer, Nils E
2015-08-01
nmrshiftdb2 supports with its laboratory information management system the integration of an electronic lab administration and management into academic NMR facilities. Also, it offers the setup of a local database, while full access to nmrshiftdb2's World Wide Web database is granted. This freely available system allows on the one hand the submission of orders for measurement, transfers recorded data automatically or manually, and enables download of spectra via web interface, as well as the integrated access to prediction, search, and assignment tools of the NMR database for lab users. On the other hand, for the staff and lab administration, flow of all orders can be supervised; administrative tools also include user and hardware management, a statistic functionality for accounting purposes, and a 'QuickCheck' function for assignment control, to facilitate quality control of assignments submitted to the (local) database. Laboratory information management system and database are based on a web interface as front end and are therefore independent of the operating system in use. Copyright © 2015 John Wiley & Sons, Ltd.
Spacelab Data Processing Facility
NASA Technical Reports Server (NTRS)
1983-01-01
The Spacelab Data Processing Facility (SDPF) processes, monitors, and accounts for the payload data from Spacelab and other Shuttle missions and forwards relevant data to various user facilities worldwide. The SLDPF is divided into the Spacelab Input Processing System (SIPS) and the Spacelab Output Processing System (SOPS). The SIPS division demultiplexes, synchronizes, time tags, quality checks, accounts for the data, and formats the data onto tapes. The SOPS division further edits, blocks, formats, and records the data on tape for shipment to users. User experiments must conform to the Spacelab's onboard High Rate Multiplexer (HRM) format for maximum process ability. Audio, analog, instrumentation, high density, experiment data, input/output data, quality control and accounting, and experimental channel tapes along with a variety of spacelab ancillary tapes are provided to the user by SLDPF.
Ground-water levels in Huron County, Michigan, January 1995 through December 1995
Sweat, M.J.
1996-01-01
In 1990, the U.S. Geological Survey (USGS) completed a study of the hydrogeology of Huron County, Michigan (Sweat, 1991). In 1993, Huron County and the USGS entered into an agreement to continue collecting water levels at selected wells throughout Huron County. As part of the agreement, the USGS has provided training and instrumentation for County personnel to measure, on a quarterly basis, the depth to water below the land surface in selected wells. The agreement includes the operation of continuous water-level recorders installed on four wells in Bingham, Fairhaven, Grant and Lake Townships (fig. 1). County personnel make quarterly water-level measurements of 22 other wells. Once each year, County personnel are accompanied by USGS personnel who provide a quality assurance/quality control check of all measurements being made.
SU-F-T-165: Daily QA Analysis for Spot Scanning Beamline
DOE Office of Scientific and Technical Information (OSTI.GOV)
Poenisch, F; Gillin, M; Sahoo, N
2016-06-15
Purpose: The dosimetric results of our daily quality assurance over the last 8 years for discrete pencil beam scanning proton therapy will be presented. Methods: To perform the dosimetric checks, a multi-ion chamber detector is used, which consists of an array of 5 single parallel plate ion chambers that are aligned as a cross separated by 10cm each. The Tracker is snapped into a jig, which is placed on the tabletop. Different amounts of Solid Water buildup are added to shift the dose distribution. The dosimetric checks consist of 3 parts: position check, range check and volume dose check. Results:more » The average deviation of all position-check data were 0.2±1.3%. For the range check, the average deviation was 0.1%±1.2%, which also corresponds to a range stability of better than 1 mm over all measurements. The volumetric dose output readings were all within ±1% with the exception of 2 occasions when the cable to the dose monitor was being repaired. Conclusion: Morning QA using the Tracker device gives very stable dosimetric readings but is also sensitive to mechanical and output changes in the proton therapy delivery system.« less
Wheat Quality Council, Hard Spring Wheat Technical Committee, 2015 Crop
USDA-ARS?s Scientific Manuscript database
Nine experimental lines of hard spring wheat were grown at up to five locations in 2015 and evaluated for kernel, milling, and bread baking quality against the check variety Glenn. Wheat samples were submitted through the Wheat Quality Council and processed and milled at the USDA-ARS Hard Red Sprin...
Wheat Quality Council, Hard Spring Wheat Technical Committee, 2017 Crop
USDA-ARS?s Scientific Manuscript database
Nine experimental lines of hard spring wheat were grown at up to six locations in 2017 and evaluated for kernel, milling, and bread baking quality against the check variety Glenn. Wheat samples were submitted through the Wheat Quality Council and processed and milled at the USDA-ARS Hard Red Spring...
Wheat Quality Council, Hard Spring Wheat Technical Committee, 2014 Crop
USDA-ARS?s Scientific Manuscript database
Eleven experimental lines of hard spring wheat were grown at up to five locations in 2014 and evaluated for kernel, milling, and bread baking quality against the check variety Glenn. Wheat samples were submitted through the Wheat Quality Council and processed and milled at the USDA-ARS Hard Red Spr...
Qualities of Early Childhood Teachers: Reflections from Teachers and Administrators.
ERIC Educational Resources Information Center
Weitman, Catheryn J.; Humphries, Janie H.
Data were collected from elementary school principals and kindergarten teachers in Texas and Louisiana in an effort to identify qualities that are thought to be important for kindergarten teachers. A questionnaire listing 462 qualities of early childhood teachers was compiled from literature reviews. Subjects were asked to check a maximum of 50…
Affect adjective check list assessment of mood variations in air traffic controllers.
DOT National Transportation Integrated Search
1971-04-01
Three groups of subjects completed Composite Mood Adjective Check Lists (CMACL) before and after selected shifts at two air traffic control (ATC) facilities as part of a multi-discipline study of stress in ATC work. : At one facility, a high traffic ...
Prediction Interval Development for Wind-Tunnel Balance Check-Loading
NASA Technical Reports Server (NTRS)
Landman, Drew; Toro, Kenneth G.; Commo, Sean A.; Lynn, Keith C.
2014-01-01
Results from the Facility Analysis Verification and Operational Reliability project revealed a critical gap in capability in ground-based aeronautics research applications. Without a standardized process for check-loading the wind-tunnel balance or the model system, the quality of the aerodynamic force data collected varied significantly between facilities. A prediction interval is required in order to confirm a check-loading. The prediction interval provides an expected upper and lower bound on balance load prediction at a given confidence level. A method has been developed which accounts for sources of variability due to calibration and check-load application. The prediction interval method of calculation and a case study demonstrating its use is provided. Validation of the methods is demonstrated for the case study based on the probability of capture of confirmation points.
Ultrasound use during cardiopulmonary resuscitation is associated with delays in chest compressions.
Huis In 't Veld, Maite A; Allison, Michael G; Bostick, David S; Fisher, Kiondra R; Goloubeva, Olga G; Witting, Michael D; Winters, Michael E
2017-10-01
High-quality chest compressions are a critical component of the resuscitation of patients in cardiopulmonary arrest. Point-of-care ultrasound (POCUS) is used frequently during emergency department (ED) resuscitations, but there has been limited research assessing its benefits and harms during the delivery of cardiopulmonary resuscitation (CPR). We hypothesized that use of POCUS during cardiac arrest resuscitation adversely affects high-quality CPR by lengthening the duration of pulse checks beyond the current cardiopulmonary resuscitation guidelines recommendation of 10s. We conducted a prospective cohort study of adults in cardiac arrest treated in an urban ED between August 2015 and September 2016. Resuscitations were recorded using video equipment in designated resuscitation rooms, and the use of POCUS was documented and timed. A linear mixed-effects model was used to estimate the effect of POCUS on pulse check duration. Twenty-three patients were enrolled in our study. The mean duration of pulse checks with POCUS was 21.0s (95% CI, 18-24) compared with 13.0s (95% CI, 12-15) for those without POCUS. POCUS increased the duration of pulse checks and CPR interruption by 8.4s (95% CI, 6.7-10.0 [p<0.0001]). Age, body mass index (BMI), and procedures did not significantly affect the duration of pulse checks. The use of POCUS during cardiac arrest resuscitation was associated with significantly increased duration of pulse checks, nearly doubling the 10-s maximum duration recommended in current guidelines. It is important for acute care providers to pay close attention to the duration of interruptions in the delivery of chest compressions when using POCUS during cardiac arrest resuscitation. Copyright © 2017 Elsevier B.V. All rights reserved.
The impact of gender on the assessment of body checking behavior.
Alfano, Lauren; Hildebrandt, Tom; Bannon, Katie; Walker, Catherine; Walton, Kate E
2011-01-01
Body checking includes any behavior aimed at global or specific evaluations of appearance characteristics. Men and women are believed to express these behaviors differently, possibly reflecting different socialization. However, there has been no empirical test of the impact of gender on body checking. A total of 1024 male and female college students completed two measures of body checking, the Body Checking Questionnaire and the Male Body Checking Questionnaire. Using multiple group confirmatory factor analysis, differential item functioning (DIF) was explored in a composite of these measures. Two global latent factors were identified (female and male body checking severity), and there were expected gender differences in these factors even after controlling for DIF. Ten items were found to be unbiased by gender and provide a suitable brief measure of body checking for mixed gender research. Practical applications for body checking assessment and theoretical implications are discussed. Copyright © 2010 Elsevier Ltd. All rights reserved.
De Boer, Jan L M; Ritsema, Rob; Piso, Sjoerd; Van Staden, Hans; Van Den Beld, Wilbert
2004-07-01
Two screening methods were developed for rapid analysis of a great number of urine and blood samples within the framework of an exposure check of the population after a firework explosion. A total of 56 elements was measured including major elements. Sample preparation consisted of simple dilution. Extensive quality controls were applied including element addition and the use of certified reference materials. Relevant results at levels similar to those found in the literature were obtained for Co, Ni, Cu, Zn, Sr, Cd, Sn, Sb, Ba, Tl, and Pb in urine and for the same elements except Ni, Sn, Sb, and Ba in blood. However, quadrupole ICP-MS has limitations, mainly related to spectral interferences, for the analysis of urine and blood, and these cause higher detection limits. The general aspects discussed in the paper give it wider applicability than just for analysis of blood and urine-it can for example be used in environmental analysis.
Geometric facial comparisons in speed-check photographs.
Buck, Ursula; Naether, Silvio; Kreutz, Kerstin; Thali, Michael
2011-11-01
In many cases, it is not possible to call the motorists to account for their considerable excess in speeding, because they deny being the driver on the speed-check photograph. An anthropological comparison of facial features using a photo-to-photo comparison can be very difficult depending on the quality of the photographs. One difficulty of that analysis method is that the comparison photographs of the presumed driver are taken with a different camera or camera lens and from a different angle than for the speed-check photo. To take a comparison photograph with exactly the same camera setup is almost impossible. Therefore, only an imprecise comparison of the individual facial features is possible. The geometry and position of each facial feature, for example the distances between the eyes or the positions of the ears, etc., cannot be taken into consideration. We applied a new method using 3D laser scanning, optical surface digitalization, and photogrammetric calculation of the speed-check photo, which enables a geometric comparison. Thus, the influence of the focal length and the distortion of the objective lens are eliminated and the precise position and the viewing direction of the speed-check camera are calculated. Even in cases of low-quality images or when the face of the driver is partly hidden, good results are delivered using this method. This new method, Geometric Comparison, is evaluated and validated in a prepared study which is described in this article.
Methods for Geometric Data Validation of 3d City Models
NASA Astrophysics Data System (ADS)
Wagner, D.; Alam, N.; Wewetzer, M.; Pries, M.; Coors, V.
2015-12-01
Geometric quality of 3D city models is crucial for data analysis and simulation tasks, which are part of modern applications of the data (e.g. potential heating energy consumption of city quarters, solar potential, etc.). Geometric quality in these contexts is however a different concept as it is for 2D maps. In the latter case, aspects such as positional or temporal accuracy and correctness represent typical quality metrics of the data. They are defined in ISO 19157 and should be mentioned as part of the metadata. 3D data has a far wider range of aspects which influence their quality, plus the idea of quality itself is application dependent. Thus, concepts for definition of quality are needed, including methods to validate these definitions. Quality on this sense means internal validation and detection of inconsistent or wrong geometry according to a predefined set of rules. A useful starting point would be to have correct geometry in accordance with ISO 19107. A valid solid should consist of planar faces which touch their neighbours exclusively in defined corner points and edges. No gaps between them are allowed, and the whole feature must be 2-manifold. In this paper, we present methods to validate common geometric requirements for building geometry. Different checks based on several algorithms have been implemented to validate a set of rules derived from the solid definition mentioned above (e.g. water tightness of the solid or planarity of its polygons), as they were developed for the software tool CityDoctor. The method of each check is specified, with a special focus on the discussion of tolerance values where they are necessary. The checks include polygon level checks to validate the correctness of each polygon, i.e. closeness of the bounding linear ring and planarity. On the solid level, which is only validated if the polygons have passed validation, correct polygon orientation is checked, after self-intersections outside of defined corner points and edges are detected, among additional criteria. Self-intersection might lead to different results, e.g. intersection points, lines or areas. Depending on the geometric constellation, they might represent gaps between bounding polygons of the solids, overlaps, or violations of the 2-manifoldness. Not least due to the floating point problem in digital numbers, tolerances must be considered in some algorithms, e.g. planarity and solid self-intersection. Effects of different tolerance values and their handling is discussed; recommendations for suitable values are given. The goal of the paper is to give a clear understanding of geometric validation in the context of 3D city models. This should also enable the data holder to get a better comprehension of the validation results and their consequences on the deployment fields of the validated data set.
Effects of a growth check on daily age estimates of age-0 alligator gar
Snow, Richard A.; Long, James M.
2016-01-01
Accurate age and growth information is essential for a complete knowledge of life history, growth rates, age at sexual maturity, and average life span in fishes. Alligator gar are becoming increasingly managed throughout their range and because this species spawns in backwater flooded areas, their offspring are prone to stranding in areas with limited prey, potentially affecting their growth. Because fish growth is tightly linked with otolith growth and annulus formation, the ability to discern marks not indicative of annuli (age checks) in alligator gar would give managers some insight when estimating ages. Previous studies have suggested that checks are often present prior to the first annulus in otoliths of alligator gar, affecting age estimates. We investigated check formation in otoliths of alligator gar in relation to growth and food availability. Sixteen age-0 alligator gar were marked with oxytetracycline (OTC) to give a reference point and divided equitably into two groups: a control group with abundant prey and an experimental group with limited prey. The experimental group was given 2 g of food per week for 20 days and then given the same prey availability as the control group for the next 20 days. After 40 days, the gar were measured, sacrificed, and their sagittae removed to determine if checks were present. Checks were visible on 14 of the 16 otoliths in the experimental group, associated with low growth during the first 20 days when prey was limited and accelerated growth after prey availability was increased. No checks were observed on otoliths of the control group, where growth and prey availability were consistent. Age estimates of fish in the control group were more accurate than those in the experimental group, showing that fish growth as a function of prey availability likely induced the checks by compressing daily ring formation.
TU-D-201-06: HDR Plan Prechecks Using Eclipse Scripting API
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palaniswaamy, G; Morrow, A; Kim, S
Purpose: Automate brachytherapy treatment plan quality check using Eclipse v13.6 scripting API based on pre-configured rules to minimize human error and maximize efficiency. Methods: The HDR Precheck system is developed based on a rules-driven approach using Eclipse scripting API. This system checks for critical plan parameters like channel length, first source position, source step size and channel mapping. The planned treatment time is verified independently based on analytical methods. For interstitial or SAVI APBI treatment plans, a Patterson-Parker system calculation is performed to verify the planned treatment time. For endobronchial treatments, an analytical formula from TG-59 is used. Acceptable tolerancesmore » were defined based on clinical experiences in our department. The system was designed to show PASS/FAIL status levels. Additional information, if necessary, is indicated appropriately in a separate comments field in the user interface. Results: The HDR Precheck system has been developed and tested to verify the treatment plan parameters that are routinely checked by the clinical physicist. The report also serves as a reminder or checklist for the planner to perform any additional critical checks such as applicator digitization or scenarios where the channel mapping was intentionally changed. It is expected to reduce the current manual plan check time from 15 minutes to <1 minute. Conclusion: Automating brachytherapy plan prechecks significantly reduces treatment plan precheck time and reduces human errors. When fully developed, this system will be able to perform TG-43 based second check of the treatment planning system’s dose calculation using random points in the target and critical structures. A histogram will be generated along with tabulated mean and standard deviation values for each structure. A knowledge database will also be developed for Brachyvision plans which will then be used for knowledge-based plan quality checks to further reduce treatment planning errors and increase confidence in the planned treatment.« less
Focant, Jean-François; Eppe, Gauthier; Massart, Anne-Cécile; Scholl, Georges; Pirard, Catherine; De Pauw, Edwin
2006-10-13
We report on the use of a state-of-the-art method for the measurement of selected polychlorinated dibenzo-p-dioxins, polychlorinated dibenzofurans and polychlorinated biphenyls in human serum specimens. The sample preparation procedure is based on manual small size solid-phase extraction (SPE) followed by automated clean-up and fractionation using multi-sorbent liquid chromatography columns. SPE cartridges and all clean-up columns are disposable. Samples are processed in batches of 20 units, including one blank control (BC) sample and one quality control (QC) sample. The analytical measurement is performed using gas chromatography coupled to isotope dilution high-resolution mass spectrometry. The sample throughput corresponds to one series of 20 samples per day, from sample reception to data quality cross-check and reporting, once the procedure has been started and series of samples keep being produced. Four analysts are required to ensure proper performances of the procedure. The entire procedure has been validated under International Organization for Standardization (ISO) 17025 criteria and further tested over more than 1500 unknown samples during various epidemiological studies. The method is further discussed in terms of reproducibility, efficiency and long-term stability regarding the 35 target analytes. Data related to quality control and limit of quantification (LOQ) calculations are also presented and discussed.
Gunetti, Monica; Castiglia, Sara; Rustichelli, Deborah; Mareschi, Katia; Sanavio, Fiorella; Muraro, Michela; Signorino, Elena; Castello, Laura; Ferrero, Ivana; Fagioli, Franca
2012-05-31
The quality and safety of advanced therapy products must be maintained throughout their production and quality control cycle to ensure their final use in patients. We validated the cell count method according to the International Conference on Harmonization of Technical Requirements for Registration of Pharmaceuticals for Human Use and European Pharmacopoeia, considering the tests' accuracy, precision, repeatability, linearity and range. As the cell count is a potency test, we checked accuracy, precision, and linearity, according to ICH Q2. Briefly our experimental approach was first to evaluate the accuracy of Fast Read 102® compared to the Bürker chamber. Once the accuracy of the alternative method was demonstrated, we checked the precision and linearity test only using Fast Read 102®. The data were statistically analyzed by average, standard deviation and coefficient of variation percentages inter and intra operator. All the tests performed met the established acceptance criteria of a coefficient of variation of less than ten percent. For the cell count, the precision reached by each operator had a coefficient of variation of less than ten percent (total cells) and under five percent (viable cells). The best range of dilution, to obtain a slope line value very similar to 1, was between 1:8 and 1:128. Our data demonstrated that the Fast Read 102® count method is accurate, precise and ensures the linearity of the results obtained in a range of cell dilution. Under our standard method procedures, this assay may thus be considered a good quality control method for the cell count as a batch release quality control test. Moreover, the Fast Read 102® chamber is a plastic, disposable device that allows a number of samples to be counted in the same chamber. Last but not least, it overcomes the problem of chamber washing after use and so allows a cell count in a clean environment such as that in a Cell Factory. In a good manufacturing practice setting the disposable cell counting devices will allow a single use of the count chamber they can then be thrown away, thus avoiding the waste disposal of vital dye (e.g. Trypan Blue) or lysing solution (e.g. Tuerk solution).
2012-01-01
Background The quality and safety of advanced therapy products must be maintained throughout their production and quality control cycle to ensure their final use in patients. We validated the cell count method according to the International Conference on Harmonization of Technical Requirements for Registration of Pharmaceuticals for Human Use and European Pharmacopoeia, considering the tests’ accuracy, precision, repeatability, linearity and range. Methods As the cell count is a potency test, we checked accuracy, precision, and linearity, according to ICH Q2. Briefly our experimental approach was first to evaluate the accuracy of Fast Read 102® compared to the Bürker chamber. Once the accuracy of the alternative method was demonstrated, we checked the precision and linearity test only using Fast Read 102®. The data were statistically analyzed by average, standard deviation and coefficient of variation percentages inter and intra operator. Results All the tests performed met the established acceptance criteria of a coefficient of variation of less than ten percent. For the cell count, the precision reached by each operator had a coefficient of variation of less than ten percent (total cells) and under five percent (viable cells). The best range of dilution, to obtain a slope line value very similar to 1, was between 1:8 and 1:128. Conclusions Our data demonstrated that the Fast Read 102® count method is accurate, precise and ensures the linearity of the results obtained in a range of cell dilution. Under our standard method procedures, this assay may thus be considered a good quality control method for the cell count as a batch release quality control test. Moreover, the Fast Read 102® chamber is a plastic, disposable device that allows a number of samples to be counted in the same chamber. Last but not least, it overcomes the problem of chamber washing after use and so allows a cell count in a clean environment such as that in a Cell Factory. In a good manufacturing practice setting the disposable cell counting devices will allow a single use of the count chamber they can then be thrown away, thus avoiding the waste disposal of vital dye (e.g. Trypan Blue) or lysing solution (e.g. Tuerk solution). PMID:22650233
Interventions for raising breast cancer awareness in women.
O'Mahony, Máirín; Comber, Harry; Fitzgerald, Tony; Corrigan, Mark A; Fitzgerald, Eileen; Grunfeld, Elizabeth A; Flynn, Maura G; Hegarty, Josephine
2017-02-10
Breast cancer continues to be the most commonly diagnosed cancer in women globally. Early detection, diagnosis and treatment of breast cancer are key to better outcomes. Since many women will discover a breast cancer symptom themselves, it is important that they are breast cancer aware i.e. have the knowledge, skills and confidence to detect breast changes and present promptly to a healthcare professional. To assess the effectiveness of interventions for raising breast cancer awareness in women. We searched the Cochrane Breast Cancer Group's Specialised Register (searched 25 January 2016), Cochrane Central Register of Controlled Trials (CENTRAL; 2015, Issue 12) in the Cochrane Library (searched 27 January 2016), MEDLINE OvidSP (2008 to 27 January 2016), Embase (Embase.com, 2008 to 27 January 2016), the World Health Organization's International Clinical Trials Registry Platform (ICTRP) search portal and ClinicalTrials.gov (searched 27 Feburary 2016). We also searched the reference lists of identified articles and reviews and the grey literature for conference proceedings and published abstracts. No language restriction was applied. Randomised controlled trials (RCTs) focusing on interventions for raising women's breast cancer awareness i.e. knowledge of potential breast cancer symptoms/changes and the confidence to look at and feel their breasts, using any means of delivery, i.e. one-to-one/group/mass media campaign(s). Two authors selected studies, independently extracted data and assessed risk of bias. We reported the odds ratio (OR) and 95% confidence intervals (CIs) for dichotomous outcomes and mean difference (MD) and standard deviation (SD) for continuous outcomes. Since it was not possible to combine data from included studies due to their heterogeneity, we present a narrative synthesis. We assessed the quality of evidence using GRADE methods. We included two RCTs involving 997 women: one RCT (867 women) randomised women to receive either a written booklet and usual care (intervention group 1), a written booklet and usual care plus a verbal interaction with a radiographer or research psychologist (intervention group 2) or usual care (control group); and the second RCT (130 women) randomised women to either an educational programme (three sessions of 60 to 90 minutes) or no intervention (control group). Knowledge of breast cancer symptomsIn the first study, knowledge of non-lump symptoms increased in intervention group 1 compared to the control group at two years postintervention, but not significantly (OR 1.1, 95% CI 0.7 to 1.6; P = 0.66; 449 women; moderate-quality evidence). Similarly, at two years postintervention, knowledge of symptoms increased in the intervention group 2 compared to the control group but not significantly (OR 1.4, 95% CI 0.9 to 2.1; P = 0.11; 434 women; moderate-quality evidence). In the second study, women's awareness of breast cancer symptoms had increased one month post intervention in the educational group (MD 3.45, SD 5.11; 65 women; low-quality evidence) compared to the control group (MD -0.68, SD 5.93; 65 women; P < 0.001), where there was a decrease in awareness. Knowledge of age-related riskIn the first study, women's knowledge of age-related risk of breast cancer increased, but not significantly, in intervention group 1 compared to control at two years postintervention (OR 1.8; 95% CI 0.9 to 3.5; P < 0.08; 447 women; moderate-quality evidence). Women's knowledge of risk increased significantly in intervention group 2 compared to control at two years postintervention (OR 4.8, 95% CI 2.6 to 9.0; P < 0.001; 431 women; moderate-quality evidence). In the second study, women's perceived susceptibility (how at risk they considered themselves) to breast cancer had increased significantly one month post intervention in the educational group (MD 1.31, SD 3.57; 65 women; low-quality evidence) compared to the control group (MD -0.55, SD 3.31; 65 women; P = 0.005), where a decrease in perceived susceptibility was noted. Frequency of Breast CheckingIn the first study, no significant change was noted for intervention group 1 compared to control at two years postintervention (OR 1.1, 95% CI 0.8 to 1.6; P = 0.54; 457 women; moderate-quality evidence). Monthly breast checking increased, but not significantly, in intervention group 2 compared to control at two years postintervention (OR 1.3, 95% CI 0.9 to 1.9; P = 0.14; 445 women; moderate-quality evidence). In the second study, women's breast cancer preventive behaviours increased significantly one month post intervention in the educational group (MD 1.21, SD 2.54; 65 women; low-quality evidence) compared to the control group (MD 0.15, SD 2.94; 65 women; P < 0.045). Breast Cancer AwarenessWomen's overall breast cancer awareness did not change in intervention group 1 compared to control at two years postintervention (OR 1.8, 95% CI 0.6 to 5.30; P = 0.32; 435 women; moderate-quality evidence) while overall awareness increased in the intervention group 2 compared to control at two years postintervention (OR 8.1, 95% CI 2.7 to 25.0; P < 0.001; 420 women; moderate-quality evidence). In the second study, there was a significant increase in scores on the Health Belief Model (that included the constructs of awareness and perceived susceptibility) at one month postintervention in the educational group (mean 1.21, SD 2.54; 65 women) compared to the control group (mean 0.15, SD 2.94; 65 women; P = 0.045).Neither study reported outcomes relating to motivation to check their breasts, confidence to seek help, time from breast symptom discovery to presentation to a healthcare professional, intentions to seek help, quality of life, adverse effects of the interventions, stages of breast cancer, survival estimates or breast cancer mortality rates. Based on the results of two RCTs, a brief intervention has the potential to increase women's breast cancer awareness. However, findings of this review should be interpreted with caution, as GRADE assessment identified moderate-quality evidence in only one of the two studies reviewed. In addition, the included trials were heterogeneous in terms of the interventions, population studied and outcomes measured. Therefore, current evidence cannot be generalised to the wider context. Further studies including larger samples, validated outcome measures and longitudinal approaches are warranted.
Edited Synoptic Cloud Reports from Ships and Land Stations Over the Globe, 1982-1991 (NDP-026B)
Hahn, Carole J. [University of Arizona; Warren, Stephen G. [University of Washington; London, Julius [University of Colorado
1996-01-01
Surface synoptic weather reports for the entire globe for the 10-year period from December 1981 through November 1991 have been processed, edited, and rewritten to provide a data set designed for use in cloud analyses. The information in these reports relating to clouds, including the present weather information, was extracted and put through a series of quality control checks. Reports not meeting certain quality control standards were rejected, as were reports from buoys and automatic weather stations. Correctable inconsistencies within reports were edited for consistency, so that the "edited cloud report" can be used for cloud analysis without further quality checking. Cases of "sky obscured" were interpreted by reference to the present weather code as to whether they indicated fog, rain or snow and were given appropriate cloud type designations. Nimbostratus clouds, which are not specifically coded for in the standard synoptic code, were also given a special designation. Changes made to an original report are indicated in the edited report so that the original report can be reconstructed if desired. While low cloud amount is normally given directly in the synoptic report, the edited cloud report also includes the amounts, either directly reported or inferred, of middle and high clouds, both the non-overlapped amounts and the "actual" amounts (which may be overlapped). Since illumination from the moon is important for the adequate detection of clouds at night, both the relative lunar illuminance and the solar altitude are given, as well as a parameter that indicates whether our recommended illuminance criterion was satisfied. This data set contains 124 million reports from land stations and 15 million reports from ships. Each report is 56 characters in length. The archive consists of 240 files, one file for each month of data for land and ocean separately. With this data set a user can develop a climatology for any particular cloud type or group of types, for any geographical region and any spatial and temporal resolution desired.
Nicolay, C R; Purkayastha, S; Greenhalgh, A; Benn, J; Chaturvedi, S; Phillips, N; Darzi, A
2012-03-01
The demand for the highest-quality patient care coupled with pressure on funding has led to the increasing use of quality improvement (QI) methodologies from the manufacturing industry. The aim of this systematic review was to identify and evaluate the application and effectiveness of these QI methodologies to the field of surgery. MEDLINE, the Cochrane Database, Allied and Complementary Medicine Database, British Nursing Index, Cumulative Index to Nursing and Allied Health Literature, Embase, Health Business(™) Elite, the Health Management Information Consortium and PsycINFO(®) were searched according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses statement. Empirical studies were included that implemented a described QI methodology to surgical care and analysed a named outcome statistically. Some 34 of 1595 articles identified met the inclusion criteria after consensus from two independent investigators. Nine studies described continuous quality improvement (CQI), five Six Sigma, five total quality management (TQM), five plan-do-study-act (PDSA) or plan-do-check-act (PDCA) cycles, five statistical process control (SPC) or statistical quality control (SQC), four Lean and one Lean Six Sigma; 20 of the studies were undertaken in the USA. The most common aims were to reduce complications or improve outcomes (11), to reduce infection (7), and to reduce theatre delays (7). There was one randomized controlled trial. QI methodologies from industry can have significant effects on improving surgical care, from reducing infection rates to increasing operating room efficiency. The evidence is generally of suboptimal quality, and rigorous randomized multicentre studies are needed to bring evidence-based management into the same league as evidence-based medicine. Copyright © 2011 British Journal of Surgery Society Ltd. Published by John Wiley & Sons, Ltd.
Final Project Report - ARM CLASIC CIRPAS Twin Otter Aerosol
DOE Office of Scientific and Technical Information (OSTI.GOV)
John A. Ogren
2010-04-05
The NOAA/ESRL/GMD aerosol group made three types of contributions related to airborne measurements of aerosol light scattering and absorption for the Cloud and Land Surface Interaction Campaign (CLASIC) in June 2007 on the Twin Otter research airplane operated by the Center for Interdisciplinary Remotely-Piloted Aircraft Studies (CIRPAS). GMD scientists served as the instrument mentor for the integrating nephelometer and particle soot absorption photometer (PSAP) on the Twin Otter during CLASIC, and were responsible for (1) instrument checks/comparisons; (2) instrument trouble shooting/repair; and (3) data quality control (QC) and submittal to the archive.
NASA Technical Reports Server (NTRS)
Buehler, Martin G. (Inventor)
1988-01-01
A set of addressable test structures, each of which uses addressing schemes to access individual elements of the structure in a matrix, is used to test the quality of a wafer before integrated circuits produced thereon are diced, packaged and subjected to final testing. The electrical characteristic of each element is checked and compared to the electrical characteristic of all other like elements in the matrix. The effectiveness of the addressable test matrix is in readily analyzing the electrical characteristics of the test elements and in providing diagnostic information.
[Investigation of Elekta linac characteristics for VMAT].
Luo, Guangwen; Zhang, Kunyi
2012-01-01
The aim of this study is to investigate the characteristics of Elekta delivery system for volumetric modulated arc therapy (VMAT). Five VMAT plans were delivered in service mode and dose rates, and speed of gantry and MLC leaves were analyzed by log files. Results showed that dose rates varied between 6 dose rates. Gantry and MLC leaf speed dynamically varied during delivery. The technique of VMAT requires linac to dynamically control more parameters, and these key dynamic variables during VMAT delivery can be checked by log files. Quality assurance procedure should be carried out for VMAT related parameter.
Long-term behaviour of timber structures in torrent control
NASA Astrophysics Data System (ADS)
Rickli, Christian; Graf, Frank
2014-05-01
Timber is widely used for protection measures in torrent control. However, life span of woody constructions such as timber check dams is limited due to fungal decay. However, only sparse scientific information is available on the long-term behaviour of timber structures and the colonisation with decay fungi. Related to this, in practice a controversial discussion has been going on if either Norway Spruce (Picea abies) or Silver Fir (Abies alba) is more enduring and if bark removal increases resistance against fungal decay. In order to going into this matter a series of 15 timber check dams built in 1996 has been monitored. The constructions were alternatively realised with Norway Spruce and Silver Fir, half of them each with remaining and removed bark, respectively. The scientific investigations included the documentation of colonisation with rot fungi and the identification of decayed zones with a simple practical approach as well as based on drilling resistance. Colonisation by decay fungi started three years after construction (e.g. Gloeophyllum sepiarium), detecting two years later first parts with reduced wood resistance. Sixteen years after construction decay was found on all check dams but two. Wood quality was markedly better in watered sections compared to the occasionally dry lateral abutment sections. Taking the whole check dams into consideration, slightly more decay was detected in Norway Spruce compared to logs in Silver Fir and both the practical approach and the drilling resistance measurement yielded in more defects on logs without bark. However, due to limited number of replications and fungal data, it was not possible to statistically verify these results. Statistical analysis was restricted to the drilling resistance data and fruit-bodies of decay fungi of the uppermost log of each check dam. Based on this limited analysis significant differences in the effect on the drilling resistance were found for watered sections and lateral abutments, brown and white rot as well as fir with and without bark. Taking further into account that brown rot reduces wood strength faster than white rot, it may be speculated that spruce logs without bark and fir logs with bark are more resistant against fungal decay compared to logs of spruce with and fir without bark, respectively. However, this has to be treated with caution as only the uppermost logs were considered, the observation period was only 15 years and the relative abundance of the most important decay fungi considerably varied between as well as within the check dams. Consequently, for statistically sound and well-funded recommendations further investigations over a longer period are indispensable.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ellefson, S; Department of Human Oncology, University of Wisconsin, Madison, WI; Culberson, W
Purpose: Discrepancies in absolute dose values have been detected between the ViewRay treatment planning system and ArcCHECK readings when performing delivery quality assurance on the ViewRay system with the ArcCHECK-MR diode array (SunNuclear Corporation). In this work, we investigate whether these discrepancies are due to errors in the ViewRay planning and/or delivery system or due to errors in the ArcCHECK’s readings. Methods: Gamma analysis was performed on 19 ViewRay patient plans using the ArcCHECK. Frequency analysis on the dose differences was performed. To investigate whether discrepancies were due to measurement or delivery error, 10 diodes in low-gradient dose regions weremore » chosen to compare with ion chamber measurements in a PMMA phantom with the same size and shape as the ArcCHECK, provided by SunNuclear. The diodes chosen all had significant discrepancies in absolute dose values compared to the ViewRay TPS. Absolute doses to PMMA were compared between the ViewRay TPS calculations, ArcCHECK measurements, and measurements in the PMMA phantom. Results: Three of the 19 patient plans had 3%/3mm gamma passing rates less than 95%, and ten of the 19 plans had 2%/2mm passing rates less than 95%. Frequency analysis implied a non-random error process. Out of the 10 diode locations measured, ion chamber measurements were all within 2.2% error relative to the TPS and had a mean error of 1.2%. ArcCHECK measurements ranged from 4.5% to over 15% error relative to the TPS and had a mean error of 8.0%. Conclusion: The ArcCHECK performs well for quality assurance on the ViewRay under most circumstances. However, under certain conditions the absolute dose readings are significantly higher compared to the planned doses. As the ion chamber measurements consistently agree with the TPS, it can be concluded that the discrepancies are due to ArcCHECK measurement error and not TPS or delivery system error. This work was funded by the Bhudatt Paliwal Professorship and the University of Wisconsin Medical Radiation Research Center.« less
OntoCheck: verifying ontology naming conventions and metadata completeness in Protégé 4.
Schober, Daniel; Tudose, Ilinca; Svatek, Vojtech; Boeker, Martin
2012-09-21
Although policy providers have outlined minimal metadata guidelines and naming conventions, ontologies of today still display inter- and intra-ontology heterogeneities in class labelling schemes and metadata completeness. This fact is at least partially due to missing or inappropriate tools. Software support can ease this situation and contribute to overall ontology consistency and quality by helping to enforce such conventions. We provide a plugin for the Protégé Ontology editor to allow for easy checks on compliance towards ontology naming conventions and metadata completeness, as well as curation in case of found violations. In a requirement analysis, derived from a prior standardization approach carried out within the OBO Foundry, we investigate the needed capabilities for software tools to check, curate and maintain class naming conventions. A Protégé tab plugin was implemented accordingly using the Protégé 4.1 libraries. The plugin was tested on six different ontologies. Based on these test results, the plugin could be refined, also by the integration of new functionalities. The new Protégé plugin, OntoCheck, allows for ontology tests to be carried out on OWL ontologies. In particular the OntoCheck plugin helps to clean up an ontology with regard to lexical heterogeneity, i.e. enforcing naming conventions and metadata completeness, meeting most of the requirements outlined for such a tool. Found test violations can be corrected to foster consistency in entity naming and meta-annotation within an artefact. Once specified, check constraints like name patterns can be stored and exchanged for later re-use. Here we describe a first version of the software, illustrate its capabilities and use within running ontology development efforts and briefly outline improvements resulting from its application. Further, we discuss OntoChecks capabilities in the context of related tools and highlight potential future expansions. The OntoCheck plugin facilitates labelling error detection and curation, contributing to lexical quality assurance in OWL ontologies. Ultimately, we hope this Protégé extension will ease ontology alignments as well as lexical post-processing of annotated data and hence can increase overall secondary data usage by humans and computers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xie, J; Wang, J; Peng, J
Purpose: To implement an entire workflow quality assurance (QA) process in the radiotherapy department and to reduce the error rates of radiotherapy based on the entire workflow management in the developing country. Methods: The entire workflow QA process management starts from patient registration to the end of last treatment including all steps through the entire radiotherapy process. Error rate of chartcheck is used to evaluate the the entire workflow QA process. Two to three qualified senior medical physicists checked the documents before the first treatment fraction of every patient. Random check of the treatment history during treatment was also performed.more » A total of around 6000 patients treatment data before and after implementing the entire workflow QA process were compared from May, 2014 to December, 2015. Results: A systemic checklist was established. It mainly includes patient’s registration, treatment plan QA, information exporting to OIS(Oncology Information System), documents of treatment QAand QA of the treatment history. The error rate derived from the chart check decreases from 1.7% to 0.9% after our the entire workflow QA process. All checked errors before the first treatment fraction were corrected as soon as oncologist re-confirmed them and reinforce staff training was accordingly followed to prevent those errors. Conclusion: The entire workflow QA process improved the safety, quality of radiotherapy in our department and we consider that our QA experience can be applicable for the heavily-loaded radiotherapy departments in developing country.« less
Using computer models to design gully erosion control structures for humid northern Ethiopia
USDA-ARS?s Scientific Manuscript database
Classic gully erosion control measures such as check dams have been unsuccessful in halting gully formation and growth in the humid northern Ethiopian highlands. Gullies are typically formed in vertisols and flow often bypasses the check dams as elevated groundwater tables make gully banks unstable....
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hardin, M; Harrison, A; Lockamy, V
Purpose: Desire to improve efficiency and throughput inspired a review of our physics chart check procedures. Departmental policy mandates plan checks pre-treatment, after first treatment and weekly every 3–5 days. This study examined the effectiveness of the “after first” check with respect to improving patient safety and clinical efficiency. Type and frequency of variations discovered during this redundant secondary review was examined over seven months. Methods: A community spreadsheet was created to record variations in care discovered during chart review following the first fraction of treatment and before the second fraction (each plan reviewed prior to treatment). Entries were recordedmore » from August 2014 through February 2015, amounting to 43 recorded variations out of 906 reviewed charts. The variations were divided into categories and frequencies were assessed month-to-month. Results: Analysis of recorded variations indicates an overall variation rate of 4.7%. The initial rate was 13.5%; months 2–7 average 3.7%. The majority of variations related to discrepancies in documentation at 46.5%, followed by prescription, plan deficiency, and dose tracking related variations at 25.5%, 12.8%, and 12.8%, respectively. Minor variations (negligible consequence on patient treatment) outweighed major variations 3 to 1. Conclusion: This work indicates that this redundant secondary check is effective. The first month spike in rates could be due to the Hawthorne/observer effect, but the consistent 4% variation rate suggests the need for periodical re-training on variations noted as frequent to improve awareness and quality of the initial chart review process, which may lead to improved treatment quality, patient safety and increased clinical efficiency. Utilizing these results, a continuous quality improvement process following Deming’s Plan-Do-Study-Act (PDSA) methodology was generated. The first iteration of this PDSA was adding a specific dose tracking checklist item in the pre-treatment plan check assessment; the ramification of which will be assessed in future data.« less
AMBER instrument control software
NASA Astrophysics Data System (ADS)
Le Coarer, Etienne P.; Zins, Gerard; Gluck, Laurence; Duvert, Gilles; Driebe, Thomas; Ohnaka, Keiichi; Heininger, Matthias; Connot, Claus; Behrend, Jan; Dugue, Michel; Clausse, Jean Michel; Millour, Florentin
2004-09-01
AMBER (Astronomical Multiple BEam Recombiner) is a 3 aperture interferometric recombiner operating between 1 and 2.5 um, for the Very Large Telescope Interferometer (VLTI). The control software of the instrument, based on the VLT Common Software, has been written to comply with specific features of the AMBER hardware, such as the Infrared detector read out modes or piezo stage drivers, as well as with the very specific operation modes of an interferomtric instrument. In this respect, the AMBER control software was designed to insure that all operations, from the preparation of the observations to the control/command of the instrument during the observations, would be kept as simple as possible for the users and operators, opening the use of an interferometric instrument to the largest community of astronomers. Peculiar attention was given to internal checks and calibration procedures both to evaluate data quality in real time, and improve the successes of long term UV plane coverage observations.
Pasqualone, Antonella; Montemurro, Cinzia; di Rienzo, Valentina; Summo, Carmine; Paradiso, Vito Michele; Caponio, Francesco
2016-08-01
In recent years, an increasing number of typicality marks has been awarded to high-quality olive oils produced from local cultivars. In this case, quality control requires effective varietal checks of the starting materials. Moreover, accurate cultivar identification is essential in vegetative-propagated plants distributed by nurseries and is a pre-requisite to register new cultivars. Food genomics provides many tools for cultivar identification and traceability from tree to oil and table olives. The results of the application of different classes of DNA markers to olive with the purpose of checking cultivar identity and variability of plant material are extensively discussed in this review, with special regard to repeatability issues and polymorphism degree. The characterization of olive germplasm from all countries of the Mediterranean basin and from less studied geographical areas is described and innovative high-throughput molecular tools to manage reference collections are reviewed. Then the transferability of DNA markers to processed products - virgin olive oils and table olives - is overviewed to point out strengths and weaknesses, with special regard to (i) the influence of processing steps and storage time on the quantity and quality of residual DNA, (ii) recent advances to overcome the bottleneck of DNA extraction from processed products, (iii) factors affecting whole comparability of DNA profiles between fresh plant materials and end-products, (iv) drawbacks in the analysis of multi-cultivar versus single-cultivar end-products and (v) the potential of quantitative polymerase chain reaction (PCR)-based techniques. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.
40 CFR Appendix A to Part 58 - Quality Assurance Requirements for SLAMS, SPMs and PSD Air Monitoring
Code of Federal Regulations, 2014 CFR
2014-07-01
... monitor. 3.3.4.4Pb Performance Evaluation Program (PEP) Procedures. Each year, one performance evaluation... Information 2. Quality System Requirements 3. Measurement Quality Check Requirements 4. Calculations for Data... 10 of this appendix) and at a national level in references 1, 2, and 3 of this appendix. 1...
40 CFR Appendix A to Part 58 - Quality Assurance Requirements for SLAMS, SPMs and PSD Air Monitoring
Code of Federal Regulations, 2013 CFR
2013-07-01
... monitor. 3.3.4.4Pb Performance Evaluation Program (PEP) Procedures. Each year, one performance evaluation... Information 2. Quality System Requirements 3. Measurement Quality Check Requirements 4. Calculations for Data... 10 of this appendix) and at a national level in references 1, 2, and 3 of this appendix. 1...
25 CFR 542.14 - What are the minimum internal control standards for the cage?
Code of Federal Regulations, 2011 CFR
2011-04-01
..., collecting and recording checks returned to the gaming operation after deposit, re-deposit, and write-off... person approving the counter check transaction. (4) When traveler's checks or other guaranteed drafts... identity, including photo identification. (8) A file for customers shall be prepared prior to acceptance of...
Noge, Sachiko; Ohishi, Tatsuo; Yoshida, Takuya; Kumagai, Hiromichi
2017-01-01
[Purpose] Locomotive syndrome (LS) is a condition by which older people may require care service because of problems with locomotive organs. This study examined whether the loco-check, a 7-item questionnaire, is useful for quantitatively assessing the severity of LS. [Subjects and Methods] Seventy-one community dwelling Japanese females aged 64–96 years (81.7 ± 8.0 years) participated in this study. The associations of the loco-check with thigh muscle mass measured by X-ray CT, physical performance, nutritional status, and quality of life (QOL) were investigated. [Results] The results showed that the number of times that “yes” was selected in the loco-check was significantly correlated with thigh muscle mass, major measures of physical performance, nutritional status, and QOL. This number was also significantly larger in the participants experiencing falling, fracture, and lumbar pain than in those without these episodes. [Conclusion] These results suggest that the loco-check might be useful for quantitatively evaluating LS. PMID:28932003
pcr: an R package for quality assessment, analysis and testing of qPCR data
Ahmed, Mahmoud
2018-01-01
Background Real-time quantitative PCR (qPCR) is a broadly used technique in the biomedical research. Currently, few different analysis models are used to determine the quality of data and to quantify the mRNA level across the experimental conditions. Methods We developed an R package to implement methods for quality assessment, analysis and testing qPCR data for statistical significance. Double Delta CT and standard curve models were implemented to quantify the relative expression of target genes from CT in standard qPCR control-group experiments. In addition, calculation of amplification efficiency and curves from serial dilution qPCR experiments are used to assess the quality of the data. Finally, two-group testing and linear models were used to test for significance of the difference in expression control groups and conditions of interest. Results Using two datasets from qPCR experiments, we applied different quality assessment, analysis and statistical testing in the pcr package and compared the results to the original published articles. The final relative expression values from the different models, as well as the intermediary outputs, were checked against the expected results in the original papers and were found to be accurate and reliable. Conclusion The pcr package provides an intuitive and unified interface for its main functions to allow biologist to perform all necessary steps of qPCR analysis and produce graphs in a uniform way. PMID:29576953
Operative blood transfusion quality improvement audit.
Al Sohaibani, Mazen; Al Malki, Assaf; Pogaku, Venumadhav; Al Dossary, Saad; Al Bernawi, Hanan
2014-01-01
To determine how current anesthesia team handless the identification of surgical anaesthetized patient (right patient). And the check of blood unit before collecting and immediately before blood administration (right blood) in operating rooms where nurses have minimal duties and responsibility to handle blood for transfusion in anaesthetized patients. To elicit the degree of anesthesia staff compliance with new policies and procedures for anaesthetized surgical patient the blood transfusion administration. A large tertiary care reference and teaching hospital. A prospective quality improvement. Elaboration on steps for administration of transfusion from policies and procedures to anaesthetized patients; and analysis of the audit forms for conducted transfusions. An audit form was used to get key performance indicators (KPIs) observed in all procedures involve blood transfusion and was ticked as item was met, partially met, not met or not applicable. Descriptive statistics as number and percentage Microsoft excel 2003. Central quality improvement committee presented the results in number percentage and graphs. The degree of compliance in performing the phases of blood transfusion by anesthesia staff reached high percentage which let us feel certain that the quality is assured that the internal policy and procedures (IPP) are followed in the great majority of all types of red cells and other blood products transfusion from the start of requesting the blood or blood product to the prescript of checking the patient in the immediate post-transfusion period. Specific problem area of giving blood transfusion to anaesthetized patient was checking KPI concerning the phases of blood transfusion was audited and assured the investigators of high quality performance in procedures of transfusion.
Software tool for physics chart checks.
Li, H Harold; Wu, Yu; Yang, Deshan; Mutic, Sasa
2014-01-01
Physics chart check has long been a central quality assurance (QC) measure in radiation oncology. The purpose of this work is to describe a software tool that aims to accomplish simplification, standardization, automation, and forced functions in the process. Nationally recognized guidelines, including American College of Radiology and American Society for Radiation Oncology guidelines and technical standards, and the American Association of Physicists in Medicine Task Group reports were identified, studied, and summarized. Meanwhile, the reported events related to physics chart check service were analyzed using an event reporting and learning system. A number of shortfalls in the chart check process were identified. To address these problems, a software tool was designed and developed under Microsoft. Net in C# to hardwire as many components as possible at each stage of the process. The software consists of the following 4 independent modules: (1) chart check management; (2) pretreatment and during treatment chart check assistant; (3) posttreatment chart check assistant; and (4) quarterly peer-review management. The users were a large group of physicists in the author's radiation oncology clinic. During over 1 year of use the tool has proven very helpful in chart checking management, communication, documentation, and maintaining consistency. The software tool presented in this work aims to assist physicists at each stage of the physics chart check process. The software tool is potentially useful for any radiation oncology clinics that are either in the process of pursuing or maintaining the American College of Radiology accreditation.
Verifying Multi-Agent Systems via Unbounded Model Checking
NASA Technical Reports Server (NTRS)
Kacprzak, M.; Lomuscio, A.; Lasica, T.; Penczek, W.; Szreter, M.
2004-01-01
We present an approach to the problem of verification of epistemic properties in multi-agent systems by means of symbolic model checking. In particular, it is shown how to extend the technique of unbounded model checking from a purely temporal setting to a temporal-epistemic one. In order to achieve this, we base our discussion on interpreted systems semantics, a popular semantics used in multi-agent systems literature. We give details of the technique and show how it can be applied to the well known train, gate and controller problem. Keywords: model checking, unbounded model checking, multi-agent systems
The method of a joint intraday security check system based on cloud computing
NASA Astrophysics Data System (ADS)
Dong, Wei; Feng, Changyou; Zhou, Caiqi; Cai, Zhi; Dan, Xu; Dai, Sai; Zhang, Chuancheng
2017-01-01
The intraday security check is the core application in the dispatching control system. The existing security check calculation only uses the dispatch center’s local model and data as the functional margin. This paper introduces the design of all-grid intraday joint security check system based on cloud computing and its implementation. To reduce the effect of subarea bad data on the all-grid security check, a new power flow algorithm basing on comparison and adjustment with inter-provincial tie-line plan is presented. And the numerical example illustrated the effectiveness and feasibility of the proposed method.
Šupak-Smolčić, Vesna; Šimundić, Ana-Maria
2013-01-01
In February 2013, Biochemia Medica has joined CrossRef, which enabled us to implement CrossCheck plagiarism detection service. Therefore, all manuscript submitted to Biochemia Medica are now first assigned to Research integrity editor (RIE), before sending the manuscript for peer-review. RIE submits the text to CrossCheck analysis and is responsible for reviewing the results of the text similarity analysis. Based on the CrossCheck analysis results, RIE subsequently provides a recommendation to the Editor-in-chief (EIC) on whether the manuscript should be forwarded to peer-review, corrected for suspected parts prior to peer-review or immediately rejected. Final decision on the manuscript is, however, with the EIC. We hope that our new policy and manuscript processing algorithm will help us to further increase the overall quality of our Journal. PMID:23894858
Morphology, geology and water quality assessment of former tin mining catchment.
Ashraf, Muhammad Aqeel; Maah, Mohd Jamil; Yusoff, Ismail
2012-01-01
Bestari Jaya, former tin mining catchment covers an area of 2656.31 hectares comprised of four hundred and forty-two different-size lakes and ponds. The present study area comprise of 92 hectares of the catchment that include four large size lakes. Arc GIS version 9.2 used to develop bathymetric map, Global Positioning System (GPS) for hydrographical survey and flow meter was utilized for water discharge analysis (flow routing) of the catchment. The water quality parameters (pH, temperature, electric conductivity, dissolved oxygen DO, total dissolved solids TDS, chlorides, ammonium, nitrates) were analyzed by using Hydrolab. Quality assurance (QA) and quality control (QC) procedures were strictly followed throughout the field work and data analysis. Different procedures were employed to evaluate the analytical data and to check for possible transcription or dilution errors, changes during analysis, or unusual or unlikely values. The results obtained are compared with interim national water quality standards for Malaysia indicates that water quality of area is highly degraded. It is concluded that Bestri Jaya ex-mining catchment has a high pollution potential due to mining activities and River Ayer Hitam, recipient of catchment water, is a highly polluted river.
Morphology, Geology and Water Quality Assessment of Former Tin Mining Catchment
Ashraf, Muhammad Aqeel; Maah, Mohd. Jamil; Yusoff, Ismail
2012-01-01
Bestari Jaya, former tin mining catchment covers an area of 2656.31 hectares comprised of four hundred and forty-two different-size lakes and ponds. The present study area comprise of 92 hectares of the catchment that include four large size lakes. Arc GIS version 9.2 used to develop bathymetric map, Global Positioning System (GPS) for hydrographical survey and flow meter was utilized for water discharge analysis (flow routing) of the catchment. The water quality parameters (pH, temperature, electric conductivity, dissolved oxygen DO, total dissolved solids TDS, chlorides, ammonium, nitrates) were analyzed by using Hydrolab. Quality assurance (QA) and quality control (QC) procedures were strictly followed throughout the field work and data analysis. Different procedures were employed to evaluate the analytical data and to check for possible transcription or dilution errors, changes during analysis, or unusual or unlikely values. The results obtained are compared with interim national water quality standards for Malaysia indicates that water quality of area is highly degraded. It is concluded that Bestri Jaya ex-mining catchment has a high pollution potential due to mining activities and River Ayer Hitam, recipient of catchment water, is a highly polluted river. PMID:22761549
THE APPLICATION OF RADIOISOTOPES IN SHIPBUILDING FOR NONDESTRUCTIVE MATERIAL TESTING
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerlach, H.
1962-02-01
S>Safety and reliability in shipbuilding require exact testing methods for all materials, such as plates, sheets, rods, cast pieces, welded joints, etc. Non-visible defects in parts exposed to great stress may cause great damages. Since both x-ray and gamma radiography can detect these defects, a choice has to be made between these radiation sources. ln general, for very thick pieces, gamma emitters such as Co/sup 60/ or Ta/sup 182/ are used; for medium thick pieces, gamma emitters, such as Ir/sup 192/ or Cs/sup 137/ are used; for very thin pieces weak gamma emitters, such as Eu/sup 155/ or x raysmore » are used. There is no competition between x rays and gamma rays in nondestructive testing because both methods supplement each other. The tcchnical and economical advantages and disadvantages of both methods are discussed. In shipbuilding, Ir/ sup 192/ has extremely good irradiation qualities for sheets from 6 to 50 mm thick and is superior to the x-ray method for checking welds of sheets of this thickness. However, for materials thicker than 50 mm x rays are useless, and defects in this material must be located with hard gamma emitters, such as Co/ sup 60/. Recently, mobile test stations with radioisotopes were established which are of great value for checking and quality control in shipbuilding at the shipyard. (OID)« less
TH-AB-201-12: Using Machine Log-Files for Treatment Planning and Delivery QA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stanhope, C; Liang, J; Drake, D
2016-06-15
Purpose: To determine the segment reduction and dose resolution necessary for machine log-files to effectively replace current phantom-based patient-specific quality assurance, while minimizing computational cost. Methods: Elekta’s Log File Convertor R3.2 records linac delivery parameters (dose rate, gantry angle, leaf position) every 40ms. Five VMAT plans [4 H&N, 1 Pulsed Brain] comprised of 2 arcs each were delivered on the ArcCHECK phantom. Log-files were reconstructed in Pinnacle on the phantom geometry using 1/2/3/4° control point spacing and 2/3/4mm dose grid resolution. Reconstruction effectiveness was quantified by comparing 2%/2mm gamma passing rates of the original and log-file plans. Modulation complexity scoresmore » (MCS) were calculated for each beam to correlate reconstruction accuracy and beam modulation. Percent error in absolute dose for each plan-pair combination (log-file vs. ArcCHECK, original vs. ArcCHECK, log-file vs. original) was calculated for each arc and every diode greater than 10% of the maximum measured dose (per beam). Comparing standard deviations of the three plan-pair distributions, relative noise of the ArcCHECK and log-file systems was elucidated. Results: The original plans exhibit a mean passing rate of 95.1±1.3%. The eight more modulated H&N arcs [MCS=0.088±0.014] and two less modulated brain arcs [MCS=0.291±0.004] yielded log-file pass rates most similar to the original plan when using 1°/2mm [0.05%±1.3% lower] and 2°/3mm [0.35±0.64% higher] log-file reconstructions respectively. Log-file and original plans displayed percent diode dose errors 4.29±6.27% and 3.61±6.57% higher than measurement. Excluding the phantom eliminates diode miscalibration and setup errors; log-file dose errors were 0.72±3.06% higher than the original plans – significantly less noisy. Conclusion: For log-file reconstructed VMAT arcs, 1° control point spacing and 2mm dose resolution is recommended, however, less modulated arcs may allow less stringent reconstructions. Following the aforementioned reconstruction recommendations, the log-file technique is capable of detecting delivery errors with equivalent accuracy and less noise than ArcCHECK QA. I am funded by an Elekta Research Grant.« less
Quality Assurance Specifications for Planetary Protection Assays
NASA Astrophysics Data System (ADS)
Baker, Amy
As the European Space Agency planetary protection (PP) activities move forward to support the ExoMars and other planetary missions, it will become necessary to increase staffing of labo-ratories that provide analyses for these programs. Standardization of procedures, a comprehen-sive quality assurance program, and unilateral training of personnel will be necessary to ensure that the planetary protection goals and schedules are met. The PP Quality Assurance/Quality Control (QAQC) program is designed to regulate and monitor procedures performed by labora-tory personnel to ensure that all work meets data quality objectives through the assembly and launch process. Because personnel time is at a premium and sampling schedules are often de-pendent on engineering schedules, it is necessary to have flexible staffing to support all sampling requirements. The most productive approach to having a competent and flexible work force is to establish well defined laboratory procedures and training programs that clearly address the needs of the program and the work force. The quality assurance specification for planetary protection assays has to ensure that labora-tories and associated personnel can demonstrate the competence to perform assays according to the applicable standard AD4. Detailed subjects included in the presentation are as follows: • field and laboratory control criteria • data reporting • personnel training requirements and certification • laboratory audit criteria. Based upon RD2 for primary and secondary validation and RD3 for data quality objectives, the QAQC will provide traceable quality assurance safeguards by providing structured laboratory requirements for guidelines and oversight including training and technical updates, standardized documentation, standardized QA/QC checks, data review and data archiving.
Improving treatment plan evaluation with automation.
Covington, Elizabeth L; Chen, Xiaoping; Younge, Kelly C; Lee, Choonik; Matuszak, Martha M; Kessler, Marc L; Keranen, Wayne; Acosta, Eduardo; Dougherty, Ashley M; Filpansick, Stephanie E; Moran, Jean M
2016-11-08
The goal of this work is to evaluate the effectiveness of Plan-Checker Tool (PCT) which was created to improve first-time plan quality, reduce patient delays, increase the efficiency of our electronic workflow, and standardize and automate the phys-ics plan review in the treatment planning system (TPS). PCT uses an application programming interface to check and compare data from the TPS and treatment management system (TMS). PCT includes a comprehensive checklist of automated and manual checks that are documented when performed by the user as part of a plan readiness check for treatment. Prior to and during PCT development, errors identified during the physics review and causes of patient treatment start delays were tracked to prioritize which checks should be automated. Nineteen of 33checklist items were automated, with data extracted with PCT. There was a 60% reduction in the number of patient delays in the six months after PCT release. PCT was suc-cessfully implemented for use on all external beam treatment plans in our clinic. While the number of errors found during the physics check did not decrease, automation of checks increased visibility of errors during the physics check, which led to decreased patient delays. The methods used here can be applied to any TMS and TPS that allows queries of the database. © 2016 The Authors.
Evaluation of Dosimetry Check software for IMRT patient-specific quality assurance.
Narayanasamy, Ganesh; Zalman, Travis; Ha, Chul S; Papanikolaou, Niko; Stathakis, Sotirios
2015-05-08
The purpose of this study is to evaluate the use of the Dosimetry Check system for patient-specific IMRT QA. Typical QA methods measure the dose in an array dosimeter surrounded by homogenous medium for which the treatment plan has been recomputed. With the Dosimetry Check system, fluence measurements acquired on a portal dosimeter is applied to the patient's CT scans. Instead of making dose comparisons in a plane, Dosimetry Check system produces isodose lines and dose-volume histograms based on the planning CT images. By exporting the dose distribution from the treatment planning system into the Dosimetry Check system, one is able to make a direct comparison between the calculated dose and the planned dose. The versatility of the software is evaluated with respect to the two IMRT techniques - step and shoot and volumetric arc therapy. The system analyzed measurements made using EPID, PTW seven29, and IBA MatriXX, and an intercomparison study was performed. Plans from patients previously treated at our institution with treated anatomical site on brain, head & neck, liver, lung, and prostate were analyzed using Dosimetry Check system for any anatomical site dependence. We have recommendations and possible precautions that may be necessary to ensure proper QA with the Dosimetry Check system.
31 CFR 596.307 - Monetary instruments.
Code of Federal Regulations, 2010 CFR
2010-07-01
... FOREIGN ASSETS CONTROL, DEPARTMENT OF THE TREASURY TERRORISM LIST GOVERNMENTS SANCTIONS REGULATIONS... includes coin or currency of the United States or of any other country, travelers' checks, personal checks...
Code of Federal Regulations, 2012 CFR
2012-01-01
... 10 Energy 1 2012-01-01 2012-01-01 false Records of alarm system and entrance control checks at permanent radiographic installations. 34.75 Section 34.75 Energy NUCLEAR REGULATORY COMMISSION LICENSES FOR INDUSTRIAL RADIOGRAPHY AND RADIATION SAFETY REQUIREMENTS FOR INDUSTRIAL RADIOGRAPHIC OPERATIONS Recordkeeping...
Code of Federal Regulations, 2014 CFR
2014-01-01
... 10 Energy 1 2014-01-01 2014-01-01 false Records of alarm system and entrance control checks at permanent radiographic installations. 34.75 Section 34.75 Energy NUCLEAR REGULATORY COMMISSION LICENSES FOR INDUSTRIAL RADIOGRAPHY AND RADIATION SAFETY REQUIREMENTS FOR INDUSTRIAL RADIOGRAPHIC OPERATIONS Recordkeeping...
Code of Federal Regulations, 2011 CFR
2011-01-01
... 10 Energy 1 2011-01-01 2011-01-01 false Records of alarm system and entrance control checks at permanent radiographic installations. 34.75 Section 34.75 Energy NUCLEAR REGULATORY COMMISSION LICENSES FOR INDUSTRIAL RADIOGRAPHY AND RADIATION SAFETY REQUIREMENTS FOR INDUSTRIAL RADIOGRAPHIC OPERATIONS Recordkeeping...
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 1 2010-01-01 2010-01-01 false Records of alarm system and entrance control checks at permanent radiographic installations. 34.75 Section 34.75 Energy NUCLEAR REGULATORY COMMISSION LICENSES FOR INDUSTRIAL RADIOGRAPHY AND RADIATION SAFETY REQUIREMENTS FOR INDUSTRIAL RADIOGRAPHIC OPERATIONS Recordkeeping...
Code of Federal Regulations, 2013 CFR
2013-01-01
... 10 Energy 1 2013-01-01 2013-01-01 false Records of alarm system and entrance control checks at permanent radiographic installations. 34.75 Section 34.75 Energy NUCLEAR REGULATORY COMMISSION LICENSES FOR INDUSTRIAL RADIOGRAPHY AND RADIATION SAFETY REQUIREMENTS FOR INDUSTRIAL RADIOGRAPHIC OPERATIONS Recordkeeping...
Improving Software Quality and Management Through Use of Service Level Agreements
2005-03-01
many who believe that the quality of the development process is the best predictor of software product quality. ( Fenton ) Repeatable software processes...reduced errors per KLOC for small projects ( Fenton ), and the quality management metric (QMM) (Machniak, Osmundson). There are also numerous IEEE 14...attention to cosmetic user interface issues and any problems that may arise with the prototype. (Sawyer) The validation process is also another check
SU-F-T-294: The Analysis of Gamma Criteria for Delta4 Dosimetry Using Statistical Process Control
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cho, S; Ahn, S; Kim, J
Purpose: To evaluate the sensitivity of gamma criteria for patient-specific volumetric modulated arc therapy(VMAT) quality assurance of the Delta{sup 4} dosimetry program using the statistical process control(SPC) methodology. Methods: The authors selected 20 patient-specific VMAT QA cases which were undertaken MapCHECK and ArcCHECK with gamma pass rate better than 97%. The QAs data were collected Delta4 Phantom+ and Elekta Agility six megavolts without using an angle incrementer. The gamma index(GI) were calculated in 2D planes with normalizing deviation to local dose(local gamma). The sensitivity of the GI methodology using criterion of 3%/3mm, 3%/2mm and 2%/3mm was analyzed with using processmore » acceptability indices. We used local confidence(LC) level, the upper control limit(UCL) and lower control limit(LCL) of I-MR chart for process capability index(Cp) and a process acceptability index (Cpk). Results: The lower local confidence levels of 3%/3mm, 3%/2mm and 2%/3mm were 92.0%, 83.6% and 78.8% respectively. All of the calculated Cp and Cpk values that used LC level were under 1.0 in this study. The calculated LCLs of I-MR charts were 89.5%, 79.0% and 70.5% respectively. These values were higher than 1.0 which means good quality of QA. For the generally used lower limit of 90%, we acquired over 1.3 of Cp value for the gamma index of 3%/3mm and lower than 1.0 in the rest of GI. Conclusion: We applied SPC methodology to evaluate the sensitivity of gamma criteria and could see the lower control limits of VMAT QA for the Delta 4 dosimetry and could see that Delta 4 phantom+ dosimetry more affected by the position error and the I-MR chart derived values are more suitable for establishing lower limits. Acknowledgement: This research was supported by Basic Science Research Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Education (No. 2015R1D1A1A01060463)« less
A Mixed-Method Efficacy and Fidelity Study of Check and Connect
ERIC Educational Resources Information Center
Powers, Kristin; Hagans, Kristi; Linn, Megan
2017-01-01
The effectiveness of the Check and Connect dropout prevention program was examined, over the course of 2.5 years, with 54 middle school students from diverse backgrounds experiencing one or more conditions of risk for dropout. Participants were randomly assigned to receive the Check and Connect intervention or business as usual (i.e., control) in…
40 CFR 86.322-79 - NDIR CO2 rejection ratio check.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 19 2012-07-01 2012-07-01 false NDIR CO2 rejection ratio check. 86.322... PROGRAMS (CONTINUED) CONTROL OF EMISSIONS FROM NEW AND IN-USE HIGHWAY VEHICLES AND ENGINES Emission....322-79 NDIR CO2 rejection ratio check. (a) Zero and span the analyzer on the lowest range that will be...
40 CFR 86.322-79 - NDIR CO2 rejection ratio check.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 19 2013-07-01 2013-07-01 false NDIR CO2 rejection ratio check. 86.322... PROGRAMS (CONTINUED) CONTROL OF EMISSIONS FROM NEW AND IN-USE HIGHWAY VEHICLES AND ENGINES Emission....322-79 NDIR CO2 rejection ratio check. (a) Zero and span the analyzer on the lowest range that will be...
40 CFR 86.322-79 - NDIR CO2 rejection ratio check.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 18 2011-07-01 2011-07-01 false NDIR CO2 rejection ratio check. 86.322... PROGRAMS (CONTINUED) CONTROL OF EMISSIONS FROM NEW AND IN-USE HIGHWAY VEHICLES AND ENGINES Emission....322-79 NDIR CO2 rejection ratio check. (a) Zero and span the analyzer on the lowest range that will be...
40 CFR 86.322-79 - NDIR CO2 rejection ratio check.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 18 2010-07-01 2010-07-01 false NDIR CO2 rejection ratio check. 86.322... PROGRAMS (CONTINUED) CONTROL OF EMISSIONS FROM NEW AND IN-USE HIGHWAY VEHICLES AND ENGINES Emission....322-79 NDIR CO2 rejection ratio check. (a) Zero and span the analyzer on the lowest range that will be...
Informatics and data quality at collaborative multicenter Breast and Colon Cancer Family Registries.
McGarvey, Peter B; Ladwa, Sweta; Oberti, Mauricio; Dragomir, Anca Dana; Hedlund, Erin K; Tanenbaum, David Michael; Suzek, Baris E; Madhavan, Subha
2012-06-01
Quality control and harmonization of data is a vital and challenging undertaking for any successful data coordination center and a responsibility shared between the multiple sites that produce, integrate, and utilize the data. Here we describe a coordinated effort between scientists and data managers in the Cancer Family Registries to implement a data governance infrastructure consisting of both organizational and technical solutions. The technical solution uses a rule-based validation system that facilitates error detection and correction for data centers submitting data to a central informatics database. Validation rules comprise both standard checks on allowable values and a crosscheck of related database elements for logical and scientific consistency. Evaluation over a 2-year timeframe showed a significant decrease in the number of errors in the database and a concurrent increase in data consistency and accuracy.
Informatics and data quality at collaborative multicenter Breast and Colon Cancer Family Registries
McGarvey, Peter B; Ladwa, Sweta; Oberti, Mauricio; Dragomir, Anca Dana; Hedlund, Erin K; Tanenbaum, David Michael; Suzek, Baris E
2012-01-01
Quality control and harmonization of data is a vital and challenging undertaking for any successful data coordination center and a responsibility shared between the multiple sites that produce, integrate, and utilize the data. Here we describe a coordinated effort between scientists and data managers in the Cancer Family Registries to implement a data governance infrastructure consisting of both organizational and technical solutions. The technical solution uses a rule-based validation system that facilitates error detection and correction for data centers submitting data to a central informatics database. Validation rules comprise both standard checks on allowable values and a crosscheck of related database elements for logical and scientific consistency. Evaluation over a 2-year timeframe showed a significant decrease in the number of errors in the database and a concurrent increase in data consistency and accuracy. PMID:22323393
Hydraulic accumulator-compressor for geopressured enhanced oil recovery
Goldsberry, Fred L.
1988-01-01
A hydraulic accumulator-compressor vessel using geothermal brine under pressure as a piston to compress waste (CO.sub.2 rich) gas is used in a system having a plurality of gas separators in tandem to recover pipeline quality gas from geothermal brine. A first high pressure separator feeds gas to a membrance separator which separates low pressure waste gas from high pressure quality gas. A second separator produces low pressure waste gas. Waste gas from both separators is combined and fed into the vessel through a port at the top as the vessel is drained for another compression cycle. High pressure brine is then admitted into the vessel through a port at the bottom of the vessel. Check valves control the flow of low pressure waste gas into the vessel and high pressure waste gas out of the vessel.
[Coronary artery bypass surgery: methods of performance monitoring and quality control].
Albert, A; Sergeant, P; Ennker, J
2009-10-01
The strength of coronary bypass operations depends on the preservation of their benefits regarding freedom of symptoms, quality of life and survival, over decades. Significant variability of the results of an operative intervention according to the hospital or the operating surgeon is considered a weakness in the procedure. The external quality insurance tries to reach a transparent service providing market through hospital ranking comparability. Widely available information and competition will promote the improvement of the whole quality. The structured dialog acts as a control instrument for the BQS (Federal Quality Insurance). It is launched in case of deviations from the standard references or statistically significant differences between the results of the operations in any hospital and the average notational results. In comparison to the external control the hospital internal control has greater ability to reach a medically useful statement regarding the results of the treatment and to correct the mistakes in time. An online information portal based on a departmental databank (DataWarehouse, DataMart) is an attractive solution for the physician in order to get transparently and timely informed about the variability in the performance.The individual surgeon significantly influences the short- and long-term treatment results. Accordingly, selection, targeted training and performance measurements are necessary.Strict risk management and failure analysis of individual cases are included in the methods of internal quality control aiming to identify and correct the inadequacies in the system and the course of treatment. According to the international as well as our own experience, at least 30% of the mortalities after bypass operations are avoidable. A functioning quality control is especially important in minimally invasive interventions because they are often technically more demanding in comparison to the conventional procedures. In the field of OPCAB surgery, the special advantages of the procedure can be utilised to reach a nearly complete avoidance of postoperative stroke through combining the procedure with aorta no-touch technique. The long-term success of the bypass operation depends on the type of bypass material in additions to many other factors. Both internal mammary arteries are considered the most durable.Using an operation preparation check contributes to the operative success.
A System Approach to Navy Medical Education and Training. Appendix 22. Otolaryngology Technician.
1974-08-31
PROCEDURES TO PATIENT 12 PEXPLAIN LUMBAR PUNCTURE PROCEDURES TO PATIENT 13 IMEASURE/WEIGH PATIENT OR PERSONNEL 14 ICHECK CENTRAL VENOUS PRESSURE 15 TAKE...BLOOD PRESSURE 16 [CHECK RADIAL AWRIST) PULSE 17 ICHECK FEMORAL PULSE FOR PRESENCE AND QUALITY 8 IDETERMINE APICAL PULSE RATE/RHYTHM WITH STETHESCOPE 19... ICHECK PATIENTS TEMPERATURE 2U ICHECK /COUNT RESPIRATIONS 21 IPERFORM CIRCULATION CHECK, E.G. COLOR, PULSE, TEMPERATURE OF ISKIN, CAPILLARY RETURN 22
MoniQA: a general approach to monitor quality assurance
NASA Astrophysics Data System (ADS)
Jacobs, J.; Deprez, T.; Marchal, G.; Bosmans, H.
2006-03-01
MoniQA ("Monitor Quality Assurance") is a new, non-commercial, independent quality assurance software application developed in our medical physics team. It is a complete Java TM - based modular environment for the evaluation of radiological viewing devices and it thus fits in the global quality assurance network of our (film less) radiology department. The purpose of the software tool is to guide the medical physicist through an acceptance protocol and the radiologist through a constancy check protocol by presentation of the necessary test patterns and by automated data collection. Data are then sent to a central management system for further analysis. At the moment more than 55 patterns have been implemented, which can be grouped in schemes to implement protocols (i.e. AAPMtg18, DIN and EUREF). Some test patterns are dynamically created and 'drawn' on the viewing device with random parameters as is the case in a recently proposed new pattern for constancy testing. The software is installed on 35 diagnostic stations (70 monitors) in a film less radiology department. Learning time was very limited. A constancy check -with the new pattern that assesses luminance decrease, resolution problems and geometric distortion- takes only 2 minutes and 28 seconds per monitor. The modular approach of the software allows the evaluation of new or emerging test patterns. We will report on the software and its usability: practicality of the constancy check tests in our hospital and on the results from acceptance tests of viewing stations for digital mammography.
Delivery quality assurance with ArcCHECK
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neilson, Christopher; Klein, Michael; Barnett, Rob
2013-04-01
Radiation therapy requires delivery quality assurance (DQA) to ensure that treatment is accurate and closely follows the plan. We report our experience with the ArcCHECK phantom and investigate its potential optimization for the DQA process. One-hundred seventy DQA plans from 84 patients were studied. Plans were classified into 2 groups: those with the target situated on the diodes of the ArcCHECK (D plans) and those with the target situated at the center (C plans). Gamma pass rates for 8 target sites were examined. The parameters used to analyze the data included 3%/3 mm with the Van Dyk percent difference criteriamore » (VD) on, 3%/3 mm with the VD off, 2%/2 mm with the VD on, and x/3 mm with the VD on and the percentage dosimetric agreement “x” for diode plans adjusted. D plans typically displayed maximum planned dose (MPD) on the cylindrical surface containing ArcCHECK diodes than center plans, resulting in inflated gamma pass rates. When this was taken into account by adjusting the percentage dosimetric agreement, C plans outperformed D plans by an average of 3.5%. ArcCHECK can streamline the DQA process, consuming less time and resources than radiographic films. It is unnecessary to generate 2 DQA plans for each patient; a single center plan will suffice. Six of 8 target sites consistently displayed pass rates well within our acceptance criteria; the lesser performance of head and neck and spinal sites can be attributed to marginally lower doses and increased high gradient of plans.« less
A method of setting limits for the purpose of quality assurance
NASA Astrophysics Data System (ADS)
Sanghangthum, Taweap; Suriyapee, Sivalee; Kim, Gwe-Ya; Pawlicki, Todd
2013-10-01
The result from any assurance measurement needs to be checked against some limits for acceptability. There are two types of limits; those that define clinical acceptability (action limits) and those that are meant to serve as a warning that the measurement is close to the action limits (tolerance limits). Currently, there is no standard procedure to set these limits. In this work, we propose an operational procedure to set tolerance limits and action limits. The approach to establish the limits is based on techniques of quality engineering using control charts and a process capability index. The method is different for tolerance limits and action limits with action limits being categorized into those that are specified and unspecified. The procedure is to first ensure process control using the I-MR control charts. Then, the tolerance limits are set equal to the control chart limits on the I chart. Action limits are determined using the Cpm process capability index with the requirements that the process must be in-control. The limits from the proposed procedure are compared to an existing or conventional method. Four examples are investigated: two of volumetric modulated arc therapy (VMAT) point dose quality assurance (QA) and two of routine linear accelerator output QA. The tolerance limits range from about 6% larger to 9% smaller than conventional action limits for VMAT QA cases. For the linac output QA, tolerance limits are about 60% smaller than conventional action limits. The operational procedure describe in this work is based on established quality management tools and will provide a systematic guide to set up tolerance and action limits for different equipment and processes.
Machine vision method for online surface inspection of easy open can ends
NASA Astrophysics Data System (ADS)
Mariño, Perfecto; Pastoriza, Vicente; Santamaría, Miguel
2006-10-01
Easy open can end manufacturing process in the food canning sector currently makes use of a manual, non-destructive testing procedure to guarantee can end repair coating quality. This surface inspection is based on a visual inspection made by human inspectors. Due to the high production rate (100 to 500 ends per minute) only a small part of each lot is verified (statistical sampling), then an automatic, online, inspection system, based on machine vision, has been developed to improve this quality control. The inspection system uses a fuzzy model to make the acceptance/rejection decision for each can end from the information obtained by the vision sensor. In this work, the inspection method is presented. This surface inspection system checks the total production, classifies the ends in agreement with an expert human inspector, supplies interpretability to the operators in order to find out the failure causes and reduce mean time to repair during failures, and allows to modify the minimum can end repair coating quality.
Chen, Nai-dong; Gao, Feng; Lin, Xin; Jin, Hui
2014-06-01
To compare the composition and content of alkaloid of Dendrobium huoshanense tissue-culture seedling and wild plant. A comparative evaluation on the quality was carried out by HPLC and TLC methods including the composition and the content of alkaloids. Remarkable variation existed in the two kinds of Dendrobium huoshanense. For the tissue-culture plant, only two alkaloids were checked out by both HPLC and TLC while four alkaloids were observed in the wild plant. The alkaloid content of tissue-culture seedling and wild plant was(0. 29 ± 0. 11)%o and(0. 43 ± 0. 15) %o,respectively. Distinguished difference is observed in both composition and content of alkaloids from the annual shoots of different provenances of Dendrobium huoshanense. It suggested that the quality of tissue-culture seedling of Dendrobium huoshanense might be inconsistent with the wild plant. Furthermore, the established alkaloids-knock-out HPLC method would provide a new research tool on quality control of Chinese medicinal materials which contain unknown alkaloids.
Anderer, Peter; Gruber, Georg; Parapatics, Silvia; Woertz, Michael; Miazhynskaia, Tatiana; Klosch, Gerhard; Saletu, Bernd; Zeitlhofer, Josef; Barbanoj, Manuel J; Danker-Hopfe, Heidi; Himanen, Sari-Leena; Kemp, Bob; Penzel, Thomas; Grozinger, Michael; Kunz, Dieter; Rappelsberger, Peter; Schlogl, Alois; Dorffner, Georg
2005-01-01
To date, the only standard for the classification of sleep-EEG recordings that has found worldwide acceptance are the rules published in 1968 by Rechtschaffen and Kales. Even though several attempts have been made to automate the classification process, so far no method has been published that has proven its validity in a study including a sufficiently large number of controls and patients of all adult age ranges. The present paper describes the development and optimization of an automatic classification system that is based on one central EEG channel, two EOG channels and one chin EMG channel. It adheres to the decision rules for visual scoring as closely as possible and includes a structured quality control procedure by a human expert. The final system (Somnolyzer 24 x 7) consists of a raw data quality check, a feature extraction algorithm (density and intensity of sleep/wake-related patterns such as sleep spindles, delta waves, SEMs and REMs), a feature matrix plausibility check, a classifier designed as an expert system, a rule-based smoothing procedure for the start and the end of stages REM, and finally a statistical comparison to age- and sex-matched normal healthy controls (Siesta Spot Report). The expert system considers different prior probabilities of stage changes depending on the preceding sleep stage, the occurrence of a movement arousal and the position of the epoch within the NREM/REM sleep cycles. Moreover, results obtained with and without using the chin EMG signal are combined. The Siesta polysomnographic database (590 recordings in both normal healthy subjects aged 20-95 years and patients suffering from organic or nonorganic sleep disorders) was split into two halves, which were randomly assigned to a training and a validation set, respectively. The final validation revealed an overall epoch-by-epoch agreement of 80% (Cohen's kappa: 0.72) between the Somnolyzer 24 x 7 and the human expert scoring, as compared with an inter-rater reliability of 77% (Cohen's kappa: 0.68) between two human experts scoring the same dataset. Two Somnolyzer 24 x 7 analyses (including a structured quality control by two human experts) revealed an inter-rater reliability close to 1 (Cohen's kappa: 0.991), which confirmed that the variability induced by the quality control procedure, whereby approximately 1% of the epochs (in 9.5% of the recordings) are changed, can definitely be neglected. Thus, the validation study proved the high reliability and validity of the Somnolyzer 24 x 7 and demonstrated its applicability in clinical routine and sleep studies.
NASA Astrophysics Data System (ADS)
Robichaud, A.; Ménard, R.
2013-05-01
We present multi-year objective analyses (OA) on a high spatio-temporal resolution (15 or 21 km, every hour) for the warm season period (1 May-31 October) for ground-level ozone (2002-2012) and for fine particulate matter (diameter less than 2.5 microns (PM2.5)) (2004-2012). The OA used here combines the Canadian Air Quality forecast suite with US and Canadian surface air quality monitoring sites. The analysis is based on an optimal interpolation with capabilities for adaptive error statistics for ozone and PM2.5 and an explicit bias correction scheme for the PM2.5 analyses. The estimation of error statistics has been computed using a modified version of the Hollingsworth-Lönnberg's (H-L) method. Various quality controls (gross error check, sudden jump test and background check) have been applied to the observations to remove outliers. An additional quality control is applied to check the consistency of the error statistics estimation model at each observing station and for each hour. The error statistics are further tuned "on the fly" using a χ2 (chi-square) diagnostic, a procedure which verifies significantly better than without tuning. Successful cross-validation experiments were performed with an OA set-up using 90% of observations to build the objective analysis and with the remainder left out as an independent set of data for verification purposes. Furthermore, comparisons with other external sources of information (global models and PM2.5 satellite surface derived measurements) show reasonable agreement. The multi-year analyses obtained provide relatively high precision with an absolute yearly averaged systematic error of less than 0.6 ppbv (parts per billion by volume) and 0.7 μg m-3 (micrograms per cubic meter) for ozone and PM2.5 respectively and a random error generally less than 9 ppbv for ozone and under 12 μg m-3 for PM2.5. In this paper, we focus on two applications: (1) presenting long term averages of objective analysis and analysis increments as a form of summer climatology and (2) analyzing long term (decadal) trends and inter-annual fluctuations using OA outputs. Our results show that high percentiles of ozone and PM2.5 are both following a decreasing trend overall in North America with the eastern part of United States (US) presenting the highest decrease likely due to more effective pollution controls. Some locations, however, exhibited an increasing trend in the mean ozone and PM2.5 such as the northwestern part of North America (northwest US and Alberta). The low percentiles are generally rising for ozone which may be linked to increasing emissions from emerging countries and the resulting pollution brought by the intercontinental transport. After removing the decadal trend, we demonstrate that the inter-annual fluctuations of the high percentiles are significantly correlated with temperature fluctuations for ozone and precipitation fluctuations for PM2.5. We also show that there was a moderately significant correlation between the inter-annual fluctuations of the high percentiles of ozone and PM2.5 with economic indices such as the Industrial Dow Jones and/or the US gross domestic product growth rate.
A source-channel coding approach to digital image protection and self-recovery.
Sarreshtedari, Saeed; Akhaee, Mohammad Ali
2015-07-01
Watermarking algorithms have been widely applied to the field of image forensics recently. One of these very forensic applications is the protection of images against tampering. For this purpose, we need to design a watermarking algorithm fulfilling two purposes in case of image tampering: 1) detecting the tampered area of the received image and 2) recovering the lost information in the tampered zones. State-of-the-art techniques accomplish these tasks using watermarks consisting of check bits and reference bits. Check bits are used for tampering detection, whereas reference bits carry information about the whole image. The problem of recovering the lost reference bits still stands. This paper is aimed at showing that having the tampering location known, image tampering can be modeled and dealt with as an erasure error. Therefore, an appropriate design of channel code can protect the reference bits against tampering. In the present proposed method, the total watermark bit-budget is dedicated to three groups: 1) source encoder output bits; 2) channel code parity bits; and 3) check bits. In watermark embedding phase, the original image is source coded and the output bit stream is protected using appropriate channel encoder. For image recovery, erasure locations detected by check bits help channel erasure decoder to retrieve the original source encoded image. Experimental results show that our proposed scheme significantly outperforms recent techniques in terms of image quality for both watermarked and recovered image. The watermarked image quality gain is achieved through spending less bit-budget on watermark, while image recovery quality is considerably improved as a consequence of consistent performance of designed source and channel codes.
HISTORICAL EMISSION AND OZONE TRENDS IN THE HOUSTON AREA
An analysis of historical trend data for emissions and air quality in Houston for period of 1974-78 is conducted for the purposes of checking the EKMA O3-predicting model and of exploring empirical relations between emission changes and O3 air quality in the Houston area. Results...
14 CFR 141.83 - Quality of training.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 14 Aeronautics and Space 3 2011-01-01 2011-01-01 false Quality of training. 141.83 Section 141.83 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) SCHOOLS AND... the FAA to administer any knowledge test, practical test, stage check, or end-of-course test to its...
14 CFR 141.83 - Quality of training.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 14 Aeronautics and Space 3 2010-01-01 2010-01-01 false Quality of training. 141.83 Section 141.83 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) SCHOOLS AND... the FAA to administer any knowledge test, practical test, stage check, or end-of-course test to its...
Objectivity of the Subjective Quality: Convergence on Competencies Expected of Doctoral Graduates
ERIC Educational Resources Information Center
Kariyana, Israel; Sonn, Reynold A.; Marongwe, Newlin
2017-01-01
This study assessed the competencies expected of doctoral graduates. Twelve purposefully sampled education experts provided the data. A case study design within a qualitative approach was adopted. Data were gathered through interviews and thematically analysed. Member checking ensured data trustworthiness. Factors affecting the quality of a…
Hard Spring Wheat Technical Committee 2016 Crop
USDA-ARS?s Scientific Manuscript database
Seven experimental lines of hard spring wheat were grown at up to five locations in 2016 and evaluated for kernel, milling, and bread baking quality against the check variety Glenn. Wheat samples were submitted through the Wheat Quality Council and processed and milled at the USDA-ARS Hard Red Spri...
ERIC Educational Resources Information Center
Downing, Christopher O., Jr.; Geller, E. Scott
2012-01-01
A participative goal-setting and feedback intervention increased cashiers' identification-checking behavior at a large grocery store. The cashiers' identification-checking percentages increased from 0.2% at baseline to 9.7% during the intervention phase and then declined to 2.3% during withdrawal. At the control store, the percentages of…
Quality in the molecular microbiology laboratory.
Wallace, Paul S; MacKay, William G
2013-01-01
In the clinical microbiology laboratory advances in nucleic acid detection, quantification, and sequence analysis have led to considerable improvements in the diagnosis, management, and monitoring of infectious diseases. Molecular diagnostic methods are routinely used to make clinical decisions based on when and how to treat a patient as well as monitor the effectiveness of a therapeutic regime and identify any potential drug resistant strains that may impact on the long term patient treatment program. Therefore, confidence in the reliability of the result provided by the laboratory service to the clinician is essential for patient treatment. Hence, suitable quality assurance and quality control measures are important to ensure that the laboratory methods and service meet the necessary regulatory requirements both at the national and international level. In essence, the modern clinical microbiology laboratory ensures the appropriateness of its services through a quality management system that monitors all aspects of the laboratory service pre- and post-analytical-from patient sample receipt to reporting of results, from checking and upholding staff competency within the laboratory to identifying areas for quality improvements within the service offered. For most European based clinical microbiology laboratories this means following the common International Standard Organization (ISO9001) framework and ISO15189 which sets out the quality management requirements for the medical laboratory (BS EN ISO 15189 (2003) Medical laboratories-particular requirements for quality and competence. British Standards Institute, Bristol, UK). In the United States clinical laboratories performing human diagnostic tests are regulated by the Centers for Medicare and Medicaid Services (CMS) following the requirements within the Clinical Laboratory Improvement Amendments document 1988 (CLIA-88). This chapter focuses on the key quality assurance and quality control requirements within the modern microbiology laboratory providing molecular diagnostics.
STS-92 MS Wisoff gets suit checked in the White Room before launch
NASA Technical Reports Server (NTRS)
2000-01-01
STS-92 Mission Specialist Peter J.K. '''Jeff''' Wisoff reaches out to shake the hand of Danny Wyatt, KSC NASA Quality Assurance specialist, after completing final check of his launch and entry suit in the White Room before entering Discovery. The White Room is an environmentally controlled area at the end of the Orbiter Access Arm that provides entry to the orbiter as well as emergency egress if needed. The arm remains in the extended position until 7 minutes 24 seconds before launch. Wisoff and the rest of the crew are undertaking the fifth flight to the International Space Station for construction. Discovery carries a payload that includes the Integrated Truss Structure Z-1, first of 10 trusses that will form the backbone of the Space Station, and the third Pressurized Mating Adapter that will provide a Shuttle docking port for solar array installation on the sixth Station flight and Lab installation on the seventh Station flight. The mission includes four spacewalks for the construction activities. Discovery's landing is expected Oct. 22 at 2:10 p.m. EDT.
Good health checks according to the general public; expectations and criteria: a focus group study.
Stol, Yrrah H; Asscher, Eva C A; Schermer, Maartje H N
2018-06-22
Health checks or health screenings identify (risk factors for) disease in people without a specific medical indication. So far, the perspective of (potential) health check users has remained underexposed in discussions about the ethics and regulation of health checks. In 2017, we conducted a qualitative study with lay people from the Netherlands (four focus groups). We asked what participants consider characteristics of good and bad health checks, and whether they saw a role for the Dutch government. Participants consider a good predictive value the most important characteristic of a good health check. Information before, during and after the test, knowledgeable and reliable providers, tests for treatable (risk factors for) disease, respect for privacy, no unnecessary health risks and accessibility are also mentioned as criteria for good health checks. Participants make many assumptions about health check offers. They assume health checks provide certainty about the presence or absence of disease, that health checks offer opportunities for health benefits and that the privacy of health check data is guaranteed. In their choice for provider and test they tend to rely more on heuristics than information. Participants trust physicians to put the interest of potential health check users first and expect the Dutch government to intervene if providers other than physicians failed to do so by offering tests with a low predictive value, or tests that may harm people, or by infringing the privacy of users. Assumptions of participants are not always justified, but they may influence the choice to participate. This is problematic because choices for checks with a low predictive value that do not provide health benefits may create uncertainty and may cause harm to health; an outcome diametrically opposite to the one intended. Also, this may impair the relationship of trust with physicians and the Dutch government. To further and protect autonomous choice and to maintain trust, we recommend the following measures to timely adjust false expectations: advertisements that give an accurate impression of health check offers, and the installation of a quality mark.
The Single Soldier Quality of Life Initiative: Great Expectations of Privacy
1995-04-01
without regard to their marital status and to hold them accountable to established standards. 18 To many "old soldiers," some of the ideas contained...Family Housing Office: assign and terminate quarters, conduct check-in and check-out inspections, maintain accountability of SQ furniture, follow up on...integrity is a second priority." 2 6 Further hindering unit integrity is that smoking preference of the soldiers must be taken into account when making
A Quality Control study of the distribution of NOAA MIRS Cloudy retrievals during Hurricane Sandy
NASA Astrophysics Data System (ADS)
Fletcher, S. J.
2013-12-01
Cloudy radiance present a difficult challenge to data assimilation (DA) systems, through both the radiative transfer system as well the hydrometers required to resolve the cloud and precipitation. In most DA systems the hydrometers are not control variables due to many limitations. The National Oceanic and Atmospheric Administration's (NOAA) Microwave Integrated Retrieval System (MIRS) is producing products from the NPP-ATMS satellite where the scene is cloud and precipitation affected. The test case that we present here is the life time of Hurricane and then Superstorm Sandy in October 2012. As a quality control study we shall compare the retrieved water vapor content during the lifetime of Sandy with the first guess and the analysis from the NOAA Gridpoint Statistical Interpolation (GSI) system. The assessment involves the gross error check system against the first guess with different values for the observational error's variance to see if the difference is within three standard deviations. We shall also compare against the final analysis at the relevant cycles to see if the products which have been retrieved through a cloudy radiance are similar, given that the DA system does not assimilate cloudy radiances yet.
Mikels, Joseph A; Löckenhoff, Corinna E; Maglio, Sam J; Goldstein, Mary K; Garber, Alan; Carstensen, Laura L
2010-03-01
Research on aging has indicated that whereas deliberative cognitive processes decline with age, emotional processes are relatively spared. To examine the implications of these divergent trajectories in the context of health care choices, we investigated whether instructional manipulations emphasizing a focus on feelings or details would have differential effects on decision quality among younger and older adults. We presented 60 younger and 60 older adults with health care choices that required them to hold in mind and consider multiple pieces of information. Instructional manipulations in the emotion-focus condition asked participants to focus on their emotional reactions to the options, report their feelings about the options, and then make a choice. In the information-focus condition, participants were instructed to focus on the specific attributes, report the details about the options, and then make a choice. In a control condition, no directives were given. Manipulation checks indicated that the instructions were successful in eliciting different modes of processing. Decision quality data indicate that younger adults performed better in the information-focus than in the control condition whereas older adults performed better in the emotion-focus and control conditions than in the information-focus condition. Findings support and extend extant theorizing on aging and decision making as well as suggest that interventions to improve decision-making quality should take the age of the decision maker into account.
Pollution of surface water in Europe
Key, A.
1956-01-01
This paper discusses pollution of surface water in 18 European countries. For each an account is given of its physical character, population, industries, and present condition of water supplies; the legal, administrative, and technical means of controlling pollution are then described, and an outline is given of current research on the difficulties peculiar to each country. A general discussion of various aspects common to the European problem of water pollution follows; standards of quality are suggested; some difficulties likely to arise in the near future are indicated, and international collaboration, primarily by the exchange of information, is recommended to check or forestall these trends. PMID:13374532
[Quality assessment of sulfur-fumigated paeoniae alba radix].
Wang, Zhao; Chen, Yu-Wu; Wang, Qiong; Sun, Lei; Xu, Wei-Yi; Jin, Hong-Yu; Ma, Shuang-Cheng
2014-08-01
The samples of sulfur-fumigated Paeoniae Alba Radix acquired both by random spot check from domestic market and self-production by the research group in the laboratory were used to evaluate the effects of sulphur fumigation on the quality of Paeoniae Alba Radix by comparing sulfur-fumigated degree and character, the content of paeoniflorin and paeoniflorin sulfurous acid ester, and changes of the fingerprint. We used methods in Chinese Pharmacopeia to evaluate the character of sulfur-fumigated Paeoniae Alba Radix and determinate the content of aulfur-fumigated paeoniflorin. LC-MS method was used to analyze paeoniflorin-converted products. HPLC fingerprint methods were established to evaluate the differences on quality by similarity. Results showed that fumigated Paeoniae Alba Radix became white and its unique fragrance disappeared, along with the production of pungent sour gas. It also had a significant effect on paeoniflorin content. As sulfur smoked degree aggravated, paeoniflorin content decreased subsequently, some of which turned into paeoniflorin sulfurous acid ester, and this change was not reversible. Fingerprint also showed obvious changes. Obviously, sulfur fumigation had severe influence on the quality of Paeoniae Alba Radix, but we can control the quality of the Paeoniae Alba Radix by testing the paeoniflorin sulfurous acid ester content.
NASA Astrophysics Data System (ADS)
Stockhause, M.; Höck, H.; Toussaint, F.; Weigel, T.; Lautenschlager, M.
2012-12-01
We present the publication process for the CMIP5 (Coupled Model Intercomparison Project Phase 5) data with special emphasis on the current role of identifiers and the potential future role of PIDs in such distributed technical infrastructures. The DataCite data publication with DOI assignment finalizes the 3 levels quality control procedure for CMIP5 data (Stockhause et al., 2012). WDCC utilizes the Assistant System Atarrabi to support the publication process. Atarrabi is a web-based workflow system for metadata reviews of data creators and Publication Agents (PAs). Within the quality checks for level 3 all available information in the different infrastructure components is cross-checked for consistency by the DataCite PA. This information includes: metadata on data, metadata in the long-term archive of the Publication Agency, quality information, and external metadata on model and simulation (CIM). For these consistency checks metadata related to the data publication has to be identified. The Data Reference Syntax (DRS) convention functions as global identifier for data. Since the DRS structures the data, hierarchically, it can be used to identify data collections like DataCite publication units, i.e. all data belonging to a CMIP5 simulation. Every technical component of the infrastructure uses DRS or maps to it, but there is no central repository storing DRS_ids. Thus they have to be mapped, occasionally. Additional local identifiers are used within the different technical infrastructure components. Identification of related pieces of information in their repositories is cumbersome and tricky for the PA. How could PIDs improve the situation? To establish a reliable distributed data and metadata infrastructure, PIDs for all objects are needed as well as relations between them. An ideal data publication scenario for federated community projects within Earth System Sciences, e.g. CMIP, would be: 1. Data creators at the modeling centers define their simulation, related metadata, and software, which are assigned PIDs. 2. During ESGF data publication the data entities are assigned PIDs with references to the PIDs of 1. Since we deal with different hierarchical levels, the definition of collections on these levels is advantageous. A possible implementation concept using Handles is described by Weigel et al. (2012). 3. Quality results are assigned PID(s) and a reference to the data. A quality PID is added as a reference to the data collection PID. 4. The PA accesses the PID on the data collection to get the data and all related information for cross-checking. The presented example of the technical infrastructure for the CMIP5 data distribution shows the importance of PIDs, especially as the data is distributed over multiple repositories world-wide and additional separate pieces of data related information are independently collected from the data. References: Stockhause, M., Höck, H., Toussaint, F., Lautenschlager, M. (2012): 'Quality assessment concept of the World Data Center for Climate and its application to CMIP5 data', Geosci. Model Dev. Discuss., 5, 781-802, doi:10.5194/gmdd-5-781-2012. Weigel, T., et al. (2012): 'Structural Elements in a Persistent Identifier Infrastructure and Resulting Benefits for the Earth Science Community', submitted to AGU 2012 Session IN009.
OntoCheck: verifying ontology naming conventions and metadata completeness in Protégé 4
2012-01-01
Background Although policy providers have outlined minimal metadata guidelines and naming conventions, ontologies of today still display inter- and intra-ontology heterogeneities in class labelling schemes and metadata completeness. This fact is at least partially due to missing or inappropriate tools. Software support can ease this situation and contribute to overall ontology consistency and quality by helping to enforce such conventions. Objective We provide a plugin for the Protégé Ontology editor to allow for easy checks on compliance towards ontology naming conventions and metadata completeness, as well as curation in case of found violations. Implementation In a requirement analysis, derived from a prior standardization approach carried out within the OBO Foundry, we investigate the needed capabilities for software tools to check, curate and maintain class naming conventions. A Protégé tab plugin was implemented accordingly using the Protégé 4.1 libraries. The plugin was tested on six different ontologies. Based on these test results, the plugin could be refined, also by the integration of new functionalities. Results The new Protégé plugin, OntoCheck, allows for ontology tests to be carried out on OWL ontologies. In particular the OntoCheck plugin helps to clean up an ontology with regard to lexical heterogeneity, i.e. enforcing naming conventions and metadata completeness, meeting most of the requirements outlined for such a tool. Found test violations can be corrected to foster consistency in entity naming and meta-annotation within an artefact. Once specified, check constraints like name patterns can be stored and exchanged for later re-use. Here we describe a first version of the software, illustrate its capabilities and use within running ontology development efforts and briefly outline improvements resulting from its application. Further, we discuss OntoChecks capabilities in the context of related tools and highlight potential future expansions. Conclusions The OntoCheck plugin facilitates labelling error detection and curation, contributing to lexical quality assurance in OWL ontologies. Ultimately, we hope this Protégé extension will ease ontology alignments as well as lexical post-processing of annotated data and hence can increase overall secondary data usage by humans and computers. PMID:23046606
Rmax: A systematic approach to evaluate instrument sort performance using center stream catch☆
Riddell, Andrew; Gardner, Rui; Perez-Gonzalez, Alexis; Lopes, Telma; Martinez, Lola
2015-01-01
Sorting performance can be evaluated with regard to Purity, Yield and/or Recovery of the sorted fraction. Purity is a check on the quality of the sample and the sort decisions made by the instrument. Recovery and Yield definitions vary with some authors regarding both as how efficient the instrument is at sorting the target particles from the original sample, others distinguishing Recovery from Yield, where the former is used to describe the accuracy of the instrument’s sort count. Yield and Recovery are often neglected, mostly due to difficulties in their measurement. Purity of the sort product is often cited alone but is not sufficient to evaluate sorting performance. All of these three performance metrics require re-sampling of the sorted fraction. But, unlike Purity, calculating Yield and/or Recovery calls for the absolute counting of particles in the sorted fraction, which may not be feasible, particularly when dealing with rare populations and precious samples. In addition, the counting process itself involves large errors. Here we describe a new metric for evaluating instrument sort Recovery, defined as the number of particles sorted relative to the number of original particles to be sorted. This calculation requires only measuring the ratios of target and non-target populations in the original pre-sort sample and in the waste stream or center stream catch (CSC), avoiding re-sampling the sorted fraction and absolute counting. We called this new metric Rmax, since it corresponds to the maximum expected Recovery for a particular set of instrument parameters. Rmax is ideal to evaluate and troubleshoot the optimum drop-charge delay of the sorter, or any instrument related failures that will affect sort performance. It can be used as a daily quality control check but can be particularly useful to assess instrument performance before single-cell sorting experiments. Because we do not perturb the sort fraction we can calculate Rmax during the sort process, being especially valuable to check instrument performance during rare population sorts. PMID:25747337
[Design and implementation of data checking system for Chinese materia medica resources survey].
Wang, Hui; Zhang, Xiao-Bo; Ge, Xiao-Guang; Jin, Yan; Jing, Zhi-Xian; Qi, Yuan-Hua; Wang, Ling; Zhao, Yu-Ping; Wang, Wei; Guo, Lan-Ping; Huang, Lu-Qi
2017-11-01
The Chinese material medica resources (CMMR) national survey information management system has collected a large amount of data. To help dealing with data recheck, reduce the work of inside, improve the recheck of survey data from provincial and county level, National Resource Center for Chinese Materia Medical has designed a data checking system for Chinese material medica resources survey based on J2EE technology, Java language, Oracle data base in accordance with the SOA framework. It includes single data check, check score, content manage, check the survey data census data with manual checking and automatic checking about census implementation plan, key research information, general survey information, cultivation of medicinal materials information, germplasm resources information the medicine information, market research information, traditional knowledge information, specimen information of this 9 aspects 20 class 175 indicators in two aspects of the quantity and quality. The established system assists in the completion of the data consistency and accuracy, pushes the county survey team timely to complete the data entry arrangement work, so as to improve the integrity, consistency and accuracy of the survey data, and ensure effective and available data, which lay a foundation for providing accurate data support for national survey of the Chinese material medica resources (CMMR) results summary, and displaying results and sharing. Copyright© by the Chinese Pharmaceutical Association.
Improving treatment plan evaluation with automation
Covington, Elizabeth L.; Chen, Xiaoping; Younge, Kelly C.; Lee, Choonik; Matuszak, Martha M.; Kessler, Marc L.; Keranen, Wayne; Acosta, Eduardo; Dougherty, Ashley M.; Filpansick, Stephanie E.
2016-01-01
The goal of this work is to evaluate the effectiveness of Plan‐Checker Tool (PCT) which was created to improve first‐time plan quality, reduce patient delays, increase the efficiency of our electronic workflow, and standardize and automate the physics plan review in the treatment planning system (TPS). PCT uses an application programming interface to check and compare data from the TPS and treatment management system (TMS). PCT includes a comprehensive checklist of automated and manual checks that are documented when performed by the user as part of a plan readiness check for treatment. Prior to and during PCT development, errors identified during the physics review and causes of patient treatment start delays were tracked to prioritize which checks should be automated. Nineteen of 33 checklist items were automated, with data extracted with PCT. There was a 60% reduction in the number of patient delays in the six months after PCT release. PCT was successfully implemented for use on all external beam treatment plans in our clinic. While the number of errors found during the physics check did not decrease, automation of checks increased visibility of errors during the physics check, which led to decreased patient delays. The methods used here can be applied to any TMS and TPS that allows queries of the database. PACS number(s): 87.55.‐x, 87.55.N‐, 87.55.Qr, 87.55.tm, 89.20.Bb PMID:27929478
Sommer, A; Girnus, R; Wendt, B; Czwoydzinski, J; Wüstenbecker, C; Heindel, W; Lenzen, H
2009-05-01
German breast cancer screening is monitored by a large physical quality assurance program. This report refers to the first experiences of the Reference Center (RC) Muenster after three years of the technical quality control of digital and analog mammography units (MU). This paper also shows whether the presently used quality assurance (QA) method is able to ensure that the MUs in the screening program are functioning without any serious problems. RC Muenster supervises 95 units (May 2008). The daily, weekly and monthly quality assurance of these units is controlled by web-based QA software named "MammoConrol" and developed by RC Muenster. The annual QA for the units must be conducted in the form of an on-site inspection by medical physics experts of the RC and is scored by an objective ranking system. The results of these QA routines were evaluated and analyzed for this paper. During the period from 3/1/2006 to 5/31/2008, 8 % of the analog systems and 1 % of the digital systems exhibited problems in the daily QA. For 9 % of the analog MUs and 17 % of the digital MUs, failures appeared in the monthly QA. In the annual control, 86.7 % of the analog units exhibited slight problems and 13.3 % had serious problems. With respect to the digital units, 12 % were without any defects, 58 % had slight problems, 27 % had serious failures and 3 % had to be reported to the responsible authorities and were temporarily shut down. The special quality control requirements for German breast cancer screening, including annual on-site checks of the units, have shown in the last three years that QA with a high monitoring standard can be ensured for a large number of decentralized MUs. The currently used QA method sufficiently ensures that the screening program is technically safe. Further studies must show whether the density and focus of the QA measures must be reconfigured.
Checking-up of optical graduated rules by laser interferometry
NASA Astrophysics Data System (ADS)
Miron, Nicolae P.; Sporea, Dan G.
1996-05-01
The main aspects related to the operating principle, design, and implementation of high-productivity equipment for checking-up the graduation accuracy of optical graduated rules used as a length reference in optical measuring instruments for precision machine tools are presented. The graduation error checking-up is done with a Michelson interferometer as a length transducer. The instrument operation is managed by a computer, which controls the equipment, data acquisition, and processing. The evaluation is performed for rule lengths from 100 to 3000 mm, with a checking-up error less than 2 micrometers/m. The checking-up time is about 15 min for a 1000-mm rule, with averaging over four measurements.
Engle, Martha; Ferguson, Allison; Fields, Willa
2016-01-01
The purpose of this quality improvement project was to redesign a hospital meal delivery process in order to shorten the time between blood glucose monitoring and corresponding insulin administration and improve glycemic control. This process change redesigned the workflow of the dietary and nursing departments. Modifications included nursing, rather than dietary, delivering meal trays to patients receiving insulin. Dietary marked the appropriate meal trays and phoned each unit prior to arrival on the unit. The process change was trialed on 2 acute care units prior to implementation hospital wide. Elapsed time between blood glucose monitoring and insulin administration was analyzed before and after process change as well as evaluation of glucometrics: percentage of patients with blood glucose between 70 and 180 mg/dL (percent perfect), blood glucose greater than 300 mg/dL (extreme hyperglycemia), and blood glucose less than 70 mg/dL (hypoglycemia). Percent perfect glucose results improved from 45% to 53%, extreme hyperglycemia (blood glucose >300 mg/dL) fell from 11.7% to 5%. Hypoglycemia demonstrated a downward trend line, demonstrating that with improving glycemic control hypoglycemia rates did not increase. Percentage of patients receiving meal insulin within 30 minutes of blood glucose check increased from 35% to 73%. In the hospital, numerous obstacles were present that interfered with on-time meal insulin delivery. Establishing a meal delivery process with the nurse performing the premeal blood glucose check, delivering the meal, and administering the insulin improves overall blood glucose control. Nurse-led process improvement of blood glucose monitoring, meal tray delivery, and insulin administration does lead to improved glycemic control for the inpatient population.
The Pan European Phenological Database PEP725: Data Content and Data Quality Control Procedures
NASA Astrophysics Data System (ADS)
Jurkovic, Anita; Hübner, Thomas; Koch, Elisabeth; Lipa, Wolfgang; Scheifinger, Helfried; Ungersböck, Markus; Zach-Hermann, Susanne
2014-05-01
Phenology - the study of the timing of recurring biological events in the animal and plant world - has become an important approach for climate change impact studies in recent years. It is therefore a "conditio sine qua non" to collect, archive, digitize, control and update phenological datasets. Thus and with regard to cross-border cooperation and activities it was necessary to establish, operate and promote a pan European phenological database (PEP725). Such a database - designed and tested under cost action 725 in 2004 and further developed and maintained in the framework of the EUMETNET program PEP725 - collects data from different European governmental and nongovernmental institutions and thus offers a unique compilation of plant phenological observations. The data follows the same classification scheme - the so called BBCH coding system - that makes datasets comparable. Europe had a long tradition in the observation of phenological events: the history of collecting phenological data and their usage in climatology began in 1751. The first datasets in PEP725 date back to 1868. However, there are only a few observations available until 1950. From 1951 onwards, the phenological networks all over Europe developed rapidly: Currently, PEP725 provides about 9 million records from 23 European countries (covering approximately 50% of Europe). To supply the data in a good and uniform quality it is essential and worthwhile to establish and develop data quality control procedures. Consequently, one of the main tasks within PEP725 is the conception of a multi-stage-quality control. Currently the tests are stepwise composed: completeness -, plausibility -, time consistency -, climatological - and statistical checks. In a nutshell: The poster exemplifies the status quo of the data content of the PEP725 database and incipient stages of used and planned quality controls, respectively. For more details, we would also like to promote and refer to the PEP725 website (http://www.pep725.eu) and invite additional institutions and regional services to join our program.
Geomorphological and ecological effects of check dams in mountain torrents of Southern Italy
NASA Astrophysics Data System (ADS)
Zema, Demetrio Antonio; Bombino, Giuseppe; Denisi, Pietro; Tamburino, Vincenzo; Marcello Zimbone, Santo
2017-04-01
It is known that installation of check dams noticeably influences torrent morphology and ecology. However, the effects of check dams on channel section and riparian vegetation of torrents are not yet completely understood. This paper provides a further contribution to a better comprehension of the actions played by check dams on hydrological and geomorphological processes in headwaters and their effects on riparian ecosystem. Field surveys on channel morphology, bed material and riparian vegetation were carried out close to five check dams in each of four mountain reaches of Calabria (Southern Italy). For each check dam three transects (one upstream, one downstream and one far from the check dam, located in the undisturbed zone and adopted as control) were identified; at each transect, a set of geomorphological and ecological indicators were surveyed as follows. Channel section morphology was assessed by the width/depth ratio (w/d); the median particle size (D50) and the finer sediment fraction (%fines) were chosen to characterize channel bed material; the specific discharge (q, the discharge per channel unit width) was assumed as measure of the flow regime. Vegetation cover and structure were evaluated by Global Canopy Cover (GCC) and Weighted Canopy Height (WCH) respectively (Bombino et al., 2008); the index of alpha-diversity (H-alpha, Hill, 1973) and the ratio between the number of alien species and the number of native species (NSA/NSN) were chosen as indicators of species richness/abundance and degree of vegetation integrity, respectively. Compared to the control transects, the values of w/d were higher upstream of check dams and lower downstream; conversely, q was lower upstream and higher in downstream sites. Upstream of the check dams D50 of bed material was lower and %fines was higher compared to the control transects; vice versa, the downstream transects showed higher D50 and lower %fines. The differences in the riparian vegetation among transects were found as the torrent ecological response to the strong contrasts surveyed in hydrological (q) and geomorphological (w/d, D50 and %fines) characteristics. Compared to control transects, vegetation was more extensive (higher GCC) and developed (higher WCH) in the upstream zones; the reverse pattern was noticed in the downstream transects (lower GCC and WCH). The indexes H-alpha and NSA/NSN were higher upstream of check dams: the presence of the check dams induced higher species richness and evenness, with alien species prevailing over native ones in the sedimentation wedge. Conversely, downstream of check dams H-alpha and NSA/NSN were lower: here, riparian vegetation lost some herbaceous species and assumed a terrestrial character. Overall, this study confirms on a quantitative approach that check dams have far reaching effects on geomorphology and ecology of mountain torrent channels; as a consequence, important and complex changes occur not only in the extent and development of riparian vegetation, but also in the species diversity and distribution. REFERENCES - Bombino G., Gurnell A.M., Tamburino V., Zema D.A., Zimbone S.M. 2008. Sediment size variation in torrents with check-dams: effects on riparian vegetation. Ecological Engineering 32(2), 166-177. - Hill MO. 1973. Diversity and evenness: a unifying notation and its consequences. Ecology 54: 427-431.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schaly, B; Hoover, D; Mitchell, S
2014-08-15
During volumetric modulated arc therapy (VMAT) of head and neck cancer, some patients lose weight which may result in anatomical deviations from the initial plan. If these deviations are substantial a new treatment plan can be designed for the remainder of treatment (i.e., adaptive planning). Since the adaptive treatment process is resource intensive, one possible approach to streamlining the quality assurance (QA) process is to use the electronic portal imaging device (EPID) to measure the integrated fluence for the adapted plans instead of the currently-used ArcCHECK device (Sun Nuclear). Although ArcCHECK is recognized as the clinical standard for patient-specific VMATmore » plan QA, it has limited length (20 cm) for most head and neck field apertures and has coarser detector spacing than the EPID (10 mm vs. 0.39 mm). In this work we compared measurement of the integrated fluence using the EPID with corresponding measurements from the ArcCHECK device. In the past year nine patients required an adapted plan. Each of the plans (the original and adapted) is composed of two arcs. Routine clinical QA was performed using the ArcCHECK device, and the same plans were delivered to the EPID (individual arcs) in integrated mode. The dose difference between the initial plan and adapted plan was compared for ArcCHECK and EPID. In most cases, it was found that the EPID is more sensitive in detecting plan differences. Therefore, we conclude that EPID provides a viable alternative for QA of the adapted head and neck plans and should be further explored.« less
TU-FG-201-05: Varian MPC as a Statistical Process Control Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carver, A; Rowbottom, C
Purpose: Quality assurance in radiotherapy requires the measurement of various machine parameters to ensure they remain within permitted values over time. In Truebeam release 2.0 the Machine Performance Check (MPC) was released allowing beam output and machine axis movements to be assessed in a single test. We aim to evaluate the Varian Machine Performance Check (MPC) as a tool for Statistical Process Control (SPC). Methods: Varian’s MPC tool was used on three Truebeam and one EDGE linac for a period of approximately one year. MPC was commissioned against independent systems. After this period the data were reviewed to determine whethermore » or not the MPC was useful as a process control tool. Analyses on individual tests were analysed using Shewhart control plots, using Matlab for analysis. Principal component analysis was used to determine if a multivariate model was of any benefit in analysing the data. Results: Control charts were found to be useful to detect beam output changes, worn T-nuts and jaw calibration issues. Upper and lower control limits were defined at the 95% level. Multivariate SPC was performed using Principal Component Analysis. We found little evidence of clustering beyond that which might be naively expected such as beam uniformity and beam output. Whilst this makes multivariate analysis of little use it suggests that each test is giving independent information. Conclusion: The variety of independent parameters tested in MPC makes it a sensitive tool for routine machine QA. We have determined that using control charts in our QA programme would rapidly detect changes in machine performance. The use of control charts allows large quantities of tests to be performed on all linacs without visual inspection of all results. The use of control limits alerts users when data are inconsistent with previous measurements before they become out of specification. A. Carver has received a speaker’s honorarium from Varian.« less
Improving NAVFAC's total quality management of construction drawings with CLIPS
NASA Technical Reports Server (NTRS)
Antelman, Albert
1991-01-01
A diagnostic expert system to improve the quality of Naval Facilities Engineering Command (NAVFAC) construction drawings and specification is described. C Language Integrated Production System (CLIPS) and computer aided design layering standards are used in an expert system to check and coordinate construction drawings and specifications to eliminate errors and omissions.
1990-09-01
change barriers, and necessary checks and balances built into processes. Furthermore, this assessment should address management system variables which...organisation’s 69 immediate product and their worklife . Focus must be maintained on improving RAAF processes. In addition to a quality committee structure as
DOT National Transportation Integrated Search
1985-12-01
This report documents the review of the MATerials and Test (MATT) Data System to check the validity of data within the system. A computer program to generate the quality level of a construction material was developed. Programs were also developed to ...
SU-F-T-272: Patient Specific Quality Assurance of Prostate VMAT Plans with Portal Dosimetry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Darko, J; Osei, E; University of Waterloo, Waterloo, ON
Purpose: To evaluate the effectiveness of using the Portal Dosimetry (PD) method for patient specific quality assurance of prostate VMAT plans. Methods: As per institutional protocol all VMAT plans were measured using the Varian Portal Dosimetry (PD) method. A gamma evaluation criterion of 3%-3mm with a minimum area gamma pass rate (gamma <1) of 95% is used clinically for all plans. We retrospectively evaluated the portal dosimetry results for 170 prostate patients treated with VMAT technique. Three sets of criterions were adopted for re-evaluating the measurements; 3%-3mm, 2%-2mm and 1%-1mm. For all criterions two areas, Field+1cm and MLC-CIAO were analysed.Tomore » ascertain the effectiveness of the portal dosimetry technique in determining the delivery accuracy of prostate VMAT plans, 10 patients previously measured with portal dosimetry, were randomly selected and their measurements repeated using the ArcCHECK method. The same criterion used in the analysis of PD was used for the ArcCHECK measurements. Results: All patient plans reviewed met the institutional criteria for Area Gamma pass rate. Overall, the gamma pass rate (gamma <1) decreases for 3%-3mm, 2%-2mm and 1%-1mm criterion. For each criterion the pass rate was significantly reduced when the MLC-CIAO was used instead of FIELD+1cm. There was noticeable change in sensitivity for MLC-CIAO with 2%-2mm criteria and much more significant reduction at 1%-1mm. Comparable results were obtained for the ArcCHECK measurements. Although differences were observed between the clockwise verses the counter clockwise plans in both the PD and ArcCHECK measurements, this was not deemed to be statistically significant. Conclusion: This work demonstrates that Portal Dosimetry technique can be effectively used for quality assurance of VMAT plans. Results obtained show similar sensitivity compared to ArcCheck. To reveal certain delivery inaccuracies, the use of a combination of criterions may provide an effective way in improving the overall sensitivity of PD. Funding provided in part by the Prostate Ride for Dad, Kitchener-Waterloo, Canada.« less
Using the Benford's Law as a First Step to Assess the Quality of the Cancer Registry Data.
Crocetti, Emanuele; Randi, Giorgia
2016-01-01
Benford's law states that the distribution of the first digit different from 0 [first significant digit (FSD)] in many collections of numbers is not uniform. The aim of this study is to evaluate whether population-based cancer incidence rates follow Benford's law, and if this can be used in their data quality check process. We sampled 43 population-based cancer registry populations (CRPs) from the Cancer Incidence in 5 Continents-volume X (CI5-X). The distribution of cancer incidence rate FSD was evaluated overall, by sex, and by CRP. Several statistics, including Pearson's coefficient of correlation and distance measures, were applied to check the adherence to the Benford's law. In the whole dataset (146,590 incidence rates) and for each sex (70,722 male and 75,868 female incidence rates), the FSD distributions were Benford-like. The coefficient of correlation between observed and expected FSD distributions was extremely high (0.999), and the distance measures low. Considering single CRP (from 933 to 7,222 incidence rates), the results were in agreement with the Benford's law, and only a few CRPs showed possible discrepancies from it. This study demonstrated for the first time that cancer incidence rates follow Benford's law. This characteristic can be used as a new, simple, and objective tool in data quality evaluation. The analyzed data had been already checked for publication in CI5-X. Therefore, their quality was expected to be good. In fact, only for a few CRPs several statistics were consistent with possible violations.
A mask quality control tool for the OSIRIS multi-object spectrograph
NASA Astrophysics Data System (ADS)
López-Ruiz, J. C.; Vaz Cedillo, Jacinto Javier; Ederoclite, Alessandro; Bongiovanni, Ángel; González Escalera, Víctor
2012-09-01
OSIRIS multi object spectrograph uses a set of user-customised-masks, which are manufactured on-demand. The manufacturing process consists of drilling the specified slits on the mask with the required accuracy. Ensuring that slits are on the right place when observing is of vital importance. We present a tool for checking the quality of the process of manufacturing the masks which is based on analyzing the instrument images obtained with the manufactured masks on place. The tool extracts the slit information from these images, relates specifications with the extracted slit information, and finally communicates to the operator if the manufactured mask fulfills the expectations of the mask designer. The proposed tool has been built using scripting languages and using standard libraries such as opencv, pyraf and scipy. The software architecture, advantages and limits of this tool in the lifecycle of a multiobject acquisition are presented.
Sealed Organic Check Material on Curiosity
2012-09-10
NASA Mars rover Curiosity carries five cylindrical blocks of organic check material for use in a control experiment if the rover Sample Analysis at Mars SAM laboratory detects any organic compounds in samples of Martian soil or powdered rock.
NASA Technical Reports Server (NTRS)
Hooker, Stanford B. (Editor); Firestone, Elaine R. (Editor); McClain, Charles R.; Darzi, Michael; Barnes, Robert A.; Eplee, Robert E.; Firestone, James K.; Patt, Frederick S.; Robinson, Wayne D.; Schieber, Brian D.;
1996-01-01
This document provides five brief reports that address several quality control procedures under the auspices of the Calibration and Validation Element (CVE) within the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) Project. Chapter 1 describes analyses of the 32 sensor engineering telemetry streams. Anomalies in any of the values may impact sensor performance in direct or indirect ways. The analyses are primarily examinations of parameter time series combined with statistical methods such as auto- and cross-correlation functions. Chapter 2 describes how the various onboard (solar and lunar) and vicarious (in situ) calibration data will be analyzed to quantify sensor degradation, if present. The analyses also include methods for detecting the influence of charged particles on sensor performance such as might be expected in the South Atlantic Anomaly (SAA). Chapter 3 discusses the quality control of the ancillary environmental data that are routinely received from other agencies or projects which are used in the atmospheric correction algorithm (total ozone, surface wind velocity, and surface pressure; surface relative humidity is also obtained, but is not used in the initial operational algorithm). Chapter 4 explains the procedures for screening level-, level-2, and level-3 products. These quality control operations incorporate both automated and interactive procedures which check for file format errors (all levels), navigation offsets (level-1), mask and flag performance (level-2), and product anomalies (all levels). Finally, Chapter 5 discusses the match-up data set development for comparing SeaWiFS level-2 derived products with in situ observations, as well as the subsequent outlier analyses that will be used for evaluating error sources.
Zema, Demetrio Antonio; Bombino, Giuseppe; Denisi, Pietro; Lucas-Borja, Manuel Esteban; Zimbone, Santo Marcello
2018-06-12
In mountain streams possible negative impacts of check dams on soil, water and riparian vegetation due to check dam installation can be noticed. In spite of the ample literature on the qualitative effects of engineering works on channel hydrology, morphology, sedimentary effects and riparian vegetation characteristics, quantitative evaluations of the changes induced by check dams on headwater characteristics are rare. In order to fill this gap, this study has evaluated the effects of check dams located in headwaters of Calabria (Southern Italy) on hydrological and geomorphological processes and on the response of riparian vegetation to these actions. The analysis has compared physical and vegetation indicators in transects identified around check dams (upstream and downstream) and far from their direct influence (control transects). Check dams were found to influence significantly unit discharge, surface and subsurface sediments (both upstream and downstream), channel shape and transverse distribution of riparian vegetation (upstream) as well as cover and structure of riparian complexes (downstream). The actions of the structures on torrent longitudinal slope and biodiversity of vegetation were less significant. The differences on bed profile slope were significant only between upstream and downstream transects. The results of the Agglomerative Hierarchical Cluster analysis confirmed the substantial similarity between upstream and control transects, thus highlighting that the construction of check dams, needed to mitigate the hydro-geological risks, has not strongly influenced the torrent functioning and ecology before check dam construction. Moreover, simple and quantitative linkages between torrent hydraulics, geomorphology and vegetation characteristics exist in the analysed headwaters; these relationships among physical adjustments of channels and most of the resulting characteristics of the riparian vegetation are specific for the transect locations with respect of check dams. Conversely, the biodiversity of the riparian vegetation basically eludes any quantitative relations with the physical and other vegetal characteristics of the torrent transects. Copyright © 2018 Elsevier B.V. All rights reserved.
2011-01-01
Background This article aims to update the existing systematic review evidence elicited by Mickenautsch et al. up to 18 January 2008 (published in the European Journal of Paediatric Dentistry in 2009) and addressing the review question of whether, in the same dentition and same cavity class, glass-ionomer cement (GIC) restored cavities show less recurrent carious lesions on cavity margins than cavities restored with amalgam. Methods The systematic literature search was extended beyond the original search date and a further hand-search and reference check was done. The quality of accepted trials was assessed, using updated quality criteria, and the risk of bias was investigated in more depth than previously reported. In addition, the focus of quantitative synthesis was shifted to single datasets extracted from the accepted trials. Results The database search (up to 10 August 2010) identified 1 new trial, in addition to the 9 included in the original systematic review, and 11 further trials were included after a hand-search and reference check. Of these 21 trials, 11 were excluded and 10 were accepted for data extraction and quality assessment. Thirteen dichotomous datasets of primary outcomes and 4 datasets with secondary outcomes were extracted. Meta-analysis and cumulative meta-analysis were used in combining clinically homogenous datasets. The overall results of the computed datasets suggest that GIC has a higher caries-preventive effect than amalgam for restorations in permanent teeth. No difference was found for restorations in the primary dentition. Conclusion This outcome is in agreement with the conclusions of the original systematic review. Although the findings of the trials identified in this update may be considered to be less affected by attrition- and publication bias, their risk of selection- and detection/performance bias is high. Thus, verification of the currently available results requires further high-quality randomised control trials. PMID:21396097
Navigation Algorithms for the SeaWiFS Mission
NASA Technical Reports Server (NTRS)
Hooker, Stanford B. (Editor); Firestone, Elaine R. (Editor); Patt, Frederick S.; McClain, Charles R. (Technical Monitor)
2002-01-01
The navigation algorithms for the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) were designed to meet the requirement of 1-pixel accuracy-a standard deviation (sigma) of 2. The objective has been to extract the best possible accuracy from the spacecraft telemetry and avoid the need for costly manual renavigation or geometric rectification. The requirement is addressed by postprocessing of both the Global Positioning System (GPS) receiver and Attitude Control System (ACS) data in the spacecraft telemetry stream. The navigation algorithms described are separated into four areas: orbit processing, attitude sensor processing, attitude determination, and final navigation processing. There has been substantial modification during the mission of the attitude determination and attitude sensor processing algorithms. For the former, the basic approach was completely changed during the first year of the mission, from a single-frame deterministic method to a Kalman smoother. This was done for several reasons: a) to improve the overall accuracy of the attitude determination, particularly near the sub-solar point; b) to reduce discontinuities; c) to support the single-ACS-string spacecraft operation that was started after the first mission year, which causes gaps in attitude sensor coverage; and d) to handle data quality problems (which became evident after launch) in the direct-broadcast data. The changes to the attitude sensor processing algorithms primarily involved the development of a model for the Earth horizon height, also needed for single-string operation; the incorporation of improved sensor calibration data; and improved data quality checking and smoothing to handle the data quality issues. The attitude sensor alignments have also been revised multiple times, generally in conjunction with the other changes. The orbit and final navigation processing algorithms have remained largely unchanged during the mission, aside from refinements to data quality checking. Although further improvements are certainly possible, future evolution of the algorithms is expected to be limited to refinements of the methods presented here, and no substantial changes are anticipated.
Reliability of electromagnetic induction data in near surface application
NASA Astrophysics Data System (ADS)
Nüsch, A.; Werban, U.; Sauer, U.; Dietrich, P.
2012-12-01
Use of the Electromagnetic Induction method for measuring electrical conductivities is widespread in applied geosciences, since the method is easy to perform and influenced by soil parameters. The vast amount of different applications of EMI measurements for different spatial resolutions as well as for the derivation of different soil parameters necessitates a unified handling of EMI data. So the requirements to the method have been changed from a qualitative overview to a quantitative use of data. A quantitative treatment of the data however is limited by the available instruments, which were made only for qualitative use. Nevertheless the limitations of the method can be expanded by considering a few conditions. In this study, we introduce possibilities for enhancing the quality of EMI data with regards to large scale investigations. In a set of systematic investigations, we show which aspects have to be taken into account when using a commercially available instrument, related to long term stability, comparability and repeatability. In-depth knowledge of the instruments used concerning aspects such as their calibration procedure, long term stability, battery life and thermal behaviour is an essential pre-requisite before starting the measurement process. A further aspect highlighted is quality control during measurements and if necessary a subsequent data correction which is pre-requisite for a quantitative analysis of the data. Quality control during the measurement process is crucial. Before a measurement starts, it is recommended that a short term test is carried out on-site to check environmental noise. Signal to noise ratio is a decisive influencing factor of whether or not the method is applicable at the chosen field site. A measurement needs to be monitored according to possible drifts. This can be achieved with different accuracies and starting from a quality check, with the help of reference lines up to a quantitative control with reference points. Further global reference lines are necessary if measurements take place at the landscape scale. In some cases, it is possible to eliminate drifts that may occur by using a data correction based on binding lines. The suggested procedure can raise the explanatory power of the data enormously and artefacts caused by drifts or inadequate handling are minimized. This work was supported by iSOIL - Interactions between soil related sciences - Linking geophysics, soil science and digital soil mapping, which is a Collaborative Project (Grant Agreement number 211386) co-funded by the Research DG of the European Commission within the RTD activities of the FP7 Thematic Priority Environment; iSOIL is one member of the SOIL TECHNOLOGY CLUSTER of Research Projects funded by the EC.
Phyllis C. Adams; Glenn A. Christensen
2012-01-01
A rigorous quality assurance (QA) process assures that the data and information provided by the Forest Inventory and Analysis (FIA) program meet the highest possible standards of precision, completeness, representativeness, comparability, and accuracy. FIA relies on its analysts to check the final data quality prior to release of a Stateâs data to the national FIA...
Implementation of Quality Assurance and Quality Control Measures in the National Phenology Database
NASA Astrophysics Data System (ADS)
Gerst, K.; Rosemartin, A.; Denny, E. G.; Marsh, L.; Barnett, L.
2015-12-01
The USA National Phenology Network (USA-NPN; www.usanpn.org) serves science and society by promoting a broad understanding of plant and animal phenology and the relationships among phenological patterns and environmental change. The National Phenology Database has over 5.5 million observation records for plants and animals for the period 1954-2015. These data have been used in a number of science, conservation and resource management applications, including national assessments of historical and potential future trends in phenology, regional assessments of spatio-temporal variation in organismal activity, and local monitoring for invasive species detection. Customizable data downloads are freely available, and data are accompanied by FGDC-compliant metadata, data-use and data-attribution policies, and vetted documented methodologies and protocols. The USA-NPN has implemented a number of measures to ensure both quality assurance and quality control. Here we describe the resources that have been developed so that incoming data submitted by both citizen and professional scientists are reliable; these include training materials, such as a botanical primer and species profiles. We also describe a number of automated quality control processes applied to incoming data streams to optimize data output quality. Existing and planned quality control measures for output of raw and derived data include: (1) Validation of site locations, including latitude, longitude, and elevation; (2) Flagging of records that conflict for a given date for an individual plant; (3) Flagging where species occur outside known ranges; (4) Flagging of records when phenophases occur outside of the plausible order for a species; (5) Flagging of records when intensity measures do not follow a plausible progression for a phenophase; (6) Flagging of records when a phenophase occurs outside of the plausible season, and (7) Quantification of precision and uncertainty for estimation of phenological metrics. Finally, we will describe preliminary work to develop methods for outlier detection that will inform plausibility checks. Ultimately we aim to maximize data quality of USA-NPN data and data products to ensure that this database can continue to be reliably applied for science and decision-making for multiple scales and applications.
Young, Stacie T.M.; Jamison, Marcael T.J.
2007-01-01
Storm runoff water-quality samples were collected as part of the State of Hawaii Department of Transportation Stormwater Monitoring Program. This program is designed to assess the effects of highway runoff and urban runoff on Halawa Stream. For this program, rainfall data were collected at two stations, continuous streamflow data at three stations, and water-quality data at five stations, which include the two continuous streamflow stations. This report summarizes rainfall, streamflow, and water-quality data collected between July 1, 2006 and June 30, 2007. A total of 13 samples was collected over two storms during July 1, 2006 to June 30, 2007. The goal was to collect grab samples nearly simultaneously at all five stations and flow-weighted time-composite samples at the three stations equipped with automatic samplers. Samples were analyzed for total suspended solids, total dissolved solids, nutrients, chemical oxygen demand, and selected trace metals (cadmium, chromium, copper, lead, nickel, and zinc). Additionally, grab samples were analyzed for oil and grease, total petroleum hydrocarbons, fecal coliform, and biological oxygen demand. Quality-assurance/quality-control samples were also collected during storms and during routine maintenance to verify analytical procedures and check the effectiveness of equipment-cleaning procedures.
NASA Astrophysics Data System (ADS)
Kawka, O. E.; Nelson, J. S.; Manalang, D.; Kelley, D. S.
2016-02-01
The Cabled Array component of the NSF-funded Ocean Observatories Initiative (OOI) provides access to real-time physical, chemical, geological, and biological data from water column and seafloor platforms/instruments at sites spanning the southern half of the Juan de Fuca Plate. The Quality Assurance (QA) program for OOI data is designed to ensure that data products meet OOI science requirements. This overall data QA plan establishes the guidelines for assuring OOI data quality and summarizes Quality Control (QC) protocols and procedures, based on best practices, which can be utilized to ensure the highest quality data across the OOI program. This presentation will highlight, specifically, the QA/QC approach being utilized for the OOI Cabled Array infrastructure and data and will include a summary of both shipboard and shore-based protocols currently in use. Aspects addressed will be pre-deployment instrument testing and calibration checks, post-deployment and pre-recovery field verification of data, and post-recovery "as-found" testing of instruments. Examples of QA/QC data will be presented and specific cases of cabled data will be discussed in the context of quality assessments and adjustment/correction of OOI datasets overall for inherent sensor drift and/or instrument fouling.
Drug supply in in-patient nursing care facilities: reasons for irregularities in quality reviews
Meinck, Matthias; Ernst, Friedemann; Pippel, Kristina; Gehrke, Jörg; Coners, Elise
2017-01-01
Background: Quality checks of the independent German Health Insurance Medical Service in in-patient nursing care facilities pursuant to Articles 114 et seqq. SGB XI [11th Book of the Social Code] also comprise the Pflegerische Medikamentenversorgung (PMV) [drug supply by nursing personnel]. Irregularities are described in quality reports in the reviewer’s own words. This investigation was intended to categorise the reasons for the above irregularities. Methods: The bases for the examination are the reports of quality checks of all of in-patient nursing care facilities conducted in 2014 (regular quality checks) in Hamburg and Schleswig-Holstein (N = 671), in which the PMV was examined for 5 742 randomly selected residents. Results: With regard to the documentation, inexplicable drug intakes (5.8 %) were found most frequently, followed by missing information on dosages and application provisions (0.8 % each), which were registered as irregularities at the residents. In the documentation of on-demand medication, insufficient indication data (3.2 %), missing daily maximum dosages (0.8 %) and missing single doses (0.6 %) were most commonly ascertained. The most frequent reasons for medication handling irregularities for the residents were false positioning (6.0 %), missing and respectively false data on consumption and on when the medical packaging was opened (3.5 %), as well as medication not directly administered using the blister (0.7 %). As for subordinate classifications of false positioning, incorrect dosages were revealed most often, followed by drugs with an exceeded expiry date and by out-of-stock drugs. Systematic patient-related factors with influence on PMV could not be determined. Conclusions: The extent of the irregularities and their type prompt a further increase in the efforts to improve the quality of nursing care facilities. The results can be used as a basis for designing specific initiatives to improve the PMV.
NASA Astrophysics Data System (ADS)
Kataoka, Haruno; Utsumi, Akira; Hirose, Yuki; Yoshiura, Hiroshi
Disclosure control of natural language information (DCNL), which we are trying to realize, is described. DCNL will be used for securing human communications over the internet, such as through blogs and social network services. Before sentences in the communications are disclosed, they are checked by DCNL and any phrases that could reveal sensitive information are transformed or omitted so that they are no longer revealing. DCNL checks not only phrases that directly represent sensitive information but also those that indirectly suggest it. Combinations of phrases are also checked. DCNL automatically learns the knowledge of sensitive phrases and the suggestive relations between phrases by using co-occurrence analysis and Web retrieval. The users' burden is therefore minimized, i.e., they do not need to define many disclosure control rules. DCNL complements the traditional access control in the fields where reliability needs to be balanced with enjoyment and objects classes for the access control cannot be predefined.
Full-Authority Fault-Tolerant Electronic Engine Control System for Variable Cycle Engines.
1982-04-01
single internally self-checked VLSI micro - processor . The selected configuration is an externally checked pair of com- mercially available...Electronic Engine Control FPMH Failures per Million Hours FTMP Fault Tolerant Multi- Processor FTSC Fault Tolerant Spaceborn Computer GRAMP Generalized...Removal * MTBR Mean Time Between Repair MTTF Mean Time to Failure xiii List of Abbreviations (continued) - NH High Pressure Rotor Speed O&S Operating
NASA Technical Reports Server (NTRS)
Gamble, Ed; Holzmann, Gerard
2011-01-01
Part of the US DOT investigation of Toyota SUA involved analysis of the throttle control software. JPL LaRS applied several techniques, including static analysis and logic model checking, to the software. A handful of logic models were built. Some weaknesses were identified; however, no cause for SUA was found. The full NASA report includes numerous other analyses
Elderly quality of life impacted by traditional chinese medicine techniques
Figueira, Helena A; Figueira, Olivia A; Figueira, Alan A; Figueira, Joana A; Giani, Tania S; Dantas, Estélio HM
2010-01-01
Background: The shift in age structure is having a profound impact, suggesting that the aged should be consulted as reporters on the quality of their own lives. Objectives: The aim of this research was to establish the possible impact of traditional Chinese medicine (TCM) techniques on the quality of life (QOL) of the elderly. Sample: Two non-selected, volunteer groups of Rio de Janeiro municipality inhabitants: a control group (36 individuals), not using TCM, and an experimental group (28 individuals), using TCM at ABACO/Sohaku-in Institute, Brazil. Methods: A questionnaire on elderly QOL devised by the World Health Organization, the WHOQOL-Old, was adopted and descriptive statistical techniques were used: mean and standard deviation. The Shapiro–Wilk test checked the normality of the distribution. Furthermore, based on its normality distribution for the intergroup comparison, the Student t test was applied to facets 2, 4, 5, 6, and total score, and the Mann–Whitney U rank test to facets 1 and 3, both tests aiming to analyze the P value between experimental and control groups. The significance level utilized was 95% (P < 0.05). Results: The experimental group reported the highest QOL for every facet and the total score. Conclusions: The results suggest that TCM raises the level of QOL. PMID:21103400
3D printing X-Ray Quality Control Phantoms. A Low Contrast Paradigm
NASA Astrophysics Data System (ADS)
Kapetanakis, I.; Fountos, G.; Michail, C.; Valais, I.; Kalyvas, N.
2017-11-01
Current 3D printing technology products may be usable in various biomedical applications. Such an application is the creation of X-ray quality control phantoms. In this work a self-assembled 3D printer (geeetech i3) was used for the design of a simple low contrast phantom. The printing material was Polylactic Acid (PLA) (100% printing density). Low contrast scheme was achieved by creating air-holes with different diameters and thicknesses, ranging from 1mm to 9mm. The phantom was irradiated at a Philips Diagnost 93 fluoroscopic installation at 40kV-70kV with the semi-automatic mode. The images were recorded with an Agfa cr30-x CR system and assessed with ImageJ software. The best contrast value observed was approximately 33%. In low contrast detectability check it was found that the 1mm diameter hole was always visible, for thickness larger or equal to 4mm. A reason for not being able to distinguish 1mm in smaller thicknesses might be the presence of printing patterns on the final image, which increased the structure noise. In conclusion the construction of a contrast resolution phantom with a 3D printer is feasible. The quality of the final product depends upon the printer accuracy and the material characteristics.
Designing and evaluating a persuasive child restraint television commercial.
Lewis, Ioni; Ho, Bonnie; Lennon, Alexia
2016-01-01
Relatively high rates of child restraint inappropriate use and misuse and faults in the installation of restraints have suggested a crucial need for public education messages to raise parental awareness of the need to use restraints correctly. This project involved the devising and pilot testing of message concepts, filming of a television advertisement (the TVC), and the evaluation of the TVC. This article focuses specifically upon the evaluation of the TVC. The development and evaluation of the TVC were guided by an extended theory of planned behavior that included the standard constructs of attitudes, subjective norms, and perceived behavioral control as well as the additional constructs of group norms and descriptive norms. The study also explored the extent to which parents with low and high intentions to self-check restraints differed on salient beliefs regarding the behavior. An online survey of parents (N = 384) was conducted where parents were randomly assigned to either the intervention group (n = 161), and therefore viewed the advertisement within the survey, or the control group (n = 223), and therefore did not view the advertisement. Following a one-off exposure to the TVC, the results indicated that, although not a significant difference, parents in the intervention group reported stronger intentions (M = 4.43, SD = 0.74) to self-check restraints than parents in the control group (M = 4.18, SD = 0.86). In addition, parents in the intervention group (M = 4.59, SD = 0.47) reported significantly higher levels of perceived behavioral control than parents in the control group (M = 4.40, SD = 0.73). The regression results revealed that, for parents in the intervention group, attitudes and group norms were significant predictors of parental intentions to self-check their child restraint. Finally, the exploratory analyses of parental beliefs suggested that those parents with low intentions to self-check child restraints were significantly more likely than high intenders to agree that they did not have enough time to check restraints or that having a child in a restraint is more important than checking the installation of the restraint. Overall, the findings provide some support for the persuasiveness of the child restraint TVC and provide insight into the factors influencing reported parental intentions as well as salient beliefs underpinning self-checking of restraints. Interventions that attempt to increase parental perceptions of the importance of self-checking restraints regularly and brevity of the time involved in doing so may be effective.
1982-07-01
was scheduled for an end-of-phase assessment ( equivalent to the stage check for the control group on the sixth flight). If performance was to NATOPS...proficiency was demonstrated. The same procedure was used for B stage flight except that the phase check (fourth flight) was equivalent to the control ...experimental grouo did not differ from the control qroup on tasks requirinq visual cues as a primary reference for successful completion (e.g
2004-01-24
Engineers and technicians in the control room at the Dryden Flight Research Center must constantly monitor critical operations and checks during research projects like NASA's hypersonic X-43A. Visible in the photo, taken two days before the X-43's captive carry flight in January 2004, are [foreground to background]; Tony Kawano (Range Safety Officer), Brad Neal (Mission Controller), and Griffin Corpening (Test Conductor).
Ontology Based Quality Evaluation for Spatial Data
NASA Astrophysics Data System (ADS)
Yılmaz, C.; Cömert, Ç.
2015-08-01
Many institutions will be providing data to the National Spatial Data Infrastructure (NSDI). Current technical background of the NSDI is based on syntactic web services. It is expected that this will be replaced by semantic web services. The quality of the data provided is important in terms of the decision-making process and the accuracy of transactions. Therefore, the data quality needs to be tested. This topic has been neglected in Turkey. Data quality control for NSDI may be done by private or public "data accreditation" institutions. A methodology is required for data quality evaluation. There are studies for data quality including ISO standards, academic studies and software to evaluate spatial data quality. ISO 19157 standard defines the data quality elements. Proprietary software such as, 1Spatial's 1Validate and ESRI's Data Reviewer offers quality evaluation based on their own classification of rules. Commonly, rule based approaches are used for geospatial data quality check. In this study, we look for the technical components to devise and implement a rule based approach with ontologies using free and open source software in semantic web context. Semantic web uses ontologies to deliver well-defined web resources and make them accessible to end-users and processes. We have created an ontology conforming to the geospatial data and defined some sample rules to show how to test data with respect to data quality elements including; attribute, topo-semantic and geometrical consistency using free and open source software. To test data against rules, sample GeoSPARQL queries are created, associated with specifications.
Do alcohol compliance checks decrease underage sales at neighboring establishments?
Erickson, Darin J; Smolenski, Derek J; Toomey, Traci L; Carlin, Bradley P; Wagenaar, Alexander C
2013-11-01
Underage alcohol compliance checks conducted by law enforcement agencies can reduce the likelihood of illegal alcohol sales at checked alcohol establishments, and theory suggests that an alcohol establishment that is checked may warn nearby establishments that compliance checks are being conducted in the area. In this study, we examined whether the effects of compliance checks diffuse to neighboring establishments. We used data from the Complying with the Minimum Drinking Age trial, which included more than 2,000 compliance checks conducted at more than 900 alcohol establishments. The primary outcome was the sale of alcohol to a pseudo-underage buyer without the need for age identification. A multilevel logistic regression was used to model the effect of a compliance check at each establishment as well as the effect of compliance checks at neighboring establishments within 500 m (stratified into four equal-radius concentric rings), after buyer, license, establishment, and community-level variables were controlled for. We observed a decrease in the likelihood of establishments selling alcohol to underage youth after they had been checked by law enforcement, but these effects quickly decayed over time. Establishments that had a close neighbor (within 125 m) checked in the past 90 days were also less likely to sell alcohol to young-appearing buyers. The spatial effect of compliance checks on other establishments decayed rapidly with increasing distance. Results confirm the hypothesis that the effects of police compliance checks do spill over to neighboring establishments. These findings have implications for the development of an optimal schedule of police compliance checks.
A new kind of universal smart home security safety monitoring system
NASA Astrophysics Data System (ADS)
Li, Biqing; Li, Zhao
2018-04-01
With the current level of social development, improved quality of life, existence and security issues of law and order has become an important issue. This graduation project adopts the form of wireless transmission, to STC89C52 microcontroller as the host control human infrared induction anti-theft monitoring system. The system mainly consists of main control circuit, power supply circuit, activities of the human body detection module, sound and light alarm circuit, record and display circuit. The main function is to achieve exploration activities on the human body, then the information is transmitted to the control panel, according to the system microcontroller program control sound and light alarm circuit, while recording the alarm location and time, and always check the record as required, and ultimately achieve the purpose of monitoring. The advantage of using pyroelectric infrared sensor can be installed in a hidden place, not easy to find, and low cost, good detection results, and has broad prospects for development.
7 CFR 58.243 - Checking quality.
Code of Federal Regulations, 2012 CFR
2012-01-01
.... Periodically samples of product and environmental material shall be tested for salmonella. Test results shall be negative when samples are tested for salmonella. Line samples should be taken periodically as an...
7 CFR 58.243 - Checking quality.
Code of Federal Regulations, 2013 CFR
2013-01-01
.... Periodically samples of product and environmental material shall be tested for salmonella. Test results shall be negative when samples are tested for salmonella. Line samples should be taken periodically as an...
7 CFR 58.243 - Checking quality.
Code of Federal Regulations, 2014 CFR
2014-01-01
.... Periodically samples of product and environmental material shall be tested for salmonella. Test results shall be negative when samples are tested for salmonella. Line samples should be taken periodically as an...
7 CFR 58.243 - Checking quality.
Code of Federal Regulations, 2011 CFR
2011-01-01
.... Periodically samples of product and environmental material shall be tested for salmonella. Test results shall be negative when samples are tested for salmonella. Line samples should be taken periodically as an...
[Practical implementation of a quality management system in a radiological department].
Huber, S; Zech, C J
2011-10-01
This article describes the architecture of a project aiming to implement a DIN EN ISO 9001 quality management system in a radiological department. It is intended to be a practical guide to demonstrate each step of the project leading to certification of the system. In a planning phase resources for the implementation of the project have to be identified and a quality management (QM) group as core team has to be formed. In the first project phase all available documents have to be checked and compiled in the QM manual. Moreover all relevant processes of the department have to be described in so-called process descriptions. In a second step responsibilities for the project are identified. Customer and employee surveys have to be carried out and a nonconformity management system has to be implemented. In this phase internal audits are also needed to check the new QM system, which is finally tested in the external certification audit with reference to its conformity with the standards.
Continuous improvement of the quality reporting system of a medium-size company
NASA Astrophysics Data System (ADS)
Hawkins, Anthony; Onuh, Spencer
2001-10-01
Many companies are faced with quality improvement issues on a daily basis but their response to this problem varies. This paper discusses the improvement in the defect reporting system at a medium sized manufacturing company following the appointment of an experienced, motivated, design engineer to be dedicated to that task. It sets out the situation that the engineer inherited and details the changes that were incorporated; it assesses which were successful and which failed. Following a survey of current literature, it was seen that there is little written specifically on the subject of audited defect reporting. It is felt that this study goes some way to filling that void. A successful survey of engineering companies in Southern Hampshire reinforces the principle findings, that the emphasising of the Check part of Demming's Plan-Do-Check-Act cycle is a novel approach to the Quality Improvement Process, and that it has reduced the cost of rework by an audited 80% in a period of two years.
PFReports: A program for systematic checking of annual peaks in NWISWeb
Ryberg, Karen R.
2008-01-01
The accuracy, characterization, and completeness of the U.S. Geological Survey (USGS) peak-flow data drive the determination of flood-frequency estimates that are used daily to design water and transportation infrastructure, delineate flood-plain boundaries, and regulate development and utilization of lands throughout the Nation and are essential to understanding the implications of climate change on flooding. Indeed, this high-profile database reflects and highlights the quality of USGS water-data collection programs. Its extension and improvement are essential to efforts to strengthen USGS networks and science leadership and is worthy of the attention of Water Science Center (WSC) hydrographers. This document describes a computer program, PFReports, and its output that facilitates efficient and robust review and correction of data in the USGS Peak Flow File (PFF) hosted as part of NWISWeb (the USGS public Web interface to much of the data stored and managed within the National Water Information System or NWIS). Checks embedded in the program are recommended as part of a more comprehensive assessment of peak flow data that will eventually include examination of possible regional changes, seasonal changes, and decadal variations in magnitude, timing, and frequency. Just as important as the comprehensive assessment, cleaning up the database will increase the likelihood of improved WSC regional flood-frequency equations. As an example of the value of cleaning up the PFF, data for 26,921 sites in the PFF were obtained. Of those sites, 17,542 sites had peak streamflow values and daily values. For the 17,542 sites, 1,097 peaks were identified that were less than the daily value for the day on which the peak occurred. Of the 26,921 sites, 11,643 had peak streamflow values, concurrent daily values, and at least 10 peaks. At the 11,643 sites, 2,205 peaks were identified as potential outliers in a regression of peak streamflows on daily values. Previous efforts to identify problems with the PFF were time consuming, laborious, and often ineffective. This new suite of checks represents an effort to automate identification of specific problems without plotting or printing large amounts of data that may not have problems. In addition, the results of the checks of the peak flow files are delivered through the World Wide Web with links to individual reports so that WSCs can focus on specific problems in an organized and standardized fashion. Over the years, technical reviews, regional-flood studies, and user inquiries have identified many minor and some major problems in the PFF. However, the cumbersome nature of the PFF editor and a lack of analytical tools have hampered efforts at quality assurance/quality control (QA/QC) and subsequently to make needed revisions to the database. This document is organized to provide information regarding PFReports, especially those tests involving regression and to provide an overview of the review procedures for utilizing the output. It also may be used as a reference for the data qualification codes and abbreviations for the tests. Results of the checks for all peak flow files (March 2008) are available at http://nd.water.usgs.gov/internal/pfreports/.
Technical editing of research reports in biomedical journals.
Wager, Elizabeth; Middleton, Philippa
2008-10-08
Most journals try to improve their articles by technical editing processes such as proof-reading, editing to conform to 'house styles', grammatical conventions and checking accuracy of cited references. Despite the considerable resources devoted to technical editing, we do not know whether it improves the accessibility of biomedical research findings or the utility of articles. This is an update of a Cochrane methodology review first published in 2003. To assess the effects of technical editing on research reports in peer-reviewed biomedical journals, and to assess the level of accuracy of references to these reports. We searched The Cochrane Library Issue 2, 2007; MEDLINE (last searched July 2006); EMBASE (last searched June 2007) and checked relevant articles for further references. We also searched the Internet and contacted researchers and experts in the field. Prospective or retrospective comparative studies of technical editing processes applied to original research articles in biomedical journals, as well as studies of reference accuracy. Two review authors independently assessed each study against the selection criteria and assessed the methodological quality of each study. One review author extracted the data, and the second review author repeated this. We located 32 studies addressing technical editing and 66 surveys of reference accuracy. Only three of the studies were randomised controlled trials. A 'package' of largely unspecified editorial processes applied between acceptance and publication was associated with improved readability in two studies and improved reporting quality in another two studies, while another study showed mixed results after stricter editorial policies were introduced. More intensive editorial processes were associated with fewer errors in abstracts and references. Providing instructions to authors was associated with improved reporting of ethics requirements in one study and fewer errors in references in two studies, but no difference was seen in the quality of abstracts in one randomised controlled trial. Structuring generally improved the quality of abstracts, but increased their length. The reference accuracy studies showed a median citation error rate of 38% and a median quotation error rate of 20%. Surprisingly few studies have evaluated the effects of technical editing rigorously. However there is some evidence that the 'package' of technical editing used by biomedical journals does improve papers. A substantial number of references in biomedical articles are cited or quoted inaccurately.
Rangé, G; Chassaing, S; Marcollet, P; Saint-Étienne, C; Dequenne, P; Goralski, M; Bardiére, P; Beverilli, F; Godillon, L; Sabine, B; Laure, C; Gautier, S; Hakim, R; Albert, F; Angoulvant, D; Grammatico-Guillon, L
2018-05-01
To assess the reliability and low cost of a computerized interventional cardiology (IC) registry to prospectively and systematically collect high-quality data for all consecutive coronary patients referred for coronary angiogram or/and coronary angioplasty. Rigorous clinical practice assessment is a key factor to improve prognosis in IC. A prospective and permanent registry could achieve this goal but, presumably, at high cost and low level of data quality. One multicentric IC registry (CRAC registry), fully integrated to usual coronary activity report software, started in the centre Val-de-Loire (CVL) French region in 2014. Quality assessment of CRAC registry was conducted on five IC CathLab of the CVL region, from January 1st to December 31st 2014. Quality of collected data was evaluated by measuring procedure exhaustivity (comparing with data from hospital information system), data completeness (quality controls) and data consistency (by checking complete medical charts as gold standard). Cost per procedure (global registry operating cost/number of collected procedures) was also estimated. CRAC model provided a high-quality level with 98.2% procedure completeness, 99.6% data completeness and 89% data consistency. The operating cost per procedure was €14.70 ($16.51) for data collection and quality control, including ST-segment elevation myocardial infarction (STEMI) preadmission information and one-year follow-up after angioplasty. This integrated computerized IC registry led to the construction of an exhaustive, reliable and costless database, including all coronary patients entering in participating IC centers in the CVL region. This solution will be developed in other French regions, setting up a national IC database for coronary patients in 2020: France PCI. Copyright © 2018 Elsevier Masson SAS. All rights reserved.
Second generation experiments in fault tolerant software
NASA Technical Reports Server (NTRS)
Knight, J. C.
1987-01-01
The purpose of the Multi-Version Software (MVS) experiment is to obtain empirical measurements of the performance of multi-version systems. Twenty version of a program were prepared under reasonably realistic development conditions from the same specifications. The overall structure of the testing environment for the MVS experiment and its status are described. A preliminary version of the control system is described that was implemented for the MVS experiment to allow the experimenter to have control over the details of the testing. The results of an empirical study of error detection using self checks are also presented. The analysis of the checks revealed that there are great differences in the ability of individual programmers to design effective checks.
Preparation for a Changing World: Quality Education Program Study. Booklet 10-A (Needs Assessment).
ERIC Educational Resources Information Center
Bucks County Public Schools, Doylestown, PA.
The general needs assessment instrument can provide the means for a school district to assess its needs relative to the Ten Goals of Quality Education. It is comprised of behavior statements taken from the category schemes. The student must check the appropriate number for each statement representing "always" through "never".…
The effects of group supervision of nurses: a systematic literature review.
Francke, Anneke L; de Graaff, Fuusje M
2012-09-01
To gain insight into the existing scientific evidence on the effects of group supervision for nurses. A systematic literature study of original research publications. Searches were performed in February 2010 in PubMed, CINAHL, Cochrane Library, Embase, ERIC, the NIVEL catalogue, and PsycINFO. No limitations were applied regarding date of publication, language or country. Original research publications were eligible for review when they described group supervision programmes directed at nurses; used a control group or a pre-test post-test design; and gave information about the effects of group supervision on nurse or patient outcomes. The two review authors independently assessed studies for inclusion. The methodological quality of included studies was also independently assessed by the review authors, using a check list developed by Van Tulder et al. in collaboration with the Dutch Cochrane Centre. Data related to the original publications were extracted by one review author and checked by a second review author. No statistical pooling of outcomes was performed, because there was large heterogeneity of outcomes. A total of 1087 potentially relevant references were found. After screening of the references, eight studies with a control group and nine with a pre-test post-test design were included. Most of the 17 studies included have serious methodological limitations, but four Swedish publications in the field of dementia care had high methodological quality and all point to positive effects on nurses' attitudes and skills and/or nurse-patient interactions. However, in interpreting these positive results, it must be taken into account that these four high-quality publications concern sub-studies of one 'sliced' research project using the same study sample. Moreover, these four publications combined a group supervision intervention with the introduction of individual care planning, which also hampers conclusions about the effectiveness of group supervision alone. Although there are rather a lot of indications that group supervision of nurses is effective, evidence on the effects is still scarce. Further methodologically sound research is needed. Copyright © 2011 Elsevier Ltd. All rights reserved.
Nuin, Maider; Alfaro, Begoña; Cruz, Ziortza; Argarate, Nerea; George, Susie; Le Marc, Yvan; Olley, June; Pin, Carmen
2008-10-31
Kinetic models were developed to predict the microbial spoilage and the sensory quality of fresh fish and to evaluate the efficiency of a commercial time-temperature integrator (TTI) label, Fresh Check(R), to monitor shelf life. Farmed turbot (Psetta maxima) samples were packaged in PVC film and stored at 0, 5, 10 and 15 degrees C. Microbial growth and sensory attributes were monitored at regular time intervals. The response of the Fresh Check device was measured at the same temperatures during the storage period. The sensory perception was quantified according to a global sensory indicator obtained by principal component analysis as well as to the Quality Index Method, QIM, as described by Rahman and Olley [Rahman, H.A., Olley, J., 1984. Assessment of sensory techniques for quality assessment of Australian fish. CSIRO Tasmanian Regional Laboratory. Occasional paper n. 8. Available from the Australian Maritime College library. Newnham. Tasmania]. Both methods were found equally valid to monitor the loss of sensory quality. The maximum specific growth rate of spoilage bacteria, the rate of change of the sensory indicators and the rate of change of the colour measurements of the TTI label were modelled as a function of temperature. The temperature had a similar effect on the bacteria, sensory and Fresh Check kinetics. At the time of sensory rejection, the bacterial load was ca. 10(5)-10(6) cfu/g. The end of shelf life indicated by the Fresh Check label was close to the sensory rejection time. The performance of the models was validated under fluctuating temperature conditions by comparing the predicted and measured values for all microbial, sensory and TTI responses. The models have been implemented in a Visual Basic add-in for Excel called "Fish Shelf Life Prediction (FSLP)". This program predicts sensory acceptability and growth of spoilage bacteria in fish and the response of the TTI at constant and fluctuating temperature conditions. The program is freely available at http://www.azti.es/muestracontenido.asp?idcontenido=980&content=15&nodo1=30&nodo2=0.
Exercise: When to Check with Your Doctor First
... check with your doctor before you start to exercise. By Mayo Clinic Staff Regular exercise can help you control your weight, reduce your ... talk to your doctor before starting a new exercise routine. Although moderate physical activity such as brisk ...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-28
... Request ACTION: 60-Day Notice of Information Collection; G-146; Non-Immigrant Check Letter; OMB Control No... collection. (2) Title of the Form/Collection: Non-Immigrant Check Letter. (3) Agency form number, if any, and...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-20
... Request ACTION: 60-day notice of information collection; G-146; Non-Immigrant Check Letter; OMB Control No... collection. (2) Title of the Form/Collection: Non-Immigrant Check Letter. (3) Agency form number, if any, and...
Presley, Todd K.; Jamison, Marcael T.J.; Young, Stacie T.M.
2008-01-01
Storm runoff water-quality samples were collected as part of the State of Hawaii Department of Transportation Stormwater Monitoring Program. The program is designed to assess the effects of highway runoff and urban runoff on Halawa Stream and to assess the effects from the H-1 storm drain on Manoa Stream. For this program, rainfall data were collected at three stations, continuous discharge data at four stations, and water-quality data at six stations, which include the four continuous discharge stations. This report summarizes rainfall, discharge, and water-quality data collected between July 1, 2007, and June 30, 2008. A total of 16 environmental samples were collected over two storms during July 1, 2007, to June 30, 2008, within the Halawa Stream drainage area. Samples were analyzed for total suspended solids, total dissolved solids, nutrients, chemical oxygen demand, and selected trace metals (cadmium, chromium, copper, lead, and zinc). Additionally, grab samples were analyzed for oil and grease, total petroleum hydrocarbons, fecal coliform, and biological oxygen demand. Some samples were analyzed for only a partial list of these analytes because an insufficient volume of sample was collected by the automatic samplers. Three additional quality-assurance/quality-control samples were collected concurrently with the storm samples. A total of 16 environmental samples were collected over four storms during July 1, 2007, to June 30, 2008 at the H-1 Storm Drain. All samples at this site were collected using an automatic sampler. Samples generally were analyzed for total suspended solids, nutrients, chemical oxygen demand, oil and grease, total petroleum hydrocarbons, and selected trace metals (cadmium, chromium, copper, lead, nickel, and zinc), although some samples were analyzed for only a partial list of these analytes. During the storm of January 29, 2008, 10 discrete samples were collected. Varying constituent concentrations were detected for the samples collected at different times during this storm event. Two quality-assurance/quality-control samples were collected concurrently with the storm samples. Three additional quality-assurance/quality-control samples were collected during routine sampler maintenance to check the effectiveness of equipment-cleaning procedures.
Data quality can make or break a research infrastructure
NASA Astrophysics Data System (ADS)
Pastorello, G.; Gunter, D.; Chu, H.; Christianson, D. S.; Trotta, C.; Canfora, E.; Faybishenko, B.; Cheah, Y. W.; Beekwilder, N.; Chan, S.; Dengel, S.; Keenan, T. F.; O'Brien, F.; Elbashandy, A.; Poindexter, C.; Humphrey, M.; Papale, D.; Agarwal, D.
2017-12-01
Research infrastructures (RIs) commonly support observational data provided by multiple, independent sources. Uniformity in the data distributed by such RIs is important in most applications, e.g., in comparative studies using data from two or more sources. Achieving uniformity in terms of data quality is challenging, especially considering that many data issues are unpredictable and cannot be detected until a first occurrence of the issue. With that, many data quality control activities within RIs require a manual, human-in-the-loop element, making it an expensive activity. Our motivating example is the FLUXNET2015 dataset - a collection of ecosystem-level carbon, water, and energy fluxes between land and atmosphere from over 200 sites around the world, some sites with over 20 years of data. About 90% of the human effort to create the dataset was spent in data quality related activities. Based on this experience, we have been working on solutions to increase the automation of data quality control procedures. Since it is nearly impossible to fully automate all quality related checks, we have been drawing from the experience with techniques used in software development, which shares a few common constraints. In both managing scientific data and writing software, human time is a precious resource; code bases, as Science datasets, can be large, complex, and full of errors; both scientific and software endeavors can be pursued by individuals, but collaborative teams can accomplish a lot more. The lucrative and fast-paced nature of the software industry fueled the creation of methods and tools to increase automation and productivity within these constraints. Issue tracking systems, methods for translating problems into automated tests, powerful version control tools are a few examples. Terrestrial and aquatic ecosystems research relies heavily on many types of observational data. As volumes of data collection increases, ensuring data quality is becoming an unwieldy challenge for RIs. Business as usual approaches to data quality do not work with larger data volumes. We believe RIs can benefit greatly from adapting and imitating this body of theory and practice from software quality into data quality, enabling systematic and reproducible safeguards against errors and mistakes in datasets as much as in software.
NASA Astrophysics Data System (ADS)
Orlecka-Sikora, Beata; Kwiatek, Grzegorz; Olszewska, Dorota; Lasocki, Stanisław; Gasparini, Paolo; Kozlovskaya, Elena; Nevalainen, Jouni; Schmittbuhl, Jean; Grasso, Jean Robert; Schaming, Marc; Biggare, Pascal; Saccarotti, Gilberto; Garcia, Alexander; Cassidy, Nigel; Toon, Sam; Mutke, Grzegorz; Sterzel, Mariusz; Szepieniec, Tomasz
2016-04-01
The EPOS integration plan assumes a significant contribution to the research on anthropogenic hazards (AH) associated with the exploration and exploitation of geo-resources. These problems will be dealt in Thematic Core Service "Anthropogenic Hazards" (TCS AH). TCS AH is based on the prototype built in the framework of the IS-EPOS platform project (https://tcs.ah-epos.eu/), financed from Polish structural funds (POIG.02.03.00-14-090/13-00), with will be further developed within EPOS IP project (H2020-INFRADEV-1-2015-1, INFRADEV-3-2015). TCS AH aims to have a measurable impact on innovative research and development as well as on society by providing comprehensive, wide-scale and high quality AH research infrastructure. One of the main deliverables are numerous comprehensive induced seismicity datasets called "episodes". The episode is a comprehensive data description of a geophysical process, induced or triggered by technological activity, which under certain circumstances can become hazardous for people, infrastructure and the environment. In addition to the six episodes already implemented during the mentioned IS-EPOS project, at least 20 new episodes related to conventional hydrocarbon extraction, reservoir treatment, underground mining and geothermal energy production are being integrated into the e-environment of the TCS AH. The heterogeneous multi-disciplinary data are transformed to unified structures developed within IS-EPOS project, to form integrated and validated datasets. Dedicated visualization tools for multidisciplinary data comprising episodes are also implemented. These tools are capable to aggregate and combine different data types and facilitating specific visualization possibilities (e.g. combining seismic and technological information). The implementation process, tailored for each episode, consists of four steps: (i) Data revision, determination of its accuracy and limitations; (ii) Data preparation and homogenization to follow the TCS AH standards; (iii) Data collection - uploading the data to local data centres and (iv) Metadata preparation. Some web services for efficient data integration and reduction of the possible mistakes, have been already developed. The data sets will be also passing through quality control according to the established quality control scheme. The data quality control workflow includes five steps; the first three are done in the episode provider's local data centre, whereas the next two are accomplished on the TCS AH side: (1) Episode data are transferred to the local data centre; the control group roles are distributed and the workflow observer is appointed; (2) The data are standardized and formats are validated; the completeness and quality of the data is checked; (3) Metadata are prepared according to TCS AH metadata scheme; the published data and metadata are checked; (4) Contextual quality of the data is analysed and episode appears in TCS AH maintenance area; (5) The new episode data is available in TCS AH. TCS AH will also serve as an integration platform for the episodes gathered in the framework of "Shale gas exploration and exploitation induced risks" project (Horizon 2020, call LCE 16-2014).
Hospital quality: a product of good management as much as good treatment.
Hyde, Andy; Frafjord, Anders
2013-01-01
In Norway, as in most countries, the demands placed on hospitals to reduce costs and improve the quality of services are intense. Although many say that improving quality reduces costs, few can prove it. Futhermore, how many people can show that improving quality improves patient satisfaction. Diakonhjemmet hospital in Norway has designed and implemented a hospital management system based on lean principles and the PDCA (Plan-Do-Check-Act) quality circle introduced by WE Deming (Deming 2000). The results are quite impressive with improvements in quality and patient satisfaction. The hospital also runs at a profit.
Detection and Analysis of the Quality of Ibuprofen Granules
NASA Astrophysics Data System (ADS)
Yu-bin, Ji; Xin, LI; Guo-song, Xin; Qin-bing, Xue
2017-12-01
The Ibuprofen Granules comprehensive quality testing to ensure that it is in accordance with the provisions of Chinese pharmacopoeia. With reference of Chinese pharmacopoeia, the Ibuprofen Granules is tested by UV, HPLC, in terms of grain size checking, volume deviation, weight loss on drying detection, dissolution rate detection, and quality evaluation. Results indicated that Ibuprofen Granules conform to the standards. The Ibuprofen Granules are qualified and should be permitted to be marketed.
Health Effects of Ozone Pollution
Inhaling ozone can cause coughing, shortness of breath, worse asthma or bronchitis symptoms, and irritation and damage to airways.You can reduce your exposure to ozone pollution by checking air quality where you live.
46 CFR 160.176-13 - Approval Tests.
Code of Federal Regulations, 2011 CFR
2011-10-01
... thread count must be at least 400 N (90 lb.). (v) [Reserved] (w) Visual examination. One complete... check the quality of incoming lifejacket components and the production process. Test samples must come...
46 CFR 160.176-13 - Approval Tests.
Code of Federal Regulations, 2013 CFR
2013-10-01
... thread count must be at least 400 N (90 lb.). (v) [Reserved] (w) Visual examination. One complete... check the quality of incoming lifejacket components and the production process. Test samples must come...
46 CFR 160.176-13 - Approval Tests.
Code of Federal Regulations, 2012 CFR
2012-10-01
... thread count must be at least 400 N (90 lb.). (v) [Reserved] (w) Visual examination. One complete... check the quality of incoming lifejacket components and the production process. Test samples must come...
46 CFR 160.176-13 - Approval Tests.
Code of Federal Regulations, 2014 CFR
2014-10-01
... thread count must be at least 400 N (90 lb.). (v) [Reserved] (w) Visual examination. One complete... check the quality of incoming lifejacket components and the production process. Test samples must come...
77 FR 38273 - Science Advisory Board; Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-27
... Administration (NOAA) science programs are of the highest quality and provide optimal support to resource... Environmental Laboratory, 7600 Sand Point Way NE., Seattle, Washington 98115. Please check the SAB Web site http...
[Association between sleep quality and life function among elderly community residents].
Tanaka, Mika; Kusaga, Mari; Tagaya, Hirokuni; Miyoko, I; Oshima, Asami; Watanabe, Chiho
2012-01-01
To investigate the association between sleep quality and life function in an elderly Japanese population. A total of 563 residents of a village in Kumamoto Prefecture aged ≥65 years were asked to fill out a self-administered questionnaire survey from June to July 2010. Sleep quality and life function were respectively evaluated using the Pittsburgh Sleep Quality Index (PSQI) and Basics Check List, which is used to screen elderly individuals at high risk of needing long-term care in the future. As adjustment factors, age, sex, economic situation, residency status, medical history, depression status, and cognitive function were assessed. We examined the relationship between sleep quality and life function using multiple logistic regression analysis, with life function as a dependent variable. Subjects already receiving care or with psychiatric disorders or severe cognitive disturbance were excluded from analysis. Among the subjects (n=395), a significant relationship was found between poor sleep quality and impaired life function in all models. The odds ratio was 1.82 (95% confidence interval: 1.03-3.23) in the final model controlling for all adjustment factors. Our findings here suggest a significant relationship between poor sleep quality and impaired life function among elderly community residents. Given these findings, intervention to improve sleep may help delay or prevent the need for long-term care among elderly individuals.
Mitchell, Joanna; Hardeman, Wendy; Pears, Sally; Vasconcelos, Joana C; Prevost, A Toby; Wilson, Ed; Sutton, Stephen
2016-06-27
Physical activity interventions that are targeted at individuals can be effective in encouraging people to be more physically active. However, most such interventions are too long or complex and not scalable to the general population. This trial will test the effectiveness and cost-effectiveness of a very brief physical activity intervention when delivered as part of preventative health checks in primary care (National Health Service (NHS) Health Check). The Very Brief Intervention (VBI) Trial is a two parallel-group, randomised, controlled trial with 1:1 individual allocation and follow-up at 3 months. A total of 1,140 participants will be recruited from 23 primary care practices in the east of England. Participants eligible for an NHS Health Check and who are considered suitable to take part by their doctor and able to provide written informed consent are eligible for the trial. Participants are randomly assigned at the beginning of the NHS Health Check to either 1) the control arm, in which they receive only the NHS Health Check, or 2) the intervention arm, in which they receive the NHS Health Check plus 'Step It Up' (a very brief intervention that can be delivered in 5 minutes by nurses and/or healthcare assistants at the end of the Health Check). 'Step It Up' includes (1) a face-to-face discussion, including feedback on current activity level, recommendations for physical activity, and information on how to use a pedometer, set step goals, and monitor progress; (2) written material supporting the discussion and tips and links to further resources to help increase physical activity; and (3) a pedometer to wear and a step chart for monitoring progress. The primary outcome is accelerometer counts per minute at 3-month follow-up. Secondary outcomes include the time spent in the different levels of physical activity, self-reported physical activity and economic measures. Trial recruitment is underway. The VBI trial will provide evidence on the effectiveness and cost-effectiveness of the Step It Up intervention delivered during NHS Health Checks and will inform policy decisions about introducing very brief interventions into routine primary care practice. ISRCTN Registry, ISRCTN72691150 . Registered on 17 July 2014.
Dissolution testing of orally disintegrating tablets.
Kraemer, Johannes; Gajendran, Jayachandar; Guillot, Alexis; Schichtel, Julian; Tuereli, Akif
2012-07-01
For industrially manufactured pharmaceutical dosage forms, product quality tests and performance tests are required to ascertain the quality of the final product. Current compendial requirements specify a disintegration and/or a dissolution test to check the quality of oral solid dosage forms. These requirements led to a number of compendial monographs for individual products and, at times, the results obtained may not be reflective of the dosage form performance. Although a general product performance test is desirable for orally disintegrating tablets (ODTs), the complexity of the release controlling mechanisms and short time-frame of release make such tests difficult to establish. For conventional oral solid dosage forms (COSDFs), disintegration is often considered to be the prerequisite for subsequent dissolution. Hence, disintegration testing is usually insufficient to judge product performance of COSDFs. Given the very fast disintegration of ODTs, the relationship between disintegration and dissolution is worthy of closer scrutiny. This article reviews the current status of dissolution testing of ODTs to establish the product quality standards. Based on experimental results, it appears that it may be feasible to rely on the dissolution test without a need for disintegration studies for selected ODTs on the market. © 2012 The Authors. JPP © 2012 Royal Pharmaceutical Society.
Comparison of the performance of intraoral X-ray sensors using objective image quality assessment.
Hellén-Halme, Kristina; Johansson, Curt; Nilsson, Mats
2016-05-01
The main aim of this study was to evaluate the performance of 10 individual sensors of the same make, using objective measures of key image quality parameters. A further aim was to compare 8 brands of sensors. Ten new sensors of 8 different models from 6 manufacturers (i.e., 80 sensors) were included in the study. All sensors were exposed in a standardized way using an X-ray tube voltage of 60 kVp and different exposure times. Sensor response, noise, low-contrast resolution, spatial resolution and uniformity were measured. Individual differences between sensors of the same brand were surprisingly large in some cases. There were clear differences in the characteristics of the different brands of sensors. The largest variations were found for individual sensor response for some of the brands studied. Also, noise level and low contrast resolution showed large variations between brands. Sensors, even of the same brand, vary significantly in their quality. It is thus valuable to establish action levels for the acceptance of newly delivered sensors and to use objective image quality control for commissioning purposes and periodic checks to ensure high performance of individual digital sensors. Copyright © 2016 Elsevier Inc. All rights reserved.
A new verification film system for routine quality control of radiation fields: Kodak EC-L.
Hermann, A; Bratengeier, K; Priske, A; Flentje, M
2000-06-01
The use of modern irradiation techniques requires better verification films for determining set-up deviations and patient movements during the course of radiation treatment. This is an investigation of the image quality and time requirement of a new verification film system compared to a conventional portal film system. For conventional verifications we used Agfa Curix HT 1000 films which were compared to the new Kodak EC-L film system. 344 Agfa Curix HT 1000 and 381 Kodak EC-L portal films of different tumor sites (prostate, rectum, head and neck) were visually judged on a light box by 2 experienced physicians. Subjective judgement of image quality, masking of films and time requirement were checked. In this investigation 68% of 175 Kodak EC-L ap/pa-films were judged "good", only 18% were classified "moderate" or "poor" 14%, but only 22% of 173 conventional ap/pa verification films (Agfa Curix HT 1000) were judged to be "good". The image quality, detail perception and time required for film inspection of the new Kodak EC-L film system was significantly improved when compared with standard portal films. They could be read more accurately and the detection of set-up deviation was facilitated.
The advanced qualtiy control techniques planned for the Internation Soil Moisture Network
NASA Astrophysics Data System (ADS)
Xaver, A.; Gruber, A.; Hegiova, A.; Sanchis-Dufau, A. D.; Dorigo, W. A.
2012-04-01
In situ soil moisture observations are essential to evaluate and calibrate modeled and remotely sensed soil moisture products. Although a number of meteorological networks and field campaigns measuring soil moisture exist on a global and long-term scale, their observations are not easily accessible and lack standardization of both technique and protocol. Thus, handling and especially comparing these datasets with satellite products or land surface models is a demanding issue. To overcome these limitations the International Soil Moisture Network (ISMN; http://www.ipf.tuwien.ac.at/insitu/) has been initiated to act as a centralized data hosting facility. One advantage of the ISMN is that users are able to access the harmonized datasets easily through a web portal. Another advantage is the fully automated processing chain including the data harmonization in terms of units and sampling interval, but even more important is the advanced quality control system each measurement has to run through. The quality of in situ soil moisture measurements is crucial for the validation of satellite- and model-based soil moisture retrievals; therefore a sophisticated quality control system was developed. After a check for plausibility and geophysical limits a quality flag is added to each measurement. An enhanced flagging mechanism was recently defined using a spectrum based approach to detect spurious spikes, jumps and plateaus. The International Soil Moisture Network has already evolved to one of the most important distribution platforms for in situ soil moisture observations and is still growing. Currently, data from 27 networks in total covering more than 800 stations in Europe, North America, Australia, Asia and Africa is hosted by the ISMN. Available datasets also include historical datasets as well as near real-time measurements. The improved quality control system will provide important information for satellite-based as well as land surface model-based validation studies.
Behera, M D; Gupta, A K; Barik, S K; Das, P; Panda, R M
2018-06-15
With the availability of satellite data from free data domain, remote sensing has increasingly become a fast-hand tool for monitoring of land and water resources development activities with minimal cost and time. Here, we verified construction of check dams and implementation of plantation activities in two districts of Tripura state using Landsat and Sentinel-2 images for the years 2008 and 2016-2017, respectively. We applied spectral reflectance curves and index-based proxies to quantify these activities for two time periods. A subset of the total check dams and plantation sites was chosen on the basis of site condition, nature of check dams, and planted species for identification on satellite images, and another subset was randomly chosen to validate identification procedure. The normalized difference water index (NDWI) derived from Landsat and Senitnel-2 were used to quantify water area evolved, qualify the water quality, and influence of associated tree shadows. Three types of check dams were observed, i.e., full, partial, and fully soil exposed on the basis of the presence of grass or scrub on the check dams. Based on the nature of check dam and site characteristics, we classified the water bodies under 11-categories using six interpretation keys (size, shape, water depth, quality, shadow of associated trees, catchment area). The check dams constructed on existing narrow gullies totally covered by branches or associated plants were not identified without field verification. Further, use of EVI enabled us to approve the plantation activities and adjudge the corresponding increase in vegetation vigor. The plantation activities were established based on the presence and absence of existing vegetation. Clearing on the plantation sites for plantation shows differential increase in EVI values during the initial years. The 403 plantation sites were categorized into 12 major groups on the basis of presence of dominant species and site conditions. The dominant species were Areca catechu, Musa paradisiaca, Ananas comosus, Bambusa sp., and mix plantation of A. catechu and M. paradisiaca. However, the highest maximum increase in average EVI was observed for the pine apple plantation sites (0.11), followed by Bambussa sp. (0.10). These sites were fully covered with plantation without any exposed soil. The present study successfully demonstrates a satellite-based survey supplemented with ground information evaluating the changes in vegetation profile due to plantation activities, locations of check dams, extent of water bodies, downstream irrigation, and catchment area of water bodies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tietze-Jaensch, Holger; Schneider, Stephan; Aksyutina, Yuliya
2012-07-01
The German product quality control is inter alia responsible for control of two radioactive waste forms of heat generating waste: a) homogeneous vitrified HLW and b) heterogeneous compacted hulls, end-pieces and technological metallic waste. In either case, significantly different metrology is employed at the site of the conditioning plant for the obligatory nuclide inventory declaration. To facilitate an independent evaluation and checking of the accompanying documentation numerical simulations are carried out. The physical and chemical properties of radioactive waste residues are used to assess the data consistency and uncertainty margins, as well as to predict the long-term behavior of themore » radioactive waste. This is relevant for repository acceptance and safety considerations. Our new numerical approach follows a bottom-up simulation starting from the burn-up behavior of the fuel elements in the reactor core. The output of these burn-up calculations is then coupled with a program that simulates the material separation in the subsequent dissolution and extraction processes normalized to the mass balance. Follow-up simulations of the separated reprocessing lines of a) the vitrification of highly-active liquid and b) the compaction of residual intermediate-active metallic hulls remaining after fuel pellets dissolution, end-pieces and technological waste, allows calculating expectation values for the various repository relevant properties of either waste stream. The principles of the German product quality control of radioactive waste residues from the spent fuel reprocessing have been introduced and explained. Namely, heat generating homogeneous vitrified HLW and heterogeneous compacted metallic MLW have been discussed. The advantages of a complementary numerical property simulation have been made clear and examples of benefits are presented. We have compiled a new program suite to calculate the physical and radio-chemical properties of common nuclear waste residues. The immediate benefit is the independent assessment of radio-active inventory declarations and much facilitated product quality control of waste residues that need to be returned to Germany and submitted to a German HLW-repository requirements. Wherever possible, internationally accepted standard programs are used and embedded. The innovative coupling of burn-up calculations (SCALE) with neutron and gamma transport codes (MCPN-X) allows an application in the world of virtual waste properties. If-then-else scenarios of hypothetical waste material compositions and distributions provide valuable information of long term nuclide property propagation under repository conditions over a very long time span. Benchmarking the program with real residue data demonstrates the power and remarkable accuracy of this numerical approach, boosting the reliability of the confidence aforementioned numerous applications, namely the proof tool set for on-the-spot production quality checking and data evaluation and independent verification. Moreover, using the numerical bottom-up approach helps to avoid the accumulation of fake activities that may gradually build up in a repository from the so-called conservative or penalizing nuclide inventory declarations. The radioactive waste properties and the hydrolytic and chemical stability can be predicted. The interaction with invasive chemicals can be assessed and propagation scenarios can be developed from reliable and sound data and HLW properties. Hence, the appropriate design of a future HLW repository can be based upon predictable and quality assured waste characteristics. (authors)« less
Exploring Antarctic Land Surface Temperature Extremes Using Condensed Anomaly Databases
NASA Astrophysics Data System (ADS)
Grant, Glenn Edwin
Satellite observations have revolutionized the Earth Sciences and climate studies. However, data and imagery continue to accumulate at an accelerating rate, and efficient tools for data discovery, analysis, and quality checking lag behind. In particular, studies of long-term, continental-scale processes at high spatiotemporal resolutions are especially problematic. The traditional technique of downloading an entire dataset and using customized analysis code is often impractical or consumes too many resources. The Condensate Database Project was envisioned as an alternative method for data exploration and quality checking. The project's premise was that much of the data in any satellite dataset is unneeded and can be eliminated, compacting massive datasets into more manageable sizes. Dataset sizes are further reduced by retaining only anomalous data of high interest. Hosting the resulting "condensed" datasets in high-speed databases enables immediate availability for queries and exploration. Proof of the project's success relied on demonstrating that the anomaly database methods can enhance and accelerate scientific investigations. The hypothesis of this dissertation is that the condensed datasets are effective tools for exploring many scientific questions, spurring further investigations and revealing important information that might otherwise remain undetected. This dissertation uses condensed databases containing 17 years of Antarctic land surface temperature anomalies as its primary data. The study demonstrates the utility of the condensate database methods by discovering new information. In particular, the process revealed critical quality problems in the source satellite data. The results are used as the starting point for four case studies, investigating Antarctic temperature extremes, cloud detection errors, and the teleconnections between Antarctic temperature anomalies and climate indices. The results confirm the hypothesis that the condensate databases are a highly useful tool for Earth Science analyses. Moreover, the quality checking capabilities provide an important method for independent evaluation of dataset veracity.
Repeal of Comprehensive Background Check Policies and Firearm Homicide and Suicide.
Kagawa, Rose M C; Castillo-Carniglia, Alvaro; Vernick, Jon S; Webster, Daniel; Crifasi, Cassandra; Rudolph, Kara E; Cerdá, Magdalena; Shev, Aaron; Wintemute, Garen J
2018-04-02
In 2016, firearms killed 38,658 people in the United States. Federal law requires licensed gun dealers, but not private parties, to conduct background checks on prospective firearm purchasers with the goal of preventing prohibited persons from obtaining firearms. Our objective was to estimate the effect of the repeal of comprehensive background check laws - requiring a background check for all handgun sales, not just sales by licensed dealers - on firearm homicide and suicide rates in Indiana and Tennessee. We compared age-adjusted firearm homicide and suicide rates, measured annually from 1981-2008 and 1994-2008 in Indiana and Tennessee, respectively, to rates in control groups constructed using the synthetic control method. The average rates of firearm homicide and suicide in Indiana and Tennessee following repeal were within the range of what could be expected given natural variation (differences = 0.7 firearm homicides and 0.5 firearm suicides per 100,000 residents in Indiana and 0.4 firearm homicides and 0.3 firearm suicides per 100,000 residents in Tennessee). Sensitivity analyses resulted in similar findings. We found no evidence of an association between the repeal of comprehensive background check policies and firearm homicide and suicide rates in Indiana and Tennessee. In order to understand whether comprehensive background check policies reduce firearm deaths in the United States generally, more evidence on the impact of such policies from other states is needed.
The rate of cis-trans conformation errors is increasing in low-resolution crystal structures.
Croll, Tristan Ian
2015-03-01
Cis-peptide bonds (with the exception of X-Pro) are exceedingly rare in native protein structures, yet a check for these is not currently included in the standard workflow for some common crystallography packages nor in the automated quality checks that are applied during submission to the Protein Data Bank. This appears to be leading to a growing rate of inclusion of spurious cis-peptide bonds in low-resolution structures both in absolute terms and as a fraction of solved residues. Most concerningly, it is possible for structures to contain very large numbers (>1%) of spurious cis-peptide bonds while still achieving excellent quality reports from MolProbity, leading to concerns that ignoring such errors is allowing software to overfit maps without producing telltale errors in, for example, the Ramachandran plot.
Error image aware content restoration
NASA Astrophysics Data System (ADS)
Choi, Sungwoo; Lee, Moonsik; Jung, Byunghee
2015-12-01
As the resolution of TV significantly increased, content consumers have become increasingly sensitive to the subtlest defect in TV contents. This rising standard in quality demanded by consumers has posed a new challenge in today's context where the tape-based process has transitioned to the file-based process: the transition necessitated digitalizing old archives, a process which inevitably produces errors such as disordered pixel blocks, scattered white noise, or totally missing pixels. Unsurprisingly, detecting and fixing such errors require a substantial amount of time and human labor to meet the standard demanded by today's consumers. In this paper, we introduce a novel, automated error restoration algorithm which can be applied to different types of classic errors by utilizing adjacent images while preserving the undamaged parts of an error image as much as possible. We tested our method to error images detected from our quality check system in KBS(Korean Broadcasting System) video archive. We are also implementing the algorithm as a plugin of well-known NLE(Non-linear editing system), which is a familiar tool for quality control agent.
Mental health and quality of life in patients with chronic otitis media.
Bakir, Salih; Kinis, Vefa; Bez, Yasin; Gun, Ramazan; Yorgancilar, Ediz; Ozbay, Musa; Aguloglu, Bülent; Meric, Faruk
2013-02-01
The present study focused on the comparison of mental health and quality of life (QoL) between chronic otitis media (COM) patients and the hearing population. The patients with chronic otitis media and healthy control group were enrolled in the study. The duration and severity of the auditory impairment were recorded. In addition to hearing loss (HL), the findings of each patient's other ear disorders (ear discharge and tinnitus) were also recorded. In both the groups, psychological symptom profile and health-related QoL were evaluated and compared using a sociodemographic questionnaire, the Symptom Check List 90-Revised Form (SCL-90-R), and the Short Form-36 (SF-36). According to SCL-90-R, somatization (p < 0.001), interpersonal sensitivity (p < 0.001), depression (p < 0.001), phobic anxiety (p < 0.001), and other subscores, and also global severity index score (p < 0.001) were significantly high in patient group when compared to the control group. The patients with COM reported significantly lower levels of QoL in terms of physical role difficulty (p < 0.001), general health perception (p < 0.004), social functioning (p < 0.001), and mental health (p < 0.017) than those of control subjects. Our results indicated that COM patients with mild or moderate HL have poorer life quality and higher psychological problems. Psychological well being should be also considered in assessment of COM patients in addition to the clinical evaluation and audiological tests.
Austin, S L; Mattick, C R; Waterhouse, P J
2015-05-01
To compare the effectiveness of distraction osteogenesis to orthognathic surgery for the treatment of maxillary hypoplasia in individuals with cleft lip and palate. A systematic review of prospective randomized, quasi-randomized or controlled clinical trials. MEDLINE, EMBASE, Scopus, Web of Science, CINAHL, CENTRAL, trial registers and grey literature were searched. Hand searching of five relevant journals was completed. Two reviewers independently completed inclusion assessment. Data extraction and risk of bias assessment were completed by a single reviewer and checked by a second reviewer. Five publications all reporting different outcomes of a single randomized controlled trial are included within the review. The quality of the evidence was low with a high risk of bias. Both surgical interventions produce significant soft tissue improvement. Horizontal relapse of the maxilla was statistically significantly greater following orthognathic surgery. There was no statistically significant difference in speech and velo-pharyngeal function between the interventions. Maxillary distraction initially lowered social self-esteem, but this improved with time resulting in higher satisfaction with life in the long term. The low quality of evidence included within the review means there is insufficient evidence to conclude whether there is a difference in effectiveness between maxillary distraction and osteotomy for the treatment of cleft-related maxillary hypoplasia. There is a need for further high-quality randomized controlled trials to allow conclusive recommendations to be made. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Arani*, Simindokht Shirvani; Ghasemi, Somaye; Samani, Ali Bahrami; Zafarghandi, Mojtaba Shamsaei
2015-01-01
Introduction: Particle-emitting, bone-seeking radiopharmaceuticals have attracted the attention of the nuclear medicine community over the last three decades for the treatment of the pain of osteoblastic metastases. The objectives of this research were to produce quality-controlled 159Gd-EDTMP in order to provide a new therapeutic radiopharmaceutical for use in clinical applications. Methods: The investigation was an experimental study in which 159Gd (T1/2=18.479 h, Eβ (max)=970.60 keV, Eγ=363.55 (11.4%) keV] was produced by thermal neutron bombardment of natural Gd2O3 at the Tehran Research Reactor (TRR) for a period of 7 d at a flux of 3–4×1013 neutrons/cm2.s. It was then quality-controlled and used to radio-label the in-house prepared ethylene diamine tetra acetic acid (EDTM). Results: Complexation parameters were optimized to achieve maximum yields (>99%). The radiochemical purity of 159Gd-EDTMP was checked by radio thin layer chromatography RTLC. It was found to retain its stability at room temperature (>95%). Bio-distribution studies of the complexes conducted in wild rats showed significant bone uptake with rapid clearance from blood. Conclusion: The properties of the 159Gd-EDTMP that was produced suggest then use of a new, efficient, palliative therapeutic agent for metastatic bone pain instead of some other current radiopharmaceuticals. PMID:26052408
Who benefit from school doctors' health checks: a prospective study of a screening method.
Nikander, Kirsi; Kosola, Silja; Kaila, Minna; Hermanson, Elina
2018-06-27
School health services provide an excellent opportunity for the detection and treatment of children at risk of later health problems. However, the optimal use of school doctors' skills and expertise remains unknown. Furthermore, no validated method for screening children for school doctors' assessments exists. The aims of the study are 1) to evaluate the benefits or harm of school doctors' routine health checks in primary school grades 1 and 5 (at ages 7 and 11) and 2) to explore whether some of the school doctors' routine health checks can be omitted using study questionnaires. This is a prospective, multicenter observational study conducted in four urban municipalities in Southern Finland by comparing the need for a school doctor's assessment to the benefit gained from it. We will recruit a random sample of 1050 children from 21 schools from primary school grades 1 and 5. Before the school doctor's health check, parents, nurses and teachers fill a study questionnaire to identify any potential concerns about each child. Doctors, blinded to the questionnaire responses, complete an electronic report after the appointment, including given instructions and follow-up plans. The child, parent, doctor and researchers assess the benefit of the health check. The researchers compare the need for a doctor's appointment to the benefit gained from it. At one year after the health check, we will analyze the implementation of the doctors' interventions and follow-up plans. The study will increase our knowledge of the benefits of school doctors' routine health checks and assess the developed screening method. We hypothesize that targeting the health checks to the children in greatest need would increase the quality of school health services. ClinicalTrials.gov Identifier: NCT03178331 , date of registration June 6 th 2017.
2015/2016 Quality Risk Management Benchmarking Survey.
Waldron, Kelly; Ramnarine, Emma; Hartman, Jeffrey
2017-01-01
This paper investigates the concept of quality risk management (QRM) maturity as it applies to the pharmaceutical and biopharmaceutical industries, using the results and analysis from a QRM benchmarking survey conducted in 2015 and 2016. QRM maturity can be defined as the effectiveness and efficiency of a quality risk management program, moving beyond "check-the-box" compliance with guidelines such as ICH Q9 Quality Risk Management , to explore the value QRM brings to business and quality operations. While significant progress has been made towards full adoption of QRM principles and practices across industry, the full benefits of QRM have not yet been fully realized. The results of the QRM Benchmarking Survey indicate that the pharmaceutical and biopharmaceutical industries are approximately halfway along the journey towards full QRM maturity. LAY ABSTRACT: The management of risks associated with medicinal product quality and patient safety are an important focus for the pharmaceutical and biopharmaceutical industries. These risks are identified, analyzed, and controlled through a defined process called quality risk management (QRM), which seeks to protect the patient from potential quality-related risks. This paper summarizes the outcomes of a comprehensive survey of industry practitioners performed in 2015 and 2016 that aimed to benchmark the level of maturity with regard to the application of QRM. The survey results and subsequent analysis revealed that the pharmaceutical and biopharmaceutical industries have made significant progress in the management of quality risks over the last ten years, and they are roughly halfway towards reaching full maturity of QRM. © PDA, Inc. 2017.
Kim, Ben Yb; Sharafoddini, Anis; Tran, Nam; Wen, Emily Y; Lee, Joon
2018-03-28
General consumers can now easily access drug information and quickly check for potential drug-drug interactions (PDDIs) through mobile health (mHealth) apps. With aging population in Canada, more people have chronic diseases and comorbidities leading to increasing numbers of medications. The use of mHealth apps for checking PDDIs can be helpful in ensuring patient safety and empowerment. The aim of this study was to review the characteristics and quality of publicly available mHealth apps that check for PDDIs. Apple App Store and Google Play were searched to identify apps with PDDI functionality. The apps' general and feature characteristics were extracted. The Mobile App Rating Scale (MARS) was used to assess the quality. A total of 23 apps were included for the review-12 from Apple App Store and 11 from Google Play. Only 5 of these were paid apps, with an average price of $7.19 CAD. The mean MARS score was 3.23 out of 5 (interquartile range 1.34). The mean MARS scores for the apps from Google Play and Apple App Store were not statistically different (P=.84). The information dimension was associated with the highest score (3.63), whereas the engagement dimension resulted in the lowest score (2.75). The total number of features per app, average rating, and price were significantly associated with the total MARS score. Some apps provided accurate and comprehensive information about potential adverse drug effects from PDDIs. Given the potentially severe consequences of incorrect drug information, there is a need for oversight to eliminate low quality and potentially harmful apps. Because managing PDDIs is complex in the absence of complete information, secondary features such as medication reminder, refill reminder, medication history tracking, and pill identification could help enhance the effectiveness of PDDI apps. ©Ben YB Kim, Anis Sharafoddini, Nam Tran, Emily Y Wen, Joon Lee. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 28.03.2018.
Kim, Ben YB; Sharafoddini, Anis; Tran, Nam; Wen, Emily Y
2018-01-01
Background General consumers can now easily access drug information and quickly check for potential drug-drug interactions (PDDIs) through mobile health (mHealth) apps. With aging population in Canada, more people have chronic diseases and comorbidities leading to increasing numbers of medications. The use of mHealth apps for checking PDDIs can be helpful in ensuring patient safety and empowerment. Objective The aim of this study was to review the characteristics and quality of publicly available mHealth apps that check for PDDIs. Methods Apple App Store and Google Play were searched to identify apps with PDDI functionality. The apps’ general and feature characteristics were extracted. The Mobile App Rating Scale (MARS) was used to assess the quality. Results A total of 23 apps were included for the review—12 from Apple App Store and 11 from Google Play. Only 5 of these were paid apps, with an average price of $7.19 CAD. The mean MARS score was 3.23 out of 5 (interquartile range 1.34). The mean MARS scores for the apps from Google Play and Apple App Store were not statistically different (P=.84). The information dimension was associated with the highest score (3.63), whereas the engagement dimension resulted in the lowest score (2.75). The total number of features per app, average rating, and price were significantly associated with the total MARS score. Conclusions Some apps provided accurate and comprehensive information about potential adverse drug effects from PDDIs. Given the potentially severe consequences of incorrect drug information, there is a need for oversight to eliminate low quality and potentially harmful apps. Because managing PDDIs is complex in the absence of complete information, secondary features such as medication reminder, refill reminder, medication history tracking, and pill identification could help enhance the effectiveness of PDDI apps. PMID:29592848
2010-06-01
automatically appended onto the data packet by the CC2420 transceiver. The frame control field (FCF), data sequence number, and frame check sequence (FCS...by the CC2420 over the MAC protocol data unit (MPDU), i.e., the length field is not part of the FCS. This field is automatically generated and...verified by the CC2420 hardware when the AUTOCRC control bit is set in the MODEMCTRL0 control register’s field . If the FCS check indicates that a data
Temporal effects of post-fire check dam construction on soil functionality in SE Spain.
González-Romero, J; Lucas-Borja, M E; Plaza-Álvarez, P A; Sagra, J; Moya, D; De Las Heras, J
2018-06-09
Wildfire has historically been an alteration factor in Mediterranean basins. Despite Mediterranean ecosystems' high resilience, wildfire accelerates erosion and degradation processes, and also affects soil functionality by affecting nutrient cycles and soil structure. In semi-arid Mediterranean basins, check dams are usually built in gullies and channels after fire as a measure against soil erosion. Although check dams have proven efficient action to reduce erosion rates, studies about how they affect soil functionality are lacking. Our approach focuses on how soil functionality, defined as a combination of physico-chemical and biological indicators, is locally affected by check dam construction and the evolution of this effect over time. Soils were sampled in eight check dams in two semi-arid areas at SE Spain, which were affected by wildfire in 2012 and 2016. The study findings reveal that by altering sediments cycle and transport, check dams influence soil's main physico-chemical and biochemical characteristics. Significant differences were found between check dam-affected zones and the control ones for many indicators such as organic matter content, electrical conductivity or enzymatic activity. According to the ANOVA results, interaction between check dams influence and time after fire, was a crucial factor. PCA results clearly showed check-dams influence on soil functionality. Copyright © 2018. Published by Elsevier B.V.
Method and apparatus for checking fire detectors
NASA Technical Reports Server (NTRS)
Clawson, G. T. (Inventor)
1974-01-01
A fire detector checking method and device are disclosed for nondestructively verifying the operation of installed fire detectors of the type which operate on the principle of detecting the rate of temperature rise of the ambient air to sound an alarm and/or which sound an alarm when the temperature of the ambient air reaches a preset level. The fire alarm checker uses the principle of effecting a controlled simulated alarm condition to ascertain wheather or not the detector will respond. The checker comprises a hand-held instrument employing a controlled heat source, e.g., an electric lamp having a variable input, for heating at a controlled rate an enclosed mass of air in a first compartment, which air mass is then disposed about the fire detector to be checked. A second compartment of the device houses an electronic circuit to sense and adjust the temperature level and heating rate of the heat source.
Quality specification in haematology: the automated blood cell count.
Buttarello, Mauro
2004-08-02
Quality specifications for automated blood cell counts include topics that go beyond the traditional analytic stage (imprecision, inaccuracy, quality control) and extend to pre- and post-analytic phases. In this review pre-analytic aspects concerning the choice of anticoagulants, maximum conservation times and differences between storage at room temperature or at 4 degrees C are considered. For the analytic phase, goals for imprecision and bias obtained with various approaches (ratio to biologic variation, state of the art, specific clinical situations) are evaluated. For the post-analytic phase, medical review criteria (algorithm, decision limit and delta check) and the structure of the report (general part and comments), which constitutes the formal act through which a laboratory communicates with clinicians, are considered. K2EDTA is considered the anticoagulant of choice for automated cell counts. Regarding storage, specimens should be analyzed as soon as possible. Storage at 4 degrees C may stabilize specimens from 24 to 72 h when complete blood count (CBC) and differential leucocyte count (DLC) is performed. For precision, analytical goals based on the state of the art are acceptable while for bias this is satisfactory only for some parameters. In haematology quality specifications for pre- and analytical phases are important, but the review criteria and the quality of the report play a central role in assuring a definite clinical value.
Bonacim, Carlos Alberto Grespam; Salgado, André Luís; Girioli, Lumila Souza; de Araujo, Adriana Maria Procópio
2011-05-01
This work focuses on a discussion about the extent to which the level of organizational structure interferes in the internal control practices of non-governmental organizations (NGOs), especially those related to health. The objective of this work was to observe the efficiency of the internal control tests applied within the organizational structure of the Foundation for Cancer Research, Prevention and Care, checking the reliability of the accounting records and operational controls. A case study in a third sector health organization was the chosen methodology. The case study involved company interviews and the analysis of confidential reports. After an evaluation of the organizational structure (of the relations between officials and volunteers) and the application of evaluation proceedings on the quality of the internal controls, the extent to which the organizational structure interferes with the internal control practices of the hospital was assessed. It was revealed that there are structured mechanisms of control in the institution, however the implementation of these controls is inadequately performed. It was further detected that the level of the organizational structure does indeed interfere in internal control practices at the entity.
Check valve installation in pilot operated relief valve prevents reverse pressurization
NASA Technical Reports Server (NTRS)
Oswalt, L.
1966-01-01
Two check valves prevent reverse flow through pilot-operated relief valves of differential area piston design. Title valves control pressure flow to ensure that the piston dome pressure is always at least as great as the main relief valve discharge pressure.