Sample records for design quality checks

  1. A rigorous approach to self-checking programming

    NASA Technical Reports Server (NTRS)

    Hua, Kien A.; Abraham, Jacob A.

    1986-01-01

    Self-checking programming is shown to be an effective concurrent error detection technique. The reliability of a self-checking program however relies on the quality of its assertion statements. A self-checking program written without formal guidelines could provide a poor coverage of the errors. A constructive technique for self-checking programming is presented. A Structured Program Design Language (SPDL) suitable for self-checking software development is defined. A set of formal rules, was also developed, that allows the transfromation of SPDL designs into self-checking designs to be done in a systematic manner.

  2. Ensuring the Quality of Data Packages in the LTER Network Provenance Aware Synthesis Tracking Architecture Data Management System and Archive

    NASA Astrophysics Data System (ADS)

    Servilla, M. S.; O'Brien, M.; Costa, D.

    2013-12-01

    Considerable ecological research performed today occurs through the analysis of data downloaded from various repositories and archives, often resulting in derived or synthetic products generated by automated workflows. These data are only meaningful for research if they are well documented by metadata, lest semantic or data type errors may occur in interpretation or processing. The Long Term Ecological Research (LTER) Network now screens all data packages entering its long-term archive to ensure that each package contains metadata that is complete, of high quality, and accurately describes the structure of its associated data entity and the data are structurally congruent to the metadata. Screening occurs prior to the upload of a data package into the Provenance Aware Synthesis Tracking Architecture (PASTA) data management system through a series of quality checks, thus preventing ambiguously or incorrectly documented data packages from entering the system. The quality checks within PASTA are designed to work specifically with the Ecological Metadata Language (EML), the metadata standard adopted by the LTER Network to describe data generated by their 26 research sites. Each quality check is codified in Java as part of the ecological community-supported Data Manager Library, which is a resource of the EML specification and used as a component of the PASTA software stack. Quality checks test for metadata quality, data integrity, or metadata-data congruence. Quality checks are further classified as either conditional or informational. Conditional checks issue a 'valid', 'warning' or 'error' response. Only an 'error' response blocks the data package from upload into PASTA. Informational checks only provide descriptive content pertaining to a particular facet of the data package. Quality checks are designed by a group of LTER information managers and reviewed by the LTER community before deploying into PASTA. A total of 32 quality checks have been deployed to date. Quality checks can be customized through a configurable template, which includes turning checks 'on' or 'off' and setting the severity of conditional checks. This feature is important to other potential users of the Data Manager Library who wish to configure its quality checks in accordance with the standards of their community. Executing the complete set of quality checks produces a report that describes the result of each check. The report is an XML document that is stored by PASTA for future reference.

  3. Visplause: Visual Data Quality Assessment of Many Time Series Using Plausibility Checks.

    PubMed

    Arbesser, Clemens; Spechtenhauser, Florian; Muhlbacher, Thomas; Piringer, Harald

    2017-01-01

    Trends like decentralized energy production lead to an exploding number of time series from sensors and other sources that need to be assessed regarding their data quality (DQ). While the identification of DQ problems for such routinely collected data is typically based on existing automated plausibility checks, an efficient inspection and validation of check results for hundreds or thousands of time series is challenging. The main contribution of this paper is the validated design of Visplause, a system to support an efficient inspection of DQ problems for many time series. The key idea of Visplause is to utilize meta-information concerning the semantics of both the time series and the plausibility checks for structuring and summarizing results of DQ checks in a flexible way. Linked views enable users to inspect anomalies in detail and to generate hypotheses about possible causes. The design of Visplause was guided by goals derived from a comprehensive task analysis with domain experts in the energy sector. We reflect on the design process by discussing design decisions at four stages and we identify lessons learned. We also report feedback from domain experts after using Visplause for a period of one month. This feedback suggests significant efficiency gains for DQ assessment, increased confidence in the DQ, and the applicability of Visplause to summarize indicators also outside the context of DQ.

  4. Algorithm design for automated transportation photo enforcement camera image and video quality diagnostic check modules

    NASA Astrophysics Data System (ADS)

    Raghavan, Ajay; Saha, Bhaskar

    2013-03-01

    Photo enforcement devices for traffic rules such as red lights, toll, stops, and speed limits are increasingly being deployed in cities and counties around the world to ensure smooth traffic flow and public safety. These are typically unattended fielded systems, and so it is important to periodically check them for potential image/video quality problems that might interfere with their intended functionality. There is interest in automating such checks to reduce the operational overhead and human error involved in manually checking large camera device fleets. Examples of problems affecting such camera devices include exposure issues, focus drifts, obstructions, misalignment, download errors, and motion blur. Furthermore, in some cases, in addition to the sub-algorithms for individual problems, one also has to carefully design the overall algorithm and logic to check for and accurately classifying these individual problems. Some of these issues can occur in tandem or have the potential to be confused for each other by automated algorithms. Examples include camera misalignment that can cause some scene elements to go out of focus for wide-area scenes or download errors that can be misinterpreted as an obstruction. Therefore, the sequence in which the sub-algorithms are utilized is also important. This paper presents an overview of these problems along with no-reference and reduced reference image and video quality solutions to detect and classify such faults.

  5. [Evaluation of the quality of clinical practice guidelines published in the Annales de Biologie Clinique with the help of the EFLM checklist].

    PubMed

    Wils, Julien; Fonfrède, Michèle; Augereau, Christine; Watine, Joseph

    2014-01-01

    Several tools are available to help evaluate the quality of clinical practice guidelines (CPG). The AGREE instrument (Appraisal of guidelines for research & evaluation) is the most consensual tool but it has been designed to assess CPG methodology only. The European federation of laboratory medicine (EFLM) recently designed a check-list dedicated to laboratory medicine which is supposed to be comprehensive and which therefore makes it possible to evaluate more thoroughly the quality of CPG in laboratory medicine. In the present work we test the comprehensiveness of this check-list on a sample of CPG written in French and published in Annales de biologie clinique (ABC). Thus we show that some work remains to be achieved before a truly comprehensive check-list is designed. We also show that there is some room for improvement for the CPG published in ABC, for example regarding the fact that some of these CPG do not provide any information about allowed durations of transport and of storage of biological samples before analysis, or about standards of minimal analytical performance, or about the sensitivities or the specificities of the recommended tests.

  6. 10 CFR 63.142 - Quality assurance criteria.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ...) Design control. (1) DOE shall establish measures to assure that applicable regulatory requirements and... control design interfaces and for coordination among participating design organizations. These measures... control measures must provide for verifying or checking the adequacy of design, such as by the performance...

  7. 10 CFR 63.142 - Quality assurance criteria.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ...) Design control. (1) DOE shall establish measures to assure that applicable regulatory requirements and... control design interfaces and for coordination among participating design organizations. These measures... control measures must provide for verifying or checking the adequacy of design, such as by the performance...

  8. 10 CFR 63.142 - Quality assurance criteria.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...) Design control. (1) DOE shall establish measures to assure that applicable regulatory requirements and... control design interfaces and for coordination among participating design organizations. These measures... control measures must provide for verifying or checking the adequacy of design, such as by the performance...

  9. 10 CFR 63.142 - Quality assurance criteria.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...) Design control. (1) DOE shall establish measures to assure that applicable regulatory requirements and... control design interfaces and for coordination among participating design organizations. These measures... control measures must provide for verifying or checking the adequacy of design, such as by the performance...

  10. Assessment of Petrological Microscopes.

    ERIC Educational Resources Information Center

    Mathison, Charter Innes

    1990-01-01

    Presented is a set of procedures designed to check the design, ergonomics, illumination, function, optics, accessory equipment, and image quality of a microscope being considered for purchase. Functions for use in a petrology or mineralogy laboratory are stressed. (CW)

  11. Plan-Do-Check-Act and the Management of Institutional Research. AIR 1992 Annual Forum Paper.

    ERIC Educational Resources Information Center

    McLaughlin, Gerald W.; Snyder, Julie K.

    This paper describes the application of a Total Quality Management strategy called Plan-Do-Check-Act (PDCA) to the projects and activities of an institutional research office at the Virginia Polytechnic Institute and State University. PDCA is a cycle designed to facilitate incremental continual improvement through change. The specific steps are…

  12. A source-channel coding approach to digital image protection and self-recovery.

    PubMed

    Sarreshtedari, Saeed; Akhaee, Mohammad Ali

    2015-07-01

    Watermarking algorithms have been widely applied to the field of image forensics recently. One of these very forensic applications is the protection of images against tampering. For this purpose, we need to design a watermarking algorithm fulfilling two purposes in case of image tampering: 1) detecting the tampered area of the received image and 2) recovering the lost information in the tampered zones. State-of-the-art techniques accomplish these tasks using watermarks consisting of check bits and reference bits. Check bits are used for tampering detection, whereas reference bits carry information about the whole image. The problem of recovering the lost reference bits still stands. This paper is aimed at showing that having the tampering location known, image tampering can be modeled and dealt with as an erasure error. Therefore, an appropriate design of channel code can protect the reference bits against tampering. In the present proposed method, the total watermark bit-budget is dedicated to three groups: 1) source encoder output bits; 2) channel code parity bits; and 3) check bits. In watermark embedding phase, the original image is source coded and the output bit stream is protected using appropriate channel encoder. For image recovery, erasure locations detected by check bits help channel erasure decoder to retrieve the original source encoded image. Experimental results show that our proposed scheme significantly outperforms recent techniques in terms of image quality for both watermarked and recovered image. The watermarked image quality gain is achieved through spending less bit-budget on watermark, while image recovery quality is considerably improved as a consequence of consistent performance of designed source and channel codes.

  13. Design and performance of daily quality assurance system for carbon ion therapy at NIRS

    NASA Astrophysics Data System (ADS)

    Saotome, N.; Furukawa, T.; Hara, Y.; Mizushima, K.; Tansho, R.; Saraya, Y.; Shirai, T.; Noda, K.

    2017-09-01

    At National Institute of Radiological Sciences (NIRS), we have been commissioning a rotating-gantry system for carbon-ion radiotherapy. This rotating gantry can transport heavy ions at 430 MeV/u to an isocenter with irradiation angles of ±180° that can rotate around the patient so that the tumor can be irradiated from any direction. A three-dimensional pencil-beam scanning irradiation system equipped with the rotating gantry enables the optimal use of physical characteristics of carbon ions to provide accurate treatment. To ensure the treatment quality using such a complex system, the calibration of the primary dose monitor, output check, range check, dose rate check, machine safety check, and some mechanical tests should be performed efficiently. For this purpose, we have developed a measurement system dedicated for quality assurance (QA) of this gantry system: the Daily QA system. The system consists of an ionization chamber system and a scintillator system. The ionization chamber system is used for the calibration of the primary dose monitor, output check, and dose rate check, and the scintillator system is used for the range check, isocenter, and gantry angle. The performance of the Daily QA system was verified by a beam test. The stability of the output was within 0.5%, and the range was within 0.5 mm. The coincidence of the coordinates between the patient-positioning system and the irradiation system was verified using the Daily QA system. Our present findings verified that the new Daily QA system for a rotating gantry is capable of verifying the irradiation system with sufficient accuracy.

  14. 40 CFR 75.60 - General provisions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... data and results of all pretest, post-test, and post-run quality-assurance checks of the reference..., application forms, designated representative signature, and petition-related test results in hardcopy to the... completing the test or within 15 days of receiving the request, whichever is later. The designated...

  15. Objectivity of the Subjective Quality: Convergence on Competencies Expected of Doctoral Graduates

    ERIC Educational Resources Information Center

    Kariyana, Israel; Sonn, Reynold A.; Marongwe, Newlin

    2017-01-01

    This study assessed the competencies expected of doctoral graduates. Twelve purposefully sampled education experts provided the data. A case study design within a qualitative approach was adopted. Data were gathered through interviews and thematically analysed. Member checking ensured data trustworthiness. Factors affecting the quality of a…

  16. Safe and effective nursing shift handover with NURSEPASS: An interrupted time series.

    PubMed

    Smeulers, Marian; Dolman, Christine D; Atema, Danielle; van Dieren, Susan; Maaskant, Jolanda M; Vermeulen, Hester

    2016-11-01

    Implementation of a locally developed evidence based nursing shift handover blueprint with a bedside-safety-check to determine the effect size on quality of handover. A mixed methods design with: (1) an interrupted time series analysis to determine the effect on handover quality in six domains; (2) descriptive statistics to analyze the intercepted discrepancies by the bedside-safety-check; (3) evaluation sessions to gather experiences with the new handover process. We observed a continued trend of improvement in handover quality and a significant improvement in two domains of handover: organization/efficiency and contents. The bedside-safety-check successfully identified discrepancies on drains, intravenous medications, bandages or general condition and was highly appreciated. Use of the nursing shift handover blueprint showed promising results on effectiveness as well as on feasibility and acceptability. However, to enable long term measurement on effectiveness, evaluation with large scale interrupted times series or statistical process control is needed. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Spectrum analysis on quality requirements consideration in software design documents.

    PubMed

    Kaiya, Haruhiko; Umemura, Masahiro; Ogata, Shinpei; Kaijiri, Kenji

    2013-12-01

    Software quality requirements defined in the requirements analysis stage should be implemented in the final products, such as source codes and system deployment. To guarantee this meta-requirement, quality requirements should be considered in the intermediate stages, such as the design stage or the architectural definition stage. We propose a novel method for checking whether quality requirements are considered in the design stage. In this method, a technique called "spectrum analysis for quality requirements" is applied not only to requirements specifications but also to design documents. The technique enables us to derive the spectrum of a document, and quality requirements considerations in the document are numerically represented in the spectrum. We can thus objectively identify whether the considerations of quality requirements in a requirements document are adapted to its design document. To validate the method, we applied it to commercial software systems with the help of a supporting tool, and we confirmed that the method worked well.

  18. Improving NAVFAC's total quality management of construction drawings with CLIPS

    NASA Technical Reports Server (NTRS)

    Antelman, Albert

    1991-01-01

    A diagnostic expert system to improve the quality of Naval Facilities Engineering Command (NAVFAC) construction drawings and specification is described. C Language Integrated Production System (CLIPS) and computer aided design layering standards are used in an expert system to check and coordinate construction drawings and specifications to eliminate errors and omissions.

  19. Verifying Architectural Design Rules of the Flight Software Product Line

    NASA Technical Reports Server (NTRS)

    Ganesan, Dharmalingam; Lindvall, Mikael; Ackermann, Chris; McComas, David; Bartholomew, Maureen

    2009-01-01

    This paper presents experiences of verifying architectural design rules of the NASA Core Flight Software (CFS) product line implementation. The goal of the verification is to check whether the implementation is consistent with the CFS architectural rules derived from the developer's guide. The results indicate that consistency checking helps a) identifying architecturally significant deviations that were eluded during code reviews, b) clarifying the design rules to the team, and c) assessing the overall implementation quality. Furthermore, it helps connecting business goals to architectural principles, and to the implementation. This paper is the first step in the definition of a method for analyzing and evaluating product line implementations from an architecture-centric perspective.

  20. Class Model Development Using Business Rules

    NASA Astrophysics Data System (ADS)

    Skersys, Tomas; Gudas, Saulius

    New developments in the area of computer-aided system engineering (CASE) greatly improve processes of the information systems development life cycle (ISDLC). Much effort is put into the quality improvement issues, but IS development projects still suffer from the poor quality of models during the system analysis and design cycles. At some degree, quality of models that are developed using CASE tools can be assured using various. automated. model comparison, syntax. checking procedures. It. is also reasonable to check these models against the business domain knowledge, but the domain knowledge stored in the repository of CASE tool (enterprise model) is insufficient (Gudas et al. 2004). Involvement of business domain experts into these processes is complicated because non- IT people often find it difficult to understand models that were developed by IT professionals using some specific modeling language.

  1. SU-F-T-32: Evaluation of the Performance of a Multiple-Array-Diode Detector for Quality Assurance Tests in High-Dose-Rate Brachytherapy with Ir-192 Source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harpool, K; De La Fuente Herman, T; Ahmad, S

    Purpose: To evaluate the performance of a two-dimensional (2D) array-diode- detector for geometric and dosimetric quality assurance (QA) tests of high-dose-rate (HDR) brachytherapy with an Ir-192-source. Methods: A phantom setup was designed that encapsulated a two-dimensional (2D) array-diode-detector (MapCheck2) and a catheter for the HDR brachytherapy Ir-192 source. This setup was used to perform both geometric and dosimetric quality assurance for the HDR-Ir192 source. The geometric tests included: (a) measurement of the position of the source and (b) spacing between different dwell positions. The dosimteric tests include: (a) linearity of output with time, (b) end effect and (c) relative dosemore » verification. The 2D-dose distribution measured with MapCheck2 was used to perform the previous tests. The results of MapCheck2 were compared with the corresponding quality assurance testes performed with Gafchromic-film and well-ionization-chamber. Results: The position of the source and the spacing between different dwell-positions were reproducible within 1 mm accuracy by measuring the position of maximal dose using MapCheck2 in contrast to the film which showed a blurred image of the dwell positions due to limited film sensitivity to irradiation. The linearity of the dose with dwell times measured from MapCheck2 was superior to the linearity measured with ionization chamber due to higher signal-to-noise ratio of the diode readings. MapCheck2 provided more accurate measurement of the end effect with uncertainty < 1.5% in comparison with the ionization chamber uncertainty of 3%. Although MapCheck2 did not provide absolute calibration dosimeter for the activity of the source, it provided accurate tool for relative dose verification in HDR-brachytherapy. Conclusion: The 2D-array-diode-detector provides a practical, compact and accurate tool to perform quality assurance for HDR-brachytherapy with an Ir-192 source. The diodes in MapCheck2 have high radiation sensitivity and linearity that is superior to Gafchromic-films and ionization chamber used for geometric and dosimetric QA in HDR-brachytherapy, respectively.« less

  2. [Design and implementation of data checking system for Chinese materia medica resources survey].

    PubMed

    Wang, Hui; Zhang, Xiao-Bo; Ge, Xiao-Guang; Jin, Yan; Jing, Zhi-Xian; Qi, Yuan-Hua; Wang, Ling; Zhao, Yu-Ping; Wang, Wei; Guo, Lan-Ping; Huang, Lu-Qi

    2017-11-01

    The Chinese material medica resources (CMMR) national survey information management system has collected a large amount of data. To help dealing with data recheck, reduce the work of inside, improve the recheck of survey data from provincial and county level, National Resource Center for Chinese Materia Medical has designed a data checking system for Chinese material medica resources survey based on J2EE technology, Java language, Oracle data base in accordance with the SOA framework. It includes single data check, check score, content manage, check the survey data census data with manual checking and automatic checking about census implementation plan, key research information, general survey information, cultivation of medicinal materials information, germplasm resources information the medicine information, market research information, traditional knowledge information, specimen information of this 9 aspects 20 class 175 indicators in two aspects of the quantity and quality. The established system assists in the completion of the data consistency and accuracy, pushes the county survey team timely to complete the data entry arrangement work, so as to improve the integrity, consistency and accuracy of the survey data, and ensure effective and available data, which lay a foundation for providing accurate data support for national survey of the Chinese material medica resources (CMMR) results summary, and displaying results and sharing. Copyright© by the Chinese Pharmaceutical Association.

  3. Flow Control and Design Assessment for Drainage System at McMurdo Station, Antarctica

    DTIC Science & Technology

    2014-11-24

    Council BMP Best Management Practice CASQUA California Storm Water Quality Task Force CRREL Cold Regions Research and Engineering Laboratory DS...ponds The California Storm Water Quality Task Force (CASQUA 1993) defines a sediment basin as “a pond created by excavation or constructing an em...British Standards Institution. California Storm Water Quality Task Force (CASQUA). 1993. ESC41: Check Dams. In Stormwater Best Management Practices

  4. Direct to consumer advertising via the Internet, a study of hip resurfacing.

    PubMed

    Ogunwale, B; Clarke, J; Young, D; Mohammed, A; Patil, S; Meek, R M D

    2009-02-01

    With increased use of the internet for health information and direct to consumer advertising from medical companies, there is concern about the quality of information available to patients. The aim of this study was to examine the quality of health information on the internet for hip resurfacing. An assessment tool was designed to measure quality of information. Websites were measured on credibility of source; usability; currentness of the information; content relevance; content accuracy/completeness and disclosure/bias. Each website assessed was given a total score, based on number of scores achieved from the above categories websites were further analysed on author, geographical origin and possession of an independent credibility check. There was positive correlation between the overall score for the website and the score of each website in each assessment category. Websites by implant companies, doctors and hospitals scored poorly. Websites with an independent credibility check such as Health on the Net (HoN) scored twice the total scores of websites without. Like other internet health websites, the quality of information on hip resurfacing websites is variable. This study highlights methods by which to assess the quality of health information on the internet and advocates that patients should look for a statement of an "independent credibility check" when searching for information on hip resurfacing.

  5. 7 CFR 58.243 - Checking quality.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Checking quality. 58.243 Section 58.243 Agriculture... Procedures § 58.243 Checking quality. All milk, milk products and dry milk products shall be subject to inspection and analysis by the dairy plant for quality and condition throughout each processing operation...

  6. Quality monitored distributed voting system

    DOEpatents

    Skogmo, David

    1997-01-01

    A quality monitoring system can detect certain system faults and fraud attempts in a distributed voting system. The system uses decoy voters to cast predetermined check ballots. Absent check ballots can indicate system faults. Altered check ballots can indicate attempts at counterfeiting votes. The system can also cast check ballots at predetermined times to provide another check on the distributed voting system.

  7. Design of experiments enhanced statistical process control for wind tunnel check standard testing

    NASA Astrophysics Data System (ADS)

    Phillips, Ben D.

    The current wind tunnel check standard testing program at NASA Langley Research Center is focused on increasing data quality, uncertainty quantification and overall control and improvement of wind tunnel measurement processes. The statistical process control (SPC) methodology employed in the check standard testing program allows for the tracking of variations in measurements over time as well as an overall assessment of facility health. While the SPC approach can and does provide researchers with valuable information, it has certain limitations in the areas of process improvement and uncertainty quantification. It is thought by utilizing design of experiments methodology in conjunction with the current SPC practices that one can efficiently and more robustly characterize uncertainties and develop enhanced process improvement procedures. In this research, methodologies were developed to generate regression models for wind tunnel calibration coefficients, balance force coefficients and wind tunnel flow angularities. The coefficients of these regression models were then tracked in statistical process control charts, giving a higher level of understanding of the processes. The methodology outlined is sufficiently generic such that this research can be applicable to any wind tunnel check standard testing program.

  8. Telecommunications Systems Career Ladder, AFSC 307XO.

    DTIC Science & Technology

    1981-01-01

    standard test tone levels perform impulse noise tests make in-service or out-of- service quality check.s on composite signal transmission levels Even...service or out-of- service quality control (QC) reports maintain trouble and restoration record forms (DD Form 1443) direct circuit or system checks...include: perform fault isolation on analog circuits make in-service or out-of- service quality checks on voice frequency carrier telegraph (VFCT) terminals

  9. Quality monitored distributed voting system

    DOEpatents

    Skogmo, D.

    1997-03-18

    A quality monitoring system can detect certain system faults and fraud attempts in a distributed voting system. The system uses decoy voters to cast predetermined check ballots. Absent check ballots can indicate system faults. Altered check ballots can indicate attempts at counterfeiting votes. The system can also cast check ballots at predetermined times to provide another check on the distributed voting system. 6 figs.

  10. Helical tomotherapy quality assurance with ArcCHECK.

    PubMed

    Chapman, David; Barnett, Rob; Yartsev, Slav

    2014-01-01

    To design a quality assurance (QA) procedure for helical tomotherapy that measures multiple beam parameters with 1 delivery and uses a rotating gantry to simulate treatment conditions. The customized QA procedure was preprogrammed on the tomotherapy operator station. The dosimetry measurements were performed using an ArcCHECK diode array and an A1SL ion chamber inserted in the central holder. The ArcCHECK was positioned 10cm above the isocenter so that the 21-cm diameter detector array could measure the 40-cm wide tomotherapy beam. During the implementation of the new QA procedure, separate comparative measurements were made using ion chambers in both liquid and solid water, the tomotherapy onboard detector array, and a MapCHECK diode array for a period of 10 weeks. There was good agreement (within 1.3%) for the beam output and cone ratio obtained with the new procedure and the routine QA measurements. The measured beam energy was comparable (0.3%) to solid water measurement during the 10-week evaluation period, excluding 2 of the 10 measurements with unusually high background. The symmetry reading was similarly compromised for those 2 weeks, and on the other weeks, it deviated from the solid water reading by ~2.5%. The ArcCHECK phantom presents a suitable alternative for performing helical tomotherapy QA, provided the background is collected properly. The proposed weekly procedure using ArcCHECK and water phantom makes the QA process more efficient. Copyright © 2014 American Association of Medical Dosimetrists. Published by Elsevier Inc. All rights reserved.

  11. Helical tomotherapy quality assurance with ArcCHECK

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chapman, David; Barnett, Rob; Yartsev, Slav, E-mail: slav.yartsev@lhsc.on.ca

    2014-07-01

    To design a quality assurance (QA) procedure for helical tomotherapy that measures multiple beam parameters with 1 delivery and uses a rotating gantry to simulate treatment conditions. The customized QA procedure was preprogrammed on the tomotherapy operator station. The dosimetry measurements were performed using an ArcCHECK diode array and an A1SL ion chamber inserted in the central holder. The ArcCHECK was positioned 10 cm above the isocenter so that the 21-cm diameter detector array could measure the 40-cm wide tomotherapy beam. During the implementation of the new QA procedure, separate comparative measurements were made using ion chambers in both liquidmore » and solid water, the tomotherapy onboard detector array, and a MapCHECK diode array for a period of 10 weeks. There was good agreement (within 1.3%) for the beam output and cone ratio obtained with the new procedure and the routine QA measurements. The measured beam energy was comparable (0.3%) to solid water measurement during the 10-week evaluation period, excluding 2 of the 10 measurements with unusually high background. The symmetry reading was similarly compromised for those 2 weeks, and on the other weeks, it deviated from the solid water reading by ∼2.5%. The ArcCHECK phantom presents a suitable alternative for performing helical tomotherapy QA, provided the background is collected properly. The proposed weekly procedure using ArcCHECK and water phantom makes the QA process more efficient.« less

  12. Fatigue design procedure for the American SST prototype

    NASA Technical Reports Server (NTRS)

    Doty, R. J.

    1972-01-01

    For supersonic airline operations, significantly higher environmental temperature is the primary new factor affecting structural service life. Methods for incorporating the influence of temperature in detailed fatigue analyses are shown along with current test indications. Thermal effects investigated include real-time compared with short-time testing, long-time temperature exposure, and stress-temperature cycle phasing. A method is presented which allows designers and stress analyzers to check fatigue resistance of structural design details. A communicative rating system is presented which defines the relative fatigue quality of the detail so that the analyst can define cyclic-load capability of the design detail by entering constant-life charts for varying detail quality. If necessary then, this system allows the designer to determine ways to improve the fatigue quality for better life or to determine the operating stresses which will provide the required service life.

  13. Hospital quality: a product of good management as much as good treatment.

    PubMed

    Hyde, Andy; Frafjord, Anders

    2013-01-01

    In Norway, as in most countries, the demands placed on hospitals to reduce costs and improve the quality of services are intense. Although many say that improving quality reduces costs, few can prove it. Futhermore, how many people can show that improving quality improves patient satisfaction. Diakonhjemmet hospital in Norway has designed and implemented a hospital management system based on lean principles and the PDCA (Plan-Do-Check-Act) quality circle introduced by WE Deming (Deming 2000). The results are quite impressive with improvements in quality and patient satisfaction. The hospital also runs at a profit.

  14. Standard Reference Specimens in Quality Control of Engineering Surfaces

    PubMed Central

    Song, J. F.; Vorburger, T. V.

    1991-01-01

    In the quality control of engineering surfaces, we aim to understand and maintain a good relationship between the manufacturing process and surface function. This is achieved by controlling the surface texture. The control process involves: 1) learning the functional parameters and their control values through controlled experiments or through a long history of production and use; 2) maintaining high accuracy and reproducibility with measurements not only of roughness calibration specimens but also of real engineering parts. In this paper, the characteristics, utilizations, and limitations of different classes of precision roughness calibration specimens are described. A measuring procedure of engineering surfaces, based on the calibration procedure of roughness specimens at NIST, is proposed. This procedure involves utilization of check specimens with waveform, wavelength, and other roughness parameters similar to functioning engineering surfaces. These check specimens would be certified under standardized reference measuring conditions, or by a reference instrument, and could be used for overall checking of the measuring procedure and for maintaining accuracy and agreement in engineering surface measurement. The concept of “surface texture design” is also suggested, which involves designing the engineering surface texture, the manufacturing process, and the quality control procedure to meet the optimal functional needs. PMID:28184115

  15. Model Checking Verification and Validation at JPL and the NASA Fairmont IV and V Facility

    NASA Technical Reports Server (NTRS)

    Schneider, Frank; Easterbrook, Steve; Callahan, Jack; Montgomery, Todd

    1999-01-01

    We show how a technology transfer effort was carried out. The successful use of model checking on a pilot JPL flight project demonstrates the usefulness and the efficacy of the approach. The pilot project was used to model a complex spacecraft controller. Software design and implementation validation were carried out successfully. To suggest future applications we also show how the implementation validation step can be automated. The effort was followed by the formal introduction of the modeling technique as a part of the JPL Quality Assurance process.

  16. The design of temporary sediment controls with special reference to water quality.

    DOT National Transportation Integrated Search

    1975-01-01

    The laboratory and field trapping efficiencies of several types of flow barriers were ascertained. The materials used to fabricate the barriers were various types of hay straw crushed stone and crushed stone/straw mixes. Field checks of systems of ba...

  17. CMM Interim Check (U)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Montano, Joshua Daniel

    2015-03-23

    Coordinate Measuring Machines (CMM) are widely used in industry, throughout the Nuclear Weapons Complex and at Los Alamos National Laboratory (LANL) to verify part conformance to design definition. Calibration cycles for CMMs at LANL are predominantly one year in length. Unfortunately, several nonconformance reports have been generated to document the discovery of a certified machine found out of tolerance during a calibration closeout. In an effort to reduce risk to product quality two solutions were proposed – shorten the calibration cycle which could be costly, or perform an interim check to monitor the machine’s performance between cycles. The CMM interimmore » check discussed makes use of Renishaw’s Machine Checking Gauge. This off-the-shelf product simulates a large sphere within a CMM’s measurement volume and allows for error estimation. Data was gathered, analyzed, and simulated from seven machines in seventeen different configurations to create statistical process control run charts for on-the-floor monitoring.« less

  18. Implementing Model-Check for Employee and Management Satisfaction

    NASA Technical Reports Server (NTRS)

    Jones, Corey; LaPha, Steven

    2013-01-01

    This presentation will discuss methods to which ModelCheck can be implemented to not only improve model quality, but also satisfy both employees and management through different sets of quality checks. This approach allows a standard set of modeling practices to be upheld throughout a company, with minimal interaction required by the end user. The presenter will demonstrate how to create multiple ModelCheck standards, preventing users from evading the system, and how it can improve the quality of drawings and models.

  19. A Multi-Encoding Approach for LTL Symbolic Satisfiability Checking

    NASA Technical Reports Server (NTRS)

    Rozier, Kristin Y.; Vardi, Moshe Y.

    2011-01-01

    Formal behavioral specifications written early in the system-design process and communicated across all design phases have been shown to increase the efficiency, consistency, and quality of the system under development. To prevent introducing design or verification errors, it is crucial to test specifications for satisfiability. Our focus here is on specifications expressed in linear temporal logic (LTL). We introduce a novel encoding of symbolic transition-based Buchi automata and a novel, "sloppy," transition encoding, both of which result in improved scalability. We also define novel BDD variable orders based on tree decomposition of formula parse trees. We describe and extensively test a new multi-encoding approach utilizing these novel encoding techniques to create 30 encoding variations. We show that our novel encodings translate to significant, sometimes exponential, improvement over the current standard encoding for symbolic LTL satisfiability checking.

  20. 30 CFR 57.8535 - Seals.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Seals. 57.8535 Section 57.8535 Mineral....8535 Seals. Seals shall be provided with a means for checking the quality of air behind the seal and a means to prevent a water head from developing unless the seal is designed to impound water. ...

  1. 30 CFR 57.8535 - Seals.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Seals. 57.8535 Section 57.8535 Mineral....8535 Seals. Seals shall be provided with a means for checking the quality of air behind the seal and a means to prevent a water head from developing unless the seal is designed to impound water. ...

  2. Principles of continuous quality improvement applied to intravenous therapy.

    PubMed

    Dunavin, M K; Lane, C; Parker, P E

    1994-01-01

    Documentation of the application of the principles of continuous quality improvement (CQI) to the health care setting is crucial for understanding the transition from traditional management models to CQI models. A CQI project was designed and implemented by the IV Therapy Department at Lawrence Memorial Hospital to test the application of these principles to intravenous therapy and as a learning tool for the entire organization. Through a prototype inventory project, significant savings in cost and time were demonstrated using check sheets, flow diagrams, control charts, and other statistical tools, as well as using the Plan-Do-Check-Act cycle. As a result, a primary goal, increased time for direct patient care, was achieved. Eight hours per week in nursing time was saved, relationships between two work areas were improved, and $6,000 in personnel costs, storage space, and inventory were saved.

  3. SEQ-REVIEW: A tool for reviewing and checking spacecraft sequences

    NASA Astrophysics Data System (ADS)

    Maldague, Pierre F.; El-Boushi, Mekki; Starbird, Thomas J.; Zawacki, Steven J.

    1994-11-01

    A key component of JPL's strategy to make space missions faster, better and cheaper is the Advanced Multi-Mission Operations System (AMMOS), a ground software intensive system currently in use and in further development. AMMOS intends to eliminate the cost of re-engineering a ground system for each new JPL mission. This paper discusses SEQ-REVIEW, a component of AMMOS that was designed to facilitate and automate the task of reviewing and checking spacecraft sequences. SEQ-REVIEW is a smart browser for inspecting files created by other sequence generation tools in the AMMOS system. It can parse sequence-related files according to a computer-readable version of a 'Software Interface Specification' (SIS), which is a standard document for defining file formats. It lets users display one or several linked files and check simple constraints using a Basic-like 'Little Language'. SEQ-REVIEW represents the first application of the Quality Function Development (QFD) method to sequence software development at JPL. The paper will show how the requirements for SEQ-REVIEW were defined and converted into a design based on object-oriented principles. The process starts with interviews of potential users, a small but diverse group that spans multiple disciplines and 'cultures'. It continues with the development of QFD matrices that related product functions and characteristics to user-demanded qualities. These matrices are then turned into a formal Software Requirements Document (SRD). The process concludes with the design phase, in which the CRC (Class, Responsibility, Collaboration) approach was used to convert requirements into a blueprint for the final product.

  4. SEQ-REVIEW: A tool for reviewing and checking spacecraft sequences

    NASA Technical Reports Server (NTRS)

    Maldague, Pierre F.; El-Boushi, Mekki; Starbird, Thomas J.; Zawacki, Steven J.

    1994-01-01

    A key component of JPL's strategy to make space missions faster, better and cheaper is the Advanced Multi-Mission Operations System (AMMOS), a ground software intensive system currently in use and in further development. AMMOS intends to eliminate the cost of re-engineering a ground system for each new JPL mission. This paper discusses SEQ-REVIEW, a component of AMMOS that was designed to facilitate and automate the task of reviewing and checking spacecraft sequences. SEQ-REVIEW is a smart browser for inspecting files created by other sequence generation tools in the AMMOS system. It can parse sequence-related files according to a computer-readable version of a 'Software Interface Specification' (SIS), which is a standard document for defining file formats. It lets users display one or several linked files and check simple constraints using a Basic-like 'Little Language'. SEQ-REVIEW represents the first application of the Quality Function Development (QFD) method to sequence software development at JPL. The paper will show how the requirements for SEQ-REVIEW were defined and converted into a design based on object-oriented principles. The process starts with interviews of potential users, a small but diverse group that spans multiple disciplines and 'cultures'. It continues with the development of QFD matrices that related product functions and characteristics to user-demanded qualities. These matrices are then turned into a formal Software Requirements Document (SRD). The process concludes with the design phase, in which the CRC (Class, Responsibility, Collaboration) approach was used to convert requirements into a blueprint for the final product.

  5. Continuous improvement of the quality reporting system of a medium-size company

    NASA Astrophysics Data System (ADS)

    Hawkins, Anthony; Onuh, Spencer

    2001-10-01

    Many companies are faced with quality improvement issues on a daily basis but their response to this problem varies. This paper discusses the improvement in the defect reporting system at a medium sized manufacturing company following the appointment of an experienced, motivated, design engineer to be dedicated to that task. It sets out the situation that the engineer inherited and details the changes that were incorporated; it assesses which were successful and which failed. Following a survey of current literature, it was seen that there is little written specifically on the subject of audited defect reporting. It is felt that this study goes some way to filling that void. A successful survey of engineering companies in Southern Hampshire reinforces the principle findings, that the emphasising of the Check part of Demming's Plan-Do-Check-Act cycle is a novel approach to the Quality Improvement Process, and that it has reduced the cost of rework by an audited 80% in a period of two years.

  6. Quality control of FWC during assembly and commissioning in SST-1 Tokamak

    NASA Astrophysics Data System (ADS)

    Patel, Hitesh; Santra, Prosenjit; Parekh, Tejas; Biswas, Prabal; Jayswal, Snehal; Chauhan, Pradeep; Paravastu, Yuvakiran; George, Siju; Semwal, Pratibha; Thankey, Prashant; Ramesh, Gattu; Prakash, Arun; Dhanani, Kalpesh; Raval, D. C.; Khan, Ziauddin; Pradhan, Subrata

    2017-04-01

    First Wall Components (FWC) of SST-1 tokamak, which are in the immediate vicinity of plasma, comprises of limiters, divertors, baffles, passive stabilizers designed to operate long duration (∼1000 s) discharges of elongated plasma. All FWC consist of copper alloy heat sink modules with SS cooling tubes brazed onto it, graphite tiles acting as armour material facing the plasma, and are mounted to the vacuum vessels with suitable Inconel support structures at inter-connected ring & port locations. The FWC are very recently assembled and commissioned successfully inside the vacuum vessel of SST-1 undergoing a rigorous quality control and checks at every stage of the assembly process. This paper will present the quality control aspects and checks of FWC from commencement of assembly procedure, namely material test reports, leak testing of high temperature baked components, assembled dimensional tolerances, leak testing of all welded joints, graphite tile tightening torques, electrical continuity and electrical isolation of passive stabilizers from vacuum vessel, baking and cooling hydraulic connections inside vacuum vessel.

  7. Software tool for physics chart checks.

    PubMed

    Li, H Harold; Wu, Yu; Yang, Deshan; Mutic, Sasa

    2014-01-01

    Physics chart check has long been a central quality assurance (QC) measure in radiation oncology. The purpose of this work is to describe a software tool that aims to accomplish simplification, standardization, automation, and forced functions in the process. Nationally recognized guidelines, including American College of Radiology and American Society for Radiation Oncology guidelines and technical standards, and the American Association of Physicists in Medicine Task Group reports were identified, studied, and summarized. Meanwhile, the reported events related to physics chart check service were analyzed using an event reporting and learning system. A number of shortfalls in the chart check process were identified. To address these problems, a software tool was designed and developed under Microsoft. Net in C# to hardwire as many components as possible at each stage of the process. The software consists of the following 4 independent modules: (1) chart check management; (2) pretreatment and during treatment chart check assistant; (3) posttreatment chart check assistant; and (4) quarterly peer-review management. The users were a large group of physicists in the author's radiation oncology clinic. During over 1 year of use the tool has proven very helpful in chart checking management, communication, documentation, and maintaining consistency. The software tool presented in this work aims to assist physicists at each stage of the physics chart check process. The software tool is potentially useful for any radiation oncology clinics that are either in the process of pursuing or maintaining the American College of Radiology accreditation.

  8. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR FORM QA/QC CHECKS (UA-C-2.0)

    EPA Science Inventory

    The purpose of this SOP is to outline the process of Field Quality Assurance and Quality Control checks. This procedure was followed to ensure consistent data retrieval during the Arizona NHEXAS project and the "Border" study. Keywords: custody; QA/QC; field checks.

    The Nation...

  9. Ultrasound use during cardiopulmonary resuscitation is associated with delays in chest compressions.

    PubMed

    Huis In 't Veld, Maite A; Allison, Michael G; Bostick, David S; Fisher, Kiondra R; Goloubeva, Olga G; Witting, Michael D; Winters, Michael E

    2017-10-01

    High-quality chest compressions are a critical component of the resuscitation of patients in cardiopulmonary arrest. Point-of-care ultrasound (POCUS) is used frequently during emergency department (ED) resuscitations, but there has been limited research assessing its benefits and harms during the delivery of cardiopulmonary resuscitation (CPR). We hypothesized that use of POCUS during cardiac arrest resuscitation adversely affects high-quality CPR by lengthening the duration of pulse checks beyond the current cardiopulmonary resuscitation guidelines recommendation of 10s. We conducted a prospective cohort study of adults in cardiac arrest treated in an urban ED between August 2015 and September 2016. Resuscitations were recorded using video equipment in designated resuscitation rooms, and the use of POCUS was documented and timed. A linear mixed-effects model was used to estimate the effect of POCUS on pulse check duration. Twenty-three patients were enrolled in our study. The mean duration of pulse checks with POCUS was 21.0s (95% CI, 18-24) compared with 13.0s (95% CI, 12-15) for those without POCUS. POCUS increased the duration of pulse checks and CPR interruption by 8.4s (95% CI, 6.7-10.0 [p<0.0001]). Age, body mass index (BMI), and procedures did not significantly affect the duration of pulse checks. The use of POCUS during cardiac arrest resuscitation was associated with significantly increased duration of pulse checks, nearly doubling the 10-s maximum duration recommended in current guidelines. It is important for acute care providers to pay close attention to the duration of interruptions in the delivery of chest compressions when using POCUS during cardiac arrest resuscitation. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. 10 CFR 73.59 - Relief from fingerprinting, identification and criminal history records checks and other elements...

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... history records checks and other elements of background checks for designated categories of individuals..., identification and criminal history records checks and other elements of background checks for designated categories of individuals. Fingerprinting, and the identification and criminal history records checks...

  11. 10 CFR 73.59 - Relief from fingerprinting, identification and criminal history records checks and other elements...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... history records checks and other elements of background checks for designated categories of individuals..., identification and criminal history records checks and other elements of background checks for designated categories of individuals. Fingerprinting, and the identification and criminal history records checks...

  12. Application Research of Quality Control Technology of Asphalt Pavement based on GPS Intelligent

    NASA Astrophysics Data System (ADS)

    Wang, Min; Gao, Bo; Shang, Fei; Wang, Tao

    2017-10-01

    Due to the difficulty of steel deck pavement asphalt layer compaction caused by the effect of the flexible supporting system (orthotropic steel deck plate), it is usually hard and difficult to control for the site compactness to reach the design goal. The intelligent compaction technology is based on GPS control technology and real-time acquisition of actual compaction tracks, and then forms a cloud maps of compaction times, which guide the roller operator to do the compaction in accordance with the design requirement to ensure the deck compaction technology and compaction quality. From the actual construction situation of actual bridge and checked data, the intelligent compaction technology is significant in guaranteeing the steel deck asphalt pavement compactness and quality stability.

  13. Towards Behavioral Reflexion Models

    NASA Technical Reports Server (NTRS)

    Ackermann, Christopher; Lindvall, Mikael; Cleaveland, Rance

    2009-01-01

    Software architecture has become essential in the struggle to manage today s increasingly large and complex systems. Software architecture views are created to capture important system characteristics on an abstract and, thus, comprehensible level. As the system is implemented and later maintained, it often deviates from the original design specification. Such deviations can have implication for the quality of the system, such as reliability, security, and maintainability. Software architecture compliance checking approaches, such as the reflexion model technique, have been proposed to address this issue by comparing the implementation to a model of the systems architecture design. However, architecture compliance checking approaches focus solely on structural characteristics and ignore behavioral conformance. This is especially an issue in Systems-of- Systems. Systems-of-Systems (SoS) are decompositions of large systems, into smaller systems for the sake of flexibility. Deviations of the implementation to its behavioral design often reduce the reliability of the entire SoS. An approach is needed that supports the reasoning about behavioral conformance on architecture level. In order to address this issue, we have developed an approach for comparing the implementation of a SoS to an architecture model of its behavioral design. The approach follows the idea of reflexion models and adopts it to support the compliance checking of behaviors. In this paper, we focus on sequencing properties as they play an important role in many SoS. Sequencing deviations potentially have a severe impact on the SoS correctness and qualities. The desired behavioral specification is defined in UML sequence diagram notation and behaviors are extracted from the SoS implementation. The behaviors are then mapped to the model of the desired behavior and the two are compared. Finally, a reflexion model is constructed that shows the deviations between behavioral design and implementation. This paper discusses the approach and shows how it can be applied to investigate reliability issues in SoS.

  14. The Quality Control Algorithms Used in the Creation of NASA Kennedy Space Center Lightning Protection System Towers Meteorological Database

    NASA Technical Reports Server (NTRS)

    Orcutt, John M.; Brenton, James C.

    2016-01-01

    An accurate database of meteorological data is essential for designing any aerospace vehicle and for preparing launch commit criteria. Meteorological instrumentation were recently placed on the three Lightning Protection System (LPS) towers at Kennedy Space Center (KSC) launch complex 39B (LC-39B), which provide a unique meteorological dataset existing at the launch complex over an extensive altitude range. Data records of temperature, dew point, relative humidity, wind speed, and wind direction are produced at 40, 78, 116, and 139 m at each tower. The Marshall Space Flight Center Natural Environments Branch (EV44) received an archive that consists of one-minute averaged measurements for the period of record of January 2011 - April 2015. However, before the received database could be used EV44 needed to remove any erroneous data from within the database through a comprehensive quality control (QC) process. The QC process applied to the LPS towers' meteorological data is similar to other QC processes developed by EV44, which were used in the creation of meteorological databases for other towers at KSC. The QC process utilized in this study has been modified specifically for use with the LPS tower database. The QC process first includes a check of each individual sensor. This check includes removing any unrealistic data and checking the temporal consistency of each variable. Next, data from all three sensors at each height are checked against each other, checked against climatology, and checked for sensors that erroneously report a constant value. Then, a vertical consistency check of each variable at each tower is completed. Last, the upwind sensor at each level is selected to minimize the influence of the towers and other structures at LC-39B on the measurements. The selection process for the upwind sensor implemented a study of tower-induced turbulence. This paper describes in detail the QC process, QC results, and the attributes of the LPS towers meteorological database.

  15. TU-D-201-06: HDR Plan Prechecks Using Eclipse Scripting API

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palaniswaamy, G; Morrow, A; Kim, S

    Purpose: Automate brachytherapy treatment plan quality check using Eclipse v13.6 scripting API based on pre-configured rules to minimize human error and maximize efficiency. Methods: The HDR Precheck system is developed based on a rules-driven approach using Eclipse scripting API. This system checks for critical plan parameters like channel length, first source position, source step size and channel mapping. The planned treatment time is verified independently based on analytical methods. For interstitial or SAVI APBI treatment plans, a Patterson-Parker system calculation is performed to verify the planned treatment time. For endobronchial treatments, an analytical formula from TG-59 is used. Acceptable tolerancesmore » were defined based on clinical experiences in our department. The system was designed to show PASS/FAIL status levels. Additional information, if necessary, is indicated appropriately in a separate comments field in the user interface. Results: The HDR Precheck system has been developed and tested to verify the treatment plan parameters that are routinely checked by the clinical physicist. The report also serves as a reminder or checklist for the planner to perform any additional critical checks such as applicator digitization or scenarios where the channel mapping was intentionally changed. It is expected to reduce the current manual plan check time from 15 minutes to <1 minute. Conclusion: Automating brachytherapy plan prechecks significantly reduces treatment plan precheck time and reduces human errors. When fully developed, this system will be able to perform TG-43 based second check of the treatment planning system’s dose calculation using random points in the target and critical structures. A histogram will be generated along with tabulated mean and standard deviation values for each structure. A knowledge database will also be developed for Brachyvision plans which will then be used for knowledge-based plan quality checks to further reduce treatment planning errors and increase confidence in the planned treatment.« less

  16. AutoLock: a semiautomated system for radiotherapy treatment plan quality control

    PubMed Central

    Lowe, Matthew; Hardy, Mark J.; Boylan, Christopher J.; Whitehurst, Philip; Rowbottom, Carl G.

    2015-01-01

    A semiautomated system for radiotherapy treatment plan quality control (QC), named AutoLock, is presented. AutoLock is designed to augment treatment plan QC by automatically checking aspects of treatment plans that are well suited to computational evaluation, whilst summarizing more subjective aspects in the form of a checklist. The treatment plan must pass all automated checks and all checklist items must be acknowledged by the planner as correct before the plan is finalized. Thus AutoLock uniquely integrates automated treatment plan QC, an electronic checklist, and plan finalization. In addition to reducing the potential for the propagation of errors, the integration of AutoLock into the plan finalization workflow has improved efficiency at our center. Detailed audit data are presented, demonstrating that the treatment plan QC rejection rate fell by around a third following the clinical introduction of AutoLock. PACS number: 87.55.Qr PMID:26103498

  17. AutoLock: a semiautomated system for radiotherapy treatment plan quality control.

    PubMed

    Dewhurst, Joseph M; Lowe, Matthew; Hardy, Mark J; Boylan, Christopher J; Whitehurst, Philip; Rowbottom, Carl G

    2015-05-08

    A semiautomated system for radiotherapy treatment plan quality control (QC), named AutoLock, is presented. AutoLock is designed to augment treatment plan QC by automatically checking aspects of treatment plans that are well suited to computational evaluation, whilst summarizing more subjective aspects in the form of a checklist. The treatment plan must pass all automated checks and all checklist items must be acknowledged by the planner as correct before the plan is finalized. Thus AutoLock uniquely integrates automated treatment plan QC, an electronic checklist, and plan finalization. In addition to reducing the potential for the propagation of errors, the integration of AutoLock into the plan finalization workflow has improved efficiency at our center. Detailed audit data are presented, demonstrating that the treatment plan QC rejection rate fell by around a third following the clinical introduction of AutoLock.

  18. Generation and use of observational data patterns in the evaluation of data quality for AmeriFlux and FLUXNET

    NASA Astrophysics Data System (ADS)

    Pastorello, G.; Agarwal, D.; Poindexter, C.; Papale, D.; Trotta, C.; Ribeca, A.; Canfora, E.; Faybishenko, B.; Gunter, D.; Chu, H.

    2015-12-01

    The fluxes-measuring sites that are part of AmeriFlux are operated and maintained in a fairly independent fashion, both in terms of scientific goals and operational practices. This is also the case for most sites from other networks in FLUXNET. This independence leads to a degree of heterogeneity in the data sets collected at the sites, which is also reflected in data quality levels. The generation of derived data products and data synthesis efforts, two of the main goals of these networks, are directly affected by the heterogeneity in data quality. In a collaborative effort between AmeriFlux and ICOS, a series of quality checks are being conducted for the data sets before any network-level data processing and product generation take place. From these checks, a set of common data issues were identified, and are being cataloged and classified into data quality patterns. These patterns are now being used as a basis for implementing automation for certain data quality checks, speeding up the process of applying the checks and evaluating the data. Currently, most data checks are performed individually in each data set, requiring visual inspection and inputs from a data curator. This manual process makes it difficult to scale the quality checks, creating a bottleneck for the data processing. One goal of the automated checks is to free up time of data curators so they can focus on new or less common issues. As new issues are identified, they can also be cataloged and classified, extending the coverage of existing patterns or potentially generating new patterns, helping both improve existing automated checks and create new ones. This approach is helping make data quality evaluation faster, more systematic, and reproducible. Furthermore, these patterns are also helping with documenting common causes and solutions for data problems. This can help tower teams with diagnosing problems in data collection and processing, and also in correcting historical data sets. In this presentation, using AmeriFlux fluxes and micrometeorological data, we discuss our approach to creating observational data patterns, and how we are using them to implement new automated checks. We also detail examples of these observational data patterns, illustrating how they are being used.

  19. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--STANDARD OPERATING PROCEDURE FOR FORM QA AND QC CHECKS (UA-C-2.0)

    EPA Science Inventory

    The purpose of this SOP is to outline the process of field quality assurance and quality control checks. This procedure was followed to ensure consistent data retrieval during the Arizona NHEXAS project and the Border study. Keywords: custody; QA/QC; field checks.

    The U.S.-Mex...

  20. A Microcomputer-Based Program for Printing Check Plots of Integrated Circuits Specified in Caltech Intermediate Form.

    DTIC Science & Technology

    1984-12-01

    only four transistors[5]. Each year since that time, the semiconductor industry has con- sistently improved the quality of the fabrication tech- niques...rarely took place at universities and was almost exclusively confined to industry . IC design techniques were developed, tested, and taught only in the...community, it is not uncommon for industry to borrow ideas and even particular programs from these university designed tools. The Very Large Scale Integration

  1. Multicriteria Gain Tuning for Rotorcraft Flight Controls (also entitled The Development of the Conduit Advanced Control System Design and Evaluation Interface with a Case Study Application Fly by Wire Helicopter Design)

    NASA Technical Reports Server (NTRS)

    Biezad, Daniel

    1997-01-01

    Handling qualities analysis and control law design would seem to be naturally complimenting components of aircraft flight control system design, however these two closely coupled disciplines are often not well integrated in practice. Handling qualities engineers and control system engineers may work in separate groups within an aircraft company. Flight control system engineers and handling quality specialists may come from different backgrounds and schooling and are often not aware of the other group's research. Thus while the handling qualities specifications represent desired aircraft response characteristics, these are rarely incorporated directly in the control system design process. Instead modem control system design techniques are based on servo-loop robustness specifications, and simple representations of the desired control response. Comprehensive handling qualities analysis is often left until the end of the design cycle and performed as a check of the completed design for satisfactory performance. This can lead to costly redesign or less than satisfactory aircraft handling qualities when the flight testing phase is reached. The desire to integrate the fields of handling qualities and flight,control systems led to the development of the CONDUIT system. This tool facilitates control system designs that achieve desired handling quality requirements and servo-loop specifications in a single design process. With CONDUIT, the control system engineer is now able to directly design and control systems to meet the complete handling specifications. CONDUIT allows the designer to retain a preferred control law structure, but then tunes the system parameters to meet the handling quality requirements.

  2. Check-In Check-Out + Social Skills: Enhancing the Effects of Check-In Check-Out for Students With Social Skill Deficits

    ERIC Educational Resources Information Center

    Ross, Scott W.; Sabey, Christian V.

    2015-01-01

    Check-In Check-Out is a Tier 2 intervention designed to reduce problem behavior and increase prosocial behavior. Although the intervention has demonstrated effects in several studies, few research efforts have considered how the intervention can be modified to support students with social skill deficits. Through a multiple baseline design across…

  3. 40 CFR 51.363 - Quality assurance.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... test, the evaporative system tests, and emission control component checks (as applicable); (vi...) A check of the Constant Volume Sampler flow calibration; (5) A check for the optimization of the... selection, and power absorption; (9) A check of the system's ability to accurately detect background...

  4. Data quality assessment for comparative effectiveness research in distributed data networks

    PubMed Central

    Brown, Jeffrey; Kahn, Michael; Toh, Sengwee

    2015-01-01

    Background Electronic health information routinely collected during healthcare delivery and reimbursement can help address the need for evidence about the real-world effectiveness, safety, and quality of medical care. Often, distributed networks that combine information from multiple sources are needed to generate this real-world evidence. Objective We provide a set of field-tested best practices and a set of recommendations for data quality checking for comparative effectiveness research (CER) in distributed data networks. Methods Explore the requirements for data quality checking and describe data quality approaches undertaken by several existing multi-site networks. Results There are no established standards regarding how to evaluate the quality of electronic health data for CER within distributed networks. Data checks of increasing complexity are often employed, ranging from consistency with syntactic rules to evaluation of semantics and consistency within and across sites. Temporal trends within and across sites are widely used, as are checks of each data refresh or update. Rates of specific events and exposures by age group, sex, and month are also common. Discussion Secondary use of electronic health data for CER holds promise but is complex, especially in distributed data networks that incorporate periodic data refreshes. The viability of a learning health system is dependent on a robust understanding of the quality, validity, and optimal secondary uses of routinely collected electronic health data within distributed health data networks. Robust data quality checking can strengthen confidence in findings based on distributed data network. PMID:23793049

  5. Building Consistency between Title, Problem Statement, Purpose, & Research Questions to Improve the Quality of Research Plans and Reports

    ERIC Educational Resources Information Center

    Newman, Isadore; Covrig, Duane M.

    2013-01-01

    Consistency in the title, problem, purpose, and research question improve the logic and transparency of research. When these components of research are aligned research design and planning are more coherent and research reports are more readable. This article reviews the process for checking for and improving consistency. Numerous examples of…

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zaks, D; Fletcher, R; Salamon, S

    Purpose: To develop an online framework that tracks a patient’s plan from initial simulation to treatment and that helps automate elements of the physics plan checks usually performed in the record and verify (RV) system and treatment planning system. Methods: We have developed PlanTracker, an online plan tracking system that automatically imports new patients tasks and follows it through treatment planning, physics checks, therapy check, and chart rounds. A survey was designed to collect information about the amount of time spent by medical physicists in non-physics related tasks. We then assessed these non-physics tasks for automation. Using these surveys, wemore » directed our PlanTracker software development towards the automation of intra-plan physics review. We then conducted a systematic evaluation of PlanTracker’s accuracy by generating test plans in the RV system software designed to mimic real plans, in order to test its efficacy in catching errors both real and theoretical. Results: PlanTracker has proven to be an effective improvement to the clinical workflow in a radiotherapy clinic. We present data indicating that roughly 1/3 of the physics plan check can be automated, and the workflow optimized, and show the functionality of PlanTracker. When the full system is in clinical use we will present data on improvement of time use in comparison to survey data prior to PlanTracker implementation. Conclusion: We have developed a framework for plan tracking and automatic checks in radiation therapy. We anticipate using PlanTracker as a basis for further development in clinical/research software. We hope that by eliminating the most simple and time consuming checks, medical physicists may be able to spend their time on plan quality and other physics tasks rather than in arithmetic and logic checks. We see this development as part of a broader initiative to advance the clinical/research informatics infrastructure surrounding the radiotherapy clinic. This research project has been financially supported by Varian Medical Systems, Palo Alto, CA, through a Varian MRA.« less

  7. The design and implementation of postprocessing for depth map on real-time extraction system.

    PubMed

    Tang, Zhiwei; Li, Bin; Li, Huosheng; Xu, Zheng

    2014-01-01

    Depth estimation becomes the key technology to resolve the communications of the stereo vision. We can get the real-time depth map based on hardware, which cannot implement complicated algorithm as software, because there are some restrictions in the hardware structure. Eventually, some wrong stereo matching will inevitably exist in the process of depth estimation by hardware, such as FPGA. In order to solve the problem a postprocessing function is designed in this paper. After matching cost unique test, the both left-right and right-left consistency check solutions are implemented, respectively; then, the cavities in depth maps can be filled by right depth values on the basis of right-left consistency check solution. The results in the experiments have shown that the depth map extraction and postprocessing function can be implemented in real time in the same system; what is more, the quality of the depth maps is satisfactory.

  8. Modelling and Analysis of the Excavation Phase by the Theory of Blocks Method of Tunnel 4 Kherrata Gorge, Algeria

    NASA Astrophysics Data System (ADS)

    Boukarm, Riadh; Houam, Abdelkader; Fredj, Mohammed; Boucif, Rima

    2017-12-01

    The aim of our work is to check the stability during excavation tunnel work in the rock mass of Kherrata, connecting the cities of Bejaia to Setif. The characterization methods through the Q system (method of Barton), RMR (Bieniawski classification) allowed us to conclude that the quality of rock mass is average in limestone, and poor in fractured limestone. Then modelling of excavation phase using the theory of blocks method (Software UNWEDGE) with the parameters from the recommendations of classification allowed us to check stability and to finally conclude that the use of geomechanical classification and the theory of blocks can be considered reliable in preliminary design.

  9. Physico-chemical properties and sensory profile of durum wheat Dittaino PDO (Protected Designation of Origin) bread and quality of re-milled semolina used for its production.

    PubMed

    Giannone, Virgilio; Giarnetti, Mariagrazia; Spina, Alfio; Todaro, Aldo; Pecorino, Biagio; Summo, Carmine; Caponio, Francesco; Paradiso, Vito Michele; Pasqualone, Antonella

    2018-02-15

    To help future quality checks, we characterized the physico-chemical and sensory properties of Dittaino bread, a sourdough-based durum wheat bread recently awarded with Protected Designation of Origin mark, along with the quality features of re-milled semolina used for its production. Semolina was checked for Falling Number (533-644s), protein content (12.0-12.3g/100gd.m.), gluten content (9.7-10.5g/100gd.m.), yellow index (18.0-21.0), water absorption (59.3-62.3g/100g), farinograph dough stability (171-327s), softening index (46-66B.U.), alveograph W (193×10 -4 -223×10 -4 J) and P/L (2.2-2.7). Accordingly, bread crumb was yellow, moderately hard (16.4-27.1N) and chewy (88.2-109.2N×mm), with low specific volume (2.28-3.03mL/g). Bread aroma profile showed ethanol and acetic acid, followed by hexanol, 3-methyl-1-butanol, 2-phenylethanol, 3-methylbutanal, hexanal, benzaldehyde, and furfural. The sensory features were dominated by a thick brown crust, with marked toasted odor, coupled to yellow and consistent crumb, with coarse grain and well-perceivable sour taste and odor. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Poster — Thur Eve — 17: In-phantom and Fluence-based Measurements for Quality Assurance of Volumetric-driven Adaptation of Arc Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schaly, B; Hoover, D; Mitchell, S

    2014-08-15

    During volumetric modulated arc therapy (VMAT) of head and neck cancer, some patients lose weight which may result in anatomical deviations from the initial plan. If these deviations are substantial a new treatment plan can be designed for the remainder of treatment (i.e., adaptive planning). Since the adaptive treatment process is resource intensive, one possible approach to streamlining the quality assurance (QA) process is to use the electronic portal imaging device (EPID) to measure the integrated fluence for the adapted plans instead of the currently-used ArcCHECK device (Sun Nuclear). Although ArcCHECK is recognized as the clinical standard for patient-specific VMATmore » plan QA, it has limited length (20 cm) for most head and neck field apertures and has coarser detector spacing than the EPID (10 mm vs. 0.39 mm). In this work we compared measurement of the integrated fluence using the EPID with corresponding measurements from the ArcCHECK device. In the past year nine patients required an adapted plan. Each of the plans (the original and adapted) is composed of two arcs. Routine clinical QA was performed using the ArcCHECK device, and the same plans were delivered to the EPID (individual arcs) in integrated mode. The dose difference between the initial plan and adapted plan was compared for ArcCHECK and EPID. In most cases, it was found that the EPID is more sensitive in detecting plan differences. Therefore, we conclude that EPID provides a viable alternative for QA of the adapted head and neck plans and should be further explored.« less

  11. SU-E-T-240: Design and Implement of An Electronic Records Function for Treatment Plan Checked Meeting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Q

    Purpose: To replace the paper records, we designed an electronic records function for plan checked meeting in our in-house developed radiotherapy information management system(RTIMS). Methods: Since 2007, the RTIMS has been developed on a database and web service of Apache+PHP+MySQL, and almost all computers and smartphones could access the RTIMS through IE browser, to input, search, count, and print the data. In 2012, we also established an radiation therapy case conference multi-media system(RTCCMMS) based on Windows Remote Desktop feature. Since 2013, we have carried out the treatment plan checked meeting of the physics division in every afternoon for about halfmore » an hour. In 2014, we designed an electronic records function, which includes a meeting information record and a checked plan record. And the meeting record includes the following items: meeting date, name, place, length, status, attendee, content, etc. The plan record includes the followings: meeting date, meeting name, patient ID, gender, age, patient name, course, plan, purpose, position, technique, CTsim type, plan type, primary doctor, other doctor, primary physicist, other physicist, difficulty, quality, score, opinion, status, note, etc. Results: In the past year, the electronic meeting records function has been successfully developed and implemented in the division, and it could be accessed from an smartphone. Almost all items have the corresponding pull-down menu selection, and each option would try to intelligently inherit default value from the former record or other form. According to the items, we could do big data mining to the input data. It also has both Chinese and English two versions. Conclusion: It was demonstrated to be user-friendly and was proven to significantly improve the clinical efficiency and quality of treatment plan. Since the RTIMS is an in-house developed system, more functions can be added or modified to further enhance its potentials in research and clinical practice. Project supported by the National Natural Science Foundation of China (Grant No.81101694)« less

  12. Take the Reins on Model Quality with ModelCHECK and Gatekeeper

    NASA Technical Reports Server (NTRS)

    Jones, Corey

    2012-01-01

    Model quality and consistency has been an issue for us due to the diverse experience level and imaginative modeling techniques of our users. Fortunately, setting up ModelCHECK and Gatekeeper to enforce our best practices has helped greatly, but it wasn't easy. There were many challenges associated with setting up ModelCHECK and Gatekeeper including: limited documentation, restrictions within ModelCHECK, and resistance from end users. However, we consider ours a success story. In this presentation we will describe how we overcame these obstacles and present some of the details of how we configured them to work for us.

  13. Control of crankshaft finish by scattering technique

    NASA Astrophysics Data System (ADS)

    Fontani, Daniela; Francini, Franco; Longobardi, Giuseppe; Sansoni, Paola

    2001-06-01

    The paper describes a new sensor dedicated to measure and check the surface quality of mechanical products. The results were obtained comparing the light scattered from two different ranges of angles by means of 16 photodiodes. The device is designed for obtaining valid data from curved surfaces as that of a crankshaft. Experimental measurements show that the ratio between scattered and reflected light intensity increases with the surface roughness. This device was developed for the off-tolerance detection of mechanical pieces in industrial production. Results of surface quality on crankshaft supplied by Renault were carried out.

  14. STS-34 onboard view of iodine comparator assembly used to check water quality

    NASA Technical Reports Server (NTRS)

    1989-01-01

    STS-34 closeup view taken onboard Atlantis, Orbiter Vehicle (OV) 104, is of the iodine comparator assembly. Potable water quality is checked by comparing the water color to the color chart on the surrounding board.

  15. Simulation-based MDP verification for leading-edge masks

    NASA Astrophysics Data System (ADS)

    Su, Bo; Syrel, Oleg; Pomerantsev, Michael; Hagiwara, Kazuyuki; Pearman, Ryan; Pang, Leo; Fujimara, Aki

    2017-07-01

    For IC design starts below the 20nm technology node, the assist features on photomasks shrink well below 60nm and the printed patterns of those features on masks written by VSB eBeam writers start to show a large deviation from the mask designs. Traditional geometry-based fracturing starts to show large errors for those small features. As a result, other mask data preparation (MDP) methods have become available and adopted, such as rule-based Mask Process Correction (MPC), model-based MPC and eventually model-based MDP. The new MDP methods may place shot edges slightly differently from target to compensate for mask process effects, so that the final patterns on a mask are much closer to the design (which can be viewed as the ideal mask), especially for those assist features. Such an alteration generally produces better masks that are closer to the intended mask design. Traditional XOR-based MDP verification cannot detect problems caused by eBeam effects. Much like model-based OPC verification which became a necessity for OPC a decade ago, we see the same trend in MDP today. Simulation-based MDP verification solution requires a GPU-accelerated computational geometry engine with simulation capabilities. To have a meaningful simulation-based mask check, a good mask process model is needed. The TrueModel® system is a field tested physical mask model developed by D2S. The GPU-accelerated D2S Computational Design Platform (CDP) is used to run simulation-based mask check, as well as model-based MDP. In addition to simulation-based checks such as mask EPE or dose margin, geometry-based rules are also available to detect quality issues such as slivers or CD splits. Dose margin related hotspots can also be detected by setting a correct detection threshold. In this paper, we will demonstrate GPU-acceleration for geometry processing, and give examples of mask check results and performance data. GPU-acceleration is necessary to make simulation-based mask MDP verification acceptable.

  16. User-friendly design approach for analog layout design

    NASA Astrophysics Data System (ADS)

    Li, Yongfu; Lee, Zhao Chuan; Tripathi, Vikas; Perez, Valerio; Ong, Yoong Seang; Hui, Chiu Wing

    2017-03-01

    Analog circuits are sensitives to the changes in the layout environment conditions, manufacturing processes, and variations. This paper presents analog verification flow with five types of analogfocused layout constraint checks to assist engineers in identifying any potential device mismatch and layout drawing mistakes. Compared to several solutions, our approach only requires layout design, which is sufficient to recognize all the matched devices. Our approach simplifies the data preparation and allows seamless integration into the layout environment with minimum disruption to the custom layout flow. Our user-friendly analog verification flow provides the engineer with more confident with their layouts quality.

  17. The Boeing 747 fatigue integrity program

    NASA Technical Reports Server (NTRS)

    Spencer, M. M.

    1972-01-01

    The fatigue integrity program which was established to insure economic operations and to provide foundation data for inspection and maintenance is discussed. Significant features of the 747 fatigue integrity program are: (1) fatigue analyses which are continually updated to reflect design changes, fatigue test results, and static and flight load survey measurements; (2) material selection and detail design by using initial fatigue analyses, service experience, and testing; and (3) fatigue testing to check detail design quality and to verify the analyses, culminated by the test of a structurally complete airframe. Fatigue stress analyses were performed with the aid of experimental as well as analytical procedures. Extensive application was made of the stress severity factor, developed at Boeing, for evaluating peak stresses in complex joints. A frame of reference was established by families of structural fatigue performance curves (S-N curves) encompassing the range of materials and fatigue qualities anticipated for the 747 airplane design.

  18. [Cleaning and disinfection in nursing homes. Data on quality of structure, process and outcome in nursing homes in Frankfurt am Main, Germany, 2011].

    PubMed

    Heudorf, U; Gasteyer, S; Samoiski, Y; Voigt, K

    2012-08-01

    Due to the Infectious Disease Prevention Act, public health services in Germany are obliged to check the infection prevention in hospitals and other medical facilities as well as in nursing homes. In Frankfurt/Main, Germany, standardized control visits have been performed for many years. In 2011 focus was laid on cleaning and disinfection of surfaces. All 41 nursing homes were checked according to a standardized checklist covering quality of structure (i.e. staffing, hygiene concept), quality of process (observation of the cleaning processes in the homes) and quality of output, which was monitored by checking the cleaning of fluorescent marks which had been applied some days before and should have been removed via cleaning in the following days before the final check. In more than two thirds of the homes, cleaning personnel were salaried, in one third external personnel were hired. Of the homes 85% provided service clothing and all of them offered protective clothing. All homes had established hygiene and cleaning concepts, however, in 15% of the homes concepts for the handling of Norovirus and in 30% concepts for the handling of Clostridium difficile were missing. Regarding process quality only half of the processes observed, i.e. cleaning of hand contact surfaces, such as handrails, washing areas and bins, were correct. Only 44% of the cleaning controls were correct with enormous differences between the homes (0-100%). The correlation between quality of process and quality of output was significant. There was good quality of structure in the homes but regarding quality of process and outcome there was great need for improvement. This was especially due to faults in communication and coordination between cleaning personnel and nursing personnel. Quality outcome was neither associated with the number of the places for residents nor with staffing. Thus, not only quality of structure but also quality of process and outcome should be checked by the public health services.

  19. Drug supply in in-patient nursing care facilities: reasons for irregularities in quality reviews

    PubMed

    Meinck, Matthias; Ernst, Friedemann; Pippel, Kristina; Gehrke, Jörg; Coners, Elise

    2017-01-01

    Background: Quality checks of the independent German Health Insurance Medical Service in in-patient nursing care facilities pursuant to Articles 114 et seqq. SGB XI [11th Book of the Social Code] also comprise the Pflegerische Medikamentenversorgung (PMV) [drug supply by nursing personnel]. Irregularities are described in quality reports in the reviewer’s own words. This investigation was intended to categorise the reasons for the above irregularities. Methods: The bases for the examination are the reports of quality checks of all of in-patient nursing care facilities conducted in 2014 (regular quality checks) in Hamburg and Schleswig-Holstein (N = 671), in which the PMV was examined for 5 742 randomly selected residents. Results: With regard to the documentation, inexplicable drug intakes (5.8 %) were found most frequently, followed by missing information on dosages and application provisions (0.8 % each), which were registered as irregularities at the residents. In the documentation of on-demand medication, insufficient indication data (3.2 %), missing daily maximum dosages (0.8 %) and missing single doses (0.6 %) were most commonly ascertained. The most frequent reasons for medication handling irregularities for the residents were false positioning (6.0 %), missing and respectively false data on consumption and on when the medical packaging was opened (3.5 %), as well as medication not directly administered using the blister (0.7 %). As for subordinate classifications of false positioning, incorrect dosages were revealed most often, followed by drugs with an exceeded expiry date and by out-of-stock drugs. Systematic patient-related factors with influence on PMV could not be determined. Conclusions: The extent of the irregularities and their type prompt a further increase in the efforts to improve the quality of nursing care facilities. The results can be used as a basis for designing specific initiatives to improve the PMV.

  20. Litho hotspots fixing using model based algorithm

    NASA Astrophysics Data System (ADS)

    Zhang, Meili; Yu, Shirui; Mao, Zhibiao; Shafee, Marwa; Madkour, Kareem; ElManhawy, Wael; Kwan, Joe; Hu, Xinyi; Wan, Qijian; Du, Chunshan

    2017-04-01

    As technology advances, IC designs are getting more sophisticated, thus it becomes more critical and challenging to fix printability issues in the design flow. Running lithography checks before tapeout is now mandatory for designers, which creates a need for more advanced and easy-to-use techniques for fixing hotspots found after lithographic simulation without creating a new design rule checking (DRC) violation or generating a new hotspot. This paper presents a new methodology for fixing hotspots on layouts while using the same engine currently used to detect the hotspots. The fix is achieved by applying minimum movement of edges causing the hotspot, with consideration of DRC constraints. The fix is internally simulated by the lithographic simulation engine to verify that the hotspot is eliminated and that no new hotspot is generated by the new edge locations. Hotspot fix checking is enhanced by adding DRC checks to the litho-friendly design (LFD) rule file to guarantee that any fix options that violate DRC checks are removed from the output hint file. This extra checking eliminates the need to re-run both DRC and LFD checks to ensure the change successfully fixed the hotspot, which saves time and simplifies the designer's workflow. This methodology is demonstrated on industrial designs, where the fixing rate of single and dual layer hotspots is reported.

  1. 31 CFR 240.6 - Provisional credit; first examination; declination; final payment.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... alteration without examining the original check or a better quality image of the check and Treasury is on... after the check is presented to a Federal Reserve Processing Center for payment, Treasury will be deemed...

  2. 31 CFR 240.6 - Provisional credit; first examination; declination; final payment.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... alteration without examining the original check or a better quality image of the check and Treasury is on... after the check is presented to a Federal Reserve Processing Center for payment, Treasury will be deemed...

  3. 31 CFR 240.6 - Provisional credit; first examination; declination; final payment.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... alteration without examining the original check or a better quality image of the check and Treasury is on... after the check is presented to a Federal Reserve Processing Center for payment, Treasury will be deemed...

  4. 31 CFR 240.6 - Provisional credit; first examination; declination; final payment.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... alteration without examining the original check or a better quality image of the check and Treasury is on... after the check is presented to a Federal Reserve Processing Center for payment, Treasury will be deemed...

  5. 31 CFR 240.6 - Provisional credit; first examination; declination; final payment.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... alteration without examining the original check or a better quality image of the check and Treasury is on... after the check is presented to a Federal Reserve Processing Center for payment, Treasury will be deemed...

  6. Quality control quantification (QCQ): a tool to measure the value of quality control checks in radiation oncology.

    PubMed

    Ford, Eric C; Terezakis, Stephanie; Souranis, Annette; Harris, Kendra; Gay, Hiram; Mutic, Sasa

    2012-11-01

    To quantify the error-detection effectiveness of commonly used quality control (QC) measures. We analyzed incidents from 2007-2010 logged into a voluntary in-house, electronic incident learning systems at 2 academic radiation oncology clinics. None of the incidents resulted in patient harm. Each incident was graded for potential severity using the French Nuclear Safety Authority scoring scale; high potential severity incidents (score >3) were considered, along with a subset of 30 randomly chosen low severity incidents. Each report was evaluated to identify which of 15 common QC checks could have detected it. The effectiveness was calculated, defined as the percentage of incidents that each QC measure could detect, both for individual QC checks and for combinations of checks. In total, 4407 incidents were reported, 292 of which had high-potential severity. High- and low-severity incidents were detectable by 4.0 ± 2.3 (mean ± SD) and 2.6 ± 1.4 QC checks, respectively (P<.001). All individual checks were less than 50% sensitive with the exception of pretreatment plan review by a physicist (63%). An effectiveness of 97% was achieved with 7 checks used in combination and was not further improved with more checks. The combination of checks with the highest effectiveness includes physics plan review, physician plan review, Electronic Portal Imaging Device-based in vivo portal dosimetry, radiation therapist timeout, weekly physics chart check, the use of checklists, port films, and source-to-skin distance checks. Some commonly used QC checks such as pretreatment intensity modulated radiation therapy QA do not substantially add to the ability to detect errors in these data. The effectiveness of QC measures in radiation oncology depends sensitively on which checks are used and in which combinations. A small percentage of errors cannot be detected by any of the standard formal QC checks currently in broad use, suggesting that further improvements are needed. These data require confirmation with a broader incident-reporting database. Copyright © 2012 Elsevier Inc. All rights reserved.

  7. The relation of mechanical properties of wood and nosebar pressure in the production of veneer

    Treesearch

    Charles W. McMillin

    1958-01-01

    Observations of checking frequency, depth of check penetration, veneer thickness, and surface quality were made at 20 machining conditions. An inverse relationship between depth of check and frequency of checking was established. The effect of cutting temperature was demonstrated, and strength in compression perpendicular to the grain, tension perpendicular to the...

  8. The Effect of the MassHealth Hospital Pay-for-Performance Program on Quality

    PubMed Central

    Ryan, Andrew M; Blustein, Jan

    2011-01-01

    Objective To test the effect of Massachusetts Medicaid's (MassHealth) hospital-based pay-for-performance (P4P) program, implemented in 2008, on quality of care for pneumonia and surgical infection prevention (SIP). Data Hospital Compare process of care quality data from 2004 to 2009 for acute care hospitals in Massachusetts (N = 62) and other states (N = 3,676) and American Hospital Association data on hospital characteristics from 2005. Study Design Panel data models with hospital fixed effects and hospital-specific trends are estimated to test the effect of P4P on composite quality for pneumonia and SIP. This base model is extended to control for the completeness of measure reporting. Further sensitivity checks include estimation with propensity-score matched control hospitals, excluding hospitals in other P4P programs, varying the time period during which the program was assumed to have an effect, and testing the program effect across hospital characteristics. Principal Findings Estimates from our preferred specification, including hospital fixed effects, trends, and the control for measure completeness, indicate small and nonsignificant program effects for pneumonia (−0.67 percentage points, p>.10) and SIP (−0.12 percentage points, p>.10). Sensitivity checks indicate a similar pattern of findings across specifications. Conclusions Despite offering substantial financial incentives, the MassHealth P4P program did not improve quality in the first years of implementation. PMID:21210796

  9. Modeling and optimization of joint quality for laser transmission joint of thermoplastic using an artificial neural network and a genetic algorithm

    NASA Astrophysics Data System (ADS)

    Wang, Xiao; Zhang, Cheng; Li, Pin; Wang, Kai; Hu, Yang; Zhang, Peng; Liu, Huixia

    2012-11-01

    A central composite rotatable experimental design(CCRD) is conducted to design experiments for laser transmission joining of thermoplastic-Polycarbonate (PC). The artificial neural network was used to establish the relationships between laser transmission joining process parameters (the laser power, velocity, clamp pressure, scanning number) and joint strength and joint seam width. The developed mathematical models are tested by analysis of variance (ANOVA) method to check their adequacy and the effects of process parameters on the responses and the interaction effects of key process parameters on the quality are analyzed and discussed. Finally, the desirability function coupled with genetic algorithm is used to carry out the optimization of the joint strength and joint width. The results show that the predicted results of the optimization are in good agreement with the experimental results, so this study provides an effective method to enhance the joint quality.

  10. Seeking more Opportunities of Check Dams' harmony with nearby Circumstances via Design Thinking Process

    NASA Astrophysics Data System (ADS)

    Lin, Huan-Chun; Chen, Su-Chin; Tsai, Chen-Chen

    2014-05-01

    The contents of engineering design should indeed contain both science and art fields. However, the art aspect is too less discussed to cause an inharmonic impact with natural surroundings, and so are check dams. This study would like to seek more opportunities of check dams' harmony with nearby circumstances. According to literatures review of philosophy and cognition science fields, we suggest a thinking process of three phases to do check dams design work for reference. The first phase, conceptualization, is to list critical problems, such as the characteristics of erosion or deposition, and translate them into some goal situations. The second phase, transformation, is to use cognition methods such as analogy, association and metaphors to shape an image and prototypes. The third phase, formation, is to decide the details of the construction, such as stable safety analysis of shapes or materials. According to the previous descriptions, Taiwan's technological codes or papers about check dam design mostly emphasize the first and third phases, still quite a few lacks of the second phase. We emphases designers shouldn't ignore any phase of the framework especially the second one, or they may miss some chances to find more suitable solutions. Otherwise, this conceptual framework is simple to apply and we suppose it's a useful tool to design a more harmonic check dam with nearby natural landscape. Key Words: check dams, design thinking process, conceptualization, transformation, formation.

  11. Quality Assurance in the Presence of Variability

    NASA Astrophysics Data System (ADS)

    Lauenroth, Kim; Metzger, Andreas; Pohl, Klaus

    Software Product Line Engineering (SPLE) is a reuse-driven development paradigm that has been applied successfully in information system engineering and other domains. Quality assurance of the reusable artifacts of the product line (e.g. requirements, design, and code artifacts) is essential for successful product line engineering. As those artifacts are reused in several products, a defect in a reusable artifact can affect several products of the product line. A central challenge for quality assurance in product line engineering is how to consider product line variability. Since the reusable artifacts contain variability, quality assurance techniques from single-system engineering cannot directly be applied to those artifacts. Therefore, different strategies and techniques have been developed for quality assurance in the presence of variability. In this chapter, we describe those strategies and discuss in more detail one of those strategies, the so called comprehensive strategy. The comprehensive strategy aims at checking the quality of all possible products of the product line and thus offers the highest benefits, since it is able to uncover defects in all possible products of the product line. However, the central challenge for applying the comprehensive strategy is the complexity that results from the product line variability and the large number of potential products of a product line. In this chapter, we present one concrete technique that we have developed to implement the comprehensive strategy that addresses this challenge. The technique is based on model checking technology and allows for a comprehensive verification of domain artifacts against temporal logic properties.

  12. Non-invasive quality evaluation of confluent cells by image-based orientation heterogeneity analysis.

    PubMed

    Sasaki, Kei; Sasaki, Hiroto; Takahashi, Atsuki; Kang, Siu; Yuasa, Tetsuya; Kato, Ryuji

    2016-02-01

    In recent years, cell and tissue therapy in regenerative medicine have advanced rapidly towards commercialization. However, conventional invasive cell quality assessment is incompatible with direct evaluation of the cells produced for such therapies, especially in the case of regenerative medicine products. Our group has demonstrated the potential of quantitative assessment of cell quality, using information obtained from cell images, for non-invasive real-time evaluation of regenerative medicine products. However, image of cells in the confluent state are often difficult to evaluate, because accurate recognition of cells is technically difficult and the morphological features of confluent cells are non-characteristic. To overcome these challenges, we developed a new image-processing algorithm, heterogeneity of orientation (H-Orient) processing, to describe the heterogeneous density of cells in the confluent state. In this algorithm, we introduced a Hessian calculation that converts pixel intensity data to orientation data and a statistical profiling calculation that evaluates the heterogeneity of orientations within an image, generating novel parameters that yield a quantitative profile of an image. Using such parameters, we tested the algorithm's performance in discriminating different qualities of cellular images with three types of clinically important cell quality check (QC) models: remaining lifespan check (QC1), manipulation error check (QC2), and differentiation potential check (QC3). Our results show that our orientation analysis algorithm could predict with high accuracy the outcomes of all types of cellular quality checks (>84% average accuracy with cross-validation). Copyright © 2015 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.

  13. Financial Record Checking in Surveys: Do Prompts Improve Data Quality?

    ERIC Educational Resources Information Center

    Murphy, Joe; Rosen, Jeffrey; Richards, Ashley; Riley, Sarah; Peytchev, Andy; Lindblad, Mark

    2016-01-01

    Self-reports of financial information in surveys, such as wealth, income, and assets, are particularly prone to inaccuracy. We sought to improve the quality of financial information captured in a survey conducted by phone and in person by encouraging respondents to check records when reporting on income and assets. We investigated whether…

  14. 40 CFR Appendix B to Part 75 - Quality Assurance and Quality Control Procedures

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Systems 1.2.1Calibration Error Test and Linearity Check Procedures Keep a written record of the procedures used for daily calibration error tests and linearity checks (e.g., how gases are to be injected..., and when calibration adjustments should be made). Identify any calibration error test and linearity...

  15. 40 CFR Appendix B to Part 75 - Quality Assurance and Quality Control Procedures

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Systems 1.2.1Calibration Error Test and Linearity Check Procedures Keep a written record of the procedures used for daily calibration error tests and linearity checks (e.g., how gases are to be injected..., and when calibration adjustments should be made). Identify any calibration error test and linearity...

  16. Analyses of Blood Bank Efficiency, Cost-Effectiveness and Quality

    NASA Astrophysics Data System (ADS)

    Lam, Hwai-Tai Chen

    In view of the increasing costs of hospital care, it is essential to investigate methods to improve the labor efficiency and the cost-effectiveness of the hospital technical core in order to control costs while maintaining the quality of care. This study was conducted to develop indices to measure efficiency, cost-effectiveness, and the quality of blood banks; to identify factors associated with efficiency, cost-effectiveness, and quality; and to generate strategies to improve blood bank labor efficiency and cost-effectiveness. Indices developed in this study for labor efficiency and cost-effectiveness were not affected by patient case mix and illness severity. Factors that were associated with labor efficiency were identified as managerial styles, and organizational designs that balance workload and labor resources. Medical directors' managerial involvement was not associated with labor efficiency, but their continuing education and specialty in blood bank were found to reduce the performance of unnecessary tests. Surprisingly, performing unnecessary tests had no association with labor efficiency. This suggested the existence of labor slack in blood banks. Cost -effectiveness was associated with workers' benefits, wages, and the production of high-end transfusion products by hospital-based donor rooms. Quality indices used in this study included autologous transfusion rates, platelet transfusion rates, and the check points available in an error-control system. Because the autologous transfusion rate was related to patient case mix, severity of illness, and possible inappropriate transfusion, it was not recommended to be used for quality index. Platelet-pheresis transfusion rates were associated with the transfusion preferences of the blood bank medical directors. The total number of check points in an error -control system was negatively associated with government ownership and workers' experience. Recommendations for improving labor efficiency and cost-effectiveness were focused on an incentive system that encourages team effort, and the use of appropriate measurements for laboratory efficiency and operational system designs.

  17. Power Grid Maintenance Scheduling Intelligence Arrangement Supporting System Based on Power Flow Forecasting

    NASA Astrophysics Data System (ADS)

    Xie, Chang; Wen, Jing; Liu, Wenying; Wang, Jiaming

    With the development of intelligent dispatching, the intelligence level of network control center full-service urgent need to raise. As an important daily work of network control center, the application of maintenance scheduling intelligent arrangement to achieve high-quality and safety operation of power grid is very important. By analyzing the shortages of the traditional maintenance scheduling software, this paper designs a power grid maintenance scheduling intelligence arrangement supporting system based on power flow forecasting, which uses the advanced technologies in maintenance scheduling, such as artificial intelligence, online security checking, intelligent visualization techniques. It implements the online security checking of maintenance scheduling based on power flow forecasting and power flow adjusting based on visualization, in order to make the maintenance scheduling arrangement moreintelligent and visual.

  18. 31 CFR 235.1 - Scope of regulations.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., DEPARTMENT OF THE TREASURY FINANCIAL MANAGEMENT SERVICE ISSUANCE OF SETTLEMENT CHECKS FOR FORGED CHECKS DRAWN... checks for checks drawn on designated depositaries of the United States by accountable officers of the...

  19. Methods to achieve high interrater reliability in data collection from primary care medical records.

    PubMed

    Liddy, Clare; Wiens, Miriam; Hogg, William

    2011-01-01

    We assessed interrater reliability (IRR) of chart abstractors within a randomized trial of cardiovascular care in primary care. We report our findings, and outline issues and provide recommendations related to determining sample size, frequency of verification, and minimum thresholds for 2 measures of IRR: the κ statistic and percent agreement. We designed a data quality monitoring procedure having 4 parts: use of standardized protocols and forms, extensive training, continuous monitoring of IRR, and a quality improvement feedback mechanism. Four abstractors checked a 5% sample of charts at 3 time points for a predefined set of indicators of the quality of care. We set our quality threshold for IRR at a κ of 0.75, a percent agreement of 95%, or both. Abstractors reabstracted a sample of charts in 16 of 27 primary care practices, checking a total of 132 charts with 38 indicators per chart. The overall κ across all items was 0.91 (95% confidence interval, 0.90-0.92) and the overall percent agreement was 94.3%, signifying excellent agreement between abstractors. We gave feedback to the abstractors to highlight items that had a κ of less than 0.70 or a percent agreement less than 95%. No practice had to have its charts abstracted again because of poor quality. A 5% sampling of charts for quality control using IRR analysis yielded κ and agreement levels that met or exceeded our quality thresholds. Using 3 time points during the chart audit phase allows for early quality control as well as ongoing quality monitoring. Our results can be used as a guide and benchmark for other medical chart review studies in primary care.

  20. Quality by Design (QbD) Approach for Development of Co-Processed Excipient Pellets (MOMLETS) By Extrusion-Spheronization Technique.

    PubMed

    Patel, Hetal; Patel, Kishan; Tiwari, Sanjay; Pandey, Sonia; Shah, Shailesh; Gohel, Mukesh

    2016-01-01

    Microcrystalline cellulose (MCC) is an excellent excipient for the production of pellets by extrusion spheronization. However, it causes slow release rate of poorly water soluble drugs from pellets. Co-processed excipient prepared by spray drying (US4744987; US5686107; WO2003051338) and coprecipitation technique (WO9517831) are patented. The objective of present study was to develop co-processed MCC pellets (MOMLETS) by extrusion-spheronization technique using the principle of Quality by Design (QbD). Co-processed excipient core pellets (MOMLETS) were developed by extrusion spheronization technique using Quality by Design (QbD) approach. BCS class II drug (telmisartan) was layered onto it in a fluidized bed processor. Quality Target Product Profile (QTPP) and Critical Quality Attributes (CQA) for pellets were identified. Risk assessment was reported using Ishikawa diagram. Plackett Burman design was used to check the effect of seven independent variables; superdisintegrant, extruder speed, ethanol: water, spheronizer speed, extruder screen, pore former and MCC: lactose; on percentage drug release at 30 min. Pareto chart and normal probability plot was constructed to identify the significant factors. Box-Behnken design (BBD) using three most significant factors (Extruder screen size, type of superdisintegrant and type of pore former) was used as an optimization design. The control space was identified in which desired quality of the pellets can be obtained. Co-processed excipient core pellets (MOMLETS) were successfully developed by QbD approach. Versatility, Industrial scalability and simplicity are the main features of the proposed research. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  1. A systematic composite service design modeling method using graph-based theory.

    PubMed

    Elhag, Arafat Abdulgader Mohammed; Mohamad, Radziah; Aziz, Muhammad Waqar; Zeshan, Furkh

    2015-01-01

    The composite service design modeling is an essential process of the service-oriented software development life cycle, where the candidate services, composite services, operations and their dependencies are required to be identified and specified before their design. However, a systematic service-oriented design modeling method for composite services is still in its infancy as most of the existing approaches provide the modeling of atomic services only. For these reasons, a new method (ComSDM) is proposed in this work for modeling the concept of service-oriented design to increase the reusability and decrease the complexity of system while keeping the service composition considerations in mind. Furthermore, the ComSDM method provides the mathematical representation of the components of service-oriented design using the graph-based theoryto facilitate the design quality measurement. To demonstrate that the ComSDM method is also suitable for composite service design modeling of distributed embedded real-time systems along with enterprise software development, it is implemented in the case study of a smart home. The results of the case study not only check the applicability of ComSDM, but can also be used to validate the complexity and reusability of ComSDM. This also guides the future research towards the design quality measurement such as using the ComSDM method to measure the quality of composite service design in service-oriented software system.

  2. A Systematic Composite Service Design Modeling Method Using Graph-Based Theory

    PubMed Central

    Elhag, Arafat Abdulgader Mohammed; Mohamad, Radziah; Aziz, Muhammad Waqar; Zeshan, Furkh

    2015-01-01

    The composite service design modeling is an essential process of the service-oriented software development life cycle, where the candidate services, composite services, operations and their dependencies are required to be identified and specified before their design. However, a systematic service-oriented design modeling method for composite services is still in its infancy as most of the existing approaches provide the modeling of atomic services only. For these reasons, a new method (ComSDM) is proposed in this work for modeling the concept of service-oriented design to increase the reusability and decrease the complexity of system while keeping the service composition considerations in mind. Furthermore, the ComSDM method provides the mathematical representation of the components of service-oriented design using the graph-based theoryto facilitate the design quality measurement. To demonstrate that the ComSDM method is also suitable for composite service design modeling of distributed embedded real-time systems along with enterprise software development, it is implemented in the case study of a smart home. The results of the case study not only check the applicability of ComSDM, but can also be used to validate the complexity and reusability of ComSDM. This also guides the future research towards the design quality measurement such as using the ComSDM method to measure the quality of composite service design in service-oriented software system. PMID:25928358

  3. Poster - Thurs Eve-43: Verification of dose calculation with tissue inhomogeneity using MapCHECK.

    PubMed

    Korol, R; Chen, J; Mosalaei, H; Karnas, S

    2008-07-01

    MapCHECK (Sun Nuclear, Melbourne, FL) with 445 diode detectors has been used widely for routine IMRT quality assurance (QA) 1 . However, routine IMRT QA has not included the verification of inhomogeneity effects. The objective of this study is to use MapCHECK and a phantom to verify dose calculation and IMRT delivery with tissue inhomogeneity. A phantom with tissue inhomogeneities was placed on top of MapCHECK to measure the planar dose for an anterior beam with photon energy 6 MV or 18 MV. The phantom was composed of a 3.5 cm thick block of lung equivalent material and solid water arranged side by side with a 0.5 cm slab of solid water on the top of the phantom. The phantom setup including MapCHECK was CT scanned and imported into Pinnacle 8.0d for dose calculation. Absolute dose distributions were compared with gamma criteria 3% for dose difference and 3 mm for distance-to-agreement. The results are in good agreement between the measured and calculated planar dose with 88% pass rate based on the gamma analysis. The major dose difference was at the lung-water interface. Further investigation will be performed on a custom designed inhomogeneity phantom with inserts of varying densities and effective depth to create various dose gradients at the interface for dose calculation and delivery verification. In conclusion, a phantom with tissue inhomogeneities can be used with MapCHECK for verification of dose calculation and delivery with tissue inhomogeneity. © 2008 American Association of Physicists in Medicine.

  4. Capturing, using, and managing quality assurance knowledge for shuttle post-MECO flight design

    NASA Technical Reports Server (NTRS)

    Peters, H. L.; Fussell, L. R.; Goodwin, M. A.; Schultz, Roger D.

    1991-01-01

    Ascent initialization values used by the Shuttle's onboard computer for nominal and abort mission scenarios are verified by a six degrees of freedom computer simulation. The procedure that the Ascent Post Main Engine Cutoff (Post-MECO) group uses to perform quality assurance (QA) of the simulation is time consuming. Also, the QA data, checklists and associated rationale, though known by the group members, is not sufficiently documented, hindering transfer of knowledge and problem resolution. A new QA procedure which retains the current high level of integrity while reducing the time required to perform QA is needed to support the increasing Shuttle flight rate. Documenting the knowledge is also needed to increase its availability for training and problem resolution. To meet these needs, a knowledge capture process, embedded into the group activities, was initiated to verify the existing QA checks, define new ones, and document all rationale. The resulting checks were automated in a conventional software program to achieve the desired standardization, integrity, and time reduction. A prototype electronic knowledge base was developed with Macintosh's HyperCard to serve as a knowledge capture tool and data storage.

  5. The Significance of Quality Assurance within Model Intercomparison Projects at the World Data Centre for Climate (WDCC)

    NASA Astrophysics Data System (ADS)

    Toussaint, F.; Hoeck, H.; Stockhause, M.; Lautenschlager, M.

    2014-12-01

    The classical goals of a quality assessment system in the data life cycle are (1) to encourage data creators to improve their quality assessment procedures to reach the next quality level and (2) enable data consumers to decide, whether a dataset has a quality that is sufficient for usage in the target application, i.e. to appraise the data usability for their own purpose.As the data volumes of projects and the interdisciplinarity of data usage grow, the need for homogeneous structure and standardised notation of data and metadata increases. This third aspect is especially valid for the data repositories, as they manage data through machine agents. So checks for homogeneity and consistency in early parts of the workflow become essential to cope with today's data volumes.Selected parts of the workflow in the model intercomparison project CMIP5 and the archival of the data for the interdiscipliary user community of the IPCC-DDC AR5 and the associated quality checks are reviewed. We compare data and metadata checks and relate different types of checks to their positions in the data life cycle.The project's data citation approach is included in the discussion, with focus on temporal aspects of the time necessary to comply with the project's requirements for formal data citations and the demand for the availability of such data citations.In order to make different quality assessments of projects comparable, WDCC developed a generic Quality Assessment System. Based on the self-assessment approach of a maturity matrix, an objective and uniform quality level system for all data at WDCC is derived which consists of five maturity quality levels.

  6. Fabrication of a grazing incidence telescope by grinding and polishing techniques on aluminum

    NASA Technical Reports Server (NTRS)

    Gallagher, Dennis; Cash, Webster; Green, James

    1991-01-01

    The paper describes the fabrication processes, by grinding and polishing, used in making the mirrors for a f/2.8 Wolter type-I grazing incidence telescope at Boulder (Colorado), together with testing procedure used to determine the quality of the images. All grinding and polishing is done on specially designed machine that consists of a horizontal spindle to hold and rotate the mirror and a stroke arm machine to push the various tools back and forth along the mirrors length. The progress is checked by means of the ronchi test during all grinding and polishing stages. Current measurements of the telescope's image quality give a FWHM measurement of 44 arcsec, with the goal set at 5-10 arcsec quality.

  7. CheckM: assessing the quality of microbial genomes recovered from isolates, single cells, and metagenomes

    PubMed Central

    Parks, Donovan H.; Imelfort, Michael; Skennerton, Connor T.; Hugenholtz, Philip; Tyson, Gene W.

    2015-01-01

    Large-scale recovery of genomes from isolates, single cells, and metagenomic data has been made possible by advances in computational methods and substantial reductions in sequencing costs. Although this increasing breadth of draft genomes is providing key information regarding the evolutionary and functional diversity of microbial life, it has become impractical to finish all available reference genomes. Making robust biological inferences from draft genomes requires accurate estimates of their completeness and contamination. Current methods for assessing genome quality are ad hoc and generally make use of a limited number of “marker” genes conserved across all bacterial or archaeal genomes. Here we introduce CheckM, an automated method for assessing the quality of a genome using a broader set of marker genes specific to the position of a genome within a reference genome tree and information about the collocation of these genes. We demonstrate the effectiveness of CheckM using synthetic data and a wide range of isolate-, single-cell-, and metagenome-derived genomes. CheckM is shown to provide accurate estimates of genome completeness and contamination and to outperform existing approaches. Using CheckM, we identify a diverse range of errors currently impacting publicly available isolate genomes and demonstrate that genomes obtained from single cells and metagenomic data vary substantially in quality. In order to facilitate the use of draft genomes, we propose an objective measure of genome quality that can be used to select genomes suitable for specific gene- and genome-centric analyses of microbial communities. PMID:25977477

  8. CheckM: assessing the quality of microbial genomes recovered from isolates, single cells, and metagenomes.

    PubMed

    Parks, Donovan H; Imelfort, Michael; Skennerton, Connor T; Hugenholtz, Philip; Tyson, Gene W

    2015-07-01

    Large-scale recovery of genomes from isolates, single cells, and metagenomic data has been made possible by advances in computational methods and substantial reductions in sequencing costs. Although this increasing breadth of draft genomes is providing key information regarding the evolutionary and functional diversity of microbial life, it has become impractical to finish all available reference genomes. Making robust biological inferences from draft genomes requires accurate estimates of their completeness and contamination. Current methods for assessing genome quality are ad hoc and generally make use of a limited number of "marker" genes conserved across all bacterial or archaeal genomes. Here we introduce CheckM, an automated method for assessing the quality of a genome using a broader set of marker genes specific to the position of a genome within a reference genome tree and information about the collocation of these genes. We demonstrate the effectiveness of CheckM using synthetic data and a wide range of isolate-, single-cell-, and metagenome-derived genomes. CheckM is shown to provide accurate estimates of genome completeness and contamination and to outperform existing approaches. Using CheckM, we identify a diverse range of errors currently impacting publicly available isolate genomes and demonstrate that genomes obtained from single cells and metagenomic data vary substantially in quality. In order to facilitate the use of draft genomes, we propose an objective measure of genome quality that can be used to select genomes suitable for specific gene- and genome-centric analyses of microbial communities. © 2015 Parks et al.; Published by Cold Spring Harbor Laboratory Press.

  9. Check & Connect: The Importance of Relationships for Promoting Engagement with School

    ERIC Educational Resources Information Center

    Anderson, Amy R.; Christenson, Sandra L.; Sinclair, Mary F.; Lehr, Camilla A.

    2004-01-01

    The purpose of this study was to examine whether the closeness and quality of relationships between intervention staff and students involved in the Check & Connect program were associated with improved student engagement in school. Participants included 80 elementary and middle school students referred to the Check & Connect program for poor…

  10. [Goals, possibilities and limits of quality evaluation of guidelines. A background report on the user manual of the "Methodological Quality of Guidelines" check list].

    PubMed

    Helou, A; Ollenschläger, G

    1998-06-01

    Recently a German appraisal instrument for clinical guidelines was published that could be used by various parties in formal evaluation of guidelines. An user's guide to the appraisal instrument was designed that contains a detailed explanation for each question to ensure that the instrument is interpreted consistently. This paper describes the purposes, format and contents of the user's guide, and reviews the key factors influencing the validity of guidelines. Taking into account international experiences, the purposes, chances and methodological limitations of a prospective assessment of clinical practice guidelines are discussed.

  11. Rural-Urban Differences in Medicare Quality Outcomes and the Impact of Risk Adjustment.

    PubMed

    Henning-Smith, Carrie; Kozhimannil, Katy; Casey, Michelle; Prasad, Shailendra; Moscovice, Ira

    2017-09-01

    There has been considerable debate in recent years about whether, and how, to risk-adjust quality measures for sociodemographic characteristics. However, geographic location, especially rurality, has been largely absent from the discussion. To examine differences by rurality in quality outcomes, and the impact of adjustment for individual and community-level sociodemographic characteristics on quality outcomes. The 2012 Medicare Current Beneficiary Survey, Access to Care module, combined with the 2012 County Health Rankings. All data used were publicly available, secondary data. We merged the 2012 Medicare Current Beneficiary Survey data with the 2012 County Health Rankings data using county of residence. We compared 6 unadjusted quality of care measures for Medicare beneficiaries (satisfaction with care, blood pressure checked, cholesterol checked, flu shot receipt, change in health status, and all-cause annual readmission) by rurality (rural noncore, micropolitan, and metropolitan). We then ran nested multivariable logistic regression models to assess the impact of adjusting for community and individual-level sociodemographic characteristics to determine whether these mediate the rurality difference in quality of care. The relationship between rurality and change in health status was mediated by the inclusion of community-level characteristics; however, adjusting for community and individual-level characteristics caused differences by rurality to emerge in 2 of the measures: blood pressure checked and cholesterol checked. For all quality scores, model fit improved after adding community and individual characteristics. Quality is multifaceted and is impacted by individual and community-level socio-demographic characteristics, as well as by geographic location. Current debates about risk-adjustment procedures should take rurality into account.

  12. 40 CFR Appendix K to Part 75 - Quality Assurance and Operating Procedures for Sorbent Trap Monitoring Systems

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... until the leak check is passed. Post-test leak check ≤4% of average sampling rate After sampling ** See... the test site. The sorbent media must be obtained from a source that can demonstrate the quality...-traceable calibration gas standards and reagents shall be used for the tests and procedures required under...

  13. 49 CFR 40.235 - What are the requirements for proper use and care of ASDs?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... ASD on the CPL. Your QAP must specify the methods used for quality control checks, temperatures at which the ASD must be stored and used, the shelf life of the device, and environmental conditions (e.g... the specified quality control checks or that has passed its expiration date. (e) As an employer, with...

  14. 49 CFR 40.235 - What are the requirements for proper use and care of ASDs?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... ASD on the CPL. Your QAP must specify the methods used for quality control checks, temperatures at which the ASD must be stored and used, the shelf life of the device, and environmental conditions (e.g... the specified quality control checks or that has passed its expiration date. (e) As an employer, with...

  15. 49 CFR 40.235 - What are the requirements for proper use and care of ASDs?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... ASD on the CPL. Your QAP must specify the methods used for quality control checks, temperatures at which the ASD must be stored and used, the shelf life of the device, and environmental conditions (e.g... the specified quality control checks or that has passed its expiration date. (e) As an employer, with...

  16. 49 CFR 40.235 - What are the requirements for proper use and care of ASDs?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... ASD on the CPL. Your QAP must specify the methods used for quality control checks, temperatures at which the ASD must be stored and used, the shelf life of the device, and environmental conditions (e.g... the specified quality control checks or that has passed its expiration date. (e) As an employer, with...

  17. 49 CFR 40.235 - What are the requirements for proper use and care of ASDs?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... ASD on the CPL. Your QAP must specify the methods used for quality control checks, temperatures at which the ASD must be stored and used, the shelf life of the device, and environmental conditions (e.g... the specified quality control checks or that has passed its expiration date. (e) As an employer, with...

  18. FIELD CHECK MANUAL FOR LANGUAGE LABORATORIES, A SERIES OF TESTS WHICH A NON-TECHNICAL PERSON CAN CONDUCT TO VERIFY SPECIFICATIONS.

    ERIC Educational Resources Information Center

    GRITTNER, FRANK; PAVLAT, RUSSELL

    IN ORDER TO ASSIST NON-TECHNICAL PEOPLE IN SCHOOLS TO CONDUCT A FIELD CHECK OF LANGUAGE LABORATORY EQUIPMENT BEFORE THEY MAKE FINAL PAYMENTS, THIS MANUAL OFFERS CRITERIA, TESTS, AND METHODS OF SCORING THE QUALITY OF THE EQUIPMENT. CHECKLISTS ARE PROVIDED FOR EVALUATING CONSOLE FUNCTIONS, TAPE RECORDERS, AMPLIFIERS, SOUND QUALITY (INCLUDING…

  19. Propel: Tools and Methods for Practical Source Code Model Checking

    NASA Technical Reports Server (NTRS)

    Mansouri-Samani, Massoud; Mehlitz, Peter; Markosian, Lawrence; OMalley, Owen; Martin, Dale; Moore, Lantz; Penix, John; Visser, Willem

    2003-01-01

    The work reported here is an overview and snapshot of a project to develop practical model checking tools for in-the-loop verification of NASA s mission-critical, multithreaded programs in Java and C++. Our strategy is to develop and evaluate both a design concept that enables the application of model checking technology to C++ and Java, and a model checking toolset for C++ and Java. The design concept and the associated model checking toolset is called Propel. It builds upon the Java PathFinder (JPF) tool, an explicit state model checker for Java applications developed by the Automated Software Engineering group at NASA Ames Research Center. The design concept that we are developing is Design for Verification (D4V). This is an adaption of existing best design practices that has the desired side-effect of enhancing verifiability by improving modularity and decreasing accidental complexity. D4V, we believe, enhances the applicability of a variety of V&V approaches; we are developing the concept in the context of model checking. The model checking toolset, Propel, is based on extending JPF to handle C++. Our principal tasks in developing the toolset are to build a translator from C++ to Java, productize JPF, and evaluate the toolset in the context of D4V. Through all these tasks we are testing Propel capabilities on customer applications.

  20. Footwear used by older people and a history of hyperkeratotic lesions on the foot

    PubMed Central

    Palomo-López, Patricia; Becerro-de-Bengoa-Vallejo, Ricardo; Losa-Iglesias, Marta Elena; Rodríguez-Sanz, David; Calvo-Lobo, César; López-López, Daniel

    2017-01-01

    Abstract Inadequate footwear, painful and hyperkeratotic lesions (HL) are an extremely common problems amongst older people. Such problems increase the risk of falls, hamper mobility, reduction of quality of life, dignity, and ability to remain independent. The etiology of painful and feet conditions is poorly understood. To discover footwear preferences of older people, pain tolerance may favor presence of HL for the use of inadequate footwear in old age. A sample of 100 participants with a mean age of 74.90 ± 7.01 years attended an outpatient clinic where self-reported demographic data, frequency with which they checked their feet were recorded and measurements were taken of foot sensitivity. Additionally, all participants’ shoes were allocated into optimal, adequate, and dangerous categories based on design, structural and safety features, and materials. Only 12% of the sample population checked their feet every day, 37% revealed symptoms of neuropathy, 14% used optimal shoes, and 61% presented HL. In a bivariate analysis, no significant differences were observed. HL are associated with inadequate footwear, loss of sensitivity, and low frequency of foot health checks. PMID:28403112

  1. CMM Interim Check Design of Experiments (U)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Montano, Joshua Daniel

    2015-07-29

    Coordinate Measuring Machines (CMM) are widely used in industry, throughout the Nuclear Weapons Complex and at Los Alamos National Laboratory (LANL) to verify part conformance to design definition. Calibration cycles for CMMs at LANL are predominantly one year in length and include a weekly interim check to reduce risk. The CMM interim check makes use of Renishaw’s Machine Checking Gauge which is an off-the-shelf product simulates a large sphere within a CMM’s measurement volume and allows for error estimation. As verification on the interim check process a design of experiments investigation was proposed to test a couple of key factorsmore » (location and inspector). The results from the two-factor factorial experiment proved that location influenced results more than the inspector or interaction.« less

  2. On quality control procedures for solar radiation and meteorological measures, from subhourly to montly average time periods

    NASA Astrophysics Data System (ADS)

    Espinar, B.; Blanc, P.; Wald, L.; Hoyer-Klick, C.; Schroedter-Homscheidt, M.; Wanderer, T.

    2012-04-01

    Meteorological data measured by ground stations are often a key element in the development and validation of methods exploiting satellite images. These data are considered as a reference against which satellite-derived estimates are compared. Long-term radiation and meteorological measurements are available from a large number of measuring stations. However, close examination of the data often reveals a lack of quality, often for extended periods of time. This lack of quality has been the reason, in many cases, of the rejection of large amount of available data. The quality data must be checked before their use in order to guarantee the inputs for the methods used in modelling, monitoring, forecast, etc. To control their quality, data should be submitted to several conditions or tests. After this checking, data that are not flagged by any of the test is released as a plausible data. In this work, it has been performed a bibliographical research of quality control tests for the common meteorological variables (ambient temperature, relative humidity and wind speed) and for the usual solar radiometrical variables (horizontal global and diffuse components of the solar radiation and the beam normal component). The different tests have been grouped according to the variable and the average time period (sub-hourly, hourly, daily and monthly averages). The quality test may be classified as follows: • Range checks: test that verify values are within a specific range. There are two types of range checks, those based on extrema and those based on rare observations. • Step check: test aimed at detecting unrealistic jumps or stagnation in the time series. • Consistency checks: test that verify the relationship between two or more time series. The gathered quality tests are applicable for all latitudes as they have not been optimized regionally nor seasonably with the aim of being generic. They have been applied to ground measurements in several geographic locations, what result in the detection of some control tests that are no longer adequate, due to different reasons. After the modification of some test, based in our experience, a set of quality control tests is now presented, updated according to technology advances and classified. The presented set of quality tests allows radiation and meteorological data to be tested in order to know their plausibility to be used as inputs in theoretical or empirical methods for scientific research. The research leading to those results has partly receive funding from the European Union's Seventh Framework Programme (FP7/2007-2013) under Grant Agreement no. 262892 (ENDORSE project).

  3. Public health concerns for anti-obesity medicines imported for personal use through the internet: a cross-sectional study

    PubMed Central

    Tanimoto, Tsuyoshi; Nakanishi, Yoko; Yoshida, Naoko; Tsuboi, Hirohito; Kimura, Kazuko

    2012-01-01

    Objective To explore the circulation of anti-obesity medicines via the internet and their quality. Design Cross-sectional study. Setting Internet pharmacies and pharmaceutical suppliers accessible from Japan. Participants Anti-obesity medicines were purchased using relevant keywords on Japanese Google search engine. Blogs and advertisement-only sites were excluded. Primary and secondary outcome measures The authenticity of the samples was investigated in collaboration with the manufacturers of the samples and medicine regulatory authorities. Quality of the samples was assessed by pharmacopoeial analyses using high-performance liquid chromatography. Results 82 samples were purchased from 36 internet sites. Approximately half of the sites did not mention a physical address, and 45% of the samples did not contain a package insert. A variety of custom declarations were made for the shipments of the samples: personal health items, supplement, medicines, general merchandise, tea and others. Among 82 samples, 52 samples were analysed to check their pharmacopoeial quality. Authenticity responses were received from only five of 20 manufacturing companies. According to the pharmacopoeial analyses and authenticity investigation, three of the samples were identified as counterfeits and did not contain any active ingredients. Two of these samples were confirmed as counterfeits by the manufacturer of the authentic products. The manufacturer of the other sample did not respond to our request for an authenticity check even after several communication attempts. These counterfeit cases have been reported at the rapid alert system of Western Pacific Region of the WHO. Conclusions Many counterfeit and unapproved anti-obesity medicines may be easily bypassing regulatory checks during shipping and are widely circulated through the internet. Regulatory authorities should take measures to prevent these medicines from entering countries to safeguard their citizens. PMID:22581794

  4. Towards a supported common NEAMS software stack

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cormac Garvey

    2012-04-01

    The NEAMS IPSC's are developing multidimensional, multiphysics, multiscale simulation codes based on first principles that will be capable of predicting all aspects of current and future nuclear reactor systems. These new breeds of simulation codes will include rigorous verification, validation and uncertainty quantification checks to quantify the accuracy and quality of the simulation results. The resulting NEAMS IPSC simulation codes will be an invaluable tool in designing the next generation of Nuclear Reactors and also contribute to a more speedy process in the acquisition of licenses from the NRC for new Reactor designs. Due to the high resolution of themore » models, the complexity of the physics and the added computational resources to quantify the accuracy/quality of the results, the NEAMS IPSC codes will require large HPC resources to carry out the production simulation runs.« less

  5. PDB data curation.

    PubMed

    Wang, Yanchao; Sunderraman, Rajshekhar

    2006-01-01

    In this paper, we propose two architectures for curating PDB data to improve its quality. The first one, PDB Data Curation System, is developed by adding two parts, Checking Filter and Curation Engine, between User Interface and Database. This architecture supports the basic PDB data curation. The other one, PDB Data Curation System with XCML, is designed for further curation which adds four more parts, PDB-XML, PDB, OODB, Protin-OODB, into the previous one. This architecture uses XCML language to automatically check errors of PDB data that enables PDB data more consistent and accurate. These two tools can be used for cleaning existing PDB files and creating new PDB files. We also show some ideas how to add constraints and assertions with XCML to get better data. In addition, we discuss the data provenance that may affect data accuracy and consistency.

  6. Progress on the Journey to Total Quality Management: Using the Myers-Briggs Type Indicator and the Adjective Check List in Management Development.

    ERIC Educational Resources Information Center

    Mani, Bonnie G.

    1995-01-01

    In an Internal Revenue Service office using total quality management (TQM), the management development program uses Myers Briggs Type Indicator and Adjective Check List for manager self-assessment. Because management commitment is essential to TQM, the process is a way of enhancing leadership skills and demonstrating appreciation of diversity. (SK)

  7. The Automation of Nowcast Model Assessment Processes

    DTIC Science & Technology

    2016-09-01

    that will automate real-time WRE-N model simulations, collect and quality control check weather observations for assimilation and verification, and...domains centered near White Sands Missile Range, New Mexico, where the Meteorological Sensor Array (MSA) will be located. The MSA will provide...observations and performing quality -control checks for the pre-forecast data assimilation period. 2. Run the WRE-N model to generate model forecast data

  8. User's manual for computer program BASEPLOT

    USGS Publications Warehouse

    Sanders, Curtis L.

    2002-01-01

    The checking and reviewing of daily records of streamflow within the U.S. Geological Survey is traditionally accomplished by hand-plotting and mentally collating tables of data. The process is time consuming, difficult to standardize, and subject to errors in computation, data entry, and logic. In addition, the presentation of flow data on the internet requires more timely and accurate computation of daily flow records. BASEPLOT was developed for checking and review of primary streamflow records within the U.S. Geological Survey. Use of BASEPLOT enables users to (1) provide efficiencies during the record checking and review process, (2) improve quality control, (3) achieve uniformity of checking and review techniques of simple stage-discharge relations, and (4) provide a tool for teaching streamflow computation techniques. The BASEPLOT program produces tables of quality control checks and produces plots of rating curves and discharge measurements; variable shift (V-shift) diagrams; and V-shifts converted to stage-discharge plots, using data stored in the U.S. Geological Survey Automatic Data Processing System database. In addition, the program plots unit-value hydrographs that show unit-value stages, shifts, and datum corrections; input shifts, datum corrections, and effective dates; discharge measurements; effective dates for rating tables; and numeric quality control checks. Checklist/tutorial forms are provided for reviewers to ensure completeness of review and standardize the review process. The program was written for the U.S. Geological Survey SUN computer using the Statistical Analysis System (SAS) software produced by SAS Institute, Incorporated.

  9. Process safety improvement--quality and target zero.

    PubMed

    Van Scyoc, Karl

    2008-11-15

    Process safety practitioners have adopted quality management principles in design of process safety management systems with positive effect, yet achieving safety objectives sometimes remain a distant target. Companies regularly apply tools and methods which have roots in quality and productivity improvement. The "plan, do, check, act" improvement loop, statistical analysis of incidents (non-conformities), and performance trending popularized by Dr. Deming are now commonly used in the context of process safety. Significant advancements in HSE performance are reported after applying methods viewed as fundamental for quality management. In pursuit of continual process safety improvement, the paper examines various quality improvement methods, and explores how methods intended for product quality can be additionally applied to continual improvement of process safety. Methods such as Kaizen, Poke yoke, and TRIZ, while long established for quality improvement, are quite unfamiliar in the process safety arena. These methods are discussed for application in improving both process safety leadership and field work team performance. Practical ways to advance process safety, based on the methods, are given.

  10. Cycle time reduction by Html report in mask checking flow

    NASA Astrophysics Data System (ADS)

    Chen, Jian-Cheng; Lu, Min-Ying; Fang, Xiang; Shen, Ming-Feng; Ma, Shou-Yuan; Yang, Chuen-Huei; Tsai, Joe; Lee, Rachel; Deng, Erwin; Lin, Ling-Chieh; Liao, Hung-Yueh; Tsai, Jenny; Bowhill, Amanda; Vu, Hien; Russell, Gordon

    2017-07-01

    The Mask Data Correctness Check (MDCC) is a reticle-level, multi-layer DRC-like check evolved from mask rule check (MRC). The MDCC uses extended job deck (EJB) to achieve mask composition and to perform a detailed check for positioning and integrity of each component of the reticle. Different design patterns on the mask will be mapped to different layers. Therefore, users may be able to review the whole reticle and check the interactions between different designs before the final mask pattern file is available. However, many types of MDCC check results, such as errors from overlapping patterns usually have very large and complex-shaped highlighted areas covering the boundary of the design. Users have to load the result OASIS file and overlap it to the original database that was assembled in MDCC process on a layout viewer, then search for the details of the check results. We introduce a quick result-reviewing method based on an html format report generated by Calibre® RVE. In the report generation process, we analyze and extract the essential part of result OASIS file to a result database (RDB) file by standard verification rule format (SVRF) commands. Calibre® RVE automatically loads the assembled reticle pattern and generates screen shots of these check results. All the processes are automatically triggered just after the MDCC process finishes. Users just have to open the html report to get the information they need: for example, check summary, captured images of results and their coordinates.

  11. Health Checks in Primary Care for Adults with Intellectual Disabilities: How Extensive Should They Be?

    ERIC Educational Resources Information Center

    Chauhan, U.; Kontopantelis, E.; Campbell, S.; Jarrett, H.; Lester, H.

    2010-01-01

    Background: Routine health checks have gained prominence as a way of detecting unmet need in primary care for adults with intellectual disabilities (ID) and general practitioners are being incentivised in the UK to carry out health checks for many conditions through an incentivisation scheme known as the Quality and Outcomes Framework (QOF).…

  12. Design and Checking Analysis of Injection Mold for a Plastic Cup

    NASA Astrophysics Data System (ADS)

    Li, Xuebing

    2018-03-01

    A special injection mold was designed for the structural characteristics of a plastic cup part. The mold was simulated by Moldflow software and verified by calculating the stripping force, the pulling force and the clamping force of the mold so that to determine the appropriate injection parameters. It has been proved that the injection mold is effective and practical in the actual producing and can meet the quality requirements during the course of using it, which solved some problems for injection molding of this kind of parts and can provide some reference for the production of other products in the same industry.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fakir, H.; Gaede, S.; Mulligan, M.

    Purpose: To design a versatile, nonhomogeneous insert for the dose verification phantom ArcCHECK{sup Trade-Mark-Sign} (Sun Nuclear Corp., FL) and to demonstrate its usefulness for the verification of dose distributions in inhomogeneous media. As an example, we demonstrate it can be used clinically for routine quality assurance of two volumetric modulated arc therapy (VMAT) systems for lung stereotactic body radiation therapy (SBRT): SmartArc{sup Registered-Sign} (Pinnacle{sup 3}, Philips Radiation Oncology Systems, Fitchburg, WI) and RapidArc{sup Registered-Sign} (Eclipse{sup Trade-Mark-Sign }, Varian Medical Systems, Palo Alto, CA). Methods: The cylindrical detector array ArcCHECK{sup Trade-Mark-Sign} has a retractable homogeneous acrylic insert. In this work, wemore » designed and manufactured a customized heterogeneous insert with densities that simulate soft tissue, lung, bone, and air. The insert offers several possible heterogeneity configurations and multiple locations for point dose measurements. SmartArc{sup Registered-Sign} and RapidArc{sup Registered-Sign} plans for lung SBRT were generated and copied to ArcCHECK{sup Trade-Mark-Sign} for each inhomogeneity configuration. Dose delivery was done on a Varian 2100 ix linac. The evaluation of dose distributions was based on gamma analysis of the diode measurements and point doses measurements at different positions near the inhomogeneities. Results: The insert was successfully manufactured and tested with different measurements of VMAT plans. Dose distributions measured with the homogeneous insert showed gamma passing rates similar to our clinical results ({approx}99%) for both treatment-planning systems. Using nonhomogeneous inserts decreased the passing rates by up to 3.6% in the examples studied. Overall, SmartArc{sup Registered-Sign} plans showed better gamma passing rates for nonhomogeneous measurements. The discrepancy between calculated and measured point doses was increased up to 6.5% for the nonhomogeneous insert depending on the inhomogeneity configuration and measurement location. SmartArc{sup Registered-Sign} and RapidArc{sup Registered-Sign} plans had similar plan quality but RapidArc{sup Registered-Sign} plans had significantly higher monitor units (up to 70%). Conclusions: A versatile, nonhomogeneous insert was developed for ArcCHECK{sup Trade-Mark-Sign} for an easy and quick evaluation of dose calculations with nonhomogeneous media and for comparison of different treatment planning systems. The device was tested for SmartArc{sup Registered-Sign} and RapidArc{sup Registered-Sign} plans for lung SBRT, showing the uncertainties of dose calculations with inhomogeneities. The new insert combines the convenience of the ArcCHECK{sup Trade-Mark-Sign} and the possibility of assessing dose distributions in inhomogeneous media.« less

  14. Sci-Fin: Visual Mining Spatial and Temporal Behavior Features from Social Media

    PubMed Central

    Pu, Jiansu; Teng, Zhiyao; Gong, Rui; Wen, Changjiang; Xu, Yang

    2016-01-01

    Check-in records are usually available in social services, which offer us the opportunity to capture and analyze users’ spatial and temporal behaviors. Mining such behavior features is essential to social analysis and business intelligence. However, the complexity and incompleteness of check-in records bring challenges to achieve such a task. Different from the previous work on social behavior analysis, in this paper, we present a visual analytics system, Social Check-in Fingerprinting (Sci-Fin), to facilitate the analysis and visualization of social check-in data. We focus on three major components of user check-in data: location, activity, and profile. Visual fingerprints for location, activity, and profile are designed to intuitively represent the high-dimensional attributes. To visually mine and demonstrate the behavior features, we integrate WorldMapper and Voronoi Treemap into our glyph-like designs. Such visual fingerprint designs offer us the opportunity to summarize the interesting features and patterns from different check-in locations, activities and users (groups). We demonstrate the effectiveness and usability of our system by conducting extensive case studies on real check-in data collected from a popular microblogging service. Interesting findings are reported and discussed at last. PMID:27999398

  15. Sci-Fin: Visual Mining Spatial and Temporal Behavior Features from Social Media.

    PubMed

    Pu, Jiansu; Teng, Zhiyao; Gong, Rui; Wen, Changjiang; Xu, Yang

    2016-12-20

    Check-in records are usually available in social services, which offer us the opportunity to capture and analyze users' spatial and temporal behaviors. Mining such behavior features is essential to social analysis and business intelligence. However, the complexity and incompleteness of check-in records bring challenges to achieve such a task. Different from the previous work on social behavior analysis, in this paper, we present a visual analytics system, Social Check-in Fingerprinting (Sci-Fin), to facilitate the analysis and visualization of social check-in data. We focus on three major components of user check-in data: location, activity, and profile. Visual fingerprints for location, activity, and profile are designed to intuitively represent the high-dimensional attributes. To visually mine and demonstrate the behavior features, we integrate WorldMapper and Voronoi Treemap into our glyph-like designs. Such visual fingerprint designs offer us the opportunity to summarize the interesting features and patterns from different check-in locations, activities and users (groups). We demonstrate the effectiveness and usability of our system by conducting extensive case studies on real check-in data collected from a popular microblogging service. Interesting findings are reported and discussed at last.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Covington, E; Younge, K; Chen, X

    Purpose: To evaluate the effectiveness of an automated plan check tool to improve first-time plan quality as well as standardize and document performance of physics plan checks. Methods: The Plan Checker Tool (PCT) uses the Eclipse Scripting API to check and compare data from the treatment planning system (TPS) and treatment management system (TMS). PCT was created to improve first-time plan quality, reduce patient delays, increase efficiency of our electronic workflow, and to standardize and partially automate plan checks in the TPS. A framework was developed which can be configured with different reference values and types of checks. One examplemore » is the prescribed dose check where PCT flags the user when the planned dose and the prescribed dose disagree. PCT includes a comprehensive checklist of automated and manual checks that are documented when performed by the user. A PDF report is created and automatically uploaded into the TMS. Prior to and during PCT development, errors caught during plan checks and also patient delays were tracked in order to prioritize which checks should be automated. The most common and significant errors were determined. Results: Nineteen of 33 checklist items were automated with data extracted with the PCT. These include checks for prescription, reference point and machine scheduling errors which are three of the top six causes of patient delays related to physics and dosimetry. Since the clinical roll-out, no delays have been due to errors that are automatically flagged by the PCT. Development continues to automate the remaining checks. Conclusion: With PCT, 57% of the physics plan checklist has been partially or fully automated. Treatment delays have declined since release of the PCT for clinical use. By tracking delays and errors, we have been able to measure the effectiveness of automating checks and are using this information to prioritize future development. This project was supported in part by P01CA059827.« less

  17. 31 CFR 235.3 - Settlement of claims.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., DEPARTMENT OF THE TREASURY FINANCIAL MANAGEMENT SERVICE ISSUANCE OF SETTLEMENT CHECKS FOR FORGED CHECKS DRAWN... respect to a check drawn on designated depositaries of the United States, in dollars or in foreign...

  18. Edited Synoptic Cloud Reports from Ships and Land Stations Over the Globe, 1982-1991 (NDP-026B)

    DOE Data Explorer

    Hahn, Carole J. [University of Arizona; Warren, Stephen G. [University of Washington; London, Julius [University of Colorado

    1996-01-01

    Surface synoptic weather reports for the entire globe for the 10-year period from December 1981 through November 1991 have been processed, edited, and rewritten to provide a data set designed for use in cloud analyses. The information in these reports relating to clouds, including the present weather information, was extracted and put through a series of quality control checks. Reports not meeting certain quality control standards were rejected, as were reports from buoys and automatic weather stations. Correctable inconsistencies within reports were edited for consistency, so that the "edited cloud report" can be used for cloud analysis without further quality checking. Cases of "sky obscured" were interpreted by reference to the present weather code as to whether they indicated fog, rain or snow and were given appropriate cloud type designations. Nimbostratus clouds, which are not specifically coded for in the standard synoptic code, were also given a special designation. Changes made to an original report are indicated in the edited report so that the original report can be reconstructed if desired. While low cloud amount is normally given directly in the synoptic report, the edited cloud report also includes the amounts, either directly reported or inferred, of middle and high clouds, both the non-overlapped amounts and the "actual" amounts (which may be overlapped). Since illumination from the moon is important for the adequate detection of clouds at night, both the relative lunar illuminance and the solar altitude are given, as well as a parameter that indicates whether our recommended illuminance criterion was satisfied. This data set contains 124 million reports from land stations and 15 million reports from ships. Each report is 56 characters in length. The archive consists of 240 files, one file for each month of data for land and ocean separately. With this data set a user can develop a climatology for any particular cloud type or group of types, for any geographical region and any spatial and temporal resolution desired.

  19. Improving Quality of Shoe Soles Product using Six Sigma

    NASA Astrophysics Data System (ADS)

    Jesslyn Wijaya, Athalia; Trusaji, Wildan; Akbar, Muhammad; Ma’ruf, Anas; Irianto, Dradjad

    2018-03-01

    A manufacture in Bandung produce kind of rubber-based product i.e. trim, rice rollers, shoe soles, etc. After penetrating the shoe soles market, the manufacture has met customer with tight quality control. Based on the past data, defect level of this product was 18.08% that caused the manufacture’s loss of time and money. Quality improvement effort was done using six sigma method that included phases of define, measure, analyse, improve, and control (DMAIC). In the design phase, the object’s problem and definition were defined. Delphi method was also used in this phase to identify critical factors. In the measure phase, the existing process stability and sigma quality level were measured. Fishbone diagram and failure mode and effect analysis (FMEA) were used in the next phase to analyse the root cause and determine the priority issues. Improve phase was done by designing alternative improvement strategy using 5W1H method. Some improvement efforts were identified, i.e. (i) modifying design of the hanging rack, (ii) create pantone colour book and check sheet, (iii) provide pedestrian line at compound department, (iv) buying stop watch, and (v) modifying shoe soles dies. Some control strategies for continuous improvement were proposed such as SOP or reward and punishment system.

  20. Design Considerations for Human Rating of Liquid Rocket Engines

    NASA Technical Reports Server (NTRS)

    Parkinson, Douglas

    2010-01-01

    I.Human-rating is specific to each engine; a. Context of program/project must be understood. b. Engine cannot be discussed independently from vehicle and mission. II. Utilize a logical combination of design, manufacturing, and test approaches a. Design 1) It is crucial to know the potential ways a system can fail, and how a failure can propagate; 2) Fault avoidance, fault tolerance, DFMR, caution and warning all have roles to play. b. Manufacturing and Assembly; 1) As-built vs. as-designed; 2) Review procedures for assembly and maintenance periodically; and 3) Keep personnel trained and certified. c. There is no substitute for test: 1) Analytical tools are constantly advancing, but still need test data for anchoring assumptions; 2) Demonstrate robustness and explore sensitivities; 3) Ideally, flight will be encompassed by ground test experience. III. Consistency and repeatability is key in production a. Maintain robust processes and procedures for inspection and quality control based upon development and qualification experience; b. Establish methods to "spot check" quality and consistency in parts: 1) Dedicated ground test engines; 2) Random components pulled from the line/lot to go through "enhanced" testing.

  1. The role of principal in optimizing school climate in primary schools

    NASA Astrophysics Data System (ADS)

    Murtedjo; Suharningsih

    2018-01-01

    This article was written based on the occurrence of elementary school changes that never counted because of the low quality, became the school of choice of the surrounding community with the many national achievements ever achieved. This article is based on research data conducted in primary schools. In this paper focused on the role of school principals in an effort to optimize school climate. To describe the principal’s role in optimizing school climate using a qualitative approach to the design of Multi-Site Study. The appointment of the informant was done by snowball technique. Data collection through in-depth interviews, participant observation, and documentation. Data credibility checking uses triangulation techniques, member checks, and peer discussions. Auditability is performed by the auditor. The collected data is analyzed by site analysis and cross-site analysis. The result of the research shows that the principal in optimizing the conducive school climate by creating the physical condition of the school and the socio-emotional condition is pleasant, so that the teachers in implementing the learning process become passionate, happy learners which ultimately improve their learning achievement and can improve the school quality.

  2. The role of hospital managers in quality and patient safety: a systematic review

    PubMed Central

    Parand, Anam; Dopson, Sue; Renz, Anna; Vincent, Charles

    2014-01-01

    Objectives To review the empirical literature to identify the activities, time spent and engagement of hospital managers in quality of care. Design A systematic review of the literature. Methods A search was carried out on the databases MEDLINE, PSYCHINFO, EMBASE, HMIC. The search strategy covered three facets: management, quality of care and the hospital setting comprising medical subject headings and key terms. Reviewers screened 15 447 titles/abstracts and 423 full texts were checked against inclusion criteria. Data extraction and quality assessment were performed on 19 included articles. Results The majority of studies were set in the USA and investigated Board/senior level management. The most common research designs were interviews and surveys on the perceptions of managerial quality and safety practices. Managerial activities comprised strategy, culture and data-centred activities, such as driving improvement culture and promotion of quality, strategy/goal setting and providing feedback. Significant positive associations with quality included compensation attached to quality, using quality improvement measures and having a Board quality committee. However, there is an inconsistency and inadequate employment of these conditions and actions across the sample hospitals. Conclusions There is some evidence that managers’ time spent and work can influence quality and safety clinical outcomes, processes and performance. However, there is a dearth of empirical studies, further weakened by a lack of objective outcome measures and little examination of actual actions undertaken. We present a model to summarise the conditions and activities that affect quality performance. PMID:25192876

  3. Intramural Comparison of NIST Laser and Optical Fiber Power Calibrations.

    PubMed

    Lehman, John H; Vayshenker, Igor; Livigni, David J; Hadler, Joshua

    2004-01-01

    The responsivity of two optical detectors was determined by the method of direct substitution in four different NIST measurement facilities. The measurements were intended to demonstrate the determination of absolute responsivity as provided by NIST calibration services at laser and optical-communication wavelengths; nominally 633 nm, 850 nm, 1060 nm, 1310 nm, and 1550 nm. The optical detectors have been designated as checks standards for the purpose of routine intramural comparison of our calibration services and to meet requirements of the NIST quality system, based on ISO 17025. The check standards are two optical-trap detectors, one based on silicon and the other on indium gallium arsenide photodiodes. The four measurement services are based on: (1) the laser optimized cryogenic radiometer (LOCR) and free field collimated laser light; (2) the C-series isoperibol calorimeter and free-field collimated laser light; (3) the electrically calibrated pyroelectric radiometer and fiber-coupled laser light; (4) the pyroelectric wedge trap detector, which measures light from a lamp source and monochromator. The results indicate that the responsivity of the check standards, as determined independently using the four services, agree to within the published expanded uncertainty ranging from approximately 0.02 % to 1.24 %.

  4. Impact of dose calibrators quality control programme in Argentina

    NASA Astrophysics Data System (ADS)

    Furnari, J. C.; de Cabrejas, M. L.; del C. Rotta, M.; Iglicki, F. A.; Milá, M. I.; Magnavacca, C.; Dima, J. C.; Rodríguez Pasqués, R. H.

    1992-02-01

    The national Quality Control (QC) programme for radionuclide calibrators started 12 years ago. Accuracy and the implementation of a QC programme were evaluated over all these years at 95 nuclear medicine laboratories where dose calibrators were in use. During all that time, the Metrology Group of CNEA has distributed 137Cs sealed sources to check stability and has been performing periodic "checking rounds" and postal surveys using unknown samples (external quality control). An account of the results of both methods is presented. At present, more of 65% of the dose calibrators measure activities with an error less than 10%.

  5. The influence of lathe check depth and orientation on the bond quality of phenol-formaldehyde-bonded birch plywood

    Treesearch

    Anti Rohumaa; Christopher G. Hunt; Mark Hughes; Charles R. Frihart; Janne Logren

    2013-01-01

    During the rotary peeling of veneer for plywood or the laminated veneer lumber manufacture, checks are formed in the veneer that are as deep as 70 – 80 % of the veneer thickness. The results of this study show that, during adhesive bond testing, deep lathe checks in birch (Betula pendula Roth.) veneer significantly reduce the shear strength and the...

  6. Quality Control of Meteorological Observations

    NASA Technical Reports Server (NTRS)

    Collins, William; Dee, Dick; Rukhovets, Leonid

    1999-01-01

    For the first time, a problem of the meteorological observation quality control (QC) was formulated by L.S. Gandin at the Main Geophysical Observatory in the 70's. Later in 1988 L.S. Gandin began adapting his ideas in complex quality control (CQC) to the operational environment at the National Centers for Environmental Prediction. The CQC was first applied by L.S.Gandin and his colleagues to detection and correction of errors in rawinsonde heights and temperatures using a complex of hydrostatic residuals.Later, a full complex of residuals, vertical and horizontal optimal interpolations and baseline checks were added for the checking and correction of a wide range of meteorological variables. some other of Gandin's ideas were applied and substantially developed at other meteorological centers. A new statistical QC was recently implemented in the Goddard Data Assimilation System. The central component of any quality control is a buddy check which is a test of individual suspect observations against available nearby non-suspect observations. A novel feature of this test is that the error variances which are used for QC decision are re-estimated on-line. As a result, the allowed tolerances for suspect observations can depend on local atmospheric conditions. The system is then better able to accept extreme values observed in deep cyclones, jet streams and so on. The basic statements of this adaptive buddy check are described. Some results of the on-line QC including moisture QC are presented.

  7. The Value of clean water: The public's willingness to pay for boatable, fishable, and swimmable quality water

    NASA Astrophysics Data System (ADS)

    Carson, Richard T.; Mitchell, Robert Cameron

    1993-07-01

    This paper presents the findings of a study designed to determine the national benefits of freshwater pollution control. By using data from a national contingent valuation survey, we estimate the aggregate benefits of meeting the goals of the Clean Water Act. A valuation function is estimated which depicts willingness to pay as a function of water quality, income, and other variables. Several validation checks and tests for specific biases are performed, and the benefit estimates are corrected for missing and invalid responses. The two major policy implications from our work are that the benefits and costs of water pollution control efforts are roughly equal and that many of the new policy actions necessary to ensure that all water bodies reach at least a swimmable quality level will not have positive net benefits.

  8. Helping You Choose Quality Ambulatory Care

    MedlinePlus

    Helping you choose: Quality ambulatory care When you need ambulatory care, you should find out some information to help you choose the best ... the center follows rules for patient safety and quality. Go to Quality Check ® at www. qualitycheck. org ...

  9. Helping You Choose Quality Hospice Care

    MedlinePlus

    Helping you choose: Quality hospice care When you need hospice care, you should find out some information to help you choose the best ... the service follows rules for patient safety and quality. Go to Quality Check ® at www. qualitycheck. org ...

  10. Using a Changing-Criterion Design to Evaluate the Effects of Check-In/Check-Out with Goal Modification

    ERIC Educational Resources Information Center

    McDaniel, Sara C.; Bruhn, Allison L.

    2016-01-01

    Check-in/check-out (CICO) is a Tier 2 behavioral intervention that has demonstrated effectiveness for students with challenging behavior in a variety of educational settings. Existing research has focused primarily on testing the intervention's effectiveness and the role of behavioral function in moderating response to intervention. Only a handful…

  11. VARED: Verification and Analysis of Requirements and Early Designs

    NASA Technical Reports Server (NTRS)

    Badger, Julia; Throop, David; Claunch, Charles

    2014-01-01

    Requirements are a part of every project life cycle; everything going forward in a project depends on them. Good requirements are hard to write, there are few useful tools to test, verify, or check them, and it is difficult to properly marry them to the subsequent design, especially if the requirements are written in natural language. In fact, the inconsistencies and errors in the requirements along with the difficulty in finding these errors contribute greatly to the cost of the testing and verification stage of flight software projects [1]. Large projects tend to have several thousand requirements written at various levels by different groups of people. The design process is distributed and a lack of widely accepted standards for requirements often results in a product that varies widely in style and quality. A simple way to improve this would be to standardize the design process using a set of tools and widely accepted requirements design constraints. The difficulty with this approach is finding the appropriate constraints and tools. Common complaints against the tools available include ease of use, functionality, and available features. Also, although preferable, it is rare that these tools are capable of testing the quality of the requirements.

  12. Technical Note: Development and performance of a software tool for quality assurance of online replanning with a conventional Linac or MR-Linac.

    PubMed

    Chen, Guang-Pei; Ahunbay, Ergun; Li, X Allen

    2016-04-01

    To develop an integrated quality assurance (QA) software tool for online replanning capable of efficiently and automatically checking radiation treatment (RT) planning parameters and gross plan quality, verifying treatment plan data transfer from treatment planning system (TPS) to record and verify (R&V) system, performing a secondary monitor unit (MU) calculation with or without a presence of a magnetic field from MR-Linac, and validating the delivery record consistency with the plan. The software tool, named ArtQA, was developed to obtain and compare plan and treatment parameters from both the TPS and the R&V system database. The TPS data are accessed via direct file reading and the R&V data are retrieved via open database connectivity and structured query language. Plan quality is evaluated with both the logical consistency of planning parameters and the achieved dose-volume histograms. Beams in between the TPS and R&V system are matched based on geometry configurations. To consider the effect of a 1.5 T transverse magnetic field from MR-Linac in the secondary MU calculation, a method based on modified Clarkson integration algorithm was developed and tested for a series of clinical situations. ArtQA has been used in their clinic and can quickly detect inconsistencies and deviations in the entire RT planning process. With the use of the ArtQA tool, the efficiency for plan check including plan quality, data transfer, and delivery check can be improved by at least 60%. The newly developed independent MU calculation tool for MR-Linac reduces the difference between the plan and calculated MUs by 10%. The software tool ArtQA can be used to perform a comprehensive QA check from planning to delivery with conventional Linac or MR-Linac and is an essential tool for online replanning where the QA check needs to be performed rapidly.

  13. Technical Note: Development and performance of a software tool for quality assurance of online replanning with a conventional Linac or MR-Linac

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Guang-Pei, E-mail: gpchen@mcw.edu; Ahunbay, Ergun; Li, X. Allen

    Purpose: To develop an integrated quality assurance (QA) software tool for online replanning capable of efficiently and automatically checking radiation treatment (RT) planning parameters and gross plan quality, verifying treatment plan data transfer from treatment planning system (TPS) to record and verify (R&V) system, performing a secondary monitor unit (MU) calculation with or without a presence of a magnetic field from MR-Linac, and validating the delivery record consistency with the plan. Methods: The software tool, named ArtQA, was developed to obtain and compare plan and treatment parameters from both the TPS and the R&V system database. The TPS data aremore » accessed via direct file reading and the R&V data are retrieved via open database connectivity and structured query language. Plan quality is evaluated with both the logical consistency of planning parameters and the achieved dose–volume histograms. Beams in between the TPS and R&V system are matched based on geometry configurations. To consider the effect of a 1.5 T transverse magnetic field from MR-Linac in the secondary MU calculation, a method based on modified Clarkson integration algorithm was developed and tested for a series of clinical situations. Results: ArtQA has been used in their clinic and can quickly detect inconsistencies and deviations in the entire RT planning process. With the use of the ArtQA tool, the efficiency for plan check including plan quality, data transfer, and delivery check can be improved by at least 60%. The newly developed independent MU calculation tool for MR-Linac reduces the difference between the plan and calculated MUs by 10%. Conclusions: The software tool ArtQA can be used to perform a comprehensive QA check from planning to delivery with conventional Linac or MR-Linac and is an essential tool for online replanning where the QA check needs to be performed rapidly.« less

  14. Use of check lists in assessing the statistical content of medical studies.

    PubMed Central

    Gardner, M J; Machin, D; Campbell, M J

    1986-01-01

    Two check lists are used routinely in the statistical assessment of manuscripts submitted to the "BMJ." One is for papers of a general nature and the other specifically for reports on clinical trials. Each check list includes questions on the design, conduct, analysis, and presentation of studies, and answers to these contribute to the overall statistical evaluation. Only a small proportion of submitted papers are assessed statistically, and these are selected at the refereeing or editorial stage. Examination of the use of the check lists showed that most papers contained statistical failings, many of which could easily be remedied. It is recommended that the check lists should be used by statistical referees, editorial staff, and authors and also during the design stage of studies. PMID:3082452

  15. Variations in Daily Sleep Quality and Type 1 Diabetes Management in Late Adolescents

    PubMed Central

    Queen, Tara L.; Butner, Jonathan; Wiebe, Deborah; Berg, Cynthia A.

    2016-01-01

    Objective To determine how between- and within-person variability in perceived sleep quality were associated with adolescent diabetes management. Methods A total of 236 older adolescents with type 1 diabetes reported daily for 2 weeks on sleep quality, self-regulatory failures, frequency of blood glucose (BG) checks, and BG values. Average, inconsistent, and daily deviations in sleep quality were examined. Results Hierarchical linear models indicated that poorer average and worse daily perceived sleep quality (compared with one’s average) was each associated with more self-regulatory failures. Sleep quality was not associated with frequency of BG checking. Poorer average sleep quality was related to greater risk of high BG. Furthermore, inconsistent and daily deviations in sleep quality interacted to predict higher BG, with more consistent sleepers benefitting more from a night of high-quality sleep. Conclusions Good, consistent sleep quality during late adolescence may benefit diabetes management by reducing self-regulatory failures and risk of high BG. PMID:26994852

  16. Austrian Daily Climate Data Rescue and Quality Control

    NASA Astrophysics Data System (ADS)

    Jurkovic, A.; Lipa, W.; Adler, S.; Albenberger, J.; Lechner, W.; Swietli, R.; Vossberg, I.; Zehetner, S.

    2010-09-01

    Checked climate datasets are a "conditio sine qua non" for all projects that are relevant for environment and climate. In the framework of climate change studies and analysis it is essential to work with quality controlled and trustful data. Furthermore these datasets are used as input for various simulation models. In regard to investigations of extreme events, like strong precipitation periods, drought periods and similar ones we need climate data in high temporal resolution (at least in daily resolution). Because of the historical background - during Second World War the majority of our climate sheets were sent to Berlin, where the historical sheets were destroyed by a bomb attack and so important information got lost - only several climate sheets, mostly duplicates, before 1939 are available and stored in our climate data archive. In 1970 the Central Institute for Meteorology and Geodynamics in Vienna started a first attempt to digitize climate data by means of punch cards. With the introduction of a routinely climate data quality control in 1984 we can speak of high-class-checked daily data (finally checked data, quality flag 6). Our group is working on the processing of digitization and quality control of the historical data for the period 1872 to 1983 for 18 years. Since 2007 it was possible to intensify the work (processes) in the framework of an internal project, namely Austrian Climate Data Rescue and Quality Control. The aim of this initiative was - and still is - to supply daily data in an outstanding good and uniform quality. So this project is a kind of pre-project for all scientific projects which are working with daily data. In addition to routine quality checks (that are running since 1984) using the commercial Bull Software we are testing our data with additional open source software, namely ProClim.db. By the use of this spatial and statistical test procedure, the elements air temperature and precipitation - for several sites in Carinthia - could already be checked, flagged and corrected. Checking the output (so called- error list) of ProClim is very time consuming and needs trained staff; however, in last instance it is necessary. Due to the guideline "Your archive is your business card for quality" the sub-project NEW ARCHIVE was initialized and started at the end of 2009. Our paper archive contains historical, up to 150 year-old, climate sheets that are valuable cultural assets. Unfortunately the storage of these historical and actual data treasures turned out to be more than suboptimal (insufficient protection against dust, dirt, humidity and light incidence). Because of this fact a concept for a new storage system and archive database was generated and already partly realized. In a nutshell this presentation shows on the one hand the importance of recovering historical climate sheets for climate change research - even if it is exhausting and time consuming - and gives on the other hand a general overview of used quality control procedures at our institute.

  17. Lot quality assurance sampling of sputum acid-fast bacillus smears for assessing sputum smear microscopy centers.

    PubMed

    Selvakumar, N; Murthy, B N; Prabhakaran, E; Sivagamasundari, S; Vasanthan, Samuel; Perumal, M; Govindaraju, R; Chauhan, L S; Wares, Fraser; Santha, T; Narayanan, P R

    2005-02-01

    Assessment of 12 microscopy centers in a tuberculosis unit by blinded checking of eight sputum smears selected by using a lot quality assurance sampling (LQAS) method and by unblinded checking of all positive and five negative slides, among the slides examined in a month in a microscopy centre, revealed that the LQAS method can be implemented in the field to monitor the performance of acid-fast bacillus microscopy centers in national tuberculosis control programs.

  18. Lot Quality Assurance Sampling of Sputum Acid-Fast Bacillus Smears for Assessing Sputum Smear Microscopy Centers

    PubMed Central

    Selvakumar, N.; Murthy, B. N.; Prabhakaran, E.; Sivagamasundari, S.; Vasanthan, Samuel; Perumal, M.; Govindaraju, R.; Chauhan, L. S.; Wares, Fraser; Santha, T.; Narayanan, P. R.

    2005-01-01

    Assessment of 12 microscopy centers in a tuberculosis unit by blinded checking of eight sputum smears selected by using a lot quality assurance sampling (LQAS) method and by unblinded checking of all positive and five negative slides, among the slides examined in a month in a microscopy centre, revealed that the LQAS method can be implemented in the field to monitor the performance of acid-fast bacillus microscopy centers in national tuberculosis control programs. PMID:15695704

  19. Lake water quality mapping from Landsat

    NASA Technical Reports Server (NTRS)

    Scherz, J. P.

    1977-01-01

    In the project described remote sensing was used to check the quality of lake waters. The lakes of three Landsat scenes were mapped with the Bendix MDAS multispectral analysis system. From the MDAS color coded maps, the lake with the worst algae problem was easily located. The lake was closely checked, and the presence of 100 cows in the springs which fed the lake could be identified as the pollution source. The laboratory and field work involved in the lake classification project is described.

  20. STS-34 onboard view of iodine comparator assembly used to check water quality

    NASA Image and Video Library

    1989-10-23

    STS034-10-014 (18-23 Oct. 1989) --- An onboard 35mm camera provides a closeup view of an STS-34 beverage container doubling as an experiment module for a test involving iodine concentration in onboard water. The examination called for the adding of starch to a specimen of Atlantis' fuel-cell produced water. The liquid was then compared against the color chart for determining the degree of iodine content. The experiment was designed by Terry H. Slezak of JSC's Photographic Technology and Television Division.

  1. Real-time simulation model of the HL-20 lifting body

    NASA Technical Reports Server (NTRS)

    Jackson, E. Bruce; Cruz, Christopher I.; Ragsdale, W. A.

    1992-01-01

    A proposed manned spacecraft design, designated the HL-20, has been under investigation at Langley Research Center. Included in that investigation are flight control design and flying qualities studies utilizing a man-in-the-loop real-time simulator. This report documents the current real-time simulation model of the HL-20 lifting body vehicle, known as version 2.0, presently in use at NASA Langley Research Center. Included are data on vehicle aerodynamics, inertias, geometries, guidance and control laws, and cockpit displays and controllers. In addition, trim case and dynamic check case data is provided. The intent of this document is to provide the reader with sufficient information to develop and validate an equivalent simulation of the HL-20 for use in real-time or analytical studies.

  2. Member Checking: A Tool to Enhance Trustworthiness or Merely a Nod to Validation?

    PubMed

    Birt, Linda; Scott, Suzanne; Cavers, Debbie; Campbell, Christine; Walter, Fiona

    2016-06-22

    The trustworthiness of results is the bedrock of high quality qualitative research. Member checking, also known as participant or respondent validation, is a technique for exploring the credibility of results. Data or results are returned to participants to check for accuracy and resonance with their experiences. Member checking is often mentioned as one in a list of validation techniques. This simplistic reporting might not acknowledge the value of using the method, nor its juxtaposition with the interpretative stance of qualitative research. In this commentary, we critique how member checking has been used in published research, before describing and evaluating an innovative in-depth member checking technique, Synthesized Member Checking. The method was used in a study with patients diagnosed with melanoma. Synthesized Member Checking addresses the co-constructed nature of knowledge by providing participants with the opportunity to engage with, and add to, interview and interpreted data, several months after their semi-structured interview. © The Author(s) 2016.

  3. 42 CFR 493.1254 - Standard: Maintenance and function checks.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Standard: Maintenance and function checks. 493.1254 Section 493.1254 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STANDARDS AND CERTIFICATION LABORATORY REQUIREMENTS Quality System for Nonwaived...

  4. An Adaptive Buddy Check for Observational Quality Control

    NASA Technical Reports Server (NTRS)

    Dee, Dick P.; Rukhovets, Leonid; Todling, Ricardo; DaSilva, Arlindo M.; Larson, Jay W.; Einaudi, Franco (Technical Monitor)

    2000-01-01

    An adaptive buddy check algorithm is presented that adjusts tolerances for outlier observations based on the variability of surrounding data. The algorithm derives from a statistical hypothesis test combined with maximum-likelihood covariance estimation. Its stability is shown to depend on the initial identification of outliers by a simple background check. The adaptive feature ensures that the final quality control decisions are not very sensitive to prescribed statistics of first-guess and observation errors, nor on other approximations introduced into the algorithm. The implementation of the algorithm in a global atmospheric data assimilation is described. Its performance is contrasted with that of a non-adaptive buddy check, for the surface analysis of an extreme storm that took place in Europe on 27 December 1999. The adaptive algorithm allowed the inclusion of many important observations that differed greatly from the first guess and that would have been excluded on the basis of prescribed statistics. The analysis of the storm development was much improved as a result of these additional observations.

  5. Systematic Review of the Check-In, Check-Out Intervention for Students at Risk for Emotional and Behavioral Disorders

    ERIC Educational Resources Information Center

    Hawken, Leanne S.; Bundock, Kaitlin; Kladis, Kristin; O'Keeffe, Breda; Barret, Courtenay A.

    2014-01-01

    The purpose of this systematic literature review was to summarize outcomes of the Check-in Check-out (CICO) intervention across elementary and secondary settings. Twenty-eight studies utilizing both single subject and group (experimental and quasi-experimental) designs were included in this review. Median effect sizes across the eight group…

  6. The Effects of Check-In/Check-Out on Problem Behavior and Academic Engagement in Elementary School Students

    ERIC Educational Resources Information Center

    Miller, Leila M.; Dufrene, Brad A.; Sterling, Heather E.; Olmi, D. Joe; Bachmayer, Erica

    2015-01-01

    This study evaluated the effectiveness of Check-in/Check-out (CICO) for improving behavioral performance for three students referred for Tier 2 behavioral supports. An ABAB withdrawal design was used to evaluate CICO and results indicate that intervention was effective for reducing problem behavior as well as increasing academic engagement for all…

  7. Saving Material with Systematic Process Designs

    NASA Astrophysics Data System (ADS)

    Kerausch, M.

    2011-08-01

    Global competition is forcing the stamping industry to further increase quality, to shorten time-to-market and to reduce total cost. Continuous balancing between these classical time-cost-quality targets throughout the product development cycle is required to ensure future economical success. In today's industrial practice, die layout standards are typically assumed to implicitly ensure the balancing of company specific time-cost-quality targets. Although die layout standards are a very successful approach, there are two methodical disadvantages. First, the capabilities for tool design have to be continuously adapted to technological innovations; e.g. to take advantage of the full forming capability of new materials. Secondly, the great variety of die design aspects have to be reduced to a generic rule or guideline; e.g. binder shape, draw-in conditions or the use of drawbeads. Therefore, it is important to not overlook cost or quality opportunities when applying die design standards. This paper describes a systematic workflow with focus on minimizing material consumption. The starting point of the investigation is a full process plan for a typical structural part. All requirements are definedaccording to a predefined set of die design standards with industrial relevance are fulfilled. In a first step binder and addendum geometry is systematically checked for material saving potentials. In a second step, blank shape and draw-in are adjusted to meet thinning, wrinkling and springback targets for a minimum blank solution. Finally the identified die layout is validated with respect to production robustness versus splits, wrinkles and springback. For all three steps the applied methodology is based on finite element simulation combined with a stochastical variation of input variables. With the proposed workflow a well-balanced (time-cost-quality) production process assuring minimal material consumption can be achieved.

  8. 31 CFR 515.405 - Exportation of securities, currency, checks, drafts and promissory notes.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., checks, drafts and promissory notes. 515.405 Section 515.405 Money and Finance: Treasury Regulations..., drafts and promissory notes. Section 515.201 prohibits the exportation of securities, currency, checks, drafts and promissory notes to a designated foreign country. ...

  9. 31 CFR 515.405 - Exportation of securities, currency, checks, drafts and promissory notes.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ..., checks, drafts and promissory notes. 515.405 Section 515.405 Money and Finance: Treasury Regulations..., drafts and promissory notes. Section 515.201 prohibits the exportation of securities, currency, checks, drafts and promissory notes to a designated foreign country. ...

  10. 31 CFR 515.405 - Exportation of securities, currency, checks, drafts and promissory notes.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ..., checks, drafts and promissory notes. 515.405 Section 515.405 Money and Finance: Treasury Regulations..., drafts and promissory notes. Section 515.201 prohibits the exportation of securities, currency, checks, drafts and promissory notes to a designated foreign country. ...

  11. 31 CFR 515.405 - Exportation of securities, currency, checks, drafts and promissory notes.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., checks, drafts and promissory notes. 515.405 Section 515.405 Money and Finance: Treasury Regulations..., drafts and promissory notes. Section 515.201 prohibits the exportation of securities, currency, checks, drafts and promissory notes to a designated foreign country. ...

  12. 31 CFR 515.405 - Exportation of securities, currency, checks, drafts and promissory notes.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., checks, drafts and promissory notes. 515.405 Section 515.405 Money and Finance: Treasury Regulations..., drafts and promissory notes. Section 515.201 prohibits the exportation of securities, currency, checks, drafts and promissory notes to a designated foreign country. ...

  13. 40 CFR 51.359 - Quality control.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 2 2014-07-01 2014-07-01 false Quality control. 51.359 Section 51.359 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS REQUIREMENTS FOR... to assure test accuracy. Computer control of quality assurance checks and quality control charts...

  14. [Advanced information technologies for financial services industry]. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    The project scope is to develop an advanced user interface utilizing speech and/or handwriting recognition technology that will improve the accuracy and speed of recording transactions in the dynamic environment of a foreign exchange (FX) trading floor. The project`s desired result is to improve the base technology for trader`s workstations on FX trading floors. Improved workstation effectiveness will allow vast amounts of complex information and events to be presented and analyzed, thus increasing the volume of money and other assets to be exchanged at an accelerated rate. The project scope is to develop and demonstrate technologies that advance interbank checkmore » imaging and paper check truncation. The following describes the tasks to be completed: (1) Identify the economics value case, the legal and regulatory issues, the business practices that are affected, and the effects upon settlement. (2) Familiarization with existing imaging technology. Develop requirements for image quality, security, and interoperability. Adapt existing technologies to meet requirements. (3) Define requirements for the imaging laboratory and design its architecture. Integrate and test technology from task 2 with equipment in the laboratory. (4) Develop and/or integrate and test remaining components; includes security, storage, and communications. (5) Build a prototype system and test in a laboratory. Install and run in two or more banks. Develop documentation. Conduct training. The project`s desired result is to enable a proof-of-concept trial in which multiple banks will exchange check images, exhibiting operating conditions which a check experiences as it travels through the payments/clearing system. The trial should demonstrate the adequacy of digital check images instead of paper checks.« less

  15. 77 FR 67344 - Proposed Information Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-09

    ... Criminal History Checks. DATES: Written comments must be submitted to the individual and office listed in... methodology and assumptions used; Enhance the quality, utility, and clarity of the information to be collected... Criminal History Check. CNCS and its grantees must ensure that national service beneficiaries are protected...

  16. Langley Wind Tunnel Data Quality Assurance-Check Standard Results

    NASA Technical Reports Server (NTRS)

    Hemsch, Michael J.; Grubb, John P.; Krieger, William B.; Cler, Daniel L.

    2000-01-01

    A framework for statistical evaluation, control and improvement of wind funnel measurement processes is presented The methodology is adapted from elements of the Measurement Assurance Plans developed by the National Bureau of Standards (now the National Institute of Standards and Technology) for standards and calibration laboratories. The present methodology is based on the notions of statistical quality control (SQC) together with check standard testing and a small number of customer repeat-run sets. The results of check standard and customer repeat-run -sets are analyzed using the statistical control chart-methods of Walter A. Shewhart long familiar to the SQC community. Control chart results are presented for. various measurement processes in five facilities at Langley Research Center. The processes include test section calibration, force and moment measurements with a balance, and instrument calibration.

  17. Microbiological water methods: quality control measures for Federal Clean Water Act and Safe Drinking Water Act regulatory compliance.

    PubMed

    Root, Patsy; Hunt, Margo; Fjeld, Karla; Kundrat, Laurie

    2014-01-01

    Quality assurance (QA) and quality control (QC) data are required in order to have confidence in the results from analytical tests and the equipment used to produce those results. Some AOAC water methods include specific QA/QC procedures, frequencies, and acceptance criteria, but these are considered to be the minimum controls needed to perform a microbiological method successfully. Some regulatory programs, such as those at Code of Federal Regulations (CFR), Title 40, Part 136.7 for chemistry methods, require additional QA/QC measures beyond those listed in the method, which can also apply to microbiological methods. Essential QA/QC measures include sterility checks, reagent specificity and sensitivity checks, assessment of each analyst's capabilities, analysis of blind check samples, and evaluation of the presence of laboratory contamination and instrument calibration and checks. The details of these procedures, their performance frequency, and expected results are set out in this report as they apply to microbiological methods. The specific regulatory requirements of CFR Title 40 Part 136.7 for the Clean Water Act, the laboratory certification requirements of CFR Title 40 Part 141 for the Safe Drinking Water Act, and the International Organization for Standardization 17025 accreditation requirements under The NELAC Institute are also discussed.

  18. Managed aquifer recharge by a check dam to improve the quality of fluoride-rich groundwater: a case study from southern India.

    PubMed

    Gowrisankar, G; Jagadeshan, G; Elango, L

    2017-04-01

    In many regions around the globe, including India, degradation in the quality of groundwater is of great concern. The objective of this investigation is to determine the effect of recharge from a check dam on quality of groundwater in a region of Krishnagiri District of Tamil Nadu State, India. For this study, water samples from 15 wells were periodically obtained and analysed for major ions and fluoride concentrations. The amount of major ions present in groundwater was compared with the drinking water guideline values of the Bureau of Indian Standards. With respect to the sodium and fluoride concentrations, 38% of groundwater samples collected was not suitable for direct use as drinking water. Suitability of water for agricultural use was determined considering the electrical conductivity, sodium adsorption ratio, sodium percentage, permeability index, Wilcox and United States Salinity Laboratory diagrams. The influence of freshwater recharge from the dam is evident as the groundwater in wells nearer to the check dam was suitable for both irrigation and domestic purposes. However, the groundwater away from the dam had a high ionic composition. This study demonstrated that in other fluoride-affected areas, the concentration can be reduced by dilution with the construction of check dams as a measure of managed aquifer recharge.

  19. Comparing a Behavioral Check-In/Check-Out (CICO) Intervention to Standard Practice in an Urban Middle School Setting Using an Experimental Group Design

    ERIC Educational Resources Information Center

    Simonsen, Brandi; Myers, Diane; Briere, Donald E., III

    2011-01-01

    Students who continue to demonstrate at-risk behaviors after a school implements schoolwide primary (Tier 1) interventions require targeted-group secondary (Tier 2) interventions. This study was conducted to compare the effectiveness of a targeted-group behavioral check-in/check-out (CICO) intervention with the school's standard practice (SP) with…

  20. A procedure and program to calculate shuttle mask advantage

    NASA Astrophysics Data System (ADS)

    Balasinski, A.; Cetin, J.; Kahng, A.; Xu, X.

    2006-10-01

    A well-known recipe for reducing mask cost component in product development is to place non-redundant elements of layout databases related to multiple products on one reticle plate [1,2]. Such reticles are known as multi-product, multi-layer, or, in general, multi-IP masks. The composition of the mask set should minimize not only the layout placement cost, but also the cost of the manufacturing process, design flow setup, and product design and introduction to market. An important factor is the quality check which should be expeditious and enable thorough visual verification to avoid costly modifications once the data is transferred to the mask shop. In this work, in order to enable the layer placement and quality check procedure, we proposed an algorithm where mask layers are first lined up according to the price and field tone [3]. Then, depending on the product die size, expected fab throughput, and scribeline requirements, the subsequent product layers are placed on the masks with different grades. The actual reduction of this concept to practice allowed us to understand the tradeoffs between the automation of layer placement and setup related constraints. For example, the limited options of the numbers of layer per plate dictated by the die size and other design feedback, made us consider layer pairing based not only on the final price of the mask set, but also on the cost of mask design and fab-friendliness. We showed that it may be advantageous to introduce manual layer pairing to ensure that, e.g., all interconnect layers would be placed on the same plate, allowing for easy and simultaneous design fixes. Another enhancement was to allow some flexibility in mixing and matching of the layers such that non-critical ones requiring low mask grade would be placed in a less restrictive way, to reduce the count of orphan layers. In summary, we created a program to automatically propose and visualize shuttle mask architecture for design verification, with enhancements to due to the actual application of the code.

  1. Quality Work, Quality Control in Technical Services.

    ERIC Educational Resources Information Center

    Horny, Karen L.

    1985-01-01

    Quality in library technical services is explored in light of changes produced by automation. Highlights include a definition of quality; new opportunities and shifting priorities; cataloging (fullness of records, heading consistency, accountability, local standards, automated checking); need for new skills (management, staff); and boons of…

  2. HiVy automated translation of stateflow designs for model checking verification

    NASA Technical Reports Server (NTRS)

    Pingree, Paula

    2003-01-01

    tool set enables model checking of finite state machines designs. This is acheived by translating state-chart specifications into the input language of the Spin model checker. An abstract syntax of hierarchical sequential automata (HSA) is provided as an intermediate format tool set.

  3. Recall intervals for oral health in primary care patients.

    PubMed

    Beirne, P; Forgie, A; Clarkson, Je; Worthington, H V

    2005-04-18

    The frequency with which patients should attend for a dental check-up and the potential effects on oral health of altering recall intervals between check-ups have been the subject of ongoing international debate for almost 3 decades. Although recommendations regarding optimal recall intervals vary between countries and dental healthcare systems, 6-monthly dental check-ups have traditionally been advocated by general dental practitioners in many developed countries. To determine the beneficial and harmful effects of different fixed recall intervals (for example 6 months versus 12 months) for the following different types of dental check-up: a) clinical examination only; b) clinical examination plus scale and polish; c) clinical examination plus preventive advice; d) clinical examination plus preventive advice plus scale and polish. To determine the relative beneficial and harmful effects between any of these different types of dental check-up at the same fixed recall interval. To compare the beneficial and harmful effects of recall intervals based on clinicians' assessment of patients' disease risk with fixed recall intervals. To compare the beneficial and harmful effects of no recall interval/patient driven attendance (which may be symptomatic) with fixed recall intervals. We searched the Cochrane Oral Health Group Trials Register, the Cochrane Central Register of Controlled Trials (CENTRAL), MEDLINE and EMBASE. Reference lists from relevant articles were scanned and the authors of some papers were contacted to identify further trials and obtain additional information. Date of most recent searches: 9th April 2003. Trials were selected if they met the following criteria: design- random allocation of participants; participants - all children and adults receiving dental check-ups in primary care settings, irrespective of their level of risk for oral disease; interventions -recall intervals for the following different types of dental check-ups: a) clinical examination only; b) clinical examination plus scale and polish; c) clinical examination plus preventive advice; d) clinical examination plus scale and polish plus preventive advice; e) no recall interval/patient driven attendance (which may be symptomatic); f) clinician risk-based recall intervals; outcomes - clinical status outcomes for dental caries (including, but not limited to, mean dmft/DMFT, dmfs/DMFS scores, caries increment, filled teeth (including replacement restorations), early carious lesions arrested or reversed); periodontal disease (including, but not limited to, plaque, calculus, gingivitis, periodontitis, change in probing depth, attachment level); oral mucosa (presence or absence of mucosal lesions, potentially malignant lesions, cancerous lesions, size and stage of cancerous lesions at diagnosis). In addition the following outcomes were considered where reported: patient-centred outcomes, economic cost outcomes, other outcomes such as improvements in oral health knowledge and attitudes, harms, changes in dietary habits and any other oral health-related behavioural change. Information regarding methods, participants, interventions, outcome measures and results were independently extracted, in duplicate, by two authors. Authors were contacted, where deemed necessary and where possible, for further details regarding study design and for data clarification. A quality assessment of the included trial was carried out. The Cochrane Oral Health Group's statistical guidelines were followed. Only one study (with 188 participants) was included in this review and was assessed as having a high risk of bias. This study provided limited data for dental caries outcomes (dmfs/DMFS increment) and economic cost outcomes (reported time taken to provide examinations and treatment). There is insufficient evidence from randomised controlled trials (RCTs) to draw any conclusions regarding the potential beneficial and harmful effects of altering the recall interval between dental check-ups. There is insufficient evidence to support or refute the practice of encouraging patients to attend for dental check-ups at 6-monthly intervals. It is important that high quality RCTs are conducted for the outcomes listed in this review in order to address the objectives of this review.

  4. Summarized Costs, Placement Of Quality Stars, And Other Online Displays Can Help Consumers Select High-Value Health Plans.

    PubMed

    Greene, Jessica; Hibbard, Judith H; Sacks, Rebecca M

    2016-04-01

    Starting in 2017, all state and federal health insurance exchanges will present quality data on health plans in addition to cost information. We analyzed variations in the current design of information on state exchanges to identify presentation approaches that encourage consumers to take quality as well as cost into account when selecting a health plan. Using an online sample of 1,025 adults, we randomly assigned participants to view the same comparative information on health plans, displayed in different ways. We found that consumers were much more likely to select a high-value plan when cost information was summarized instead of detailed, when quality stars were displayed adjacent to cost information, when consumers understood that quality stars signified the quality of medical care, and when high-value plans were highlighted with a check mark or blue ribbon. These approaches, which were equally effective for participants with higher and lower numeracy, can inform the development of future displays of plan information in the exchanges. Project HOPE—The People-to-People Health Foundation, Inc.

  5. Seasonal rationalization of river water quality sampling locations: a comparative study of the modified Sanders and multivariate statistical approaches.

    PubMed

    Varekar, Vikas; Karmakar, Subhankar; Jha, Ramakar

    2016-02-01

    The design of surface water quality sampling location is a crucial decision-making process for rationalization of monitoring network. The quantity, quality, and types of available dataset (watershed characteristics and water quality data) may affect the selection of appropriate design methodology. The modified Sanders approach and multivariate statistical techniques [particularly factor analysis (FA)/principal component analysis (PCA)] are well-accepted and widely used techniques for design of sampling locations. However, their performance may vary significantly with quantity, quality, and types of available dataset. In this paper, an attempt has been made to evaluate performance of these techniques by accounting the effect of seasonal variation, under a situation of limited water quality data but extensive watershed characteristics information, as continuous and consistent river water quality data is usually difficult to obtain, whereas watershed information may be made available through application of geospatial techniques. A case study of Kali River, Western Uttar Pradesh, India, is selected for the analysis. The monitoring was carried out at 16 sampling locations. The discrete and diffuse pollution loads at different sampling sites were estimated and accounted using modified Sanders approach, whereas the monitored physical and chemical water quality parameters were utilized as inputs for FA/PCA. The designed optimum number of sampling locations for monsoon and non-monsoon seasons by modified Sanders approach are eight and seven while that for FA/PCA are eleven and nine, respectively. Less variation in the number and locations of designed sampling sites were obtained by both techniques, which shows stability of results. A geospatial analysis has also been carried out to check the significance of designed sampling location with respect to river basin characteristics and land use of the study area. Both methods are equally efficient; however, modified Sanders approach outperforms FA/PCA when limited water quality and extensive watershed information is available. The available water quality dataset is limited and FA/PCA-based approach fails to identify monitoring locations with higher variation, as these multivariate statistical approaches are data-driven. The priority/hierarchy and number of sampling sites designed by modified Sanders approach are well justified by the land use practices and observed river basin characteristics of the study area.

  6. The ALL-OUT Library; A Design for Computer-Powered, Multidimensional Services.

    ERIC Educational Resources Information Center

    Sleeth, Jim; LaRue, James

    1983-01-01

    Preliminary description of design of electronic library and home information delivery system highlights potentials of personal computer interface program (applying for service, assuring that users are valid, checking for measures, searching, locating titles) and incorporation of concepts used in other information systems (security checks,…

  7. From ISO 9001:2008 to ISO 9001:2015: Significant changes and their impacts to aspiring organizations

    NASA Astrophysics Data System (ADS)

    Sari, Y.; Wibisono, E.; Wahyudi, R. D.; Lio, Y.

    2017-11-01

    ISO 9001:2015 is the latest version of ISO Quality Management System standard that has been updated recently from ISO 9001:2008. It is necessary for all organizations that have implemented and been certified with ISO 9001:2008 to prepare the transition and upgrade their Quality Management System because the certification will expire by September 2018. This paper attempts to provide knowledge on the significant changes from ISO 9001:2008 to ISO 9001:2015, what new requirements are added, and how they would impact the organizations. An exploratory and applied research was chosen as the research approach and aimed to explore what transition designs are needed to anticipate the changes as well as their impacts. The research applied a methodology of Plan-Do-Check-Action (PDCA) cycle into four organizations and their results were compared and discussed to explain the transition designs. Some qualitative methods such as observation and interview were used to collect the data. By addressing the new requirements, three transition designs that should be prepared are: (i) identifying needs from interested parties, (ii) analyzing internal and external factors of the organizations to formulate relevant strategies and quality objectives, and (iii) registering risks associated to business processes as well as organizational strategies.

  8. Toward improved design of check dam systems: A case study in the Loess Plateau, China

    NASA Astrophysics Data System (ADS)

    Pal, Debasish; Galelli, Stefano; Tang, Honglei; Ran, Qihua

    2018-04-01

    Check dams are one of the most common strategies for controlling sediment transport in erosion prone areas, along with soil and water conservation measures. However, existing mathematical models that simulate sediment production and delivery are often unable to simulate how the storage capacity of check dams varies with time. To explicitly account for this process-and to support the design of check dam systems-we developed a modelling framework consisting of two components, namely (1) the spatially distributed Soil Erosion and Sediment Delivery Model (WaTEM/SEDEM), and (2) a network-based model of check dam storage dynamics. The two models are run sequentially, with the second model receiving the initial sediment input to check dams from WaTEM/SEDEM. The framework is first applied to Shejiagou catchment, a 4.26 km2 area located in the Loess Plateau, China, where we study the effect of the existing check dam system on sediment dynamics. Results show that the deployment of check dams altered significantly the sediment delivery ratio of the catchment. Furthermore, the network-based model reveals a large variability in the life expectancy of check dams and abrupt changes in their filling rates. The application of the framework to six alternative check dam deployment scenarios is then used to illustrate its usefulness for planning purposes, and to derive some insights on the effect of key decision variables, such as the number, size, and site location of check dams. Simulation results suggest that better performance-in terms of life expectancy and sediment delivery ratio-could have been achieved with an alternative deployment strategy.

  9. Sub-pixel analysis to support graphic security after scanning at low resolution

    NASA Astrophysics Data System (ADS)

    Haas, Bertrand; Cordery, Robert; Gou, Hongmei; Decker, Steve

    2006-02-01

    Whether in the domain of audio, video or finance, our world tends to become increasingly digital. However, for diverse reasons, the transition from analog to digital is often much extended in time, and proceeds by long steps (and sometimes never completes). One such step is the conversion of information on analog media to digital information. We focus in this paper on the conversion (scanning) of printed documents to digital images. Analog media have the advantage over digital channels that they can harbor much imperceptible information that can be used for fraud detection and forensic purposes. But this secondary information usually fails to be retrieved during the conversion step. This is particularly relevant since the Check-21 act (Check Clearing for the 21st Century act) became effective in 2004 and allows images of checks to be handled by banks as usual paper checks. We use here this situation of check scanning as our primary benchmark for graphic security features after scanning. We will first present a quick review of the most common graphic security features currently found on checks, with their specific purpose, qualities and disadvantages, and we demonstrate their poor survivability after scanning in the average scanning conditions expected from the Check-21 Act. We will then present a novel method of measurement of distances between and rotations of line elements in a scanned image: Based on an appropriate print model, we refine direct measurements to an accuracy beyond the size of a scanning pixel, so we can then determine expected distances, periodicity, sharpness and print quality of known characters, symbols and other graphic elements in a document image. Finally we will apply our method to fraud detection of documents after gray-scale scanning at 300dpi resolution. We show in particular that alterations on legitimate checks or copies of checks can be successfully detected by measuring with sub-pixel accuracy the irregularities inherently introduced by the illegitimate process.

  10. Redundancy checking algorithms based on parallel novel extension rule

    NASA Astrophysics Data System (ADS)

    Liu, Lei; Yang, Yang; Li, Guangli; Wang, Qi; Lü, Shuai

    2017-05-01

    Redundancy checking (RC) is a key knowledge reduction technology. Extension rule (ER) is a new reasoning method, first presented in 2003 and well received by experts at home and abroad. Novel extension rule (NER) is an improved ER-based reasoning method, presented in 2009. In this paper, we first analyse the characteristics of the extension rule, and then present a simple algorithm for redundancy checking based on extension rule (RCER). In addition, we introduce MIMF, a type of heuristic strategy. Using the aforementioned rule and strategy, we design and implement RCHER algorithm, which relies on MIMF. Next we design and implement an RCNER (redundancy checking based on NER) algorithm based on NER. Parallel computing greatly accelerates the NER algorithm, which has weak dependence among tasks when executed. Considering this, we present PNER (parallel NER) and apply it to redundancy checking and necessity checking. Furthermore, we design and implement the RCPNER (redundancy checking based on PNER) and NCPPNER (necessary clause partition based on PNER) algorithms as well. The experimental results show that MIMF significantly influences the acceleration of algorithm RCER in formulae on a large scale and high redundancy. Comparing PNER with NER and RCPNER with RCNER, the average speedup can reach up to the number of task decompositions when executed. Comparing NCPNER with the RCNER-based algorithm on separating redundant formulae, speedup increases steadily as the scale of the formulae is incrementing. Finally, we describe the challenges that the extension rule will be faced with and suggest possible solutions.

  11. Real-world heart failure management in 10,910 patients with chronic heart failure in the Netherlands : Design and rationale of the Chronic Heart failure ESC guideline-based Cardiology practice Quality project (CHECK-HF) registry.

    PubMed

    Brugts, J J; Linssen, G C M; Hoes, A W; Brunner-La Rocca, H P

    2018-05-01

    Data from patient registries give insight into the management of patients with heart failure (HF), but actual data from unselected real-world HF patients are scarce. Therefore, we performed a cross sectional study of current HF care in the period 2013-2016 among more than 10,000 unselected HF patients at HF outpatient clinics in the Netherlands. In 34 participating centres, all 10,910 patients with chronic HF treated at cardiology centres were included in the CHECK-HF registry. Of these, most (96%) were managed at a specific HF outpatient clinic. Heart failure was typically diagnosed according to the ESC guidelines 2012, based on signs, symptoms and structural and/or functional cardiac abnormalities. Information on diagnostics, treatment and co-morbidities were recorded, with specific focus on drug therapy and devices. In our cohort, the mean age was 73 years (SD 12) and 60% were male. Frequent co-morbidities reported in the patient records were diabetes mellitus 30%, hypertension 43%, COPD 19%, and renal insufficiency 58%. In 47% of the patients, ischaemia was the origin of HF. In our registry, the prevalence of HF with preserved ejection fraction was 21%. The CHECK-HF registry will provide insight into the current, real world management of patient with chronic HF, including HF with reduced ejection fraction, preserved ejection fraction and mid-range ejection fraction, that will help define ways to improve quality of care. Drug and device therapy and guideline adherence as well as interactions with age, gender and co-morbidities will receive specific attention.

  12. Discussion on design and stress checking of cast-in-place bracket

    NASA Astrophysics Data System (ADS)

    Xi, Tang Xian; Yong, He; Hu, Sun Shuan

    2018-04-01

    The cast-in-place bracket is the main support structure in the construction of bridge. Its strength, stiffness and stability have a direct impact on the quality and the safety of bridge construction. The design and calculation of the bracket in the prestressed concrete box girder are analyzed in this paper. The models including Bailey beam, steel crossbeam and steel columns are established by the finite element software. The strength, stiffness and stability of each model under the most unfavorable load are analyzed by MIDAS Civil. The analysis results verify that the support plan meets the relevant specifications and construction requirements. The feasibility of the support scheme was verified well accordingly. The paper can provide reference and guidance for similar engineering construction.

  13. Is a quasi-3D dosimeter better than a 2D dosimeter for Tomotherapy delivery quality assurance?

    NASA Astrophysics Data System (ADS)

    Xing, Aitang; Deshpande, Shrikant; Arumugam, Sankar; George, Armia; Holloway, Lois; Vial, Philip; Goozee, Gary

    2015-01-01

    Delivery quality assurance (DQA) has been performed for each Tomotherapy patient either using ArcCHECK or MatriXX Evolution in our clinic since 2012. ArcCHECK is a quasi-3D dosimeter whereas MatriXX is a 2D detector. A review of DQA results was performed for all patients in the last three years, a total of 221 DQA plans. These DQA plans came from 215 patients with a variety of treatment sites including head-neck, pelvis, and chest wall. The acceptable Gamma pass rate in our clinic is over 95% using 3mm and 3% of maximum planned dose with 10% dose threshold. The mean value and standard deviation of Gamma pass rates were 98.2% ± 1.98(1SD) for MatriXX and 98.5%±1.88 (1SD) for ArcCHECK. A paired t-test was also performed for the groups of patients whose DQA was performed with both the ArcCHECK and MatriXX. No statistical dependence was found in terms of the Gamma pass rate for ArcCHECK and MatriXX. The considered 3D and 2D dosimeters have achieved similar results in performing routine patient-specific DQA for patients treated on a TomoTherapy unit.

  14. Needs assessment and implementation of an employee assistance program: promoting a healthier work force.

    PubMed

    Monfils, M K

    1995-05-01

    1. The functions of a continuous quality improvement tool used by Deming--the Plan, Do, Check, Act Cycle--can be applied to the assessment, implementation, and ongoing evaluation of an Employee Assistance Program (EAP). 2. Various methods are available to assess the need for an EAP. As much data as possible should be collected to qualify and quantify the need so that management can make an informed decision and develop measures to determine program effectiveness. 3. Once an EAP is implemented, it should be monitored continually against the effectiveness measures initially developed. Using a continuous quality improvement process, the occupational health nurse and the EAP provider can establish a dynamic relationship that allows for growth beyond the original design and increased effectiveness of service to employees.

  15. Texas International Airlines LOFT program

    NASA Technical Reports Server (NTRS)

    Sommerville, J.

    1981-01-01

    A line-oriented flight training program which allows the crew to work as a team to solve all problems, abnormal or emergency, within the crew concept. A line-oriented check ride takes place every six months for the pilot as a proficiency check. There are advantages and disadvantages to this program. One disadvantage is that since it is designed as a check-ride, the scenarios must be structured so that the average pilot will complete the check-ride without complication. This system is different from a proficiency check which can be stopped at a problem area so training to proficiency can take place before proceeding with the check.

  16. Check out the Atmospheric Science User Forum

    Atmospheric Science Data Center

    2016-11-16

    Check out the Atmospheric Science User Forum Tuesday, November 15, 2016 The ASDC would like to bring your attention to the Atmospheric Science User Forum. The purpose of this forum is to improve user service, quality, and efficiency of NASA atmospheric science data. The forum intends to provide a quick and easy way to facilitate ...

  17. Beyond Member-Checking: A Dialogic Approach to the Research Interview

    ERIC Educational Resources Information Center

    Harvey, Lou

    2015-01-01

    This article presents a dialogic qualitative interview design for a narrative study of six international UK university students' motivation for learning English. Based on the work of Mikhail Bakhtin, this design was developed in order to address the limitations of member-checking [Lincoln, Y. S., and E. G. Guba. 1985. "Naturalistic…

  18. Revisiting the Procedures for the Vector Data Quality Assurance in Practice

    NASA Astrophysics Data System (ADS)

    Erdoğan, M.; Torun, A.; Boyacı, D.

    2012-07-01

    Immense use of topographical data in spatial data visualization, business GIS (Geographic Information Systems) solutions and applications, mobile and location-based services forced the topo-data providers to create standard, up-to-date and complete data sets in a sustainable frame. Data quality has been studied and researched for more than two decades. There have been un-countable numbers of references on its semantics, its conceptual logical and representations and many applications on spatial databases and GIS. However, there is a gap between research and practice in the sense of spatial data quality which increases the costs and decreases the efficiency of data production. Spatial data quality is well-known by academia and industry but usually in different context. The research on spatial data quality stated several issues having practical use such as descriptive information, metadata, fulfillment of spatial relationships among data, integrity measures, geometric constraints etc. The industry and data producers realize them in three stages; pre-, co- and post data capturing. The pre-data capturing stage covers semantic modelling, data definition, cataloguing, modelling, data dictionary and schema creation processes. The co-data capturing stage covers general rules of spatial relationships, data and model specific rules such as topologic and model building relationships, geometric threshold, data extraction guidelines, object-object, object-belonging class, object-non-belonging class, class-class relationships to be taken into account during data capturing. And post-data capturing stage covers specified QC (quality check) benchmarks and checking compliance to general and specific rules. The vector data quality criteria are different from the views of producers and users. But these criteria are generally driven by the needs, expectations and feedbacks of the users. This paper presents a practical method which closes the gap between theory and practice. Development of spatial data quality concepts into developments and application requires existence of conceptual, logical and most importantly physical existence of data model, rules and knowledge of realization in a form of geo-spatial data. The applicable metrics and thresholds are determined on this concrete base. This study discusses application of geo-spatial data quality issues and QA (quality assurance) and QC procedures in the topographic data production. Firstly we introduce MGCP (Multinational Geospatial Co-production Program) data profile of NATO (North Atlantic Treaty Organization) DFDD (DGIWG Feature Data Dictionary), the requirements of data owner, the view of data producers for both data capturing and QC and finally QA to fulfil user needs. Then, our practical and new approach which divides the quality into three phases is introduced. Finally, implementation of our approach to accomplish metrics, measures and thresholds of quality definitions is discussed. In this paper, especially geometry and semantics quality and quality control procedures that can be performed by the producers are discussed. Some applicable best-practices that we experienced on techniques of quality control, defining regulations that define the objectives and data production procedures are given in the final remarks. These quality control procedures should include the visual checks over the source data, captured vector data and printouts, some automatic checks that can be performed by software and some semi-automatic checks by the interaction with quality control personnel. Finally, these quality control procedures should ensure the geometric, semantic, attribution and metadata quality of vector data.

  19. Quality assurance of weather data for agricultural system model input

    USDA-ARS?s Scientific Manuscript database

    It is well known that crop production and hydrologic variation on watersheds is weather related. Rarely, however, is meteorological data quality checks reported for agricultural systems model research. We present quality assurance procedures for agricultural system model weather data input. Problems...

  20. An analysis of the ArcCHECK-MR diode array's performance for ViewRay quality assurance.

    PubMed

    Ellefson, Steven T; Culberson, Wesley S; Bednarz, Bryan P; DeWerd, Larry A; Bayouth, John E

    2017-07-01

    The ArcCHECK-MR diode array utilizes a correction system with a virtual inclinometer to correct the angular response dependencies of the diodes. However, this correction system cannot be applied to measurements on the ViewRay MR-IGRT system due to the virtual inclinometer's incompatibility with the ViewRay's multiple simultaneous beams. Additionally, the ArcCHECK's current correction factors were determined without magnetic field effects taken into account. In the course of performing ViewRay IMRT quality assurance with the ArcCHECK, measurements were observed to be consistently higher than the ViewRay TPS predictions. The goals of this study were to quantify the observed discrepancies and test whether applying the current factors improves the ArcCHECK's accuracy for measurements on the ViewRay. Gamma and frequency analysis were performed on 19 ViewRay patient plans. Ion chamber measurements were performed at a subset of diode locations using a PMMA phantom with the same dimensions as the ArcCHECK. A new method for applying directionally dependent factors utilizing beam information from the ViewRay TPS was developed in order to analyze the current ArcCHECK correction factors. To test the current factors, nine ViewRay plans were altered to be delivered with only a single simultaneous beam and were measured with the ArcCHECK. The current correction factors were applied using both the new and current methods. The new method was also used to apply corrections to the original 19 ViewRay plans. It was found the ArcCHECK systematically reports doses higher than those actually delivered by the ViewRay. Application of the current correction factors by either method did not consistently improve measurement accuracy. As dose deposition and diode response have both been shown to change under the influence of a magnetic field, it can be concluded the current ArcCHECK correction factors are invalid and/or inadequate to correct measurements on the ViewRay system. © 2017 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  1. Progress Check Module; Basic Electricity and Electronics Individualized Learning System. Progress Check Booklet.

    ERIC Educational Resources Information Center

    Bureau of Naval Personnel, Washington, DC.

    The Progress Check Booklet is designed to be used by the student working in the programed course to determine if he has mastered the concepts in the course booklets on: electrical current; voltage; resistance; measuring current and voltage in series circuits; relationships of current, voltage, and resistance; parellel circuits; combination…

  2. The development of a decision aid for tinnitus.

    PubMed

    Pryce, Helen; Durand, Marie-Anne; Hall, Amanda; Shaw, Rachel; Culhane, Beth-Anne; Swift, Sarah; Straus, Jean; Marks, Elizabeth; Ward, Melanie; Chilvers, Katie

    2018-05-09

    To develop a decision aid for tinnitus care that would meet international consensus for decision aid quality. A mixed methods design that included qualitative in-depth interviews, literature review, focus groups, user testing and readability checking. Patients and clinicians who have clinical experience of tinnitus. A decision aid for tinnitus care was developed. This incorporates key evidence of efficacy for the most frequently used tinnitus care options, together with information derived from patient priorities when deciding which choice to make. The decision aid has potential to enable shared decision making between clinicians and patients in audiology. The decision aid meets consensus standards.

  3. The case for design and build in piped medical gases.

    PubMed

    Cruddas, I

    1990-10-01

    The proposal is not new or radical in that currently many small works are and historical have been carried out implicitly utilising this system. Furthermore, this idea is not suggesting that M&E consultants be omitted from the process only that their role be redefined in terms of approving/checking proposals/installations/commissioning etc. There is an appropriate form of contract already available through JCT '80 why not utilise it? As is being done with boilers, water treatment, lifts etc. etc. The recommendation would improve quality, reduce time and cost, directly apportion accountability and involve the knowledgeable professionals within the industry.

  4. High-resolution urban observation network for user-specific meteorological information service in the Seoul Metropolitan Area, South Korea

    NASA Astrophysics Data System (ADS)

    Park, Moon-Soo; Park, Sung-Hwa; Chae, Jung-Hoon; Choi, Min-Hyeok; Song, Yunyoung; Kang, Minsoo; Roh, Joon-Woo

    2017-04-01

    To improve our knowledge of urban meteorology, including those processes applicable to high-resolution meteorological models in the Seoul Metropolitan Area (SMA), the Weather Information Service Engine (WISE) Urban Meteorological Observation System (UMS-Seoul) has been designed and installed. The UMS-Seoul incorporates 14 surface energy balance (EB) systems, 7 surface-based three-dimensional (3-D) meteorological observation systems and applied meteorological (AP) observation systems, and the existing surface-based meteorological observation network. The EB system consists of a radiation balance system, sonic anemometers, infrared CO2/H2O gas analyzers, and many sensors measuring the wind speed and direction, temperature and humidity, precipitation, and air pressure. The EB-produced radiation, meteorological, and turbulence data will be used to quantify the surface EB according to land use and to improve the boundary-layer and surface processes in meteorological models. The 3-D system, composed of a wind lidar, microwave radiometer, aerosol lidar, or ceilometer, produces the cloud height, vertical profiles of backscatter by aerosols, wind speed and direction, temperature, humidity, and liquid water content. It will be used for high-resolution reanalysis data based on observations and for the improvement of the boundary-layer, radiation, and microphysics processes in meteorological models. The AP system includes road weather information, mosquito activity, water quality, and agrometeorological observation instruments. The standardized metadata for networks and stations are documented and renewed periodically to provide a detailed observation environment. The UMS-Seoul data are designed to support real-time acquisition and display and automatically quality check within 10 min from observation. After the quality check, data can be distributed to relevant potential users such as researchers and policy makers. Finally, two case studies demonstrate that the observed data have a great potential to help to understand the boundary-layer structures more deeply, improve the performance of high-resolution meteorological models, and provide useful information customized based on the user demands in the SMA.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tan, J; Yan, Y; Hager, F

    Purpose: Radiation therapy has evolved to become not only more precise and potent, but also more complicated to monitor and deliver. More rigorous and comprehensive quality assurance is needed to safeguard ever advancing radiation therapy. ICRU standards dictate that an ever growing set of treatment parameters are manually checked weekly by medical physicists. This “weekly chart check” procedure is laborious and subject to human errors or other factors. A computer-assisted chart checking process will enable more complete and accurate human review of critical parameters, reduce the risk of medical errors, and improve the efficiency. Methods: We developed a web-based softwaremore » system that enables a thorough weekly quality assurance checks. In the backend, the software retrieves all machine parameters from a Treatment Management System (TMS) and compares them against the corresponding ones from the treatment planning system. They are also checked for validity against preset rules. The results are displayed as a web page in the front-end for physicists to review. Then a summary report is generated and uploaded automatically to the TMS as a record for weekly chart checking. Results: The software system has been deployed on a web server in our department’s intranet, and has been tested thoroughly by our clinical physicists. A plan parameter would be highlighted when it is off the preset limit. The developed system has changed the way of checking charts with significantly improved accuracy, efficiency, and completeness. It has been shown to be robust, fast, and easy to use. Conclusion: A computer-assisted system has been developed for efficient, accurate, and comprehensive weekly chart checking. The system has been extensively validated and is being implemented for routine clinical use.« less

  6. Architecture for space habitats. Role of architectural design in planning artificial environment for long time manned space missions

    NASA Astrophysics Data System (ADS)

    Martinez, Vera

    2007-02-01

    The paper discusses concepts about the role of architecture in the design of space habitats and the development of a general evaluation criteria of architectural design contribution. Besides the existing feasibility studies, the general requisites, the development studies, and the critical design review which are mainly based on the experience of human space missions and the standards of the NASA-STD-3000 manual and which analyze and evaluate the relation between man and environment and between man and machine mainly in its functionality, there is very few material about design of comfort and wellbeing of man in space habitat. Architecture for space habitat means the design of an artificial environment with much comfort in an "atmosphere" of wellbeing. These are mainly psychological effects of human factors which are very important in the case of a long time space mission. How can the degree of comfort and "wellbeing atmosphere" in an artificial environment be measured? How can the quality of the architectural contribution in space design be quantified? Definition of a criteria catalogue to reach a larger objectivity in architectural design evaluation. Definition of constant parameters as a result of project necessities to quantify the quality of the design. Architectural design analysis due the application and verification within the parameters and consequently overlapping and evaluating results. Interdisciplinary work between architects, astronautics, engineers, psychologists, etc. All the disciplines needed for planning a high quality habitat for humans in space. Analysis of the principles of well designed artificial environment. Good quality design for space architecture is the result of the interaction and interrelation between many different project necessities (technological, environmental, human factors, transportation, costs, etc.). Each of this necessities is interrelated in the design project and cannot be evaluated on its own. Therefore, the design process needs constant check ups to choose each time the best solution in relation to the whole. As well as for the main disciplines around human factors, architectural design for space has to be largely tested to produce scientific improvement.

  7. Agricultural Baseline (BL0) scenario

    DOE Data Explorer

    Davis, Maggie R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000181319328); Hellwinckel, Chad M [University of Tennessee] (ORCID:0000000173085058); Eaton, Laurence [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000312709626); Turhollow, Anthony [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000228159350); Brandt, Craig [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000214707379); Langholtz, Matthew H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000281537154)

    2016-07-13

    Scientific reason for data generation: to serve as the reference case for the BT16 volume 1 agricultural scenarios. The agricultural baseline runs from 2015 through 2040; a starting year of 2014 is used. Date the data set was last modified: 02/12/2016 How each parameter was produced (methods), format, and relationship to other data in the data set: simulation was developed without offering a farmgate price to energy crops or residues (i.e., building on both the USDA 2015 baseline and the agricultural census data (USDA NASS 2014). Data generated are .txt output files by year, simulation identifier, county code (1-3109). Instruments used: POLYSYS (version POLYS2015_V10_alt_JAN22B) supplied by the University of Tennessee APAC The quality assurance and quality control that have been applied: • Check for negative planted area, harvested area, production, yield and cost values. • Check if harvested area exceeds planted area for annuals. • Check FIPS codes.

  8. Simple colonoscopy reporting system checking the detection rate of colon polyps.

    PubMed

    Kim, Jae Hyun; Choi, Youn Jung; Kwon, Hye Jung; Park, Seun Ja; Park, Moo In; Moon, Won; Kim, Sung Eun

    2015-08-21

    To present a simple colonoscopy reporting system that can be checked easily the detection rate of colon polyps. A simple colonoscopy reporting system Kosin Gastroenterology (KG quality reporting system) was developed. The polyp detection rate (PDR), adenoma detection rate (ADR), serrated polyp detection rate (SDR), and advanced adenoma detection rate (AADR) are easily calculated to use this system. In our gastroenterology center, the PDR, ADR, SDR, and AADR test results from each gastroenterologist were updated, every month. Between June 2014, when the program was started, and December 2014, the overall PDR and ADR in our center were 62.5% and 41.4%, respectively. And the overall SDR and AADR were 7.5% and 12.1%, respectively. We envision that KG quality reporting system can be applied to develop a comprehensive system to check colon polyp detection rates in other gastroenterology centers.

  9. The method of a joint intraday security check system based on cloud computing

    NASA Astrophysics Data System (ADS)

    Dong, Wei; Feng, Changyou; Zhou, Caiqi; Cai, Zhi; Dan, Xu; Dai, Sai; Zhang, Chuancheng

    2017-01-01

    The intraday security check is the core application in the dispatching control system. The existing security check calculation only uses the dispatch center’s local model and data as the functional margin. This paper introduces the design of all-grid intraday joint security check system based on cloud computing and its implementation. To reduce the effect of subarea bad data on the all-grid security check, a new power flow algorithm basing on comparison and adjustment with inter-provincial tie-line plan is presented. And the numerical example illustrated the effectiveness and feasibility of the proposed method.

  10. The Modern Measurement Technology And Checking Of Shafs Parameters

    NASA Astrophysics Data System (ADS)

    Tichá, Šárka; Botek, Jan

    2015-12-01

    This paper is focused on rationalization checking parameters of shaft in companies engaged in the production of components of electric motors, wind turbines and vacuum systems. Customers increasing constantly their requirements to ensure the overall quality of the product, i.e. the quality of machining, dimensional and shape accuracy and overall purity of the subscribed products. The aim of this paper is to introduce using modern measurement technology in controlling these components and compare the results with existing control methodology. The main objective of this rationalization is to eliminate mistakes and shortcomings of current inspection methods.

  11. Low-Density Parity-Check Code Design Techniques to Simplify Encoding

    NASA Astrophysics Data System (ADS)

    Perez, J. M.; Andrews, K.

    2007-11-01

    This work describes a method for encoding low-density parity-check (LDPC) codes based on the accumulate-repeat-4-jagged-accumulate (AR4JA) scheme, using the low-density parity-check matrix H instead of the dense generator matrix G. The use of the H matrix to encode allows a significant reduction in memory consumption and provides the encoder design a great flexibility. Also described are new hardware-efficient codes, based on the same kind of protographs, which require less memory storage and area, allowing at the same time a reduction in the encoding delay.

  12. Design and scheduling for periodic concurrent error detection and recovery in processor arrays

    NASA Technical Reports Server (NTRS)

    Wang, Yi-Min; Chung, Pi-Yu; Fuchs, W. Kent

    1992-01-01

    Periodic application of time-redundant error checking provides the trade-off between error detection latency and performance degradation. The goal is to achieve high error coverage while satisfying performance requirements. We derive the optimal scheduling of checking patterns in order to uniformly distribute the available checking capability and maximize the error coverage. Synchronous buffering designs using data forwarding and dynamic reconfiguration are described. Efficient single-cycle diagnosis is implemented by error pattern analysis and direct-mapped recovery cache. A rollback recovery scheme using start-up control for local recovery is also presented.

  13. The effects of teachers' homework follow-up practices on students' EFL performance: a randomized-group design

    PubMed Central

    Rosário, Pedro; Núñez, José C.; Vallejo, Guillermo; Cunha, Jennifer; Nunes, Tânia; Suárez, Natalia; Fuentes, Sonia; Moreira, Tânia

    2015-01-01

    This study analyzed the effects of five types of homework follow-up practices (i.e., checking homework completion; answering questions about homework; checking homework orally; checking homework on the board; and collecting and grading homework) used in class by 26 teachers of English as a Foreign Language (EFL) using a randomized-group design. Once a week, for 6 weeks, the EFL teachers used a particular type of homework follow-up practice they had previously been assigned to. At the end of the 6 weeks students completed an EFL exam as an outcome measure. The results showed that three types of homework follow-up practices (i.e., checking homework orally; checking homework on the board; and collecting and grading homework) had a positive impact on students' performance, thus highlighting the role of EFL teachers in the homework process. The effect of EFL teachers' homework follow-up practices on students' performance was affected by students' prior knowledge, but not by the number of homework follow-up sessions. PMID:26528204

  14. The effects of teachers' homework follow-up practices on students' EFL performance: a randomized-group design.

    PubMed

    Rosário, Pedro; Núñez, José C; Vallejo, Guillermo; Cunha, Jennifer; Nunes, Tânia; Suárez, Natalia; Fuentes, Sonia; Moreira, Tânia

    2015-01-01

    This study analyzed the effects of five types of homework follow-up practices (i.e., checking homework completion; answering questions about homework; checking homework orally; checking homework on the board; and collecting and grading homework) used in class by 26 teachers of English as a Foreign Language (EFL) using a randomized-group design. Once a week, for 6 weeks, the EFL teachers used a particular type of homework follow-up practice they had previously been assigned to. At the end of the 6 weeks students completed an EFL exam as an outcome measure. The results showed that three types of homework follow-up practices (i.e., checking homework orally; checking homework on the board; and collecting and grading homework) had a positive impact on students' performance, thus highlighting the role of EFL teachers in the homework process. The effect of EFL teachers' homework follow-up practices on students' performance was affected by students' prior knowledge, but not by the number of homework follow-up sessions.

  15. Data Quality Control for Vessel Mounted Acoustic Doppler Current Profiler. Application for the Western Mediterranean Sea

    NASA Technical Reports Server (NTRS)

    Garcia-Gorriz, E.; Front, J.; Candela, J.

    1997-01-01

    A systematic Data Quality Checking Protocol for vessel Mounted Acoustic Doppler Current Profiler observations is proposed. Previous-to-acquisition conditions are considered along with simultaneous ones.

  16. Evaluation of non-destructive technologies for construction quality control of HMA and PCC pavements in Louisiana : [research project capsule].

    DOT National Transportation Integrated Search

    2009-07-01

    Current roadway quality control and quality acceptance (QC/QA) procedures : for Louisiana include coring for thickness, density, and air void checks in hot : mix asphalt (HMA) pavements and thickness and compressive strength for : Portland cement con...

  17. Protecting quantum memories using coherent parity check codes

    NASA Astrophysics Data System (ADS)

    Roffe, Joschka; Headley, David; Chancellor, Nicholas; Horsman, Dominic; Kendon, Viv

    2018-07-01

    Coherent parity check (CPC) codes are a new framework for the construction of quantum error correction codes that encode multiple qubits per logical block. CPC codes have a canonical structure involving successive rounds of bit and phase parity checks, supplemented by cross-checks to fix the code distance. In this paper, we provide a detailed introduction to CPC codes using conventional quantum circuit notation. We demonstrate the implementation of a CPC code on real hardware, by designing a [[4, 2, 2

  18. SU-D-BRD-01: An Automated Physics Weekly Chart Checking System Supporting ARIA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, X; Yang, D

    Purpose: A software tool was developed in this study to perform automatic weekly physics chart check on the patient data in ARIA. The tool accesses the electronic patient data directly from ARIA server and checks the accuracy of treatment deliveries, and generates reports which summarize the delivery history and highlight the errors. Methods: The tool has four modules. 1) The database interface is designed to directly access treatment delivery data from the ARIA database before reorganizing the data into the patient chart tree (PCT). 2) PCT is a core data structure designed to store and organize the data in logicalmore » hierarchies, and to be passed among functions. 3) The treatment data check module analyzes the organized data in PCT and stores the checking results into PCT. 4) Report generation module generates reports containing the treatment delivery summary, chart checking results and plots of daily treatment setup parameters (couch table positions, shifts of image guidance). The errors that are found by the tool are highlighted with colors. Results: The weekly check tool has been implemented in MATLAB and clinically tested at two major cancer centers. Javascript, cascading style sheets (CSS) and dynamic HTML were employed to create the user-interactive reports. It takes 0.06 second to search the delivery records of one beam with PCT and compare the delivery records with beam plan. The reports, saved in the HTML files on shared network folder, can be accessed by web browser on computers and mobile devices. Conclusion: The presented weekly check tool is useful to check the electronic patient treatment data in Varian ARIA system. It could be more efficient and reliable than the manually check by physicists. The work was partially supported by a research grant from Varian Medical System.« less

  19. Building a QC Database of Meteorological Data From NASA KSC and the United States Air Force's Eastern Range

    NASA Technical Reports Server (NTRS)

    Brenton, James C.; Barbre, Robert E.; Orcutt, John M.; Decker, Ryan K.

    2018-01-01

    The National Aeronautics and Space Administration's (NASA) Marshall Space Flight Center (MSFC) Natural Environments Branch (EV44) has provided atmospheric databases and analysis in support of space vehicle design and day-of-launch operations for NASA and commercial launch vehicle programs launching from the NASA Kennedy Space Center (KSC), co-located on the United States Air Force's Eastern Range (ER) at the Cape Canaveral Air Force Station. The ER is one of the most heavily instrumented sites in the United States measuring various atmospheric parameters on a continuous basis. An inherent challenge with the large databases that EV44 receives from the ER consists of ensuring erroneous data are removed from the databases, and thus excluded from launch vehicle design analyses. EV44 has put forth great effort in developing quality control (QC) procedures for individual meteorological instruments; however, no standard QC procedures for all databases currently exist resulting in QC databases that have inconsistencies in variables, methodologies, and periods of record. The goal of this activity is to use the previous efforts by EV44 to develop a standardized set of QC procedures from which to build flags within the meteorological databases from KSC and the ER, while maintaining open communication with end users from the launch community to develop ways to improve, adapt and grow the QC database. Details of the QC checks are described. The flagged data points will be plotted in a graphical user interface (GUI) as part of a manual confirmation that the flagged data do indeed need to be removed from the archive. As the rate of launches increases with additional launch vehicle programs, more emphasis is being placed to continually update and check weather databases for data quality before use in launch vehicle design and certification analyses.

  20. A Swiss cheese error detection method for real-time EPID-based quality assurance and error prevention.

    PubMed

    Passarge, Michelle; Fix, Michael K; Manser, Peter; Stampanoni, Marco F M; Siebers, Jeffrey V

    2017-04-01

    To develop a robust and efficient process that detects relevant dose errors (dose errors of ≥5%) in external beam radiation therapy and directly indicates the origin of the error. The process is illustrated in the context of electronic portal imaging device (EPID)-based angle-resolved volumetric-modulated arc therapy (VMAT) quality assurance (QA), particularly as would be implemented in a real-time monitoring program. A Swiss cheese error detection (SCED) method was created as a paradigm for a cine EPID-based during-treatment QA. For VMAT, the method compares a treatment plan-based reference set of EPID images with images acquired over each 2° gantry angle interval. The process utilizes a sequence of independent consecutively executed error detection tests: an aperture check that verifies in-field radiation delivery and ensures no out-of-field radiation; output normalization checks at two different stages; global image alignment check to examine if rotation, scaling, and translation are within tolerances; pixel intensity check containing the standard gamma evaluation (3%, 3 mm) and pixel intensity deviation checks including and excluding high dose gradient regions. Tolerances for each check were determined. To test the SCED method, 12 different types of errors were selected to modify the original plan. A series of angle-resolved predicted EPID images were artificially generated for each test case, resulting in a sequence of precalculated frames for each modified treatment plan. The SCED method was applied multiple times for each test case to assess the ability to detect introduced plan variations. To compare the performance of the SCED process with that of a standard gamma analysis, both error detection methods were applied to the generated test cases with realistic noise variations. Averaged over ten test runs, 95.1% of all plan variations that resulted in relevant patient dose errors were detected within 2° and 100% within 14° (<4% of patient dose delivery). Including cases that led to slightly modified but clinically equivalent plans, 89.1% were detected by the SCED method within 2°. Based on the type of check that detected the error, determination of error sources was achieved. With noise ranging from no random noise to four times the established noise value, the averaged relevant dose error detection rate of the SCED method was between 94.0% and 95.8% and that of gamma between 82.8% and 89.8%. An EPID-frame-based error detection process for VMAT deliveries was successfully designed and tested via simulations. The SCED method was inspected for robustness with realistic noise variations, demonstrating that it has the potential to detect a large majority of relevant dose errors. Compared to a typical (3%, 3 mm) gamma analysis, the SCED method produced a higher detection rate for all introduced dose errors, identified errors in an earlier stage, displayed a higher robustness to noise variations, and indicated the error source. © 2017 American Association of Physicists in Medicine.

  1. Predictive value of obsessive compulsive symptoms involving the skin on quality of life in patients with acne vulgaris.

    PubMed

    Bez, Yasin; Yesilova, Yavuz; Arı, Mustafa; Kaya, Mehmet Cemal; Alpak, Gokay; Bulut, Mahmut

    2013-11-01

    Acne is one of the most common dermatological diseases, and obsessive compulsive disorder is among the most frequent psychiatric conditions seen in dermatology clinics. Comorbidity of these conditions may therefore be expected. The aim of this study was to measure obsessive compulsive symptoms and quality of life in patients with acne vulgaris, compare them with those of healthy control subjects, and determine whether there is any predictive value of obsessive compulsive symptoms for quality of life in patients with acne. Obsessive compulsive symptoms and quality of life measurements of 146 patients with acne vulgaris and 94 healthy control subjects were made using the Maudsley Obsessive Compulsive Questionnaire and Short Form-36 in a cross-sectional design. Patients with acne vulgaris had lower scores for physical functioning, physical role dysfunction, general health perception, vitality, and emotional role dysfunction. They also had higher scores for checking, slowness, and rumination. The only predictor of physical functioning and vitality dimensions of health-related quality of life in these patients was rumination score. Obsessive compulsive symptoms in patients with acne vulgaris are higher than in controls, and this may correlate with both disease severity and quality of life for patients.

  2. 46 CFR 160.132-9 - Preapproval review.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... inventory of materials; (iii) The method for checking quality of fabrication and joints, including welding... §§ 160.132-19 and 160.132-21 of this subpart; (5) A description of the quality control procedures and... between the independent laboratory and Commandant under 46 CFR subpart 159.010. (d) Plan quality. All...

  3. 46 CFR 160.132-9 - Preapproval review.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... inventory of materials; (iii) The method for checking quality of fabrication and joints, including welding... §§ 160.132-19 and 160.132-21 of this subpart; (5) A description of the quality control procedures and... between the independent laboratory and Commandant under 46 CFR subpart 159.010. (d) Plan quality. All...

  4. 46 CFR 160.132-9 - Preapproval review.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... inventory of materials; (iii) The method for checking quality of fabrication and joints, including welding... §§ 160.132-19 and 160.132-21 of this subpart; (5) A description of the quality control procedures and... between the independent laboratory and Commandant under 46 CFR subpart 159.010. (d) Plan quality. All...

  5. 40 CFR 60.2735 - Is there a minimum amount of monitoring data I must obtain?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... activities including, as applicable, calibration checks and required zero and span adjustments. A monitoring... monitoring system quality assurance or control activities in calculations used to report emissions or...-control periods, and required monitoring system quality assurance or quality control activities including...

  6. 40 CFR 60.2735 - Is there a minimum amount of monitoring data I must obtain?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... activities including, as applicable, calibration checks and required zero and span adjustments. A monitoring... monitoring system quality assurance or control activities in calculations used to report emissions or...-control periods, and required monitoring system quality assurance or quality control activities including...

  7. A School-Based Quality Improvement Program.

    ERIC Educational Resources Information Center

    Rappaport, Lewis A.

    1993-01-01

    As one Brooklyn high school discovered, quality improvement begins with administrator commitment and participants' immersion in the literature. Other key elements include ongoing training of personnel involved in the quality-improvement process, tools such as the Deming Cycle (plan-do-check-act), voluntary and goal-oriented teamwork, and a worthy…

  8. Assessing Educational Processes Using Total-Quality-Management Measurement Tools.

    ERIC Educational Resources Information Center

    Macchia, Peter, Jr.

    1993-01-01

    Discussion of the use of Total Quality Management (TQM) assessment tools in educational settings highlights and gives examples of fishbone diagrams, or cause and effect charts; Pareto diagrams; control charts; histograms and check sheets; scatter diagrams; and flowcharts. Variation and quality are discussed in terms of continuous process…

  9. "Check, Change What You Need to Change and/or Keep What You Want": An Art Therapy Neurobiological-Based Trauma Protocol

    ERIC Educational Resources Information Center

    Hass-Cohen, Noah; Clyde Findlay, Joanna; Carr, Richard; Vanderlan, Jessica

    2014-01-01

    The Check ("Check, Change What You Need To Change and/or Keep What You Want") art therapy protocol is a sequence of directives for treating trauma that is grounded in neurobiological theory and designed to facilitate trauma narrative processing, autobiographical coherency, and the rebalancing of dysregulated responses to psychosocial…

  10. A Low-Complexity Euclidean Orthogonal LDPC Architecture for Low Power Applications.

    PubMed

    Revathy, M; Saravanan, R

    2015-01-01

    Low-density parity-check (LDPC) codes have been implemented in latest digital video broadcasting, broadband wireless access (WiMax), and fourth generation of wireless standards. In this paper, we have proposed a high efficient low-density parity-check code (LDPC) decoder architecture for low power applications. This study also considers the design and analysis of check node and variable node units and Euclidean orthogonal generator in LDPC decoder architecture. The Euclidean orthogonal generator is used to reduce the error rate of the proposed LDPC architecture, which can be incorporated between check and variable node architecture. This proposed decoder design is synthesized on Xilinx 9.2i platform and simulated using Modelsim, which is targeted to 45 nm devices. Synthesis report proves that the proposed architecture greatly reduces the power consumption and hardware utilizations on comparing with different conventional architectures.

  11. Console test report for shuttle task 501 shuttle carrier aircraft transceiver console (SED 36115353-301)

    NASA Technical Reports Server (NTRS)

    Lane, J. H.

    1976-01-01

    Performance tests completed on the Space Shuttle Carrier Aircraft (SCA) transceiver console, verifying its design objectives, were described. These tests included: (1) check of power supply voltages for correct output voltage and energization at the proper point in the turn on sequence, (2) check of cooling system (LRU blower, overload sensors and circuitry, and thermocouple probe), (3) check of control circuits logic, including the provisions for remote control and display, (4) check of the LRU connector for presence of correct voltages and absence of incorrect voltages under both energized and deenergized conditions, and (5) check of the AGC and power output monitor circuits.

  12. Measuring health care workers' perceptions of what constitutes a compassionate organisation culture and working environment: Findings from a quantitative feasibility survey.

    PubMed

    McSherry, Robert; Pearce, Paddy

    2018-03-01

    Health care organisation cultures and working environments are highly complex, dynamic and constantly evolving settings. They significantly influence both the delivery and outcomes of care. Phase 1 quantitative findings are presented from a larger three phase feasibility study designed to develop and test a Cultural Health Check toolkit to support health care workers, patients and organisations in the provision of safe, compassionate and dignified care. A mixed methods approach was applied. The Cultural Health Check Healthcare Workers Questionnaire was distributed across two National Health Service Hospitals in England, UK. Both hospitals allocated two wards comprising of older people and surgical specialities. The newly devised Cultural Health Check Staff Rating Scale Version 1 questionnaire was distributed to 223 health care workers. Ninety eight responses were returned giving a response rate of 44%. The Cultural Health Check Staff Rating Scale Version 1 has a significant Cronbach alpha of .775; this reliability scaling is reflected in all 16 items in the scale. Exploratory factor analysis identified two significant factors "Professional Practice and Support" and "Workforce and Service Delivery." These factors according to health care workers significantly impact on the organisation culture and quality of care delivered by staff. The Cultural Health Check Staff Rating Scale Version 1 questionnaire is a newly validated measurement tool that could be used and applied to gauge health care workers perceptions of an organisations level of compassion. Historically we have focused on identifying how caring and compassionate nurses, doctors and related allied health professionals are. This turns the attention on employers of nurses and other related organisations. The questionnaire can be used to gauge the level of compassion with a health care organisation culture and working environment. Nurse managers and leaders should focus attention regarding how these two factors are supported and resourced in the future. © 2017 John Wiley & Sons Ltd.

  13. [Factors Associated with Stress Check Attendance: Possible Effect of Timing of Annual Health Examination].

    PubMed

    Ishimaru, Tomohiro; Hattori, Michihiro; Nagata, Masako; Kuwahara, Keisuke; Watanabe, Seiji; Mori, Koji

    2018-01-01

    The stress check program has been part of annual employees' health screening since 2015. Employees are recommended, but not obliged, to undergo the stress check offered. This study was designed to examine the factors associated with stress check attendance. A total of 31,156 Japanese employees who underwent an annual health examination and a stress check service at an Occupational Health Service Center in 2016 participated in this study. Data from the annual health examination and stress check service included stress check attendance, date of attendance (if implemented), gender, age, workplace industry, number of employees at the workplace, and tobacco and alcohol consumption. Data were analyzed using multiple logistic regression. The mean rate of stress check attendance was 90.8%. A higher rate of stress check attendance was associated with a lower duration from the annual health examination, age ≥30 years, construction and transport industry, and 50-999 employees at the workplace. A lower rate of stress check attendance was associated with medical and welfare industry and ≥1,000 employees at the workplace. These findings provide insights into developing strategies for improving the rate of stress check attendance. In particular, stress check attendance may improve if the stress check service and annual health examination are conducted simultaneously.

  14. Data quality in a DRG-based information system.

    PubMed

    Colin, C; Ecochard, R; Delahaye, F; Landrivon, G; Messy, P; Morgon, E; Matillon, Y

    1994-09-01

    The aim of this study initiated in May 1990 was to evaluate the quality of the medical data collected from the main hospital of the "Hospices Civils de Lyon", Edouard Herriot Hospital. We studied a random sample of 593 discharge abstracts from 12 wards of the hospital. Quality control was performed by checking multi-hospitalized patients' personal data, checking that each discharge abstract was exhaustive, examining the quality of abstracting, studying diagnoses and medical procedures coding, and checking data entry. Assessment of personal data showed a 4.4% error rate. It was mainly accounted for by spelling mistakes in surnames and first names, and mistakes in dates of birth. The quality of a discharge abstract was estimated according to the two purposes of the medical information system: description of hospital morbidity per patient and Diagnosis Related Group's case mix. Error rates in discharge abstracts were expressed in two ways: an overall rate for errors of concordance between Discharge Abstracts and Medical Records, and a specific rate for errors modifying classification in Diagnosis Related Groups (DRG). For abstracting medical information, these error rates were 11.5% (SE +/- 2.2) and 7.5% (SE +/- 1.9) respectively. For coding diagnoses and procedures, they were 11.4% (SE +/- 1.5) and 1.3% (SE +/- 0.5) respectively. For data entry on the computerized data base, the error rate was 2% (SE +/- 0.5) and 0.2% (SE +/- 0.05). Quality control must be performed regularly because it demonstrates the degree of participation from health care teams and the coherence of the database.(ABSTRACT TRUNCATED AT 250 WORDS)

  15. SU-F-T-238: Analyzing the Performance of MapCHECK2 and Delta4 Quality Assurance Phantoms in IMRT and VMAT Plans

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, SH; Tsai, YC; Lan, HT

    2016-06-15

    Purpose: Intensity-modulated radiotherapy (IMRT) and volumetric modulated arc therapy (VMAT) have been widely investigated for use in radiotherapy and found to have a highly conformal dose distribution. Delta{sup 4} is a novel cylindrical phantom consisting of 1069 p-type diodes with true treatments measured in the 3D target volume. The goal of this study was to compare the performance of a Delta{sup 4} diode array for IMRT and VMAT planning with ion chamber and MapCHECK2. Methods: Fifty-four IMRT (n=9) and VMAT (n=45) plans were imported to Philips Pinnacle Planning System 9.2 for recalculation with a solid water phantom, MapCHECK2, and themore » Delta4 phantom. To evaluate the difference between the measured and calculated dose, we used MapCHECK2 and Delta{sup 4} for a dose-map comparison and an ion chamber (PTW 31010 Semiflex 0.125 cc) for a point-dose comparison. Results: All 54 plans met the criteria of <3% difference for the point dose (at least two points) by ion chamber. The mean difference was 0.784% with a standard deviation of 1.962%. With a criteria of 3 mm/3% in a gamma analysis, the average passing rates were 96.86%±2.19% and 98.42%±1.97% for MapCHECK2 and Delta{sup 4}, respectively. The student t-test of MapCHECK2/Delta{sup 4}, ion chamber/Delta{sup 4}, and ion chamber/MapCHECK2 were 0.0008, 0.2944, and 0.0002, respectively. There was no significant difference in passing rates between MapCHECK2 and Delta{sup 4} for the IMRT plan (p = 0.25). However, a higher pass rate was observed in Delta{sup 4} (98.36%) as compared to MapCHECK2 (96.64%, p < 0.0001) for the VMAT plan. Conclusion: The Pinnacle planning system can accurately calculate doses for VMAT and IMRT plans. The Delta{sup 4} shows a similar result when compared to ion chamber and MapCHECK2, and is an efficient tool for patient-specific quality assurance, especially for rotation therapy.« less

  16. Data Quality Control of the French Permanent Broadband Network in the RESIF Framework.

    NASA Astrophysics Data System (ADS)

    Grunberg, M.; Lambotte, S.; Engels, F.

    2014-12-01

    In the framework of the RESIF (Réseau Sismologique et géodésique Français) project, a new information system is setting up, allowing the improvement of the management and the distribution of high quality data from the different elements of RESIF. Within this information system, EOST (in Strasbourg) is in charge of collecting real-time permanent broadband seismic waveform, and performing Quality Control on these data. The real-time and validated data set are pushed to the French National Distribution Center (Isterre/Grenoble) to make them publicly available. Furthermore EOST hosts the BCSF-ReNaSS, in charge of the French metropolitan seismic bulletin. This allows to benefit from some high-end quality control based on the national and world-wide seismicity. Here we present the real-time seismic data flow from the stations of the French National Broad Band Network to EOST, and then, the data Quality Control procedures that were recently installed, including some new developments.The data Quality Control consists in applying a variety of processes to check the consistency of the whole system from the stations to the data center. This allows us to verify that instruments and data transmission are operating correctly. Moreover, time quality is critical for most of the scientific data applications. To face this challenge and check the consistency of polarities and amplitudes, we deployed several high-end processes including a noise correlation procedure to check for timing accuracy (intrumental time errors result in a time-shift of the whole cross-correlation, clearly distinct from those due to change in medium physical properties), and a systematic comparison of synthetic and real data for teleseismic earthquakes of magnitude larger than 6.5 to detect timing errors as well as polarity and amplitude problems.

  17. Statistical Quality Control of Moisture Data in GEOS DAS

    NASA Technical Reports Server (NTRS)

    Dee, D. P.; Rukhovets, L.; Todling, R.

    1999-01-01

    A new statistical quality control algorithm was recently implemented in the Goddard Earth Observing System Data Assimilation System (GEOS DAS). The final step in the algorithm consists of an adaptive buddy check that either accepts or rejects outlier observations based on a local statistical analysis of nearby data. A basic assumption in any such test is that the observed field is spatially coherent, in the sense that nearby data can be expected to confirm each other. However, the buddy check resulted in excessive rejection of moisture data, especially during the Northern Hemisphere summer. The analysis moisture variable in GEOS DAS is water vapor mixing ratio. Observational evidence shows that the distribution of mixing ratio errors is far from normal. Furthermore, spatial correlations among mixing ratio errors are highly anisotropic and difficult to identify. Both factors contribute to the poor performance of the statistical quality control algorithm. To alleviate the problem, we applied the buddy check to relative humidity data instead. This variable explicitly depends on temperature and therefore exhibits a much greater spatial coherence. As a result, reject rates of moisture data are much more reasonable and homogeneous in time and space.

  18. Finite-Time Performance of Local Search Algorithms: Theory and Application

    DTIC Science & Technology

    2010-06-10

    security devices deployed at airport security checkpoints are used to detect prohibited items (e.g., guns, knives, explosives). Each security device...security devices are deployed, the practical issue of determining how to optimally use them can be difficult. For an airport security system design...checked baggage), explosive detection systems (designed to detect explosives in checked baggage), and detailed hand search by an airport security official

  19. DataPlus™ - a revolutionary applications generator for DOS hand-held computers

    Treesearch

    David Dean; Linda Dean

    2000-01-01

    DataPlus allows the user to easily design data collection templates for DOS-based hand-held computers that mimic clipboard data sheets. The user designs and tests the application on the desktop PC and then transfers it to a DOS field computer. Other features include: error checking, missing data checks, and sensor input from RS-232 devices such as bar code wands,...

  20. 39 CFR 233.9 - Expedited release of conveyances being forfeited in a judicial forfeiture proceeding for a drug...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... is not evidence of a violation of law or has design or other characteristics that particularly suit it for use in illegal activities. This bond must be in the form of a traveler's check, a money order... Service. A bond in the form of a cashier's check will be considered as paid once the check has been...

  1. 39 CFR 233.9 - Expedited release of conveyances being forfeited in a judicial forfeiture proceeding for a drug...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... is not evidence of a violation of law or has design or other characteristics that particularly suit it for use in illegal activities. This bond must be in the form of a traveler's check, a money order... Service. A bond in the form of a cashier's check will be considered as paid once the check has been...

  2. Liquid rocket pressure regulators, relief valves, check valves, burst disks, and explosive valves. [design techniques and practices

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The development of and operational programs for effective use in design are presented for liquid rocket pressure regulators, relief valves, check valves, burst disks, and explosive valves. A review of the total design problem is presented, and design elements are identified which are involved in successful design. Current technology pertaining to these elements is also described. Design criteria are presented which state what rule or standard must be imposed on each essential design element to assure successful design. These criteria serve as a checklist of rules for a project manager to use in guiding a design or in assessing its adequacy. Recommended practices are included which state how to satisfy each of the criteria.

  3. Designing and optimizing a healthcare kiosk for the community.

    PubMed

    Lyu, Yongqiang; Vincent, Christopher James; Chen, Yu; Shi, Yuanchun; Tang, Yida; Wang, Wenyao; Liu, Wei; Zhang, Shuangshuang; Fang, Ke; Ding, Ji

    2015-03-01

    Investigating new ways to deliver care, such as the use of self-service kiosks to collect and monitor signs of wellness, supports healthcare efficiency and inclusivity. Self-service kiosks offer this potential, but there is a need for solutions to meet acceptable standards, e.g. provision of accurate measurements. This study investigates the design and optimization of a prototype healthcare kiosk to collect vital signs measures. The design problem was decomposed, formalized, focused and used to generate multiple solutions. Systematic implementation and evaluation allowed for the optimization of measurement accuracy, first for individuals and then for a population. The optimized solution was tested independently to check the suitability of the methods, and quality of the solution. The process resulted in a reduction of measurement noise and an optimal fit, in terms of the positioning of measurement devices. This guaranteed the accuracy of the solution and provides a general methodology for similar design problems. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  4. PF-WFS Shell Inspection Update December 2016

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vigil, Anthony Eugene; Ledoux, Reina Rebecca; Gonzales, Antonio R.

    Since the last project update in FY16:Q2, PF-WFS personnel have advanced in understanding of shell inspection on Coordinate Measuring Machines {CMM} and refined the PF-WFS process to the point it was decided to convert shell inspection from the Sheffield #1 gage to Lietz CM Ms. As a part of introspection on the quality of this process many sets of data have been reviewed and analyzed. This analysis included Sheffield to CMM comparisons, CMM inspection repeatability, fixturing differences, quality check development, probing approach changes. This update report will touch on these improvements that have built the confidence in this process tomore » mainstream it inspecting shells. In addition to the CMM programming advancements, the continuation in refinement of input and outputs for the CMM program has created an archiving scheme, input spline files, an output metafile, and inspection report package. This project will continue to mature. Part designs may require program modifications to accommodate "new to this process" part designs. Technology limitations tied to security and performance are requiring possible changes to computer configurations to support an automated process.« less

  5. The use of self checks and voting in software error detection - An empirical study

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy G.; Cha, Stephen S.; Knight, John C.; Shimeall, Timothy J.

    1990-01-01

    The results of an empirical study of software error detection using self checks and N-version voting are presented. Working independently, each of 24 programmers first prepared a set of self checks using just the requirements specification of an aerospace application, and then each added self checks to an existing implementation of that specification. The modified programs were executed to measure the error-detection performance of the checks and to compare this with error detection using simple voting among multiple versions. The analysis of the checks revealed that there are great differences in the ability of individual programmers to design effective checks. It was found that some checks that might have been effective failed to detect an error because they were badly placed, and there were numerous instances of checks signaling nonexistent errors. In general, specification-based checks alone were not as effective as specification-based checks combined with code-based checks. Self checks made it possible to identify faults that had not been detected previously by voting 28 versions of the program over a million randomly generated inputs. This appeared to result from the fact that the self checks could examine the internal state of the executing program, whereas voting examines only final results of computations. If internal states had to be identical in N-version voting systems, then there would be no reason to write multiple versions.

  6. Assume-Guarantee Verification of Source Code with Design-Level Assumptions

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra; Pasareanu, Corina S.; Cobleigh, Jamieson M.

    2004-01-01

    Model checking is an automated technique that can be used to determine whether a system satisfies certain required properties. To address the 'state explosion' problem associated with this technique, we propose to integrate assume-guarantee verification at different phases of system development. During design, developers build abstract behavioral models of the system components and use them to establish key properties of the system. To increase the scalability of model checking at this level, we have developed techniques that automatically decompose the verification task by generating component assumptions for the properties to hold. The design-level artifacts are subsequently used to guide the implementation of the system, but also to enable more efficient reasoning at the source code-level. In particular we propose to use design-level assumptions to similarly decompose the verification of the actual system implementation. We demonstrate our approach on a significant NASA application, where design-level models were used to identify; and correct a safety property violation, and design-level assumptions allowed us to check successfully that the property was presented by the implementation.

  7. SU-F-T-165: Daily QA Analysis for Spot Scanning Beamline

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poenisch, F; Gillin, M; Sahoo, N

    2016-06-15

    Purpose: The dosimetric results of our daily quality assurance over the last 8 years for discrete pencil beam scanning proton therapy will be presented. Methods: To perform the dosimetric checks, a multi-ion chamber detector is used, which consists of an array of 5 single parallel plate ion chambers that are aligned as a cross separated by 10cm each. The Tracker is snapped into a jig, which is placed on the tabletop. Different amounts of Solid Water buildup are added to shift the dose distribution. The dosimetric checks consist of 3 parts: position check, range check and volume dose check. Results:more » The average deviation of all position-check data were 0.2±1.3%. For the range check, the average deviation was 0.1%±1.2%, which also corresponds to a range stability of better than 1 mm over all measurements. The volumetric dose output readings were all within ±1% with the exception of 2 occasions when the cable to the dose monitor was being repaired. Conclusion: Morning QA using the Tracker device gives very stable dosimetric readings but is also sensitive to mechanical and output changes in the proton therapy delivery system.« less

  8. 46 CFR 160.115-9 - Preapproval review.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... inventory of materials; (iii) The method for checking quality of fabrication and joints, including welding... §§ 160.115-19 and 160.115-21 of this subpart; (5) A description of the quality control procedures and... between the independent laboratory and Commandant under 46 CFR part 159, subpart 159.010. (d) Plan quality...

  9. 46 CFR 160.115-9 - Preapproval review.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... inventory of materials; (iii) The method for checking quality of fabrication and joints, including welding... §§ 160.115-19 and 160.115-21 of this subpart; (5) A description of the quality control procedures and... between the independent laboratory and Commandant under 46 CFR part 159, subpart 159.010. (d) Plan quality...

  10. 46 CFR 160.115-9 - Preapproval review.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... inventory of materials; (iii) The method for checking quality of fabrication and joints, including welding... §§ 160.115-19 and 160.115-21 of this subpart; (5) A description of the quality control procedures and... between the independent laboratory and Commandant under 46 CFR part 159, subpart 159.010. (d) Plan quality...

  11. Wheat Quality Council, Hard Spring Wheat Technical Committee, 2015 Crop

    USDA-ARS?s Scientific Manuscript database

    Nine experimental lines of hard spring wheat were grown at up to five locations in 2015 and evaluated for kernel, milling, and bread baking quality against the check variety Glenn. Wheat samples were submitted through the Wheat Quality Council and processed and milled at the USDA-ARS Hard Red Sprin...

  12. Wheat Quality Council, Hard Spring Wheat Technical Committee, 2017 Crop

    USDA-ARS?s Scientific Manuscript database

    Nine experimental lines of hard spring wheat were grown at up to six locations in 2017 and evaluated for kernel, milling, and bread baking quality against the check variety Glenn. Wheat samples were submitted through the Wheat Quality Council and processed and milled at the USDA-ARS Hard Red Spring...

  13. Wheat Quality Council, Hard Spring Wheat Technical Committee, 2014 Crop

    USDA-ARS?s Scientific Manuscript database

    Eleven experimental lines of hard spring wheat were grown at up to five locations in 2014 and evaluated for kernel, milling, and bread baking quality against the check variety Glenn. Wheat samples were submitted through the Wheat Quality Council and processed and milled at the USDA-ARS Hard Red Spr...

  14. Qualities of Early Childhood Teachers: Reflections from Teachers and Administrators.

    ERIC Educational Resources Information Center

    Weitman, Catheryn J.; Humphries, Janie H.

    Data were collected from elementary school principals and kindergarten teachers in Texas and Louisiana in an effort to identify qualities that are thought to be important for kindergarten teachers. A questionnaire listing 462 qualities of early childhood teachers was compiled from literature reviews. Subjects were asked to check a maximum of 50…

  15. Preventing the threat of credit-card fraud: Factors influencing cashiers' identification-checking behavior.

    PubMed

    Downing, Christopher; Howard, E Henry; Goodwin, Christina; Geller, E Scott

    2016-01-01

    Two studies examined factors influencing cashiers' identification (ID)-checking behavior in order to inform the development of interventions to prevent credit-card fraud. In both studies, research assistants made credit purchases in various stores and noted the cashiers' ID-checking behavior. In the first study, the store type, whether the cashier swiped the credit/debit card, the amount of the purchase, and whether the credit/debit card was signed significantly influenced ID-checking behavior. In the second study, an A-B-A design was used to evaluate the impact of a "Check my ID" prompt placed on the credit/debit card. The prompt increased cashiers' ID-checking behavior from 5.9% at Baseline to 10.3% during the Intervention. When the prompt was removed, the cashiers' ID-checking behavior decreased to 7.2%. Implications for further intervention research to prevent credit-card fraud are discussed.

  16. Development of an expert planning system for OSSA

    NASA Technical Reports Server (NTRS)

    Groundwater, B.; Lembeck, M. F.; Sarsfield, L.; Diaz, Alphonso

    1988-01-01

    This paper presents concepts related to preliminary work for the development of an expert planning system for NASA's Office for Space Science and Applications (OSSA). The expert system will function as a planner's decision aid in preparing mission plans encompassing sets of proposed OSSA space science initiatives. These plans in turn will be checked against budgetary and technical constraints and tested for constraint violations. Appropriate advice will be generated by the system for making modifications to the plans to bring them in line with the constraints. The OSSA Planning Expert System (OPES) has been designed to function as an integral part of the OSSA mission planning process. It will be able to suggest a best plan, be able to accept and check a user-suggested strawman plan, and should provide a quick response to user request and actions. OPES will be written in the C programming language and have a transparent user interface running under Windows 386 on a Compaq 386/20 machine. The system's sorted knowledge and inference procedures will model the expertise of human planners familiar with the OSSA planning domain. Given mission priorities and budget guidelines, the system first sets the launch dates for each mission. It will check to make sure that planetary launch windows and precursor mission relationships are not violated. Additional levels of constraints will then be considered, checking such things as the availability of a suitable launch vehicle, total mission launch mass required vs. the identified launch mass capability, and the total power required by the payload at its destination vs. the actual power available. System output will be in the form of Gantt charts, spreadsheet hardcopy, and other presentation quality materials detailing the resulting OSSA mission plan.

  17. Design and validation of an aircraft seat comfort scale using item response theory.

    PubMed

    Menegon, Lizandra da Silva; Vincenzi, Silvana Ligia; de Andrade, Dalton Francisco; Barbetta, Pedro Alberto; Merino, Eugenio Andrés Díaz; Vink, Peter

    2017-07-01

    This article aims to evaluate the psychometric properties of a scale that measures aircraft seat comfort. Factor analysis was used to study data variances. Psychometric quality was checked by using Item Response Theory. The sample consisted of 1500 passengers who completed a questionnaire at a Brazilian airport. Full information factor analysis showed the presence of one dominant factor explaining 34% of data variance. The scale generated covered all levels of comfort data, from 'no comfort' to 'maximum comfort'. The results show that the passengers consider there is comfort, but this is very minimal when these passengers have to perform their desired activities. It tends to increase when aspects of the aircraft seating are improved and positive emotions are elicited. Comfort peaks when pleasure is experienced and passenger expectations are exceeded (maximum comfort). This outcome seems consistent with the literature. Further research is advised to compare the outcome of this questionnaire with other research methods, and to check if the questionnaire is sensitive enough and whether its conclusions are useful in practice. Copyright © 2017. Published by Elsevier Ltd.

  18. 40 CFR 85.2233 - Steady state test equipment calibrations, adjustments, and quality control-EPA 91.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... tolerance range. The pressure in the sample cell must be the same with the calibration gas flowing during... this chapter. The check is done at 30 mph (48 kph), and a power absorption load setting to generate a... in § 85.2225(c)(1) are not met. (2) Leak checks. Each time the sample line integrity is broken, a...

  19. 40 CFR 85.2233 - Steady state test equipment calibrations, adjustments, and quality control-EPA 91.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... tolerance range. The pressure in the sample cell must be the same with the calibration gas flowing during... this chapter. The check is done at 30 mph (48 kph), and a power absorption load setting to generate a... in § 85.2225(c)(1) are not met. (2) Leak checks. Each time the sample line integrity is broken, a...

  20. 40 CFR 85.2233 - Steady state test equipment calibrations, adjustments, and quality control-EPA 91.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... tolerance range. The pressure in the sample cell must be the same with the calibration gas flowing during... this chapter. The check is done at 30 mph (48 kph), and a power absorption load setting to generate a... in § 85.2225(c)(1) are not met. (2) Leak checks. Each time the sample line integrity is broken, a...

  1. A Low-Complexity Euclidean Orthogonal LDPC Architecture for Low Power Applications

    PubMed Central

    Revathy, M.; Saravanan, R.

    2015-01-01

    Low-density parity-check (LDPC) codes have been implemented in latest digital video broadcasting, broadband wireless access (WiMax), and fourth generation of wireless standards. In this paper, we have proposed a high efficient low-density parity-check code (LDPC) decoder architecture for low power applications. This study also considers the design and analysis of check node and variable node units and Euclidean orthogonal generator in LDPC decoder architecture. The Euclidean orthogonal generator is used to reduce the error rate of the proposed LDPC architecture, which can be incorporated between check and variable node architecture. This proposed decoder design is synthesized on Xilinx 9.2i platform and simulated using Modelsim, which is targeted to 45 nm devices. Synthesis report proves that the proposed architecture greatly reduces the power consumption and hardware utilizations on comparing with different conventional architectures. PMID:26065017

  2. Study rationale and design of OPTIMISE, a randomised controlled trial on the effect of benchmarking on quality of care in type 2 diabetes mellitus.

    PubMed

    Nobels, Frank; Debacker, Noëmi; Brotons, Carlos; Elisaf, Moses; Hermans, Michel P; Michel, Georges; Muls, Erik

    2011-09-22

    To investigate the effect of physician- and patient-specific feedback with benchmarking on the quality of care in adults with type 2 diabetes mellitus (T2DM). Study centres in six European countries were randomised to either a benchmarking or control group. Physicians in both groups received feedback on modifiable outcome indicators (glycated haemoglobin [HbA1c], glycaemia, total cholesterol, high density lipoprotein-cholesterol, low density lipoprotein [LDL]-cholesterol and triglycerides) for each patient at 0, 4, 8 and 12 months, based on the four times yearly control visits recommended by international guidelines. The benchmarking group also received comparative results on three critical quality indicators of vascular risk (HbA1c, LDL-cholesterol and systolic blood pressure [SBP]), checked against the results of their colleagues from the same country, and versus pre-set targets. After 12 months of follow up, the percentage of patients achieving the pre-determined targets for the three critical quality indicators will be assessed in the two groups. Recruitment was completed in December 2008 with 3994 evaluable patients. This paper discusses the study rationale and design of OPTIMISE, a randomised controlled study, that will help assess whether benchmarking is a useful clinical tool for improving outcomes in T2DM in primary care. NCT00681850.

  3. Study rationale and design of OPTIMISE, a randomised controlled trial on the effect of benchmarking on quality of care in type 2 diabetes mellitus

    PubMed Central

    2011-01-01

    Background To investigate the effect of physician- and patient-specific feedback with benchmarking on the quality of care in adults with type 2 diabetes mellitus (T2DM). Methods Study centres in six European countries were randomised to either a benchmarking or control group. Physicians in both groups received feedback on modifiable outcome indicators (glycated haemoglobin [HbA1c], glycaemia, total cholesterol, high density lipoprotein-cholesterol, low density lipoprotein [LDL]-cholesterol and triglycerides) for each patient at 0, 4, 8 and 12 months, based on the four times yearly control visits recommended by international guidelines. The benchmarking group also received comparative results on three critical quality indicators of vascular risk (HbA1c, LDL-cholesterol and systolic blood pressure [SBP]), checked against the results of their colleagues from the same country, and versus pre-set targets. After 12 months of follow up, the percentage of patients achieving the pre-determined targets for the three critical quality indicators will be assessed in the two groups. Results Recruitment was completed in December 2008 with 3994 evaluable patients. Conclusions This paper discusses the study rationale and design of OPTIMISE, a randomised controlled study, that will help assess whether benchmarking is a useful clinical tool for improving outcomes in T2DM in primary care. Trial registration NCT00681850 PMID:21939502

  4. A study of applications scribe frame data verifications using design rule check

    NASA Astrophysics Data System (ADS)

    Saito, Shoko; Miyazaki, Masaru; Sakurai, Mitsuo; Itoh, Takahisa; Doi, Kazumasa; Sakurai, Norioko; Okada, Tomoyuki

    2013-06-01

    In semiconductor manufacturing, scribe frame data generally is generated for each LSI product according to its specific process design. Scribe frame data is designed based on definition tables of scanner alignment, wafer inspection and customers specified marks. We check that scribe frame design is conforming to specification of alignment and inspection marks at the end. Recently, in COT (customer owned tooling) business or new technology development, there is no effective verification method for the scribe frame data, and we take a lot of time to work on verification. Therefore, we tried to establish new verification method of scribe frame data by applying pattern matching and DRC (Design Rule Check) which is used in device verification. We would like to show scheme of the scribe frame data verification using DRC which we tried to apply. First, verification rules are created based on specifications of scanner, inspection and others, and a mark library is also created for pattern matching. Next, DRC verification is performed to scribe frame data. Then the DRC verification includes pattern matching using mark library. As a result, our experiments demonstrated that by use of pattern matching and DRC verification our new method can yield speed improvements of more than 12 percent compared to the conventional mark checks by visual inspection and the inspection time can be reduced to less than 5 percent if multi-CPU processing is used. Our method delivers both short processing time and excellent accuracy when checking many marks. It is easy to maintain and provides an easy way for COT customers to use original marks. We believe that our new DRC verification method for scribe frame data is indispensable and mutually beneficial.

  5. Prediction Interval Development for Wind-Tunnel Balance Check-Loading

    NASA Technical Reports Server (NTRS)

    Landman, Drew; Toro, Kenneth G.; Commo, Sean A.; Lynn, Keith C.

    2014-01-01

    Results from the Facility Analysis Verification and Operational Reliability project revealed a critical gap in capability in ground-based aeronautics research applications. Without a standardized process for check-loading the wind-tunnel balance or the model system, the quality of the aerodynamic force data collected varied significantly between facilities. A prediction interval is required in order to confirm a check-loading. The prediction interval provides an expected upper and lower bound on balance load prediction at a given confidence level. A method has been developed which accounts for sources of variability due to calibration and check-load application. The prediction interval method of calculation and a case study demonstrating its use is provided. Validation of the methods is demonstrated for the case study based on the probability of capture of confirmation points.

  6. Inkjet 3D printed check microvalve

    NASA Astrophysics Data System (ADS)

    Walczak, Rafał; Adamski, Krzysztof; Lizanets, Danylo

    2017-04-01

    3D printing enables fast and relatively easy fabrication of various microfluidic structures including microvalves. A check microvalve is the simplest valve enabling control of the fluid flow in microchannels. Proper operation of the check valve is ensured by a movable element that tightens the valve seat during backward flow and enables free flow for forward pressure. Thus, knowledge of the mechanical properties of the movable element is crucial for optimal design and operation of the valve. In this paper, we present for the first time the results of investigations on basic mechanical properties of the building material used in multijet 3D printing. Specified mechanical properties were used in the design and fabrication of two types of check microvalve—with deflecting or hinge-fixed microflap—with 200 µm and 300 µm thickness. Results of numerical simulation and experimental data of the microflap deflection were obtained and compared. The valves were successfully 3D printed and characterised. Opening/closing characteristics of the microvalve for forward and backward pressures were determined. Thus, proper operation of the check microvalve so developed was confirmed.

  7. TESPI (Tool for Environmental Sound Product Innovation): a simplified software tool to support environmentally conscious design in SMEs

    NASA Astrophysics Data System (ADS)

    Misceo, Monica; Buonamici, Roberto; Buttol, Patrizia; Naldesi, Luciano; Grimaldi, Filomena; Rinaldi, Caterina

    2004-12-01

    TESPI (Tool for Environmental Sound Product Innovation) is the prototype of a software tool developed within the framework of the "eLCA" project. The project, (www.elca.enea.it)financed by the European Commission, is realising "On line green tools and services for Small and Medium sized Enterprises (SMEs)". The implementation by SMEs of environmental product innovation (as fostered by the European Integrated Product Policy, IPP) needs specific adaptation to their economic model, their knowledge of production and management processes and their relationships with innovation and the environment. In particular, quality and costs are the main driving forces of innovation in European SMEs, and well known barriers exist to the adoption of an environmental approach in the product design. Starting from these considerations, the TESPI tool has been developed to support the first steps of product design taking into account both the quality and the environment. Two main issues have been considered: (i) classic Quality Function Deployment (QFD) can hardly be proposed to SMEs; (ii) the environmental aspects of the product life cycle need to be integrated with the quality approach. TESPI is a user friendly web-based tool, has a training approach and applies to modular products. Users are guided through the investigation of the quality aspects of their product (customer"s needs and requirements fulfilment) and the identification of the key environmental aspects in the product"s life cycle. A simplified check list allows analyzing the environmental performance of the product. Help is available for a better understanding of the analysis criteria. As a result, the significant aspects for the redesign of the product are identified.

  8. Conceptual design of ACB-CP for ITER cryogenic system

    NASA Astrophysics Data System (ADS)

    Jiang, Yongcheng; Xiong, Lianyou; Peng, Nan; Tang, Jiancheng; Liu, Liqiang; Zhang, Liang

    2012-06-01

    ACB-CP (Auxiliary Cold Box for Cryopumps) is used to supply the cryopumps system with necessary cryogen in ITER (International Thermonuclear Experimental Reactor) cryogenic distribution system. The conceptual design of ACB-CP contains thermo-hydraulic analysis, 3D structure design and strength checking. Through the thermohydraulic analysis, the main specifications of process valves, pressure safety valves, pipes, heat exchangers can be decided. During the 3D structure design process, vacuum requirement, adiabatic requirement, assembly constraints and maintenance requirement have been considered to arrange the pipes, valves and other components. The strength checking has been performed to crosscheck if the 3D design meets the strength requirements for the ACB-CP.

  9. Quality of Health Care: The Views of Homeless Youth

    PubMed Central

    Ensign, Josephine

    2004-01-01

    Objective To develop homeless-youth-identified process and outcome measures of quality of health care. Data Sources/Study Setting Primary data collection with homeless youth from both street and clinic settings in Seattle, Washington, for calendar year 2002. Study Design The research was a focused ethnography, using key informant and in-depth individual interviews as well as focus groups with a purposeful sample of 47 homeless youth aged 12–23 years. Data Collection/Extraction Methods All interviews and focus groups were tape-recorded, transcribed, and preliminarily coded, with final coding cross-checked and verified with a second researcher. Principal Findings Homeless youth most often stated that cultural and interpersonal aspects of quality of care were important to them. Physical aspects of quality of care reported by the youth were health care sites separate from those for homeless adults, andsites that offered a choice of allopathic and complementary medicine. Outcomes of health care included survival of homelessness, functional and disease-state improvement, and having increased trust and connections with adults and with the wider community. Conclusions Homeless youth identified components of quality of care as well as how quality of care should be measured. Their perspectives will be included in a larger follow-up study to develop quality of care indicators for homeless youth. PMID:15230923

  10. Checking-up of optical graduated rules by laser interferometry

    NASA Astrophysics Data System (ADS)

    Miron, Nicolae P.; Sporea, Dan G.

    1996-05-01

    The main aspects related to the operating principle, design, and implementation of high-productivity equipment for checking-up the graduation accuracy of optical graduated rules used as a length reference in optical measuring instruments for precision machine tools are presented. The graduation error checking-up is done with a Michelson interferometer as a length transducer. The instrument operation is managed by a computer, which controls the equipment, data acquisition, and processing. The evaluation is performed for rule lengths from 100 to 3000 mm, with a checking-up error less than 2 micrometers/m. The checking-up time is about 15 min for a 1000-mm rule, with averaging over four measurements.

  11. Geometric facial comparisons in speed-check photographs.

    PubMed

    Buck, Ursula; Naether, Silvio; Kreutz, Kerstin; Thali, Michael

    2011-11-01

    In many cases, it is not possible to call the motorists to account for their considerable excess in speeding, because they deny being the driver on the speed-check photograph. An anthropological comparison of facial features using a photo-to-photo comparison can be very difficult depending on the quality of the photographs. One difficulty of that analysis method is that the comparison photographs of the presumed driver are taken with a different camera or camera lens and from a different angle than for the speed-check photo. To take a comparison photograph with exactly the same camera setup is almost impossible. Therefore, only an imprecise comparison of the individual facial features is possible. The geometry and position of each facial feature, for example the distances between the eyes or the positions of the ears, etc., cannot be taken into consideration. We applied a new method using 3D laser scanning, optical surface digitalization, and photogrammetric calculation of the speed-check photo, which enables a geometric comparison. Thus, the influence of the focal length and the distortion of the objective lens are eliminated and the precise position and the viewing direction of the speed-check camera are calculated. Even in cases of low-quality images or when the face of the driver is partly hidden, good results are delivered using this method. This new method, Geometric Comparison, is evaluated and validated in a prepared study which is described in this article.

  12. Methods for Geometric Data Validation of 3d City Models

    NASA Astrophysics Data System (ADS)

    Wagner, D.; Alam, N.; Wewetzer, M.; Pries, M.; Coors, V.

    2015-12-01

    Geometric quality of 3D city models is crucial for data analysis and simulation tasks, which are part of modern applications of the data (e.g. potential heating energy consumption of city quarters, solar potential, etc.). Geometric quality in these contexts is however a different concept as it is for 2D maps. In the latter case, aspects such as positional or temporal accuracy and correctness represent typical quality metrics of the data. They are defined in ISO 19157 and should be mentioned as part of the metadata. 3D data has a far wider range of aspects which influence their quality, plus the idea of quality itself is application dependent. Thus, concepts for definition of quality are needed, including methods to validate these definitions. Quality on this sense means internal validation and detection of inconsistent or wrong geometry according to a predefined set of rules. A useful starting point would be to have correct geometry in accordance with ISO 19107. A valid solid should consist of planar faces which touch their neighbours exclusively in defined corner points and edges. No gaps between them are allowed, and the whole feature must be 2-manifold. In this paper, we present methods to validate common geometric requirements for building geometry. Different checks based on several algorithms have been implemented to validate a set of rules derived from the solid definition mentioned above (e.g. water tightness of the solid or planarity of its polygons), as they were developed for the software tool CityDoctor. The method of each check is specified, with a special focus on the discussion of tolerance values where they are necessary. The checks include polygon level checks to validate the correctness of each polygon, i.e. closeness of the bounding linear ring and planarity. On the solid level, which is only validated if the polygons have passed validation, correct polygon orientation is checked, after self-intersections outside of defined corner points and edges are detected, among additional criteria. Self-intersection might lead to different results, e.g. intersection points, lines or areas. Depending on the geometric constellation, they might represent gaps between bounding polygons of the solids, overlaps, or violations of the 2-manifoldness. Not least due to the floating point problem in digital numbers, tolerances must be considered in some algorithms, e.g. planarity and solid self-intersection. Effects of different tolerance values and their handling is discussed; recommendations for suitable values are given. The goal of the paper is to give a clear understanding of geometric validation in the context of 3D city models. This should also enable the data holder to get a better comprehension of the validation results and their consequences on the deployment fields of the validated data set.

  13. Agricultural Baseline (BL0) scenario of the 2016 Billion-Ton Report

    DOE Data Explorer

    Davis, Maggie R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000181319328); Hellwinkel, Chad [University of Tennessee, APAC] (ORCID:0000000173085058); Eaton, Laurence [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000312709626); Langholtz, Matthew H [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000281537154); Turhollow, Anthony [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000228159350); Brandt, Craig [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000214707379); Myers, Aaron [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)] (ORCID:0000000320373827)

    2016-07-13

    Scientific reason for data generation: to serve as the reference case for the BT16 volume 1 agricultural scenarios. The agricultural baseline runs from 2015 through 2040; a starting year of 2014 is used. Date the data set was last modified: 02/12/2016 How each parameter was produced (methods), format, and relationship to other data in the data set: simulation was developed without offering a farmgate price to energy crops or residues (i.e., building on both the USDA 2015 baseline and the agricultural census data (USDA NASS 2014). Data generated are .txt output files by year, simulation identifier, county code (1-3109). Instruments used: POLYSYS (version POLYS2015_V10_alt_JAN22B) supplied by the University of Tennessee APAC The quality assurance and quality control that have been applied: • Check for negative planted area, harvested area, production, yield and cost values. • Check if harvested area exceeds planted area for annuals. • Check FIPS codes.

  14. Water system microbial check valve development

    NASA Technical Reports Server (NTRS)

    Colombo, G. V.; Greenley, D. R.; Putnam, D. F.

    1978-01-01

    A residual iodine microbial check valve (RIMCV) assembly was developed and tested. The assembly is designed to be used in the space shuttle potable water system. The RIMCV is based on an anion exchange resin that is supersaturated with an iodine solution. This system causes a residual to be present in the effluent water which provides continuing bactericidal action. A flight prototype design was finalized and five units were manufactured and delivered.

  15. Design of Installing Check Dam Using RAMMS Model in Seorak National Park of South Korea

    NASA Astrophysics Data System (ADS)

    Jun, K.; Tak, W.; JUN, B. H.; Lee, H. J.; KIM, S. D.

    2016-12-01

    Design of Installing Check Dam Using RAMMS Model in Seorak National Park of South Korea Kye-Won Jun*, Won-Jun Tak*, Byong-Hee Jun**, Ho-Jin Lee***, Soung-Doug Kim* *Graduate School of Disaster Prevention, Kangwon National University, 346 Joogang-ro, Samcheok-si, Gangwon-do, Korea **School of Fire and Disaster Protection, Kangwon National University, 346 Joogang-ro, Samcheok-si, Gangwon-do, Korea ***School of Civil Engineering, Chungbuk National University, 1 Chungdae-ro, Seowon-gu, Cheongju, Korea Abstract As more than 64% of the land in South Korea is mountainous area, so many regions in South Korea are exposed to the danger of landslide and debris flow. So it is important to understand the behavior of debris flow in mountainous terrains, the various methods and models are being presented and developed based on the mathematical concept. The purpose of this study is to investigate the regions that experienced the debris flow due to typhoon called Ewiniar and to perform numerical modeling to design and layout of the Check dam for reducing the damage by the debris flow. For the performance of numerical modeling, on-site measurement of the research area was conducted including: topographic investigation, research on bridges in the downstream, and precision LiDAR 3D scanning for composed basic data of numerical modeling. The numerical simulation of this study was performed using RAMMS (Rapid Mass Movements Simulation) model for the analysis of the debris flow. This model applied to the conditions of the Check dam which was installed in the upstream, midstream, and downstream. Considering the reduction effect of debris flow, the expansion of debris flow, and the influence on the bridges in the downstream, proper location of the Check dam was designated. The result of present numerical model showed that when the Check dam was installed in the downstream section, 50 m above the bridge, the reduction effect of the debris flow was higher compared to when the Check dam were installed in other sections. Key words: Debris flow, LiDAR, Check dam, RAMMSAcknowledgementsThis research was supported by a grant [MPSS-NH-2014-74] through the Disaster and Safety Management Institute funded by Ministry of Public Safety and Security of Korean government

  16. Proceedings of the First NASA Formal Methods Symposium

    NASA Technical Reports Server (NTRS)

    Denney, Ewen (Editor); Giannakopoulou, Dimitra (Editor); Pasareanu, Corina S. (Editor)

    2009-01-01

    Topics covered include: Model Checking - My 27-Year Quest to Overcome the State Explosion Problem; Applying Formal Methods to NASA Projects: Transition from Research to Practice; TLA+: Whence, Wherefore, and Whither; Formal Methods Applications in Air Transportation; Theorem Proving in Intel Hardware Design; Building a Formal Model of a Human-Interactive System: Insights into the Integration of Formal Methods and Human Factors Engineering; Model Checking for Autonomic Systems Specified with ASSL; A Game-Theoretic Approach to Branching Time Abstract-Check-Refine Process; Software Model Checking Without Source Code; Generalized Abstract Symbolic Summaries; A Comparative Study of Randomized Constraint Solvers for Random-Symbolic Testing; Component-Oriented Behavior Extraction for Autonomic System Design; Automated Verification of Design Patterns with LePUS3; A Module Language for Typing by Contracts; From Goal-Oriented Requirements to Event-B Specifications; Introduction of Virtualization Technology to Multi-Process Model Checking; Comparing Techniques for Certified Static Analysis; Towards a Framework for Generating Tests to Satisfy Complex Code Coverage in Java Pathfinder; jFuzz: A Concolic Whitebox Fuzzer for Java; Machine-Checkable Timed CSP; Stochastic Formal Correctness of Numerical Algorithms; Deductive Verification of Cryptographic Software; Coloured Petri Net Refinement Specification and Correctness Proof with Coq; Modeling Guidelines for Code Generation in the Railway Signaling Context; Tactical Synthesis Of Efficient Global Search Algorithms; Towards Co-Engineering Communicating Autonomous Cyber-Physical Systems; and Formal Methods for Automated Diagnosis of Autosub 6000.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ellefson, S; Department of Human Oncology, University of Wisconsin, Madison, WI; Culberson, W

    Purpose: Discrepancies in absolute dose values have been detected between the ViewRay treatment planning system and ArcCHECK readings when performing delivery quality assurance on the ViewRay system with the ArcCHECK-MR diode array (SunNuclear Corporation). In this work, we investigate whether these discrepancies are due to errors in the ViewRay planning and/or delivery system or due to errors in the ArcCHECK’s readings. Methods: Gamma analysis was performed on 19 ViewRay patient plans using the ArcCHECK. Frequency analysis on the dose differences was performed. To investigate whether discrepancies were due to measurement or delivery error, 10 diodes in low-gradient dose regions weremore » chosen to compare with ion chamber measurements in a PMMA phantom with the same size and shape as the ArcCHECK, provided by SunNuclear. The diodes chosen all had significant discrepancies in absolute dose values compared to the ViewRay TPS. Absolute doses to PMMA were compared between the ViewRay TPS calculations, ArcCHECK measurements, and measurements in the PMMA phantom. Results: Three of the 19 patient plans had 3%/3mm gamma passing rates less than 95%, and ten of the 19 plans had 2%/2mm passing rates less than 95%. Frequency analysis implied a non-random error process. Out of the 10 diode locations measured, ion chamber measurements were all within 2.2% error relative to the TPS and had a mean error of 1.2%. ArcCHECK measurements ranged from 4.5% to over 15% error relative to the TPS and had a mean error of 8.0%. Conclusion: The ArcCHECK performs well for quality assurance on the ViewRay under most circumstances. However, under certain conditions the absolute dose readings are significantly higher compared to the planned doses. As the ion chamber measurements consistently agree with the TPS, it can be concluded that the discrepancies are due to ArcCHECK measurement error and not TPS or delivery system error. This work was funded by the Bhudatt Paliwal Professorship and the University of Wisconsin Medical Radiation Research Center.« less

  18. OntoCheck: verifying ontology naming conventions and metadata completeness in Protégé 4.

    PubMed

    Schober, Daniel; Tudose, Ilinca; Svatek, Vojtech; Boeker, Martin

    2012-09-21

    Although policy providers have outlined minimal metadata guidelines and naming conventions, ontologies of today still display inter- and intra-ontology heterogeneities in class labelling schemes and metadata completeness. This fact is at least partially due to missing or inappropriate tools. Software support can ease this situation and contribute to overall ontology consistency and quality by helping to enforce such conventions. We provide a plugin for the Protégé Ontology editor to allow for easy checks on compliance towards ontology naming conventions and metadata completeness, as well as curation in case of found violations. In a requirement analysis, derived from a prior standardization approach carried out within the OBO Foundry, we investigate the needed capabilities for software tools to check, curate and maintain class naming conventions. A Protégé tab plugin was implemented accordingly using the Protégé 4.1 libraries. The plugin was tested on six different ontologies. Based on these test results, the plugin could be refined, also by the integration of new functionalities. The new Protégé plugin, OntoCheck, allows for ontology tests to be carried out on OWL ontologies. In particular the OntoCheck plugin helps to clean up an ontology with regard to lexical heterogeneity, i.e. enforcing naming conventions and metadata completeness, meeting most of the requirements outlined for such a tool. Found test violations can be corrected to foster consistency in entity naming and meta-annotation within an artefact. Once specified, check constraints like name patterns can be stored and exchanged for later re-use. Here we describe a first version of the software, illustrate its capabilities and use within running ontology development efforts and briefly outline improvements resulting from its application. Further, we discuss OntoChecks capabilities in the context of related tools and highlight potential future expansions. The OntoCheck plugin facilitates labelling error detection and curation, contributing to lexical quality assurance in OWL ontologies. Ultimately, we hope this Protégé extension will ease ontology alignments as well as lexical post-processing of annotated data and hence can increase overall secondary data usage by humans and computers.

  19. Simple Check Valves for Microfluidic Devices

    NASA Technical Reports Server (NTRS)

    Willis, Peter A.; Greer, Harold F.; Smith, J. Anthony

    2010-01-01

    A simple design concept for check valves has been adopted for microfluidic devices that consist mostly of (1) deformable fluorocarbon polymer membranes sandwiched between (2) borosilicate float glass wafers into which channels, valve seats, and holes have been etched. The first microfluidic devices in which these check valves are intended to be used are micro-capillary electrophoresis (microCE) devices undergoing development for use on Mars in detecting compounds indicative of life. In this application, it will be necessary to store some liquid samples in reservoirs in the devices for subsequent laboratory analysis, and check valves are needed to prevent cross-contamination of the samples. The simple check-valve design concept is also applicable to other microfluidic devices and to fluidic devices in general. These check valves are simplified microscopic versions of conventional rubber- flap check valves that are parts of numerous industrial and consumer products. These check valves are fabricated, not as separate components, but as integral parts of microfluidic devices. A check valve according to this concept consists of suitably shaped portions of a deformable membrane and the two glass wafers between which the membrane is sandwiched (see figure). The valve flap is formed by making an approximately semicircular cut in the membrane. The flap is centered over a hole in the lower glass wafer, through which hole the liquid in question is intended to flow upward into a wider hole, channel, or reservoir in the upper glass wafer. The radius of the cut exceeds the radius of the hole by an amount large enough to prevent settling of the flap into the hole. As in a conventional rubber-flap check valve, back pressure in the liquid pushes the flap against the valve seat (in this case, the valve seat is the adjacent surface of the lower glass wafer), thereby forming a seal that prevents backflow.

  20. pySeismicDQA: open source post experiment data quality assessment and processing

    NASA Astrophysics Data System (ADS)

    Polkowski, Marcin

    2017-04-01

    Seismic Data Quality Assessment is python based, open source set of tools dedicated for data processing after passive seismic experiments. Primary goal of this toolset is unification of data types and formats from different dataloggers necessary for further processing. This process requires additional data checks for errors, equipment malfunction, data format errors, abnormal noise levels, etc. In all such cases user needs to decide (manually or by automatic threshold) if data is removed from output dataset. Additionally, output dataset can be visualized in form of website with data availability charts and waveform visualization with earthquake catalog (external). Data processing can be extended with simple STA/LTA event detection. pySeismicDQA is designed and tested for two passive seismic experiments in central Europe: PASSEQ 2006-2008 and "13 BB Star" (2013-2016). National Science Centre Poland provided financial support for this work via NCN grant DEC-2011/02/A/ST10/00284.

  1. 75 FR 65975 - Exchange Visitor Program-Secondary School Students

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-27

    ...The Department is revising existing Secondary School Student regulations regarding the screening, selection, school enrollment, orientation, and quality assurance monitoring of exchange students as well as the screening, selection, orientation, and quality assurance monitoring of host families and field staff. Further, the Department is adopting a new requirement regarding training for all organizational representatives who place and/or monitor students with host families. The proposed requirement to conduct FBI fingerprint-based criminal background checks will not be implemented at this time. Rather, it will continue to be examined and a subsequent Final Rule regarding this provision will be forthcoming. These regulations, as revised, govern the Department designated exchange visitor programs under which foreign secondary school students (ages 15-18\\1/2\\) are afforded the opportunity to study in the United States at accredited public or private secondary schools for an academic semester or year while living with American host families or residing at accredited U.S. boarding schools.

  2. A mask quality control tool for the OSIRIS multi-object spectrograph

    NASA Astrophysics Data System (ADS)

    López-Ruiz, J. C.; Vaz Cedillo, Jacinto Javier; Ederoclite, Alessandro; Bongiovanni, Ángel; González Escalera, Víctor

    2012-09-01

    OSIRIS multi object spectrograph uses a set of user-customised-masks, which are manufactured on-demand. The manufacturing process consists of drilling the specified slits on the mask with the required accuracy. Ensuring that slits are on the right place when observing is of vital importance. We present a tool for checking the quality of the process of manufacturing the masks which is based on analyzing the instrument images obtained with the manufactured masks on place. The tool extracts the slit information from these images, relates specifications with the extracted slit information, and finally communicates to the operator if the manufactured mask fulfills the expectations of the mask designer. The proposed tool has been built using scripting languages and using standard libraries such as opencv, pyraf and scipy. The software architecture, advantages and limits of this tool in the lifecycle of a multiobject acquisition are presented.

  3. SU-F-T-251: The Quality Assurance for the Heavy Patient Load Department in the Developing Country: The Primary Experience of An Entire Workflow QA Process Management in Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, J; Wang, J; Peng, J

    Purpose: To implement an entire workflow quality assurance (QA) process in the radiotherapy department and to reduce the error rates of radiotherapy based on the entire workflow management in the developing country. Methods: The entire workflow QA process management starts from patient registration to the end of last treatment including all steps through the entire radiotherapy process. Error rate of chartcheck is used to evaluate the the entire workflow QA process. Two to three qualified senior medical physicists checked the documents before the first treatment fraction of every patient. Random check of the treatment history during treatment was also performed.more » A total of around 6000 patients treatment data before and after implementing the entire workflow QA process were compared from May, 2014 to December, 2015. Results: A systemic checklist was established. It mainly includes patient’s registration, treatment plan QA, information exporting to OIS(Oncology Information System), documents of treatment QAand QA of the treatment history. The error rate derived from the chart check decreases from 1.7% to 0.9% after our the entire workflow QA process. All checked errors before the first treatment fraction were corrected as soon as oncologist re-confirmed them and reinforce staff training was accordingly followed to prevent those errors. Conclusion: The entire workflow QA process improved the safety, quality of radiotherapy in our department and we consider that our QA experience can be applicable for the heavily-loaded radiotherapy departments in developing country.« less

  4. Improved near real-time data management procedures for the Mediterranean ocean Forecasting System-Voluntary Observing Ship program

    NASA Astrophysics Data System (ADS)

    Manzella, G. M. R.; Scoccimarro, E.; Pinardi, N.; Tonani, M.

    2003-01-01

    A "ship of opportunity" program was launched as part of the Mediterranean Forecasting System Pilot Project. During the operational period (September 1999 to May 2000), six tracks covered the Mediterranean from the northern to southern boundaries approximately every 15 days, while a long eastwest track from Haifa to Gibraltar was covered approximately every month. XBT data were collected, sub-sampled at 15 inflection points and transmitted through a satellite communication system to a regional data centre. It was found that this data transmission system has limitations in terms of quality of the temperature profiles and quantity of data successfully transmitted. At the end of the MFSPP operational period, a new strategy for data transmission and management was developed. First of all, VOS-XBT data are transmitted with full resolution. Secondly, a new data management system, called Near Real Time Quality Control for XBT (NRT.QC.XBT), was defined to produce a parallel stream of high quality XBT data for further scientific analysis. The procedure includes: (1) Position control; (2) Elimination of spikes; (3) Re-sampling at a 1 metre vertical interval; (4) Filtering; (5) General malfunctioning check; (6) Comparison with climatology (and distance from this in terms of standard deviations); (7) Visual check; and (8) Data consistency check. The first six steps of the new procedure are completely automated; they are also performed using a new climatology developed as part of the project. The visual checks are finally done with a free-market software that allows NRT final data assessment.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hardin, M; Harrison, A; Lockamy, V

    Purpose: Desire to improve efficiency and throughput inspired a review of our physics chart check procedures. Departmental policy mandates plan checks pre-treatment, after first treatment and weekly every 3–5 days. This study examined the effectiveness of the “after first” check with respect to improving patient safety and clinical efficiency. Type and frequency of variations discovered during this redundant secondary review was examined over seven months. Methods: A community spreadsheet was created to record variations in care discovered during chart review following the first fraction of treatment and before the second fraction (each plan reviewed prior to treatment). Entries were recordedmore » from August 2014 through February 2015, amounting to 43 recorded variations out of 906 reviewed charts. The variations were divided into categories and frequencies were assessed month-to-month. Results: Analysis of recorded variations indicates an overall variation rate of 4.7%. The initial rate was 13.5%; months 2–7 average 3.7%. The majority of variations related to discrepancies in documentation at 46.5%, followed by prescription, plan deficiency, and dose tracking related variations at 25.5%, 12.8%, and 12.8%, respectively. Minor variations (negligible consequence on patient treatment) outweighed major variations 3 to 1. Conclusion: This work indicates that this redundant secondary check is effective. The first month spike in rates could be due to the Hawthorne/observer effect, but the consistent 4% variation rate suggests the need for periodical re-training on variations noted as frequent to improve awareness and quality of the initial chart review process, which may lead to improved treatment quality, patient safety and increased clinical efficiency. Utilizing these results, a continuous quality improvement process following Deming’s Plan-Do-Study-Act (PDSA) methodology was generated. The first iteration of this PDSA was adding a specific dose tracking checklist item in the pre-treatment plan check assessment; the ramification of which will be assessed in future data.« less

  6. Assessing women's sexuality after cancer therapy: checking assumptions with the focus group technique.

    PubMed

    Bruner, D W; Boyd, C P

    1999-12-01

    Cancer and cancer therapies impair sexual health in a multitude of ways. The promotion of sexual health is therefore vital for preserving quality of life and is an integral part of total or holistic cancer management. Nursing, to provide holistic care, requires research that is meaningful to patients as well as the profession to develop educational and interventional studies to promote sexual health and coping. To obtain meaningful research data instruments that are reliable, valid, and pertinent to patients' needs are required. Several sexual functioning instruments were reviewed for this study and found to be lacking in either a conceptual foundation or psychometric validation. Without a defined conceptual framework, authors of the instruments must have made certain assumptions regarding what women undergoing cancer therapy experience and what they perceive as important. To check these assumptions before assessing women's sexuality after cancer therapies in a larger study, a pilot study was designed to compare what women experience and perceive as important regarding their sexuality with what is assessed in several currently available research instruments, using the focus group technique. Based on the focus group findings, current sexual functioning questionnaires may be lacking in pertinent areas of concern for women treated for breast or gynecologic malignancies. Better conceptual foundations may help future questionnaire design. Self-regulation theory may provide an acceptable conceptual framework from which to develop a sexual functioning questionnaire.

  7. Automated quality checks on repeat prescribing.

    PubMed Central

    Rogers, Jeremy E; Wroe, Christopher J; Roberts, Angus; Swallow, Angela; Stables, David; Cantrill, Judith A; Rector, Alan L

    2003-01-01

    BACKGROUND: Good clinical practice in primary care includes periodic review of repeat prescriptions. Markers of prescriptions that may need review have been described, but manually checking all repeat prescriptions against the markers would be impractical. AIM: To investigate the feasibility of computerising the application of repeat prescribing quality checks to electronic patient records in United Kingdom (UK) primary care. DESIGN OF STUDY: Software performance test against benchmark manual analysis of cross-sectional convenience sample of prescribing documentation. SETTING: Three general practices in Greater Manchester, in the north west of England, during a 4-month period in 2001. METHOD: A machine-readable drug information resource, based on the British National Formulary (BNF) as the 'gold standard' for valid drug indications, was installed in three practices. Software raised alerts for each repeat prescribed item where the electronic patient record contained no valid indication for the medication. Alerts raised by the software in two practices were analysed manually. Clinical reaction to the software was assessed by semi-structured interviews in three practices. RESULTS: There was no valid indication in the electronic medical records for 14.8% of repeat prescribed items. Sixty-two per cent of all alerts generated were incorrect. Forty-three per cent of all incorrect alerts were as a result of errors in the drug information resource, 44% to locally idiosyncratic clinical coding, 8% to the use of the BNF without adaptation as a gold standard, and 5% to the inability of the system to infer diagnoses that, although unrecorded, would be 'obvious' to a clinical reading the record. The interviewed clinicians supported the goals of the software. CONCLUSION: Using electronic records for secondary decision support purposes will benefit from (and may require) both more consistent electronic clinical data collection across multiple sites, and reconciling clinicians' willingness to infer unstated but 'obvious' diagnoses with the machine's inability to do the same. PMID:14702902

  8. 40 CFR Appendix A to Part 58 - Quality Assurance Requirements for SLAMS, SPMs and PSD Air Monitoring

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... monitor. 3.3.4.4Pb Performance Evaluation Program (PEP) Procedures. Each year, one performance evaluation... Information 2. Quality System Requirements 3. Measurement Quality Check Requirements 4. Calculations for Data... 10 of this appendix) and at a national level in references 1, 2, and 3 of this appendix. 1...

  9. 40 CFR Appendix A to Part 58 - Quality Assurance Requirements for SLAMS, SPMs and PSD Air Monitoring

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... monitor. 3.3.4.4Pb Performance Evaluation Program (PEP) Procedures. Each year, one performance evaluation... Information 2. Quality System Requirements 3. Measurement Quality Check Requirements 4. Calculations for Data... 10 of this appendix) and at a national level in references 1, 2, and 3 of this appendix. 1...

  10. Chemical Safety Alert: Shaft Blow-Out Hazard of Check and Butterfly Valves

    EPA Pesticide Factsheets

    Certain types of check and butterfly valves can undergo shaft-disk separation and fail catastrophically, even when operated within their design limits of pressure and temperature, causing toxic/flammable gas releases, fires, and vapor cloud explosions.

  11. Quantitative assessment of locomotive syndrome by the loco-check questionnaire in older Japanese females

    PubMed Central

    Noge, Sachiko; Ohishi, Tatsuo; Yoshida, Takuya; Kumagai, Hiromichi

    2017-01-01

    [Purpose] Locomotive syndrome (LS) is a condition by which older people may require care service because of problems with locomotive organs. This study examined whether the loco-check, a 7-item questionnaire, is useful for quantitatively assessing the severity of LS. [Subjects and Methods] Seventy-one community dwelling Japanese females aged 64–96 years (81.7 ± 8.0 years) participated in this study. The associations of the loco-check with thigh muscle mass measured by X-ray CT, physical performance, nutritional status, and quality of life (QOL) were investigated. [Results] The results showed that the number of times that “yes” was selected in the loco-check was significantly correlated with thigh muscle mass, major measures of physical performance, nutritional status, and QOL. This number was also significantly larger in the participants experiencing falling, fracture, and lumbar pain than in those without these episodes. [Conclusion] These results suggest that the loco-check might be useful for quantitatively evaluating LS. PMID:28932003

  12. Check Calibration of the NASA Glenn 10- by 10-Foot Supersonic Wind Tunnel (2014 Test Entry)

    NASA Technical Reports Server (NTRS)

    Johnson, Aaron; Pastor-Barsi, Christine; Arrington, E. Allen

    2016-01-01

    A check calibration of the 10- by 10-Foot Supersonic Wind Tunnel (SWT) was conducted in May/June 2014 using an array of five supersonic wedge probes to verify the 1999 Calibration. This check calibration was necessary following a control systems upgrade and an integrated systems test (IST). This check calibration was required to verify the tunnel flow quality was unchanged by the control systems upgrade prior to the next test customer beginning their test entry. The previous check calibration of the tunnel occurred in 2007, prior to the Mars Science Laboratory test program. Secondary objectives of this test entry included the validation of the new Cobra data acquisition system (DAS) against the current Escort DAS and the creation of statistical process control (SPC) charts through the collection of series of repeated test points at certain predetermined tunnel parameters. The SPC charts secondary objective was not completed due to schedule constraints. It is hoped that this effort will be readdressed and completed in the near future.

  13. [The quality of medication orders--can it be improved?].

    PubMed

    Vaknin, Ofra; Wingart-Emerel, Efrat; Stern, Zvi

    2003-07-01

    Medication errors are a common cause of morbidity and mortality among patients. Medication administration in hospitals is a complicated procedure with the possibility of error at each step. Errors are most commonly found at the prescription and transcription stages, although it is known that most errors can easily be avoided through strict adherence to standardized procedure guidelines. In examination of medication errors reported in the hospital in the year 2000, we found that 38% reported to have resulted from transcription errors. In the year 2001, the hospital initiated a program designed to identify faulty process of orders in an effort to improve the quality and effectiveness of the medication administration process. As part of this program, it was decided to check and evaluate the quality of the written doctor's orders and the transcription of those orders to the nursing cadre, in various hospital units. The study was conducted using a questionnaire which checked compliance to hospital standards with regard to the medication administration process, as applied to 6 units over the course of 8 weeks. Results of the survey showed poor compliance to guidelines on the part of doctors and nurses. Only 18% of doctors' orders in the study and 37% of the nurses' transcriptions were written according to standards. The Emergency Department showed an even lower compliance with only 3% of doctors' orders and 25% of nurses' transcriptions complying to standards. As a result of this study, it was decided to initiate an intensive in-service teaching course to refresh the staff's knowledge of medication administration guidelines. In the future it is recommended that hand-written orders be replaced by computerized orders in an effort to limit the chance of error.

  14. Operative blood transfusion quality improvement audit.

    PubMed

    Al Sohaibani, Mazen; Al Malki, Assaf; Pogaku, Venumadhav; Al Dossary, Saad; Al Bernawi, Hanan

    2014-01-01

    To determine how current anesthesia team handless the identification of surgical anaesthetized patient (right patient). And the check of blood unit before collecting and immediately before blood administration (right blood) in operating rooms where nurses have minimal duties and responsibility to handle blood for transfusion in anaesthetized patients. To elicit the degree of anesthesia staff compliance with new policies and procedures for anaesthetized surgical patient the blood transfusion administration. A large tertiary care reference and teaching hospital. A prospective quality improvement. Elaboration on steps for administration of transfusion from policies and procedures to anaesthetized patients; and analysis of the audit forms for conducted transfusions. An audit form was used to get key performance indicators (KPIs) observed in all procedures involve blood transfusion and was ticked as item was met, partially met, not met or not applicable. Descriptive statistics as number and percentage Microsoft excel 2003. Central quality improvement committee presented the results in number percentage and graphs. The degree of compliance in performing the phases of blood transfusion by anesthesia staff reached high percentage which let us feel certain that the quality is assured that the internal policy and procedures (IPP) are followed in the great majority of all types of red cells and other blood products transfusion from the start of requesting the blood or blood product to the prescript of checking the patient in the immediate post-transfusion period. Specific problem area of giving blood transfusion to anaesthetized patient was checking KPI concerning the phases of blood transfusion was audited and assured the investigators of high quality performance in procedures of transfusion.

  15. Clinical implementation of RNA signatures for pharmacogenomic decision-making

    PubMed Central

    Tang, Weihua; Hu, Zhiyuan; Muallem, Hind; Gulley, Margaret L

    2011-01-01

    RNA profiling is increasingly used to predict drug response, dose, or toxicity based on analysis of drug pharmacokinetic or pharmacodynamic pathways. Before implementing multiplexed RNA arrays in clinical practice, validation studies are carried out to demonstrate sufficient evidence of analytic and clinical performance, and to establish an assay protocol with quality assurance measures. Pathologists assure quality by selecting input tissue and by interpreting results in the context of the input tissue as well as the technologies that were used and the clinical setting in which the test was ordered. A strength of RNA profiling is the array-based measurement of tens to thousands of RNAs at once, including redundant tests for critical analytes or pathways to promote confidence in test results. Instrument and reagent manufacturers are crucial for supplying reliable components of the test system. Strategies for quality assurance include careful attention to RNA preservation and quality checks at pertinent steps in the assay protocol, beginning with specimen collection and proceeding through the various phases of transport, processing, storage, analysis, interpretation, and reporting. Specimen quality is checked by probing housekeeping transcripts, while spiked and exogenous controls serve as a check on analytic performance of the test system. Software is required to manipulate abundant array data and present it for interpretation by a laboratory physician who reports results in a manner facilitating therapeutic decision-making. Maintenance of the assay requires periodic documentation of personnel competency and laboratory proficiency. These strategies are shepherding genomic arrays into clinical settings to provide added value to patients and to the larger health care system. PMID:23226056

  16. Biochemia Medica has started using the CrossCheck plagiarism detection software powered by iThenticate

    PubMed Central

    Šupak-Smolčić, Vesna; Šimundić, Ana-Maria

    2013-01-01

    In February 2013, Biochemia Medica has joined CrossRef, which enabled us to implement CrossCheck plagiarism detection service. Therefore, all manuscript submitted to Biochemia Medica are now first assigned to Research integrity editor (RIE), before sending the manuscript for peer-review. RIE submits the text to CrossCheck analysis and is responsible for reviewing the results of the text similarity analysis. Based on the CrossCheck analysis results, RIE subsequently provides a recommendation to the Editor-in-chief (EIC) on whether the manuscript should be forwarded to peer-review, corrected for suspected parts prior to peer-review or immediately rejected. Final decision on the manuscript is, however, with the EIC. We hope that our new policy and manuscript processing algorithm will help us to further increase the overall quality of our Journal. PMID:23894858

  17. Rainfall, Streamflow, and Water-Quality Data During Stormwater Monitoring, Halawa Stream Drainage Basin, Oahu, Hawaii, July 1, 2006 to June 30, 2007

    USGS Publications Warehouse

    Young, Stacie T.M.; Jamison, Marcael T.J.

    2007-01-01

    Storm runoff water-quality samples were collected as part of the State of Hawaii Department of Transportation Stormwater Monitoring Program. This program is designed to assess the effects of highway runoff and urban runoff on Halawa Stream. For this program, rainfall data were collected at two stations, continuous streamflow data at three stations, and water-quality data at five stations, which include the two continuous streamflow stations. This report summarizes rainfall, streamflow, and water-quality data collected between July 1, 2006 and June 30, 2007. A total of 13 samples was collected over two storms during July 1, 2006 to June 30, 2007. The goal was to collect grab samples nearly simultaneously at all five stations and flow-weighted time-composite samples at the three stations equipped with automatic samplers. Samples were analyzed for total suspended solids, total dissolved solids, nutrients, chemical oxygen demand, and selected trace metals (cadmium, chromium, copper, lead, nickel, and zinc). Additionally, grab samples were analyzed for oil and grease, total petroleum hydrocarbons, fecal coliform, and biological oxygen demand. Quality-assurance/quality-control samples were also collected during storms and during routine maintenance to verify analytical procedures and check the effectiveness of equipment-cleaning procedures.

  18. Quality Assurance of Real-Time Oceanographic Data from the Cabled Array of the Ocean Observatories Initiative

    NASA Astrophysics Data System (ADS)

    Kawka, O. E.; Nelson, J. S.; Manalang, D.; Kelley, D. S.

    2016-02-01

    The Cabled Array component of the NSF-funded Ocean Observatories Initiative (OOI) provides access to real-time physical, chemical, geological, and biological data from water column and seafloor platforms/instruments at sites spanning the southern half of the Juan de Fuca Plate. The Quality Assurance (QA) program for OOI data is designed to ensure that data products meet OOI science requirements. This overall data QA plan establishes the guidelines for assuring OOI data quality and summarizes Quality Control (QC) protocols and procedures, based on best practices, which can be utilized to ensure the highest quality data across the OOI program. This presentation will highlight, specifically, the QA/QC approach being utilized for the OOI Cabled Array infrastructure and data and will include a summary of both shipboard and shore-based protocols currently in use. Aspects addressed will be pre-deployment instrument testing and calibration checks, post-deployment and pre-recovery field verification of data, and post-recovery "as-found" testing of instruments. Examples of QA/QC data will be presented and specific cases of cabled data will be discussed in the context of quality assessments and adjustment/correction of OOI datasets overall for inherent sensor drift and/or instrument fouling.

  19. A special ionisation chamber for quality control of diagnostic and mammography X ray equipment.

    PubMed

    Costa, A M; Caldas, L V E

    2003-01-01

    A quality control program for X ray equipment used for conventional radiography and mammography requires the constancy check of the beam qualities in terms of the half-value layers. In this work, a special double-faced parallel-plate ionisation chamber was developed with inner electrodes of different materials, in a tandem system. Its application will be in quality control programs of diagnostic and mammography X ray equipment for confirmation of half-value layers previously determined by the conventional method. Moreover, the chamber also may be utilised for measurements of air kerma values (and air kerma rates) in X radiation fields used for conventional radiography and mammography. The chamber was studied in relation to the characteristics of saturation, ion collection efficiency, polarity effects, leakage current, and short-term stability. The energy dependence in response of each of the two faces of the chamber was determined over the conventional radiography and mammography X ray ranges (unattenuated beams). The different energy response of the two faces of the chamber allowed the formation of a tandem system useful for the constancy check of beam qualities.

  20. Improving treatment plan evaluation with automation.

    PubMed

    Covington, Elizabeth L; Chen, Xiaoping; Younge, Kelly C; Lee, Choonik; Matuszak, Martha M; Kessler, Marc L; Keranen, Wayne; Acosta, Eduardo; Dougherty, Ashley M; Filpansick, Stephanie E; Moran, Jean M

    2016-11-08

    The goal of this work is to evaluate the effectiveness of Plan-Checker Tool (PCT) which was created to improve first-time plan quality, reduce patient delays, increase the efficiency of our electronic workflow, and standardize and automate the phys-ics plan review in the treatment planning system (TPS). PCT uses an application programming interface to check and compare data from the TPS and treatment management system (TMS). PCT includes a comprehensive checklist of automated and manual checks that are documented when performed by the user as part of a plan readiness check for treatment. Prior to and during PCT development, errors identified during the physics review and causes of patient treatment start delays were tracked to prioritize which checks should be automated. Nineteen of 33checklist items were automated, with data extracted with PCT. There was a 60% reduction in the number of patient delays in the six months after PCT release. PCT was suc-cessfully implemented for use on all external beam treatment plans in our clinic. While the number of errors found during the physics check did not decrease, automation of checks increased visibility of errors during the physics check, which led to decreased patient delays. The methods used here can be applied to any TMS and TPS that allows queries of the database. © 2016 The Authors.

  1. Evaluation of Dosimetry Check software for IMRT patient-specific quality assurance.

    PubMed

    Narayanasamy, Ganesh; Zalman, Travis; Ha, Chul S; Papanikolaou, Niko; Stathakis, Sotirios

    2015-05-08

    The purpose of this study is to evaluate the use of the Dosimetry Check system for patient-specific IMRT QA. Typical QA methods measure the dose in an array dosimeter surrounded by homogenous medium for which the treatment plan has been recomputed. With the Dosimetry Check system, fluence measurements acquired on a portal dosimeter is applied to the patient's CT scans. Instead of making dose comparisons in a plane, Dosimetry Check system produces isodose lines and dose-volume histograms based on the planning CT images. By exporting the dose distribution from the treatment planning system into the Dosimetry Check system, one is able to make a direct comparison between the calculated dose and the planned dose. The versatility of the software is evaluated with respect to the two IMRT techniques - step and shoot and volumetric arc therapy. The system analyzed measurements made using EPID, PTW seven29, and IBA MatriXX, and an intercomparison study was performed. Plans from patients previously treated at our institution with treated anatomical site on brain, head & neck, liver, lung, and prostate were analyzed using Dosimetry Check system for any anatomical site dependence. We have recommendations and possible precautions that may be necessary to ensure proper QA with the Dosimetry Check system.

  2. Prototype data terminal: Multiplexer/demultiplexer

    NASA Technical Reports Server (NTRS)

    Leck, D. E.; Goodwin, J. E.

    1972-01-01

    The design and operation of a quad redundant data terminal and a multiplexer/demultiplexer (MDU) design are described. The most unique feature is the design of the quad redundant data terminal. This is one of the few designs where the unit is fail/op, fail/op, fail/safe. Laboratory tests confirm that the unit will operate satisfactorily with the failure of three out of four channels. Although the design utilizes state-of-the-art technology. The waveform error checks, the voting techniques, and the parity bit checks are believed to be used in unique configurations. Correct word selection routines are also novel, if not unique. The MDU design, while not redundant, utilizes, the latest state-of-the-art advantages of light couplers and integrated circuit amplifiers.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ford, Eric C., E-mail: eford@uw.edu; Terezakis, Stephanie; Souranis, Annette

    Purpose: To quantify the error-detection effectiveness of commonly used quality control (QC) measures. Methods: We analyzed incidents from 2007-2010 logged into a voluntary in-house, electronic incident learning systems at 2 academic radiation oncology clinics. None of the incidents resulted in patient harm. Each incident was graded for potential severity using the French Nuclear Safety Authority scoring scale; high potential severity incidents (score >3) were considered, along with a subset of 30 randomly chosen low severity incidents. Each report was evaluated to identify which of 15 common QC checks could have detected it. The effectiveness was calculated, defined as the percentagemore » of incidents that each QC measure could detect, both for individual QC checks and for combinations of checks. Results: In total, 4407 incidents were reported, 292 of which had high-potential severity. High- and low-severity incidents were detectable by 4.0 {+-} 2.3 (mean {+-} SD) and 2.6 {+-} 1.4 QC checks, respectively (P<.001). All individual checks were less than 50% sensitive with the exception of pretreatment plan review by a physicist (63%). An effectiveness of 97% was achieved with 7 checks used in combination and was not further improved with more checks. The combination of checks with the highest effectiveness includes physics plan review, physician plan review, Electronic Portal Imaging Device-based in vivo portal dosimetry, radiation therapist timeout, weekly physics chart check, the use of checklists, port films, and source-to-skin distance checks. Some commonly used QC checks such as pretreatment intensity modulated radiation therapy QA do not substantially add to the ability to detect errors in these data. Conclusions: The effectiveness of QC measures in radiation oncology depends sensitively on which checks are used and in which combinations. A small percentage of errors cannot be detected by any of the standard formal QC checks currently in broad use, suggesting that further improvements are needed. These data require confirmation with a broader incident-reporting database.« less

  4. Pill testing or drug checking in Australia: Acceptability of service design features.

    PubMed

    Barratt, Monica J; Bruno, Raimondo; Ezard, Nadine; Ritter, Alison

    2018-02-01

    This study aimed to determine design features of a drug-checking service that would be feasible, attractive and likely to be used by Australian festival and nightlife attendees. Web survey of 851 Australians reporting use of psychostimulants and/or hallucinogens and attendance at licensed venues past midnight and/or festivals in the past year (70% male; median age 23 years). A drug-checking service located at festivals or clubs would be used by 94%; a fixed-site service external to such events by 85%. Most (80%) were willing to wait an hour for their result. Almost all (94%) would not use a service if there was a possibility of arrest, and a majority (64%) would not use a service that did not provide individual feedback of results. Drug-checking results were only slightly more attractive if they provided comprehensive quantitative results compared with qualitative results of key ingredients. Most (93%) were willing to pay up to $5, and 68% up to $10, per test. One-third (33%) reported willingness to donate a whole dose for testing: they were more likely to be male, younger, less experienced, use drugs more frequently and attend venues/festivals less frequently. In this sample, festival- or club-based drug-checking services with low wait times and low cost appear broadly attractive under conditions of legal amnesty and individualised feedback. Quantitative analysis of ecstasy pills requiring surrender of a whole pill may appeal to a minority in Australia where pills are more expensive than elsewhere. [Barratt MJ, Bruno R, Ezard N, Ritter A. Pill testing or drug checking in Australia: Acceptability of service design features. Drug Alcohol Rev 2017;00:000-000]. © 2017 Australasian Professional Society on Alcohol and other Drugs.

  5. [Influence of an observer in the haemolysis produced during the extraction of blood samples in primary care].

    PubMed

    Bel-Peña, N; Mérida-de la Torre, F J

    2015-01-01

    To check whether an intervention based on direct observation and complementary information to nurses helps reduce haemolysis when drawing blood specimens. Random sampling study in primary care centres in the serrania de Málaga health management area, using a cross-sectional, longitudinal pre- and post-intervention design. The study period was from August 2012 to January 2015. The level of free haemoglobin was measured by direct spectrophotometry in the specimens extracted. It was then checked whether the intervention influenced the level of haemolysis, and if this was maintained over time. The mean haemolysis measured pre-intervention was 17%, and after intervention it was 6.1%. A year later and under the same conditions, the frequency of haemolysis was measured again the samples analysed, and the percentage was 9% These results are low when compared to the level obtained pre-intervention, but are higher when compared to the levels obtained immediately after the intervention. The transport and analysis conditions were the same. An intervention based on a direct and informative observation in the process of collecting blood samples contributes significantly to reduce the level of haemolysis. This effect is maintained in time. This intervention needs to be repeated to maintain its effectiveness. Audits and continuing education programs are useful for quality assurance procedures, and maintain the level of care needed for a good quality of care. Copyright © 2015 SECA. Published by Elsevier Espana. All rights reserved.

  6. Early experiences with the multidose drug dispensing system – A matter of trust?

    PubMed Central

    Wekre, Liv Johanne; Melby, Line; Grimsmo, Anders

    2011-01-01

    Objective To study early experiences with multidose drug dispensing (MDD) among different groups of health personnel. Design Qualitative study based on focus-group interviews. Setting Primary health care, Trondheim, Norway. Main outcome The importance of trust in the technology and in collaborating partners is actualized in the early implementation of MDD. Results GPs, home-care nurses, pharmacists, and medical secretaries trusted the new MDD technology. The quality of the GPs’ medication records improved. However, health personnel, including the GPs themselves, would not always trust the medication records of the GPs. Checking the multidose bags arriving from the pharmacy was considered unnecessary in the written routines dealing with MDD. However, home-care nurses experienced errors and continued to manually check the bags. Nurses in the home-care service felt a loss of knowledge with regard to the patients’ medications and in turn experienced reduced ability to give medical information to patients and to observe the effects of the drugs. The home-care services’ routines for drug handling were not always trusted by the other groups of health personnel involved. Conclusion Health personnel faced some challenges during the implementation of the MDD system, but most of them remained confident in the new system. Building trust has to be a process that runs in parallel with the introduction of new technology and the establishment of new routines for improving the quality in handling of medicines and to facilitate better cooperation and communication. PMID:21323496

  7. Comparison of alternative approaches for analysing multi-level RNA-seq data

    PubMed Central

    Mohorianu, Irina; Bretman, Amanda; Smith, Damian T.; Fowler, Emily K.; Dalmay, Tamas

    2017-01-01

    RNA sequencing (RNA-seq) is widely used for RNA quantification in the environmental, biological and medical sciences. It enables the description of genome-wide patterns of expression and the identification of regulatory interactions and networks. The aim of RNA-seq data analyses is to achieve rigorous quantification of genes/transcripts to allow a reliable prediction of differential expression (DE), despite variation in levels of noise and inherent biases in sequencing data. This can be especially challenging for datasets in which gene expression differences are subtle, as in the behavioural transcriptomics test dataset from D. melanogaster that we used here. We investigated the power of existing approaches for quality checking mRNA-seq data and explored additional, quantitative quality checks. To accommodate nested, multi-level experimental designs, we incorporated sample layout into our analyses. We employed a subsampling without replacement-based normalization and an identification of DE that accounted for the hierarchy and amplitude of effect sizes within samples, then evaluated the resulting differential expression call in comparison to existing approaches. In a final step to test for broader applicability, we applied our approaches to a published set of H. sapiens mRNA-seq samples, The dataset-tailored methods improved sample comparability and delivered a robust prediction of subtle gene expression changes. The proposed approaches have the potential to improve key steps in the analysis of RNA-seq data by incorporating the structure and characteristics of biological experiments. PMID:28792517

  8. Improving Software Quality and Management Through Use of Service Level Agreements

    DTIC Science & Technology

    2005-03-01

    many who believe that the quality of the development process is the best predictor of software product quality. ( Fenton ) Repeatable software processes...reduced errors per KLOC for small projects ( Fenton ), and the quality management metric (QMM) (Machniak, Osmundson). There are also numerous IEEE 14...attention to cosmetic user interface issues and any problems that may arise with the prototype. (Sawyer) The validation process is also another check

  9. The effects of group supervision of nurses: a systematic literature review.

    PubMed

    Francke, Anneke L; de Graaff, Fuusje M

    2012-09-01

    To gain insight into the existing scientific evidence on the effects of group supervision for nurses. A systematic literature study of original research publications. Searches were performed in February 2010 in PubMed, CINAHL, Cochrane Library, Embase, ERIC, the NIVEL catalogue, and PsycINFO. No limitations were applied regarding date of publication, language or country. Original research publications were eligible for review when they described group supervision programmes directed at nurses; used a control group or a pre-test post-test design; and gave information about the effects of group supervision on nurse or patient outcomes. The two review authors independently assessed studies for inclusion. The methodological quality of included studies was also independently assessed by the review authors, using a check list developed by Van Tulder et al. in collaboration with the Dutch Cochrane Centre. Data related to the original publications were extracted by one review author and checked by a second review author. No statistical pooling of outcomes was performed, because there was large heterogeneity of outcomes. A total of 1087 potentially relevant references were found. After screening of the references, eight studies with a control group and nine with a pre-test post-test design were included. Most of the 17 studies included have serious methodological limitations, but four Swedish publications in the field of dementia care had high methodological quality and all point to positive effects on nurses' attitudes and skills and/or nurse-patient interactions. However, in interpreting these positive results, it must be taken into account that these four high-quality publications concern sub-studies of one 'sliced' research project using the same study sample. Moreover, these four publications combined a group supervision intervention with the introduction of individual care planning, which also hampers conclusions about the effectiveness of group supervision alone. Although there are rather a lot of indications that group supervision of nurses is effective, evidence on the effects is still scarce. Further methodologically sound research is needed. Copyright © 2011 Elsevier Ltd. All rights reserved.

  10. The IEO Data Center Management System: Tools for quality control, analysis and access marine data

    NASA Astrophysics Data System (ADS)

    Casas, Antonia; Garcia, Maria Jesus; Nikouline, Andrei

    2010-05-01

    Since 1994 the Data Centre of the Spanish Oceanographic Institute develops system for archiving and quality control of oceanographic data. The work started in the frame of the European Marine Science & Technology Programme (MAST) when a consortium of several Mediterranean Data Centres began to work on the MEDATLAS project. Along the years, old software modules for MS DOS were rewritten, improved and migrated to Windows environment. Oceanographic data quality control includes now not only vertical profiles (mainly CTD and bottles observations) but also time series of currents and sea level observations. New powerful routines for analysis and for graphic visualization were added. Data presented originally in ASCII format were organized recently in an open source MySQL database. Nowadays, the IEO, as part of SeaDataNet Infrastructure, has designed and developed a new information system, consistent with the ISO 19115 and SeaDataNet standards, in order to manage the large and diverse marine data and information originated in Spain by different sources, and to interoperate with SeaDataNet. The system works with data stored in ASCII files (MEDATLAS, ODV) as well as data stored within the relational database. The components of the system are: 1.MEDATLAS Format and Quality Control - QCDAMAR: Quality Control of Marine Data. Main set of tools for working with data presented as text files. Includes extended quality control (searching for duplicated cruises and profiles, checking date, position, ship velocity, constant profiles, spikes, density inversion, sounding, acceptable data, impossible regional values,...) and input/output filters. - QCMareas: A set of procedures for the quality control of tide gauge data according to standard international Sea Level Observing System. These procedures include checking for unexpected anomalies in the time series, interpolation, filtering, computation of basic statistics and residuals. 2. DAMAR: A relational data base (MySql) designed to manage the wide variety of marine information as common vocabularies, Catalogues (CSR & EDIOS), Data and Metadata. 3.Other tools for analysis and data management - Import_DB: Script to import data and metadata from the Medatlas ASCII files into the database. - SelDamar/Selavi: interface with the database for local and web access. Allows selective retrievals applying the criteria introduced by the user, as geographical bounds, data responsible, cruises, platform, time periods, etc. Includes also statistical reference values calculation, plotting of original and mean profiles together with vertical interpolation. - ExtractDAMAR: Script to extract data when they are archived in ASCII files that meet the criteria upon an user request through SelDamar interface and export them in ODV format, making also a unit conversion.

  11. Quality assurance testing of acoustic doppler current profiler transform matrices

    USGS Publications Warehouse

    Armstrong, Brandy; Fulford, Janice M.; Thibodeaux, Kirk G.

    2015-01-01

    The U.S. Geological Survey (USGS) Hydrologic Instrumentation Facility (HIF) is nationally responsible for the design, testing, evaluation, repair, calibration, warehousing, and distribution of hydrologic instrumentation in use within the USGS Water Mission Area (WMA). The HIF's Hydraulic Laboratory has begun routine quality assurance (QA) testing and documenting the performance of every USGS WMA acoustic Doppler current profiler (ADCP) used for making velocity and discharge measurements. All existing ADCPs are being registered and tracked in a database maintained by the HIF, and called for QA checks in the HIF's Hydraulic Laboratory on a 3- year cycle. All new ADCPs purchased directly from the manufacturer as well as ADCPs sent to the HIF or the manufacturer for repair are being registered and tracked in the database and QA checked in the laboratory before being placed into service. Meters failing the QA check are sent directly to the manufacturer for repairs and rechecked by HIF or removed from service. Although this QA program is specific to the SonTek1 and Teledyne RD Instruments1, ADCPs most commonly used within the WMA, it is the intent of the USGS Office of Surface Water and the HIF to expand this program to include all bottom tracking ADCPs as they become available and more widely used throughout the WMA. As part of the HIF QA process, instruments are inspected for physical damage, the instrument must pass the ADCP diagnostic self-check tests, the temperature probe must be within ± 2 degrees Celsius of a National Institute of Standards and Technology traceable reference thermometer and the distance made good over a fixed distance must meet the manufacturer's specifications (+/-0.25% or +/-1% difference). The transform matrix is tested by conducting distance-made-good (DMG) tests comparing the straight-line distance from bottom tracking to the measured tow-track distance. The DMG test is conducted on each instrument twice in the forward and reverse directions (4 tows) at four orientations (16 total tows); with beam 1 orientated 0 degrees to the towing direction; turned 45 degrees to the towing direction; turned 90 degrees to the towing direction; and turned 135 degrees to the towing direction. All QA data files and summary results are archived. This paper documents methodology, participation and preliminary results of WMA ADCP QA testing.

  12. SARTools: A DESeq2- and EdgeR-Based R Pipeline for Comprehensive Differential Analysis of RNA-Seq Data.

    PubMed

    Varet, Hugo; Brillet-Guéguen, Loraine; Coppée, Jean-Yves; Dillies, Marie-Agnès

    2016-01-01

    Several R packages exist for the detection of differentially expressed genes from RNA-Seq data. The analysis process includes three main steps, namely normalization, dispersion estimation and test for differential expression. Quality control steps along this process are recommended but not mandatory, and failing to check the characteristics of the dataset may lead to spurious results. In addition, normalization methods and statistical models are not exchangeable across the packages without adequate transformations the users are often not aware of. Thus, dedicated analysis pipelines are needed to include systematic quality control steps and prevent errors from misusing the proposed methods. SARTools is an R pipeline for differential analysis of RNA-Seq count data. It can handle designs involving two or more conditions of a single biological factor with or without a blocking factor (such as a batch effect or a sample pairing). It is based on DESeq2 and edgeR and is composed of an R package and two R script templates (for DESeq2 and edgeR respectively). Tuning a small number of parameters and executing one of the R scripts, users have access to the full results of the analysis, including lists of differentially expressed genes and a HTML report that (i) displays diagnostic plots for quality control and model hypotheses checking and (ii) keeps track of the whole analysis process, parameter values and versions of the R packages used. SARTools provides systematic quality controls of the dataset as well as diagnostic plots that help to tune the model parameters. It gives access to the main parameters of DESeq2 and edgeR and prevents untrained users from misusing some functionalities of both packages. By keeping track of all the parameters of the analysis process it fits the requirements of reproducible research.

  13. Navigation Algorithms for the SeaWiFS Mission

    NASA Technical Reports Server (NTRS)

    Hooker, Stanford B. (Editor); Firestone, Elaine R. (Editor); Patt, Frederick S.; McClain, Charles R. (Technical Monitor)

    2002-01-01

    The navigation algorithms for the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) were designed to meet the requirement of 1-pixel accuracy-a standard deviation (sigma) of 2. The objective has been to extract the best possible accuracy from the spacecraft telemetry and avoid the need for costly manual renavigation or geometric rectification. The requirement is addressed by postprocessing of both the Global Positioning System (GPS) receiver and Attitude Control System (ACS) data in the spacecraft telemetry stream. The navigation algorithms described are separated into four areas: orbit processing, attitude sensor processing, attitude determination, and final navigation processing. There has been substantial modification during the mission of the attitude determination and attitude sensor processing algorithms. For the former, the basic approach was completely changed during the first year of the mission, from a single-frame deterministic method to a Kalman smoother. This was done for several reasons: a) to improve the overall accuracy of the attitude determination, particularly near the sub-solar point; b) to reduce discontinuities; c) to support the single-ACS-string spacecraft operation that was started after the first mission year, which causes gaps in attitude sensor coverage; and d) to handle data quality problems (which became evident after launch) in the direct-broadcast data. The changes to the attitude sensor processing algorithms primarily involved the development of a model for the Earth horizon height, also needed for single-string operation; the incorporation of improved sensor calibration data; and improved data quality checking and smoothing to handle the data quality issues. The attitude sensor alignments have also been revised multiple times, generally in conjunction with the other changes. The orbit and final navigation processing algorithms have remained largely unchanged during the mission, aside from refinements to data quality checking. Although further improvements are certainly possible, future evolution of the algorithms is expected to be limited to refinements of the methods presented here, and no substantial changes are anticipated.

  14. A System Approach to Navy Medical Education and Training. Appendix 22. Otolaryngology Technician.

    DTIC Science & Technology

    1974-08-31

    PROCEDURES TO PATIENT 12 PEXPLAIN LUMBAR PUNCTURE PROCEDURES TO PATIENT 13 IMEASURE/WEIGH PATIENT OR PERSONNEL 14 ICHECK CENTRAL VENOUS PRESSURE 15 TAKE...BLOOD PRESSURE 16 [CHECK RADIAL AWRIST) PULSE 17 ICHECK FEMORAL PULSE FOR PRESENCE AND QUALITY 8 IDETERMINE APICAL PULSE RATE/RHYTHM WITH STETHESCOPE 19... ICHECK PATIENTS TEMPERATURE 2U ICHECK /COUNT RESPIRATIONS 21 IPERFORM CIRCULATION CHECK, E.G. COLOR, PULSE, TEMPERATURE OF ISKIN, CAPILLARY RETURN 22

  15. MoniQA: a general approach to monitor quality assurance

    NASA Astrophysics Data System (ADS)

    Jacobs, J.; Deprez, T.; Marchal, G.; Bosmans, H.

    2006-03-01

    MoniQA ("Monitor Quality Assurance") is a new, non-commercial, independent quality assurance software application developed in our medical physics team. It is a complete Java TM - based modular environment for the evaluation of radiological viewing devices and it thus fits in the global quality assurance network of our (film less) radiology department. The purpose of the software tool is to guide the medical physicist through an acceptance protocol and the radiologist through a constancy check protocol by presentation of the necessary test patterns and by automated data collection. Data are then sent to a central management system for further analysis. At the moment more than 55 patterns have been implemented, which can be grouped in schemes to implement protocols (i.e. AAPMtg18, DIN and EUREF). Some test patterns are dynamically created and 'drawn' on the viewing device with random parameters as is the case in a recently proposed new pattern for constancy testing. The software is installed on 35 diagnostic stations (70 monitors) in a film less radiology department. Learning time was very limited. A constancy check -with the new pattern that assesses luminance decrease, resolution problems and geometric distortion- takes only 2 minutes and 28 seconds per monitor. The modular approach of the software allows the evaluation of new or emerging test patterns. We will report on the software and its usability: practicality of the constancy check tests in our hospital and on the results from acceptance tests of viewing stations for digital mammography.

  16. Delivery quality assurance with ArcCHECK

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neilson, Christopher; Klein, Michael; Barnett, Rob

    2013-04-01

    Radiation therapy requires delivery quality assurance (DQA) to ensure that treatment is accurate and closely follows the plan. We report our experience with the ArcCHECK phantom and investigate its potential optimization for the DQA process. One-hundred seventy DQA plans from 84 patients were studied. Plans were classified into 2 groups: those with the target situated on the diodes of the ArcCHECK (D plans) and those with the target situated at the center (C plans). Gamma pass rates for 8 target sites were examined. The parameters used to analyze the data included 3%/3 mm with the Van Dyk percent difference criteriamore » (VD) on, 3%/3 mm with the VD off, 2%/2 mm with the VD on, and x/3 mm with the VD on and the percentage dosimetric agreement “x” for diode plans adjusted. D plans typically displayed maximum planned dose (MPD) on the cylindrical surface containing ArcCHECK diodes than center plans, resulting in inflated gamma pass rates. When this was taken into account by adjusting the percentage dosimetric agreement, C plans outperformed D plans by an average of 3.5%. ArcCHECK can streamline the DQA process, consuming less time and resources than radiographic films. It is unnecessary to generate 2 DQA plans for each patient; a single center plan will suffice. Six of 8 target sites consistently displayed pass rates well within our acceptance criteria; the lesser performance of head and neck and spinal sites can be attributed to marginally lower doses and increased high gradient of plans.« less

  17. Check valve installation in pilot operated relief valve prevents reverse pressurization

    NASA Technical Reports Server (NTRS)

    Oswalt, L.

    1966-01-01

    Two check valves prevent reverse flow through pilot-operated relief valves of differential area piston design. Title valves control pressure flow to ensure that the piston dome pressure is always at least as great as the main relief valve discharge pressure.

  18. Poverty and Economic Decision-Making: Evidence from Changes in Financial Resources at Payday

    PubMed Central

    Carvalho, Leandro S.; Meier, Stephan; Wang, Stephanie W.

    2016-01-01

    We study the effect of financial resources on decision-making. Low-income U.S. households are randomly assigned to receive an online survey before or after payday. The survey collects measures of cognitive function and administers risk and intertemporal choice tasks. The study design generates variation in cash, checking and savings balances, and expenditures. Before-payday participants behave as if they are more present-biased when making intertemporal choices about monetary rewards but not when making intertemporal choices about non-monetary real-effort tasks. Nor do we find before-after differences in risk-taking, the quality of decision-making, the performance in cognitive function tasks, or in heuristic judgments. PMID:28003681

  19. ED15-0104-78

    NASA Image and Video Library

    2015-04-09

    The X-56A Multi-Utility Technology Testbed (MUTT) is greeted on an Edwards Air Force Base runway by a U.S. Air Force Research Laboratory (AFRL) team member. NASA’s Armstrong Flight Research Center and the AFRL, along with participants from Langley Research Center and Glenn Research Center, and support from Lockheed Martin, are using the second X-56A (dubbed “Buckeye”) to check out aircraft systems, evaluate handling qualities, characterize and expand the airplane’s performance envelope, and verify pre-flight predictions regarding aircraft behavior. The 20-minute flight marked the beginning of a research effort designed to yield significant advances in aeroservoelastic technology using a low-cost, modular, remotely piloted aerial vehicle.

  20. Long life assurance study for manned spacecraft long life hardware. Volume 3: Long life assurance studies of components

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The guidelines for selecting hardware to be used in manned spacecraft to obtain a five year operational lifetime without maintenance were developed. An analysis was conducted on the design, application, failure mechanisms, manufacturing processes and controls, screen and burn-in techniques, and quality control of hardware items. The equipment considered for evaluation include: (1) electric motors and bearings; (2) accelerometers; (3) gyroscopes and bearings; (4) compressors and pumps, (5) magnetic tape recorders; (6) plumbing components and tubing; (7) check valves; (8) pressure regulators and solenoid valves; (9) thermal control valves; (10) pressure vessels and positive expulsion devices; (11) nickel cadmium batteries; and (12) transducers.

  1. Program Model Checking as a New Trend

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Visser, Willem; Clancy, Daniel (Technical Monitor)

    2002-01-01

    This paper introduces a special section of STTT (International Journal on Software Tools for Technology Transfer) containing a selection of papers that were presented at the 7th International SPIN workshop, Stanford, August 30 - September 1, 2000. The workshop was named SPIN Model Checking and Software Verification, with an emphasis on model checking of programs. The paper outlines the motivation for stressing software verification, rather than only design and model verification, by presenting the work done in the Automated Software Engineering group at NASA Ames Research Center within the last 5 years. This includes work in software model checking, testing like technologies and static analysis.

  2. Low complexity Reed-Solomon-based low-density parity-check design for software defined optical transmission system based on adaptive puncturing decoding algorithm

    NASA Astrophysics Data System (ADS)

    Pan, Xiaolong; Liu, Bo; Zheng, Jianglong; Tian, Qinghua

    2016-08-01

    We propose and demonstrate a low complexity Reed-Solomon-based low-density parity-check (RS-LDPC) code with adaptive puncturing decoding algorithm for elastic optical transmission system. Partial received codes and the relevant column in parity-check matrix can be punctured to reduce the calculation complexity by adaptive parity-check matrix during decoding process. The results show that the complexity of the proposed decoding algorithm is reduced by 30% compared with the regular RS-LDPC system. The optimized code rate of the RS-LDPC code can be obtained after five times iteration.

  3. Sediment depositions upstream of open check dams: new elements from small scale models

    NASA Astrophysics Data System (ADS)

    Piton, Guillaume; Le Guern, Jules; Carbonari, Costanza; Recking, Alain

    2015-04-01

    Torrent hazard mitigation remains a big issue in mountainous regions. In steep slope streams and especially in their fan part, torrential floods mainly result from abrupt and massive sediment deposits. To curtail such phenomenon, soil conservation measures as well as torrent control works have been undertaken for decades. Since the 1950s, open check dams complete other structural and non-structural measures in watershed scale mitigation plans1. They are often built to trap sediments near the fan apexes. The development of earthmoving machinery after the WWII facilitated the dredging operations of open check dams. Hundreds of these structures have thus been built for 60 years. Their design evolved with the improving comprehension of torrential hydraulics and sediment transport; however this kind of structure has a general tendency to trap most of the sediments supplied by the headwaters. Secondary effects as channel incision downstream of the traps often followed an open check dam creation. This sediment starvation trend tends to propagate to the main valley rivers and to disrupt past geomorphic equilibriums. Taking it into account and to diminish useless dredging operation, a better selectivity of sediment trapping must be sought in open check dams, i.e. optimal open check dams would trap sediments during dangerous floods and flush them during normal small floods. An accurate description of the hydraulic and deposition processes that occur in sediment traps is needed to optimize existing structures and to design best-adjusted new structures. A literature review2 showed that if design criteria exist for the structure itself, little information is available on the dynamic of the sediment depositions upstream of open check dams, i.e. what are the geomorphic patterns that occur during the deposition?, what are the relevant friction laws and sediment transport formula that better describe massive depositions in sediment traps?, what are the range of Froude and Shields numbers that the flows tend to adopt? New small scale model experiments have been undertaken focusing on depositions processes and their related hydraulics. Accurate photogrammetric measurements allowed us to better describe the deposition processes3. Large Scale Particle Image Velocimetry (LS-PIV) was performed to determine surface velocity fields in highly active channels with low grain submersion4. We will present preliminary results of our experiments showing the new elements we observed in massive deposit dynamics. REFERENCES 1.Armanini, A., Dellagiacoma, F. & Ferrari, L. From the check dam to the development of functional check dams. Fluvial Hydraulics of Mountain Regions 37, 331-344 (1991). 2.Piton, G. & Recking, A. Design of sediment traps with open check dams: a review, part I: hydraulic and deposition processes. (Accepted by the) Journal of Hydraulic Engineering 1-23 (2015). 3.Le Guern, J. Ms Thesis: Modélisation physique des plages de depot : analyse de la dynamique de remplissage.(2014) . 4.Carbonari, C. Ms Thesis: Small scale experiments of deposition processes occuring in sediment traps, LS-PIV measurments and geomorphological descriptions. (in preparation).

  4. Comparison of the temperature accuracy between smart phone based and high-end thermal cameras using a temperature gradient phantom

    NASA Astrophysics Data System (ADS)

    Klaessens, John H.; van der Veen, Albert; Verdaasdonk, Rudolf M.

    2017-03-01

    Recently, low cost smart phone based thermal cameras are being considered to be used in a clinical setting for monitoring physiological temperature responses such as: body temperature change, local inflammations, perfusion changes or (burn) wound healing. These thermal cameras contain uncooled micro-bolometers with an internal calibration check and have a temperature resolution of 0.1 degree. For clinical applications a fast quality measurement before use is required (absolute temperature check) and quality control (stability, repeatability, absolute temperature, absolute temperature differences) should be performed regularly. Therefore, a calibrated temperature phantom has been developed based on thermistor heating on both ends of a black coated metal strip to create a controllable temperature gradient from room temperature 26 °C up to 100 °C. The absolute temperatures on the strip are determined with software controlled 5 PT-1000 sensors using lookup tables. In this study 3 FLIR-ONE cameras and one high end camera were checked with this temperature phantom. The results show a relative good agreement between both low-cost and high-end camera's and the phantom temperature gradient, with temperature differences of 1 degree up to 6 degrees between the camera's and the phantom. The measurements were repeated as to absolute temperature and temperature stability over the sensor area. Both low-cost and high-end thermal cameras measured relative temperature changes with high accuracy and absolute temperatures with constant deviations. Low-cost smart phone based thermal cameras can be a good alternative to high-end thermal cameras for routine clinical measurements, appropriate to the research question, providing regular calibration checks for quality control.

  5. A shared computer-based problem-oriented patient record for the primary care team.

    PubMed

    Linnarsson, R; Nordgren, K

    1995-01-01

    1. INTRODUCTION. A computer-based patient record (CPR) system, Swedestar, has been developed for use in primary health care. The principal aim of the system is to support continuous quality improvement through improved information handling, improved decision-making, and improved procedures for quality assurance. The Swedestar system has evolved during a ten-year period beginning in 1984. 2. SYSTEM DESIGN. The design philosophy is based on the following key factors: a shared, problem-oriented patient record; structured data entry based on an extensive controlled vocabulary; advanced search and query functions, where the query language has the most important role; integrated decision support for drug prescribing and care protocols and guidelines; integrated procedures for quality assurance. 3. A SHARED PROBLEM-ORIENTED PATIENT RECORD. The core of the CPR system is the problem-oriented patient record. All problems of one patient, recorded by different members of the care team, are displayed on the problem list. Starting from this list, a problem follow-up can be made, one problem at a time or for several problems simultaneously. Thus, it is possible to get an integrated view, across provider categories, of those problems of one patient that belong together. This shared problem-oriented patient record provides an important basis for the primary care team work. 4. INTEGRATED DECISION SUPPORT. The decision support of the system includes a drug prescribing module and a care protocol module. The drug prescribing module is integrated with the patient records and includes an on-line check of the patient's medication list for potential interactions and data-driven reminders concerning major drug problems. Care protocols have been developed for the most common chronic diseases, such as asthma, diabetes, and hypertension. The patient records can be automatically checked according to the care protocols. 5. PRACTICAL EXPERIENCE. The Swedestar system has been implemented in a primary care area with 30,000 inhabitants. It is being used by all the primary care team members: 15 general practitioners, 25 district nurses, and 10 physiotherapists. Several years of practical experience of the CPR system shows that it has a positive impact on quality of care on four levels: 1) improved clinical follow-up of individual patients; 2) facilitated follow-up of aggregated data such as practice activity analysis, annual reports, and clinical indicators; 3) automated medical audit; and 4) concurrent audit. Within that primary care area, quality of care has improved substantially in several aspects due to the use of the CPR system [1].

  6. 46 CFR 160.133-9 - Preapproval review.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... manual as described in §§ 160.133-19 and 160.133-21 of this subpart; (7) A description of the quality... suppliers; (ii) The method for controlling the inventory of materials; (iii) The method for checking quality of fabrication and joints, including welding inspection procedures; and (iv) The inspection...

  7. 46 CFR 160.133-9 - Preapproval review.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... manual as described in §§ 160.133-19 and 160.133-21 of this subpart; (7) A description of the quality... suppliers; (ii) The method for controlling the inventory of materials; (iii) The method for checking quality of fabrication and joints, including welding inspection procedures; and (iv) The inspection...

  8. 46 CFR 160.170-9 - Preapproval review.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... manual as described in §§ 160.170-19 and 160.170-21 of this subpart; (7) A description of the quality... suppliers; (ii) The method for controlling the inventory of materials; (iii) The method for checking quality of fabrication and joints, including welding inspection procedures; and (iv) The inspection...

  9. 46 CFR 160.170-9 - Preapproval review.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... manual as described in §§ 160.170-19 and 160.170-21 of this subpart; (7) A description of the quality... suppliers; (ii) The method for controlling the inventory of materials; (iii) The method for checking quality of fabrication and joints, including welding inspection procedures; and (iv) The inspection...

  10. 46 CFR 160.133-9 - Preapproval review.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... manual as described in §§ 160.133-19 and 160.133-21 of this subpart; (7) A description of the quality... suppliers; (ii) The method for controlling the inventory of materials; (iii) The method for checking quality of fabrication and joints, including welding inspection procedures; and (iv) The inspection...

  11. 46 CFR 160.170-9 - Preapproval review.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... manual as described in §§ 160.170-19 and 160.170-21 of this subpart; (7) A description of the quality... suppliers; (ii) The method for controlling the inventory of materials; (iii) The method for checking quality of fabrication and joints, including welding inspection procedures; and (iv) The inspection...

  12. HISTORICAL EMISSION AND OZONE TRENDS IN THE HOUSTON AREA

    EPA Science Inventory

    An analysis of historical trend data for emissions and air quality in Houston for period of 1974-78 is conducted for the purposes of checking the EKMA O3-predicting model and of exploring empirical relations between emission changes and O3 air quality in the Houston area. Results...

  13. 14 CFR 141.83 - Quality of training.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 3 2011-01-01 2011-01-01 false Quality of training. 141.83 Section 141.83 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) SCHOOLS AND... the FAA to administer any knowledge test, practical test, stage check, or end-of-course test to its...

  14. 14 CFR 141.83 - Quality of training.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 3 2010-01-01 2010-01-01 false Quality of training. 141.83 Section 141.83 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED) SCHOOLS AND... the FAA to administer any knowledge test, practical test, stage check, or end-of-course test to its...

  15. Hard Spring Wheat Technical Committee 2016 Crop

    USDA-ARS?s Scientific Manuscript database

    Seven experimental lines of hard spring wheat were grown at up to five locations in 2016 and evaluated for kernel, milling, and bread baking quality against the check variety Glenn. Wheat samples were submitted through the Wheat Quality Council and processed and milled at the USDA-ARS Hard Red Spri...

  16. 40 CFR 63.6135 - How do I monitor and collect data to demonstrate continuous compliance?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Combustion Turbines Continuous Compliance Requirements § 63.6135 How do I monitor and collect data to... quality assurance or quality control activities (including, as applicable, calibration checks and required... times the stationary combustion turbine is operating. (b) Do not use data recorded during monitor...

  17. 40 CFR 63.6135 - How do I monitor and collect data to demonstrate continuous compliance?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Combustion Turbines Continuous Compliance Requirements § 63.6135 How do I monitor and collect data to... quality assurance or quality control activities (including, as applicable, calibration checks and required... times the stationary combustion turbine is operating. (b) Do not use data recorded during monitor...

  18. 40 CFR 63.6135 - How do I monitor and collect data to demonstrate continuous compliance?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Combustion Turbines Continuous Compliance Requirements § 63.6135 How do I monitor and collect data to... quality assurance or quality control activities (including, as applicable, calibration checks and required... times the stationary combustion turbine is operating. (b) Do not use data recorded during monitor...

  19. Applicability of refractometry for fast routine checking of hospital preparations.

    PubMed

    Hendrickx, Stijn; Verón, Aurora Monteagudo; Van Schepdael, Ann; Adams, Erwin

    2016-04-30

    Quality control of hospital pharmacy formulations is of the utmost importance to ensure constant quality and to avoid potential mistakes before administration to the patient. In this study we investigated the applicability of refractometry as a fast, inexpensive and easy-to-use quality control measurement. Refractive indices (RI) of a multitude of different hospital formulations with varying concentrations of active compound were measured. The samples consisted of a number of binary aqueous solutions (one compound in water), complex aqueous solutions (multiple compounds in water or in a constant matrix), two suspensions and one emulsion. For all these formulations, linear regression analysis was performed, quality control limits determined and accuracy and repeatability were checked. Subsequently, actual hospital pharmacy samples were analyzed to check whether they were within the specified limits. For both binary and complex aqueous formulations, repeatability was good and a linear correlation for all samples could be observed on condition that the concentration of the active compound was sufficiently high. The refractometer was not sensitive enough for solutions of folic acid and levothyroxine, which had too low a concentration of active compound. Due to lack of homogeneity and light scattering, emulsions and suspensions do not seem suitable for quality control by refractometry. A mathematical equation was generated to predict the refractive index of an aqueous solution containing clonidine HCl as active compound. Values calculated from the equation were compared with measured values and deviations of all samples were found to be lower than 1.3%. In order to use refractometry in a hospital pharmacy for quality control of multicomponent samples, additional intermediate measurements would be required, to overcome the fact that refractometry is not compound specific. In conclusion, we found that refractometry could potentially be useful for daily, fast quality measurements of relatively concentrated binary and more complex aqueous solutions in the hospital pharmacy. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Healthy participants in phase I clinical trials: the quality of their decision to take part.

    PubMed

    Rabin, Cheryl; Tabak, Nili

    2006-08-01

    This study was set out to test the quality of the decision-making process of healthy volunteers in clinical trials. Researchers fear that the decision to volunteer for clinical trials is taken inadequately and that the signature on the consent forms, meant to affirm that consent was 'informed', is actually insubstantial. The study design was quasi-experimental, using a convenience quota sample. Over a period of a year, candidates were approached during their screening process for a proposed clinical trial, after concluding the required 'Informed Consent' procedure. In all, 100 participants in phase I trials filled out questionnaires based ultimately on the Janis and Mann model of vigilant information processing, during their stay in the research centre. Only 35% of the participants reached a 'quality decision'. There is a definite correlation between information processing and quality decision-making. However, many of the healthy research volunteers (58%) do not seek out information nor check alternatives before making a decision. Full disclosure is essential to a valid informed consent procedure but not sufficient; emphasis must be put on having the information understood and assimilated. Research nurses play a central role in achieving this objective.

  1. [Impact of a quality assurance program on the use of neuromuscular monitoring and reversal of muscle relaxants].

    PubMed

    Motamed, C; Bourgain, J-L

    2009-04-01

    As part of a quality assurance in the anaesthesia department, this study was designed to enhance the rate of neuromuscular blockade monitoring for patients receiving muscle relaxant during anaesthesia. After approval of our local ethical committee, we assessed 200 computerized anaesthesia records in which neuromuscular relaxants were used. The following data were collected: demographic characteristics, durations of anaesthesia and surgery, use of neuromuscular monitoring, reversal agents and the quality of neuromuscular monitoring. The results were discussed with all anaesthesia providers of the department and an internal guideline was elaborated with the endpoint that all patients having muscle relaxants should have quantitative neuromuscular monitoring. Six months later, another assessment of 200 consecutive records collected the same data to check the efficiency of the elaborated guideline. The monitoring rate was of 67% at the first assessment and increased to 94% (p<0.05). The reversal rate was at 48% in the first assessment and was stable at the second assessment (50%). The rate of patients not monitored and not reversed decreased from 5 to 2% (p<0.05). This study shows that as part of a quality assurance program systematic quantitative monitoring of neuromuscular blockade can be significantly increased.

  2. The Absolute Vector Magnetometers on Board Swarm, Lessons Learned From Two Years in Space.

    NASA Astrophysics Data System (ADS)

    Hulot, G.; Leger, J. M.; Vigneron, P.; Brocco, L.; Olsen, N.; Jager, T.; Bertrand, F.; Fratter, I.; Sirol, O.; Lalanne, X.

    2015-12-01

    ESA's Swarm satellites carry 4He absolute magnetometers (ASM), designed by CEA-Léti and developed in partnership with CNES. These instruments are the first-ever space-born magnetometers to use a common sensor to simultaneously deliver 1Hz independent absolute scalar and vector readings of the magnetic field. They have provided the very high accuracy scalar field data nominally required by the mission (for both science and calibration purposes, since each satellite also carries a low noise high frequency fluxgate magnetometer designed by DTU), but also very useful experimental absolute vector data. In this presentation, we will report on the status of the instruments, as well as on the various tests and investigations carried out using these experimental data since launch in November 2013. In particular, we will illustrate the advantages of flying ASM instruments on space-born magnetic missions for nominal data quality checks, geomagnetic field modeling and science objectives.

  3. Good health checks according to the general public; expectations and criteria: a focus group study.

    PubMed

    Stol, Yrrah H; Asscher, Eva C A; Schermer, Maartje H N

    2018-06-22

    Health checks or health screenings identify (risk factors for) disease in people without a specific medical indication. So far, the perspective of (potential) health check users has remained underexposed in discussions about the ethics and regulation of health checks. In 2017, we conducted a qualitative study with lay people from the Netherlands (four focus groups). We asked what participants consider characteristics of good and bad health checks, and whether they saw a role for the Dutch government. Participants consider a good predictive value the most important characteristic of a good health check. Information before, during and after the test, knowledgeable and reliable providers, tests for treatable (risk factors for) disease, respect for privacy, no unnecessary health risks and accessibility are also mentioned as criteria for good health checks. Participants make many assumptions about health check offers. They assume health checks provide certainty about the presence or absence of disease, that health checks offer opportunities for health benefits and that the privacy of health check data is guaranteed. In their choice for provider and test they tend to rely more on heuristics than information. Participants trust physicians to put the interest of potential health check users first and expect the Dutch government to intervene if providers other than physicians failed to do so by offering tests with a low predictive value, or tests that may harm people, or by infringing the privacy of users. Assumptions of participants are not always justified, but they may influence the choice to participate. This is problematic because choices for checks with a low predictive value that do not provide health benefits may create uncertainty and may cause harm to health; an outcome diametrically opposite to the one intended. Also, this may impair the relationship of trust with physicians and the Dutch government. To further and protect autonomous choice and to maintain trust, we recommend the following measures to timely adjust false expectations: advertisements that give an accurate impression of health check offers, and the installation of a quality mark.

  4. The Single Soldier Quality of Life Initiative: Great Expectations of Privacy

    DTIC Science & Technology

    1995-04-01

    without regard to their marital status and to hold them accountable to established standards. 18 To many "old soldiers," some of the ideas contained...Family Housing Office: assign and terminate quarters, conduct check-in and check-out inspections, maintain accountability of SQ furniture, follow up on...integrity is a second priority." 2 6 Further hindering unit integrity is that smoking preference of the soldiers must be taken into account when making

  5. Valid internal standard technique for arson detection based on gas chromatography-mass spectrometry.

    PubMed

    Salgueiro, Pedro A S; Borges, Carlos M F; Bettencourt da Silva, Ricardo J N

    2012-09-28

    The most popular procedures for the detection of residues of accelerants in fire debris are the ones published by the American Society for Testing and Materials (ASTM E1412-07 and E1618-10). The most critical stages of these tests are the conservation of fire debris from the sampling to the laboratory, the extraction of residues of accelerants from the debris to the activated charcoal strips (ACS) and from those to the final solvent, as well as the analysis of sample extract by gas chromatography-mass spectrometry (GC-MS) and the interpretation of the instrumental signal. This work proposes a strategy for checking the quality of the sample conservation, the accelerant residues transference to final solvent and GC-MS analysis, using internal standard additions. It is used internal standards ranging from a highly volatile compound for checking debris conservation to low volatile compound for checking GC-MS repeatability. The developed quality control (QC) parameters are not affected by GC-MS sensitivity variation and, specifically, the GC-MS performance control is not affected by ACS adsorption saturation that may mask test performance deviations. The proposed QC procedure proved to be adequate to check GC-MS repeatability, ACS extraction and sample conservation since: (1) standard additions are affected by negligible uncertainty and (2) observed dispersion of QC parameters are fit for its intended use. Copyright © 2012 Elsevier B.V. All rights reserved.

  6. 107: REVIEW OF THE QUALITY HOSPITALS WEBSITES IN KHORASAN RAZAVI PROVINCE

    PubMed Central

    Dastani, Meisam; Sattari, Masoume

    2017-01-01

    Background and aims The aim of the present study is to present a clear vision of the quality status of Khorasan Razavi hospitals websites in four dimensions of content, performance, management and usage of the website. Methods This is a survey study. The sample consisted of 49 hospital websites of Khorasan Razavi province. The instrument was used check list including 21 components and four criteria (content, performance, management and how to use the website). Its validity and reliability have proved through previous studies. Also, the data were analyzed using descriptive statistics. Results The findings showed that only %59 of hospitals in Khorasan Razavi, has been active website. The overall Status of website showed that the most popular websites of the situation, too weak (%51), moderate (%26.5) and weak (%16.3), respectively. In relation to criteria content, study, management and how to usage of the website and design, the findings showed that %40.8 of websites had an unfavorable condition. Of the 16 high quality website selected, only three websites Sina Hospital in Mashhad, Javdoll-Aemeh in Mashhad and Razavi were in good condition and the other of the websites had weak condition. Conclusion The results of this study indicate that, yet most websites do not regard minimal medical standards and also, they could not establish good relationships with their audiences. In fact, in Iran, still regard to the quality and performance of websites has not been one of the priorities for improving service quality in hospitals. The findings of this study can be effective in the identification and development of hospital websites quality criteria in terms of design, content, performance and management and how to use.

  7. PFReports: A program for systematic checking of annual peaks in NWISWeb

    USGS Publications Warehouse

    Ryberg, Karen R.

    2008-01-01

    The accuracy, characterization, and completeness of the U.S. Geological Survey (USGS) peak-flow data drive the determination of flood-frequency estimates that are used daily to design water and transportation infrastructure, delineate flood-plain boundaries, and regulate development and utilization of lands throughout the Nation and are essential to understanding the implications of climate change on flooding. Indeed, this high-profile database reflects and highlights the quality of USGS water-data collection programs. Its extension and improvement are essential to efforts to strengthen USGS networks and science leadership and is worthy of the attention of Water Science Center (WSC) hydrographers. This document describes a computer program, PFReports, and its output that facilitates efficient and robust review and correction of data in the USGS Peak Flow File (PFF) hosted as part of NWISWeb (the USGS public Web interface to much of the data stored and managed within the National Water Information System or NWIS). Checks embedded in the program are recommended as part of a more comprehensive assessment of peak flow data that will eventually include examination of possible regional changes, seasonal changes, and decadal variations in magnitude, timing, and frequency. Just as important as the comprehensive assessment, cleaning up the database will increase the likelihood of improved WSC regional flood-frequency equations. As an example of the value of cleaning up the PFF, data for 26,921 sites in the PFF were obtained. Of those sites, 17,542 sites had peak streamflow values and daily values. For the 17,542 sites, 1,097 peaks were identified that were less than the daily value for the day on which the peak occurred. Of the 26,921 sites, 11,643 had peak streamflow values, concurrent daily values, and at least 10 peaks. At the 11,643 sites, 2,205 peaks were identified as potential outliers in a regression of peak streamflows on daily values. Previous efforts to identify problems with the PFF were time consuming, laborious, and often ineffective. This new suite of checks represents an effort to automate identification of specific problems without plotting or printing large amounts of data that may not have problems. In addition, the results of the checks of the peak flow files are delivered through the World Wide Web with links to individual reports so that WSCs can focus on specific problems in an organized and standardized fashion. Over the years, technical reviews, regional-flood studies, and user inquiries have identified many minor and some major problems in the PFF. However, the cumbersome nature of the PFF editor and a lack of analytical tools have hampered efforts at quality assurance/quality control (QA/QC) and subsequently to make needed revisions to the database. This document is organized to provide information regarding PFReports, especially those tests involving regression and to provide an overview of the review procedures for utilizing the output. It also may be used as a reference for the data qualification codes and abbreviations for the tests. Results of the checks for all peak flow files (March 2008) are available at http://nd.water.usgs.gov/internal/pfreports/.

  8. Remotely adjustable check-valves with an electrochemical release mechanism for implantable biomedical microsystems.

    PubMed

    Pan, Tingrui; Baldi, Antonio; Ziaie, Babak

    2007-06-01

    In this paper, we present two remotely adjustable check-valves with an electrochemical release mechanism for implantable biomedical microsystems. These valves allow one to vary the opening pressure set-point and flow resistance over a period of time. The first design consists of a micromachined check-valve array using a SU-8 polymer structural layer deposited on the top of a gold sacrificial layer. The second design is based on a variable length cantilever beam structure with a gold sacrificial layer. The adjustable cantilever-beam structure is fabricated by gold thermo-compression bond of a thin silicon wafer over a glass substrate. In both designs, the evaporated gold can be electrochemically dissolved using a constant DC current via a telemetry link. In the first design the dissolution simply opens up individual outlets, while in the second design, gold anchors are sequentially dissolved hence increasing the effective length of the cantilever beam (reducing the opening pressure). A current density of 35 mA/cm(2) is used to dissolve the gold sacrificial layers. Both gravity and syringe-pump driven flow are used to characterize the valve performance. A multi-stage fluidic performance (e.g. flow resistance and opening pressure) is clearly demonstrated.

  9. 14 CFR 121.315 - Cockpit check procedure.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 3 2010-01-01 2010-01-01 false Cockpit check procedure. 121.315 Section 121.315 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION (CONTINUED... emergencies. The procedures must be designed so that a flight crewmember will not need to rely upon his memory...

  10. REACH. Teacher's Guide Volume II. Check Points.

    ERIC Educational Resources Information Center

    Georgia Univ., Athens. Div. of Vocational Education.

    Designed for use with individualized instructional units (CE 026 345-347, CE 026 349-351) in the REACH (Refrigeration, Electro-Mechanical, Air-Conditioning, Heating) electromechanical cluster, this second volume of the postsecondary teacher guide contains the check points which the instructor may want to refer to when the unit sheet directs the…

  11. Using computer models to design gully erosion control structures for humid northern Ethiopia

    USDA-ARS?s Scientific Manuscript database

    Classic gully erosion control measures such as check dams have been unsuccessful in halting gully formation and growth in the humid northern Ethiopian highlands. Gullies are typically formed in vertisols and flow often bypasses the check dams as elevated groundwater tables make gully banks unstable....

  12. AUTOMOTIVE DIESEL MAINTENANCE 2. UNIT XXV, MICHIGAN/CLARK TRANSMISSION--TROUBLESHOOTING.

    ERIC Educational Resources Information Center

    Minnesota State Dept. of Education, St. Paul. Div. of Vocational and Technical Education.

    THIS MODULE OF A 25-MODULE COURSE IS DESIGNED TO DEVELOP AN UNDERSTANDING OF TROUBLESHOOTING PROCEDURES FOR A SPECIFIC TRANSMISSION USED ON DIESEL POWERED EQUIPMENT. TOPICS ARE (1) PRELIMINARY CHECKS, (2) PRESSURE AND OIL FLOW CHECKS, (3) TROUBLESHOOTING TABLES, (4) TROUBLESHOOTING VEHICLES UNDER FIELD CONDITIONS, AND (5) ANALYZING UNACCEPTABLE…

  13. OntoCheck: verifying ontology naming conventions and metadata completeness in Protégé 4

    PubMed Central

    2012-01-01

    Background Although policy providers have outlined minimal metadata guidelines and naming conventions, ontologies of today still display inter- and intra-ontology heterogeneities in class labelling schemes and metadata completeness. This fact is at least partially due to missing or inappropriate tools. Software support can ease this situation and contribute to overall ontology consistency and quality by helping to enforce such conventions. Objective We provide a plugin for the Protégé Ontology editor to allow for easy checks on compliance towards ontology naming conventions and metadata completeness, as well as curation in case of found violations. Implementation In a requirement analysis, derived from a prior standardization approach carried out within the OBO Foundry, we investigate the needed capabilities for software tools to check, curate and maintain class naming conventions. A Protégé tab plugin was implemented accordingly using the Protégé 4.1 libraries. The plugin was tested on six different ontologies. Based on these test results, the plugin could be refined, also by the integration of new functionalities. Results The new Protégé plugin, OntoCheck, allows for ontology tests to be carried out on OWL ontologies. In particular the OntoCheck plugin helps to clean up an ontology with regard to lexical heterogeneity, i.e. enforcing naming conventions and metadata completeness, meeting most of the requirements outlined for such a tool. Found test violations can be corrected to foster consistency in entity naming and meta-annotation within an artefact. Once specified, check constraints like name patterns can be stored and exchanged for later re-use. Here we describe a first version of the software, illustrate its capabilities and use within running ontology development efforts and briefly outline improvements resulting from its application. Further, we discuss OntoChecks capabilities in the context of related tools and highlight potential future expansions. Conclusions The OntoCheck plugin facilitates labelling error detection and curation, contributing to lexical quality assurance in OWL ontologies. Ultimately, we hope this Protégé extension will ease ontology alignments as well as lexical post-processing of annotated data and hence can increase overall secondary data usage by humans and computers. PMID:23046606

  14. Slit-check dams for the control of debris flow

    NASA Astrophysics Data System (ADS)

    Armanini, Aronne; Larcher, Michele

    2017-04-01

    Debris flows are paroxysmal events that mobilize, alongside water, huge quantities of sediment in a very short time, then with both solid and liquid huge discharges, possibly exceeding the capacity of the current torrent restoration works. In this respect, the climate change forcing cannot be ignored. In the majority of urbanized areas, that are generally the most vulnerable, there is often not enough space to create channelling works able to let the volumes pass through without overflowing. The simplest, less expensive and most sustainable solution consists in reducing the peak solid discharge by creating storage areas upstream of the settlements, typically upstream of the alluvial fans, allowing for reduced works of canalization, that are compatible with the constraints imposed by the urbanization. The general idea consists in storing a part of the flowing solids during the peak of the hydrograph and releasing it in a successive phase or during minor floods. For this purpose, and in order to optimize the solid peak discharge reduction, it is necessary that properly designed open-check dams, capable of inducing a significative sedimentation of the solid discharge only when this exceeds a design-threshold value, control the deposition basins. A correct design of the check dam is crucial in order to induce the sedimentation in the right amount and at the right moment: a too early sedimentation might fill the volume before the peak, like in the case of close-check dams, while a too weak sedimentation might not use the whole available volume. In both cases, the channelling works might not be sufficient to let all the flow pass through, compromising the safety of the settlement. To avoid this inconvenience, we propose the use of slit-check dams, whose efficiency has already been proved for bed load. Check dams are often designed only on the base of the designer's experience. Besides, even today it is often believed that the filtering effect of open check dams is exerted through a mechanical sieve, while it was proved that the retention of the solid material is rather due to a hydrodynamic effect induced by the narrowing of the section. Also in the case of debris flow, through proper balances of liquid and solid mass and energy it is possible to obtain a rational criterion for designing the width of the slit in order to obtain a sediment deposition of desired elevation for a given design discharge. In this way the use of the retention basin can be optimized in order to maximize the reduction of the debris flow peak discharge. Flume experiments were carried out in steady conditions at the University of Trento and confirmed with good agreement the prediction of the theory. As in the case of ordinary sediment transport, the clogging induced by the vegetal material represents the major problem for the operational reliability of this systems and needs therefore to be further investigated.

  15. A new dataset validation system for the Planetary Science Archive

    NASA Astrophysics Data System (ADS)

    Manaud, N.; Zender, J.; Heather, D.; Martinez, S.

    2007-08-01

    The Planetary Science Archive is the official archive for the Mars Express mission. It has received its first data by the end of 2004. These data are delivered by the PI teams to the PSA team as datasets, which are formatted conform to the Planetary Data System (PDS). The PI teams are responsible for analyzing and calibrating the instrument data as well as the production of reduced and calibrated data. They are also responsible of the scientific validation of these data. ESA is responsible of the long-term data archiving and distribution to the scientific community and must ensure, in this regard, that all archived products meet quality. To do so, an archive peer-review is used to control the quality of the Mars Express science data archiving process. However a full validation of its content is missing. An independent review board recently recommended that the completeness of the archive as well as the consistency of the delivered data should be validated following well-defined procedures. A new validation software tool is being developed to complete the overall data quality control system functionality. This new tool aims to improve the quality of data and services provided to the scientific community through the PSA, and shall allow to track anomalies in and to control the completeness of datasets. It shall ensure that the PSA end-users: (1) can rely on the result of their queries, (2) will get data products that are suitable for scientific analysis, (3) can find all science data acquired during a mission. We defined dataset validation as the verification and assessment process to check the dataset content against pre-defined top-level criteria, which represent the general characteristics of good quality datasets. The dataset content that is checked includes the data and all types of information that are essential in the process of deriving scientific results and those interfacing with the PSA database. The validation software tool is a multi-mission tool that has been designed to provide the user with the flexibility of defining and implementing various types of validation criteria, to iteratively and incrementally validate datasets, and to generate validation reports.

  16. Strength evaluation of prosthetic check sockets, copolymer sockets, and definitive laminated sockets.

    PubMed

    Gerschutz, Maria J; Haynes, Michael L; Nixon, Derek; Colvin, James M

    2012-01-01

    A prosthesis encounters loading through forces and torques exerted by the person with amputation. International Organization for Standardization (ISO) standard 10328 was designed to test most lower-limb prosthetic components. However, this standard does not include prosthetic sockets. We measured static failure loads of prosthetic sockets using a modified ISO 10328 and then compared them with the criteria set by this standard for other components. Check socket (CS) strengths were influenced by thickness, material choice, and fabrication method. Copolymer socket (CP) strengths depended on thickness and fabrication methods. A majority of the CSs and all of the CPs failed to pass the ISO 10328 ductile loading criterion. In contrast, the strengths of definitive laminated sockets (DLs) were influenced more by construction material and technique. A majority of the DLs failed to pass the ISO 10328 brittle loading criterion. Analyzing prosthetic sockets from a variety of facilities demonstrated that socket performance varies considerably between and within facilities. The results from this article provide a foundation for understanding the quality of prosthetic sockets, some insight into possible routes for improving the current care delivered to patients, and a comparative basis for future technology.

  17. Improving treatment plan evaluation with automation

    PubMed Central

    Covington, Elizabeth L.; Chen, Xiaoping; Younge, Kelly C.; Lee, Choonik; Matuszak, Martha M.; Kessler, Marc L.; Keranen, Wayne; Acosta, Eduardo; Dougherty, Ashley M.; Filpansick, Stephanie E.

    2016-01-01

    The goal of this work is to evaluate the effectiveness of Plan‐Checker Tool (PCT) which was created to improve first‐time plan quality, reduce patient delays, increase the efficiency of our electronic workflow, and standardize and automate the physics plan review in the treatment planning system (TPS). PCT uses an application programming interface to check and compare data from the TPS and treatment management system (TMS). PCT includes a comprehensive checklist of automated and manual checks that are documented when performed by the user as part of a plan readiness check for treatment. Prior to and during PCT development, errors identified during the physics review and causes of patient treatment start delays were tracked to prioritize which checks should be automated. Nineteen of 33 checklist items were automated, with data extracted with PCT. There was a 60% reduction in the number of patient delays in the six months after PCT release. PCT was successfully implemented for use on all external beam treatment plans in our clinic. While the number of errors found during the physics check did not decrease, automation of checks increased visibility of errors during the physics check, which led to decreased patient delays. The methods used here can be applied to any TMS and TPS that allows queries of the database. PACS number(s): 87.55.‐x, 87.55.N‐, 87.55.Qr, 87.55.tm, 89.20.Bb PMID:27929478

  18. GPCRs from fusarium graminearum detection, modeling and virtual screening - the search for new routes to control head blight disease.

    PubMed

    Bresso, Emmanuel; Togawa, Roberto; Hammond-Kosack, Kim; Urban, Martin; Maigret, Bernard; Martins, Natalia Florencio

    2016-12-15

    Fusarium graminearum (FG) is one of the major cereal infecting pathogens causing high economic losses worldwide and resulting in adverse effects on human and animal health. Therefore, the development of new fungicides against FG is an important issue to reduce cereal infection and economic impact. In the strategy for developing new fungicides, a critical step is the identification of new targets against which innovative chemicals weapons can be designed. As several G-protein coupled receptors (GPCRs) are implicated in signaling pathways critical for the fungi development and survival, such proteins could be valuable efficient targets to reduce Fusarium growth and therefore to prevent food contamination. In this study, GPCRs were predicted in the FG proteome using a manually curated pipeline dedicated to the identification of GPCRs. Based on several successive filters, the most appropriate GPCR candidate target for developing new fungicides was selected. Searching for new compounds blocking this particular target requires the knowledge of its 3D-structure. As no experimental X-Ray structure of the selected protein was available, a 3D model was built by homology modeling. The model quality and stability was checked by 100 ns of molecular dynamics simulations. Two stable conformations representative of the conformational families of the protein were extracted from the 100 ns simulation and were used for an ensemble docking campaign. The model quality and stability was checked by 100 ns of molecular dynamics simulations previously to the virtual screening step. The virtual screening step comprised the exploration of a chemical library with 11,000 compounds that were docked to the GPCR model. Among these compounds, we selected the ten top-ranked nontoxic molecules proposed to be experimentally tested to validate the in silico simulation. This study provides an integrated process merging genomics, structural bioinformatics and drug design for proposing innovative solutions to a world wide threat to grain producers and consumers.

  19. First experiences with the LHC BLM sanity checks

    NASA Astrophysics Data System (ADS)

    Emery, J.; Dehning, B.; Effinger, E.; Nordt, A.; Sapinski, M. G.; Zamantzas, C.

    2010-12-01

    The reliability concerns have driven the design of the Large Hardron Collider (LHC) Beam Loss Monitoring (BLM) system from the early stage of the studies up to the present commissioning and the latest development of diagnostic tools. To protect the system against non-conformities, new ways of automatic checking have been developed and implemented. These checks are regularly and systematically executed by the LHC operation team to ensure that the system status is after each test "as good as new". The sanity checks are part of this strategy. They are testing the electrical part of the detectors (ionisation chamber or secondary emission detector), their cable connections to the front-end electronics, further connections to the back-end electronics and their ability to request a beam abort. During the installation and in the early commissioning phase, these checks have shown their ability to find also non-conformities caused by unexpected failure event scenarios. In every day operation, a non-conformity discovered by this check inhibits any further injections into the LHC until the check confirms the absence of non-conformities.

  20. Microspheres as resistive elements in a check valve for low pressure and low flow rate conditions.

    PubMed

    Ou, Kevin; Jackson, John; Burt, Helen; Chiao, Mu

    2012-11-07

    In this paper we describe a microsphere-based check valve integrated with a micropump. The check valve uses Ø20 μm polystyrene microspheres to rectify flow in low pressure and low flow rate applications (Re < 1). The microspheres form a porous medium in the check valve increasing fluidic resistance based on the direction of flow. Three check valve designs were fabricated and characterized to study the microspheres' effectiveness as resistive elements. A maximum diodicity (ratio of flow in the forward and reverse direction) of 18 was achieved. The pumping system can deliver a minimum flow volume of 0.25 μL and a maximum flow volume of 1.26 μL under an applied pressure of 0.2 kPa and 1 kPa, respectively. A proof-of-concept study was conducted using a pharmaceutical agent, docetaxel (DTX), as a sample drug showing the microsphere check valve's ability to limit diffusion from the micropump. The proposed check valve and pumping concept shows strong potential for implantable drug delivery applications with low flow rate requirements.

  1. Space shuttle prototype check valve development

    NASA Technical Reports Server (NTRS)

    Tellier, G. F.

    1976-01-01

    Contaminant-resistant seal designs and a dynamically stable prototype check valve for the orbital maneuvering and reaction control helium pressurization systems of the space shuttle were developed. Polymer and carbide seal models were designed and tested. Perfluoroelastomers compatible with N2O4 and N2H4 types were evaluated and compared with Teflon in flat and captive seal models. Low load sealing and contamination resistance tests demonstrated cutter seal superiority over polymer seals. Ceramic and carbide materials were evaluated for N2O4 service using exposure to RFNA as a worst case screen; chemically vapor deposited tungsten carbide was shown to be impervious to the acid after 6 months immersion. A unique carbide shell poppet/cutter seat check valve was designed and tested to demonstrate low cracking pressure ( 2.0 psid), dynamic stability under all test bench flow conditions, contamination resistance (0.001 inch CRES wires cut with 1.5 pound seat load) and long life of 100,000 cycles (leakage 1.0 scc/hr helium from 0.1 to 400 psig).

  2. UTP and Temporal Logic Model Checking

    NASA Astrophysics Data System (ADS)

    Anderson, Hugh; Ciobanu, Gabriel; Freitas, Leo

    In this paper we give an additional perspective to the formal verification of programs through temporal logic model checking, which uses Hoare and He Unifying Theories of Programming (UTP). Our perspective emphasizes the use of UTP designs, an alphabetised relational calculus expressed as a pre/post condition pair of relations, to verify state or temporal assertions about programs. The temporal model checking relation is derived from a satisfaction relation between the model and its properties. The contribution of this paper is that it shows a UTP perspective to temporal logic model checking. The approach includes the notion of efficiency found in traditional model checkers, which reduced a state explosion problem through the use of efficient data structures

  3. Network design and quality checks in automatic orientation of close-range photogrammetric blocks.

    PubMed

    Dall'Asta, Elisa; Thoeni, Klaus; Santise, Marina; Forlani, Gianfranco; Giacomini, Anna; Roncella, Riccardo

    2015-04-03

    Due to the recent improvements of automatic measurement procedures in photogrammetry, multi-view 3D reconstruction technologies are becoming a favourite survey tool. Rapidly widening structure-from-motion (SfM) software packages offer significantly easier image processing workflows than traditional photogrammetry packages. However, while most orientation and surface reconstruction strategies will almost always succeed in any given task, estimating the quality of the result is, to some extent, still an open issue. An assessment of the precision and reliability of block orientation is necessary and should be included in every processing pipeline. Such a need was clearly felt from the results of close-range photogrammetric surveys of in situ full-scale and laboratory-scale experiments. In order to study the impact of the block control and the camera network design on the block orientation accuracy, a series of Monte Carlo simulations was performed. Two image block configurations were investigated: a single pseudo-normal strip and a circular highly-convergent block. The influence of surveying and data processing choices, such as the number and accuracy of the ground control points, autofocus and camera calibration was investigated. The research highlights the most significant aspects and processes to be taken into account for adequate in situ and laboratory surveys, when modern SfM software packages are used, and evaluates their effect on the quality of the results of the surface reconstruction.

  4. An audit of level two and level three checks of anaesthesia delivery systems performed at three hospitals in South Australia.

    PubMed

    Sweeney, N; Owen, H; Fronsko, R; Hurlow, E

    2012-11-01

    Anaesthetists may subject patients to unnecessary risk by not checking anaesthetic equipment thoroughly before use. Numerous adverse events have been associated with failure to check equipment. The Australian and New Zealand College of Anaesthetists and anaesthetic delivery system manufactures have made recommendations on how anaesthetic equipment should be maintained and checked before use and for the training required for staff who use such equipment. These recommendations are made to minimise the risk to patients undergoing anaesthesia. This prospective audit investigated the adherence of anaesthetic practitioners to a selection of those recommendations. Covert observations of anaesthetic practitioners were made while they were checking their designated anaesthetic machine, either at the beginning of a day's list or between cases. Structured interviews with staff who check the anaesthetic machine were carried out to determine the training they had received. The results indicated poor compliance with recommendations: significantly, the backup oxygen cylinders' pressure/contents were not checked in 45% of observations; the emergency ventilation device was not checked in 67% of observations; the breathing circuit was not tested between patients in 79% of observations; no documentation of the checks performed was done in any cases; and no assessment or accreditation of the staff who performed these checks was performed. It was concluded that the poor compliance was a system failing and that patient safety might be increased with training and accrediting staff responsible for checking equipment, documenting the checks performed, and the formulation and use of a checklist.

  5. 40 CFR 75.59 - Certification, quality assurance, and quality control record provisions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... and the run average); (B) The raw data and results for all required pre-test, post-test, pre-run and...-day calibration error tests, all daily system integrity checks (Hg monitors, only), and all off-line calibration demonstrations, including any follow-up tests after corrective action: (i) Component-system...

  6. Planning and Implementing Total Quality Management in the Royal Australian Air Force: A Multiple Case Study Analysis

    DTIC Science & Technology

    1990-09-01

    change barriers, and necessary checks and balances built into processes. Furthermore, this assessment should address management system variables which...organisation’s 69 immediate product and their worklife . Focus must be maintained on improving RAAF processes. In addition to a quality committee structure as

  7. Evaluation and use of the Materials and Test (MATT) Data System for quality of construction and management review : final report.

    DOT National Transportation Integrated Search

    1985-12-01

    This report documents the review of the MATerials and Test (MATT) Data System to check the validity of data within the system. A computer program to generate the quality level of a construction material was developed. Programs were also developed to ...

  8. Radiation Safety and Quality Assurance in North American Dental Schools.

    ERIC Educational Resources Information Center

    Farman, Allan G.; Hines, Vickie G.

    1986-01-01

    A survey of dental schools that revealed processing quality control and routine maintenance checks on x-ray generators are being carried out in a timely manner is discussed. However, methods for reducing patient exposure to radiation are not being fully implemented, and some dental students are being exposed to x-rays. (Author/MLW)

  9. Software Quality Control at Belle II

    NASA Astrophysics Data System (ADS)

    Ritter, M.; Kuhr, T.; Hauth, T.; Gebard, T.; Kristof, M.; Pulvermacher, C.; Belle Software Group, II

    2017-10-01

    Over the last seven years the software stack of the next generation B factory experiment Belle II has grown to over one million lines of C++ and Python code, counting only the part included in offline software releases. There are several thousand commits to the central repository by about 100 individual developers per year. To keep a coherent software stack of high quality that it can be sustained and used efficiently for data acquisition, simulation, reconstruction, and analysis over the lifetime of the Belle II experiment is a challenge. A set of tools is employed to monitor the quality of the software and provide fast feedback to the developers. They are integrated in a machinery that is controlled by a buildbot master and automates the quality checks. The tools include different compilers, cppcheck, the clang static analyzer, valgrind memcheck, doxygen, a geometry overlap checker, a check for missing or extra library links, unit tests, steering file level tests, a sophisticated high-level validation suite, and an issue tracker. The technological development infrastructure is complemented by organizational means to coordinate the development.

  10. Evaluation of the user seal check on gross leakage detection of 3 different designs of N95 filtering facepiece respirators.

    PubMed

    Lam, Simon C; Lui, Andrew K F; Lee, Linda Y K; Lee, Joseph K L; Wong, K F; Lee, Cathy N Y

    2016-05-01

    The use of N95 respirators prevents spread of respiratory infectious agents, but leakage hampers its protection. Manufacturers recommend a user seal check to identify on-site gross leakage. However, no empirical evidence is provided. Therefore, this study aims to examine validity of a user seal check on gross leakage detection in commonly used types of N95 respirators. A convenience sample of 638 nursing students was recruited. On the wearing of 3 different designs of N95 respirators, namely 3M-1860s, 3M-1862, and Kimberly-Clark 46827, the standardized user seal check procedure was carried out to identify gross leakage. Repeated testing of leakage was followed by the use of a quantitative fit testing (QNFT) device in performing normal breathing and deep breathing exercises. Sensitivity, specificity, predictive values, and likelihood ratios were calculated accordingly. As indicated by QNFT, prevalence of actual gross leakage was 31.0%-39.2% with the 3M respirators and 65.4%-65.8% with the Kimberly-Clark respirator. Sensitivity and specificity of the user seal check for identifying actual gross leakage were approximately 27.7% and 75.5% for 3M-1860s, 22.1% and 80.5% for 3M-1862, and 26.9% and 80.2% for Kimberly-Clark 46827, respectively. Likelihood ratios were close to 1 (range, 0.89-1.51) for all types of respirators. The results did not support user seal checks in detecting any actual gross leakage in the donning of N95 respirators. However, such a check might alert health care workers that donning a tight-fitting respirator should be performed carefully. Copyright © 2016 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  11. SU-F-T-272: Patient Specific Quality Assurance of Prostate VMAT Plans with Portal Dosimetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Darko, J; Osei, E; University of Waterloo, Waterloo, ON

    Purpose: To evaluate the effectiveness of using the Portal Dosimetry (PD) method for patient specific quality assurance of prostate VMAT plans. Methods: As per institutional protocol all VMAT plans were measured using the Varian Portal Dosimetry (PD) method. A gamma evaluation criterion of 3%-3mm with a minimum area gamma pass rate (gamma <1) of 95% is used clinically for all plans. We retrospectively evaluated the portal dosimetry results for 170 prostate patients treated with VMAT technique. Three sets of criterions were adopted for re-evaluating the measurements; 3%-3mm, 2%-2mm and 1%-1mm. For all criterions two areas, Field+1cm and MLC-CIAO were analysed.Tomore » ascertain the effectiveness of the portal dosimetry technique in determining the delivery accuracy of prostate VMAT plans, 10 patients previously measured with portal dosimetry, were randomly selected and their measurements repeated using the ArcCHECK method. The same criterion used in the analysis of PD was used for the ArcCHECK measurements. Results: All patient plans reviewed met the institutional criteria for Area Gamma pass rate. Overall, the gamma pass rate (gamma <1) decreases for 3%-3mm, 2%-2mm and 1%-1mm criterion. For each criterion the pass rate was significantly reduced when the MLC-CIAO was used instead of FIELD+1cm. There was noticeable change in sensitivity for MLC-CIAO with 2%-2mm criteria and much more significant reduction at 1%-1mm. Comparable results were obtained for the ArcCHECK measurements. Although differences were observed between the clockwise verses the counter clockwise plans in both the PD and ArcCHECK measurements, this was not deemed to be statistically significant. Conclusion: This work demonstrates that Portal Dosimetry technique can be effectively used for quality assurance of VMAT plans. Results obtained show similar sensitivity compared to ArcCheck. To reveal certain delivery inaccuracies, the use of a combination of criterions may provide an effective way in improving the overall sensitivity of PD. Funding provided in part by the Prostate Ride for Dad, Kitchener-Waterloo, Canada.« less

  12. Using the Benford's Law as a First Step to Assess the Quality of the Cancer Registry Data.

    PubMed

    Crocetti, Emanuele; Randi, Giorgia

    2016-01-01

    Benford's law states that the distribution of the first digit different from 0 [first significant digit (FSD)] in many collections of numbers is not uniform. The aim of this study is to evaluate whether population-based cancer incidence rates follow Benford's law, and if this can be used in their data quality check process. We sampled 43 population-based cancer registry populations (CRPs) from the Cancer Incidence in 5 Continents-volume X (CI5-X). The distribution of cancer incidence rate FSD was evaluated overall, by sex, and by CRP. Several statistics, including Pearson's coefficient of correlation and distance measures, were applied to check the adherence to the Benford's law. In the whole dataset (146,590 incidence rates) and for each sex (70,722 male and 75,868 female incidence rates), the FSD distributions were Benford-like. The coefficient of correlation between observed and expected FSD distributions was extremely high (0.999), and the distance measures low. Considering single CRP (from 933 to 7,222 incidence rates), the results were in agreement with the Benford's law, and only a few CRPs showed possible discrepancies from it. This study demonstrated for the first time that cancer incidence rates follow Benford's law. This characteristic can be used as a new, simple, and objective tool in data quality evaluation. The analyzed data had been already checked for publication in CI5-X. Therefore, their quality was expected to be good. In fact, only for a few CRPs several statistics were consistent with possible violations.

  13. The effect of on/off indicator design on state confusion, preference, and response time performance, executive summary

    NASA Technical Reports Server (NTRS)

    Donner, Kimberly A.; Holden, Kritina L.; Manahan, Meera K.

    1991-01-01

    Investigated are five designs of software-based ON/OFF indicators in a hypothetical Space Station Power System monitoring task. The hardware equivalent of the indicators used in the present study is the traditional indicator light that illuminates an ON label or an OFF label. Coding methods used to represent the active state were reverse video, color, frame, check, or reverse video with check. Display background color was also varied. Subjects made judgments concerning the state of indicators that resulted in very low error rates and high percentages of agreement across indicator designs. Response time measures for each of the five indicator designs did not differ significantly, although subjects reported that color was the best communicator. The impact of these results on indicator design is discussed.

  14. Whole grain cereals for the primary or secondary prevention of cardiovascular disease.

    PubMed

    Kelly, Sarah Am; Hartley, Louise; Loveman, Emma; Colquitt, Jill L; Jones, Helen M; Al-Khudairy, Lena; Clar, Christine; Germanò, Roberta; Lunn, Hannah R; Frost, Gary; Rees, Karen

    2017-08-24

    There is evidence from observational studies that whole grains can have a beneficial effect on risk for cardiovascular disease (CVD). Earlier versions of this review found mainly short-term intervention studies. There are now longer-term randomised controlled trials (RCTs) available. This is an update and expansion of the original review conducted in 2007. The aim of this systematic review was to assess the effect of whole grain foods or diets on total mortality, cardiovascular events, and cardiovascular risk factors (blood lipids, blood pressure) in healthy people or people who have established cardiovascular disease or related risk factors, using all eligible RCTs. We searched CENTRAL (Issue 8, 2016) in the Cochrane Library, MEDLINE (1946 to 31 August 2016), Embase (1980 to week 35 2016), and CINAHL Plus (1937 to 31 August 2016) on 31 August 2016. We also searched ClinicalTrials.gov on 5 July 2017 and the World Health Organization International Clinical Trials Registry Platform (WHO ICTRP) on 6 July 2017. We checked reference lists of relevant articles and applied no language restrictions. We selected RCTs assessing the effects of whole grain foods or diets containing whole grains compared to foods or diets with a similar composition, over a minimum of 12 weeks, on cardiovascular disease and related risk factors. Eligible for inclusion were healthy adults, those at increased risk of CVD, or those previously diagnosed with CVD. Two review authors independently selected studies. Data were extracted and quality-checked by one review author and checked by a second review author. A second review author checked the analyses. We assessed treatment effect using mean difference in a fixed-effect model and heterogeneity using the I 2 statistic and the Chi 2 test of heterogeneity. We assessed the overall quality of evidence using GRADE with GRADEpro software. We included nine RCTs randomising a total of 1414 participants (age range 24 to 70; mean age 45 to 59, where reported) to whole grain versus lower whole grain or refined grain control groups. We found no studies that reported the effect of whole grain diets on total cardiovascular mortality or cardiovascular events (total myocardial infarction, unstable angina, coronary artery bypass graft surgery, percutaneous transluminal coronary angioplasty, total stroke). All included studies reported the effect of whole grain diets on risk factors for cardiovascular disease including blood lipids and blood pressure. All studies were in primary prevention populations and had an unclear or high risk of bias, and no studies had an intervention duration greater than 16 weeks.Overall, we found no difference between whole grain and control groups for total cholesterol (mean difference 0.07, 95% confidence interval -0.07 to 0.21; 6 studies (7 comparisons); 722 participants; low-quality evidence).Using GRADE, we assessed the overall quality of the available evidence on cholesterol as low. Four studies were funded by independent national and government funding bodies, while the remaining studies reported funding or partial funding by organisations with commercial interests in cereals. There is insufficient evidence from RCTs of an effect of whole grain diets on cardiovascular outcomes or on major CVD risk factors such as blood lipids and blood pressure. Trials were at unclear or high risk of bias with small sample sizes and relatively short-term interventions, and the overall quality of the evidence was low. There is a need for well-designed, adequately powered RCTs with longer durations assessing cardiovascular events as well as cardiovascular risk factors.

  15. Enhancement of the Automated Quality Control Procedures for the International Soil Moisture Network

    NASA Astrophysics Data System (ADS)

    Heer, Elsa; Xaver, Angelika; Dorigo, Wouter; Messner, Romina

    2017-04-01

    In-situ soil moisture observations are still trusted to be the most reliable data to validate remotely sensed soil moisture products. Thus, the quality of in-situ soil moisture observations is of high importance. The International Soil Moisture Network (ISMN; http://ismn.geo.tuwien.ac.at/) provides in-situ soil moisture data from all around the world. The data is collected from individual networks and data providers, measured by different sensors in various depths. The data sets which are delivered in different units, time zones and data formats are then transformed into homogeneous data sets. An erroneous behavior of soil moisture data is very difficult to detect, due to annual and daily changes and most significantly the high influence of precipitation and snow melting processes. Only few of the network providers have a quality assessment for their data sets. Therefore, advanced quality control procedures have been developed for the ISMN (Dorigo et al. 2013). Three categories of quality checks were introduced: exceeding boundary values, geophysical consistency checks and a spectrum based approach. The spectrum based quality control algorithms aim to detect erroneous measurements which occur within plausible geophysical ranges, e.g. a sudden drop in soil moisture caused by a sensor malfunction. By defining several conditions which have to be met by the original soil moisture time series and their first and second derivative, such error types can be detected. Since the development of these sophisticated methods many more data providers shared their data with the ISMN and new types of erroneous measurements were identified. Thus, an enhancement of the automated quality control procedures became necessary. In the present work, we introduce enhancements of the existing quality control algorithms. Additionally, six completely new quality checks have been developed, e.g. detection of suspicious values before or after NAN-values, constant values and values that lie in a spectrum where a high majority of values before and after is flagged and therefore a sensor malfunction is certain. For the evaluation of the enhanced automated quality control system many test data sets were chosen, and manually validated to be compared to the existing quality control procedures and the new algorithms. Improvements will be shown that assure an appropriate assessment of the ISMN data sets, which are used for validations of soil moisture data retrieved by satellite data and are the foundation many other scientific publications.

  16. 30 CFR 77.804 - High-voltage trailing cables; minimum design requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... equipped with metallic shields around each power conductor with one or more ground conductors having a total cross-sectional area of not less than one-half the power conductor, and with an insulated conductor for the ground continuity check circuit. External ground check conductors may be used if they are...

  17. 76 FR 53348 - Airworthiness Directives; BAE SYSTEMS (Operations) Limited Model BAe 146 Airplanes and Model Avro...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-26

    ... Maintenance Manual (AMM) includes chapters 05-10 ``Time Limits'', 05-15 ``Critical Design Configuration... 05, ``Time Limits/Maintenance Checks,'' of BAe 146 Series/AVRO 146-RJ Series Aircraft Maintenance... Chapter 05, ``Time Limits/ Maintenance Checks,'' of the BAE SYSTEMS (Operations) Limited BAe 146 Series...

  18. 12 CFR Appendix E to Part 229 - Commentary

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... by another entity. The Board believes that the statutory proximity test was designed to apply to.... The EFA Act defines a certified check as one to which a bank has certified that the drawer's signature... by regulations.” The Board has defined check processing region as the territory served by one of the...

  19. 12 CFR Appendix E to Part 229 - Commentary

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... by another entity. The Board believes that the statutory proximity test was designed to apply to.... The EFA Act defines a certified check as one to which a bank has certified that the drawer's signature... by regulations.” The Board has defined check processing region as the territory served by one of the...

  20. 12 CFR Appendix E to Part 229 - Commentary

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... by another entity. The Board believes that the statutory proximity test was designed to apply to.... The EFA Act defines a certified check as one to which a bank has certified that the drawer's signature... by regulations.” The Board has defined check processing region as the territory served by one of the...

  1. 12 CFR Appendix E to Part 229 - Commentary

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... by another entity. The Board believes that the statutory proximity test was designed to apply to.... The EFA Act defines a certified check as one to which a bank has certified that the drawer's signature... by regulations.” The Board has defined check processing region as the territory served by one of the...

  2. "Check Your Smile", Prototype of a Collaborative LSP Website for Technical Vocabulary

    ERIC Educational Resources Information Center

    Yassine-Diab, Nadia; Alazard-Guiu, Charlotte; Loiseau, Mathieu; Sorin, Laurent; Orliac, Charlotte

    2016-01-01

    In a design-based research approach (Barab & Squire, 2004), we are currently developing the first prototype of a collaborative Language for Specific Purposes (LSP) website. It focuses on technical vocabulary to help students master any field of LSP better. "Check Your Smile" is a platform aggregating various types of gameplays for…

  3. Utilization of Computer Technology To Facilitate Money Management by Individuals with Mental Retardation.

    ERIC Educational Resources Information Center

    Davies, Daniel K.; Stock, Steven E.; Wehmeyer, Michael L.

    2003-01-01

    This report describes results of an initial investigation of the utility of a specially designed money management software program for improving management of personal checking accounts for individuals with mental retardation. Use with 19 adults with mental retardation indicated the software resulted in significant reduction in check writing and…

  4. 49 CFR 1562.23 - Aircraft operator and passenger requirements.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... designated by an aircraft operator under paragraph (a) of this section: (1) Must undergo a fingerprint-based... compliance with the fingerprint-based criminal history records check requirements of §§ 1542.209, 1544.229... a fingerprint-based criminal history records check that does not disclose that he or she has a...

  5. 49 CFR 1562.23 - Aircraft operator and passenger requirements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... designated by an aircraft operator under paragraph (a) of this section: (1) Must undergo a fingerprint-based... compliance with the fingerprint-based criminal history records check requirements of §§ 1542.209, 1544.229... a fingerprint-based criminal history records check that does not disclose that he or she has a...

  6. Ice hockey shoulder pad design and the effect on head response during shoulder-to-head impacts.

    PubMed

    Richards, Darrin; Ivarsson, B Johan; Scher, Irving; Hoover, Ryan; Rodowicz, Kathleen; Cripton, Peter

    2016-11-01

    Ice hockey body checks involving direct shoulder-to-head contact frequently result in head injury. In the current study, we examined the effect of shoulder pad style on the likelihood of head injury from a shoulder-to-head check. Shoulder-to-head body checks were simulated by swinging a modified Hybrid-III anthropomorphic test device (ATD) with and without shoulder pads into a stationary Hybrid-III ATD at 21 km/h. Tests were conducted with three different styles of shoulder pads (traditional, integrated and tethered) and without shoulder pads for the purpose of control. Head response kinematics for the stationary ATD were measured. Compared to the case of no shoulder pads, the three different pad styles significantly (p < 0.05) reduced peak resultant linear head accelerations of the stationary ATD by 35-56%. The integrated shoulder pads reduced linear head accelerations by an additional 18-21% beyond the other two styles of shoulder pads. The data presented here suggest that shoulder pads can be designed to help protect the head of the struck player in a shoulder-to-head check.

  7. A national analytical quality assurance program: Developing guidelines and analytical tools for the forest inventory and analysis program

    Treesearch

    Phyllis C. Adams; Glenn A. Christensen

    2012-01-01

    A rigorous quality assurance (QA) process assures that the data and information provided by the Forest Inventory and Analysis (FIA) program meet the highest possible standards of precision, completeness, representativeness, comparability, and accuracy. FIA relies on its analysts to check the final data quality prior to release of a State’s data to the national FIA...

  8. Trust, but verify - Accuracy of clinical commercial radiation Treatment Planning Systems

    NASA Astrophysics Data System (ADS)

    Lehmann, J.; Kenny, J.; Lye, J.; Dunn, L.; Williams, I.

    2014-03-01

    Computer based Treatment Planning Systems (TPS) are used worldwide to design and calculate treatment plans for treating radiation therapy patients. TPS are generally well designed and thoroughly tested by their developers and local physicists prior to clinical use. However, the wide-reaching impact of their accuracy warrants ongoing vigilance. This work reviews the findings of the Australian national audit system and provides recommendations for checks of TPS. The Australian Clinical Dosimetry Service (ACDS) has designed and implemented a national system of audits, currently in a three year test phase. The Level III audits verify the accuracy of a beam model of a facility's TPS through a comparison of measurements with calculation at selected points in an anthropomorphic phantom. The plans are prescribed by the ACDS and all measurement equipment is brought in for independent onsite measurements. In this first version of audits, plans are comparatively simple, involving asymmetric fields, wedges and inhomogeneities. The ACDS has performed 14 Level III audits to-date. Six audits returned at least one measurement at Action Level, indicating that the measured dose differed more than 3.3% (but less than 5%) from the planned dose. Two audits failed (difference >5%). One fail was caused by a data transmission error coupled with quality assurance (QA) not being performed. The second fail was investigated and reduced to Action Level with the onsite audit team finding phantom setup at treatment a contributing factor. The Action Level results are attributed to small dose calculation deviations within the TPS, which are investigated and corrected by the facilities. Small deviations exist in clinical TPS which can add up and can combine with output variations to result in unacceptable variations. Ongoing checks and independent audits are recommended.

  9. In pursuit of quality by viable quality assurance system: the controllers' perceptions.

    PubMed

    Aziz, Anwar

    2011-01-01

    Patients, families and communities expect safe, competent and compassionate nursing care that has always been a core value of nursing. To meet these expectations, a valid and reliable quality assurance (QA) system is crucial to ensure that nurse-graduates are competent, confident and fit to practice. The QA approach is seen to be fundamental for quality improvement, it would be appropriate to consider its influence in the nursing education in Pakistan as the current situation is evident of non-existence of such a system to assure its quality. The data is drawn from a qualitative case study conducted in 2004. Among a purposive sample of 71 nurses inclusive of a group of Controllers were interviewed on one-to-one basis. Interviews were audio taped to reduce the risk of any misinterpretation and to facilitate the exact description of data as it was said. The non-directive, semi-structured and open-ended questionnaire was used to collect data. Thematic analysis of verbatim transcripts of the interviews was done. The study findings reveal a unanimous desire of the nurses to gauge quality of nurse education through efficient and effective quality assurance system. A crucial need is felt to develop a viable quality assurance system to ensure approved level of quality in nursing education to deliver the right care to the right patient at the right time, every time. The continuous quality assurance and improvement (CQAI) framework based on Deming Quality Cycle (Plan, Do, Check and Act) could facilitate appropriate designing and development of mechanism.

  10. Rainfall, Streamflow, and Water-Quality Data During Stormwater Monitoring, Halawa Stream Drainage Basin, Oahu, Hawaii, July 1, 2003 to June 30, 2004

    USGS Publications Warehouse

    Young, Stacie T.M.; Ball, Marcael T.J.

    2004-01-01

    Storm runoff water-quality samples were collected as part of the State of Hawaii Department of Transportation Stormwater Monitoring Program. This program is designed to assess the effects of highway runoff and urban runoff on Halawa Stream. For this program, rainfall data were collected at two sites, continuous streamflow data at three sites, and water-quality data at five sites, which include the three streamflow sites. This report summarizes rainfall, streamflow, and water-quality data collected between July 1, 2003 and June 30, 2004. A total of 30 samples was collected over four storms during July 1, 2003 to June 30, 2004. In general, an attempt was made to collect grab samples nearly simultaneously at all five sites, and flow-weighted time-composite samples were collected at the three sites equipped with automatic samplers. However, all four storms were partially sampled because either not all stations were sampled or only grab samples were collected. Samples were analyzed for total suspended solids, total dissolved solids, nutrients, chemical oxygen demand, and selected trace metals (cadmium, copper, lead, and zinc). Grab samples were additionally analyzed for oil and grease, total petroleum hydrocarbons, fecal coliform, and biological oxygen demand. Quality-assurance/quality-control samples, collected during storms and during routine maintenance, were also collected to verify analytical procedures and check the effectiveness of equipment-cleaning procedures.

  11. A quality score for coronary artery tree extraction results

    NASA Astrophysics Data System (ADS)

    Cao, Qing; Broersen, Alexander; Kitslaar, Pieter H.; Lelieveldt, Boudewijn P. F.; Dijkstra, Jouke

    2018-02-01

    Coronary artery trees (CATs) are often extracted to aid the fully automatic analysis of coronary artery disease on coronary computed tomography angiography (CCTA) images. Automatically extracted CATs often miss some arteries or include wrong extractions which require manual corrections before performing successive steps. For analyzing a large number of datasets, a manual quality check of the extraction results is time-consuming. This paper presents a method to automatically calculate quality scores for extracted CATs in terms of clinical significance of the extracted arteries and the completeness of the extracted CAT. Both right dominant (RD) and left dominant (LD) anatomical statistical models are generated and exploited in developing the quality score. To automatically determine which model should be used, a dominance type detection method is also designed. Experiments are performed on the automatically extracted and manually refined CATs from 42 datasets to evaluate the proposed quality score. In 39 (92.9%) cases, the proposed method is able to measure the quality of the manually refined CATs with higher scores than the automatically extracted CATs. In a 100-point scale system, the average scores for automatically and manually refined CATs are 82.0 (+/-15.8) and 88.9 (+/-5.4) respectively. The proposed quality score will assist the automatic processing of the CAT extractions for large cohorts which contain both RD and LD cases. To the best of our knowledge, this is the first time that a general quality score for an extracted CAT is presented.

  12. Knowledge-based critiquing of graphical user interfaces with CHIMES

    NASA Technical Reports Server (NTRS)

    Jiang, Jianping; Murphy, Elizabeth D.; Carter, Leslie E.; Truszkowski, Walter F.

    1994-01-01

    CHIMES is a critiquing tool that automates the process of checking graphical user interface (GUI) designs for compliance with human factors design guidelines and toolkit style guides. The current prototype identifies instances of non-compliance and presents problem statements, advice, and tips to the GUI designer. Changes requested by the designer are made automatically, and the revised GUI is re-evaluated. A case study conducted at NASA-Goddard showed that CHIMES has the potential for dramatically reducing the time formerly spent in hands-on consistency checking. Capabilities recently added to CHIMES include exception handling and rule building. CHIMES is intended for use prior to usability testing as a means, for example, of catching and correcting syntactic inconsistencies in a larger user interface.

  13. Improving the automated optimization of profile extrusion dies by applying appropriate optimization areas and strategies

    NASA Astrophysics Data System (ADS)

    Hopmann, Ch.; Windeck, C.; Kurth, K.; Behr, M.; Siegbert, R.; Elgeti, S.

    2014-05-01

    The rheological design of profile extrusion dies is one of the most challenging tasks in die design. As no analytical solution is available, the quality and the development time for a new design highly depend on the empirical knowledge of the die manufacturer. Usually, prior to start production several time-consuming, iterative running-in trials need to be performed to check the profile accuracy and the die geometry is reworked. An alternative are numerical flow simulations. These simulations enable to calculate the melt flow through a die so that the quality of the flow distribution can be analyzed. The objective of a current research project is to improve the automated optimization of profile extrusion dies. Special emphasis is put on choosing a convenient starting geometry and parameterization, which enable for possible deformations. In this work, three commonly used design features are examined with regard to their influence on the optimization results. Based on the results, a strategy is derived to select the most relevant areas of the flow channels for the optimization. For these characteristic areas recommendations are given concerning an efficient parameterization setup that still enables adequate deformations of the flow channel geometry. Exemplarily, this approach is applied to a L-shaped profile with different wall thicknesses. The die is optimized automatically and simulation results are qualitatively compared with experimental results. Furthermore, the strategy is applied to a complex extrusion die of a floor skirting profile to prove the universal adaptability.

  14. The importance of reference materials in doping-control analysis.

    PubMed

    Mackay, Lindsey G; Kazlauskas, Rymantas

    2011-08-01

    Currently a large range of pure substance reference materials are available for calibration of doping-control methods. These materials enable traceability to the International System of Units (SI) for the results generated by World Anti-Doping Agency (WADA)-accredited laboratories. Only a small number of prohibited substances have threshold limits for which quantification is highly important. For these analytes only the highest quality reference materials that are available should be used. Many prohibited substances have no threshold limits and reference materials provide essential identity confirmation. For these reference materials the correct identity is critical and the methods used to assess identity in these cases should be critically evaluated. There is still a lack of certified matrix reference materials to support many aspects of doping analysis. However, in key areas a range of urine matrix materials have been produced for substances with threshold limits, for example 19-norandrosterone and testosterone/epitestosterone (T/E) ratio. These matrix-certified reference materials (CRMs) are an excellent independent means of checking method recovery and bias and will typically be used in method validation and then regularly as quality-control checks. They can be particularly important in the analysis of samples close to threshold limits, in which measurement accuracy becomes critical. Some reference materials for isotope ratio mass spectrometry (IRMS) analysis are available and a matrix material certified for steroid delta values is currently under production. In other new areas, for example the Athlete Biological Passport, peptide hormone testing, designer steroids, and gene doping, reference material needs still need to be thoroughly assessed and prioritised.

  15. Design and analysis of surface plasmon resonance (SPR) sensor to check the quality of food from adulteration

    NASA Astrophysics Data System (ADS)

    Kumar, Manish; Raghuwanshi, Sanjeev Kumar

    2018-02-01

    In recent years, food safety issues caused by contamination of chemical substances or microbial species have raised a major area of concern to mankind. The conventional chromatography-based methods for detection of chemical are based on human-observation and slow for real-time monitoring. The surface plasmon resonance (SPR) sensors offers the capability of detection of very low concentrations of adulterated chemical and biological agents for real-time by monitoring. Thus, adulterant agent in food gives change in refractive index of pure food result in corresponding phase change. These changes can be detected at the output and can be related to the concentration of the chemical species present at the point.

  16. [Data validation methods and discussion on Chinese materia medica resource survey].

    PubMed

    Zhang, Yue; Ma, Wei-Feng; Zhang, Xiao-Bo; Zhu, Shou-Dong; Guo, Lan-Ping; Wang, Xing-Xing

    2013-07-01

    From the beginning of the fourth national survey of the Chinese materia medica resources, there were 22 provinces have conducted pilots. The survey teams have reported immense data, it put forward the very high request to the database system construction. In order to ensure the quality, it is necessary to check and validate the data in database system. Data validation is important methods to ensure the validity, integrity and accuracy of census data. This paper comprehensively introduce the data validation system of the fourth national survey of the Chinese materia medica resources database system, and further improve the design idea and programs of data validation. The purpose of this study is to promote the survey work smoothly.

  17. 7 CFR 58.243 - Checking quality.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    .... Periodically samples of product and environmental material shall be tested for salmonella. Test results shall be negative when samples are tested for salmonella. Line samples should be taken periodically as an...

  18. 7 CFR 58.243 - Checking quality.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    .... Periodically samples of product and environmental material shall be tested for salmonella. Test results shall be negative when samples are tested for salmonella. Line samples should be taken periodically as an...

  19. 7 CFR 58.243 - Checking quality.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    .... Periodically samples of product and environmental material shall be tested for salmonella. Test results shall be negative when samples are tested for salmonella. Line samples should be taken periodically as an...

  20. 7 CFR 58.243 - Checking quality.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    .... Periodically samples of product and environmental material shall be tested for salmonella. Test results shall be negative when samples are tested for salmonella. Line samples should be taken periodically as an...

  1. Harvesting river water through small dams promote positive environmental impact.

    PubMed

    Agoramoorthy, Govindasamy; Chaudhary, Sunita; Chinnasamy, Pennan; Hsu, Minna J

    2016-11-01

    While deliberations relating to negative consequences of large dams on the environment continue to dominate world attention, positive benefits provided by small dams, also known as check dams, go unobserved. Besides, little is known about the potential of check dams in mitigating global warming impacts due to less data availability. Small dams are usually commissioned to private contractors who do not have clear mandate from their employers to post their work online for public scrutiny. As a result, statistics on the design, cost, and materials used to build check dams are not available in public domain. However, this review paper presents data for the first time on the often ignored potential of check dams mitigating climate-induced hydrological threats. We hope that the scientific analysis presented in this paper will promote further research on check dams worldwide to better comprehend their eco-friendly significance serving society.

  2. Design, decoding and optimized implementation of SECDED codes over GF(q)

    DOEpatents

    Ward, H Lee; Ganti, Anand; Resnick, David R

    2014-06-17

    A plurality of columns for a check matrix that implements a distance d linear error correcting code are populated by providing a set of vectors from which to populate the columns, and applying to the set of vectors a filter operation that reduces the set by eliminating therefrom all vectors that would, if used to populate the columns, prevent the check matrix from satisfying a column-wise linear independence requirement associated with check matrices of distance d linear codes. One of the vectors from the reduced set may then be selected to populate one of the columns. The filtering and selecting repeats iteratively until either all of the columns are populated or the number of currently unpopulated columns exceeds the number of vectors in the reduced set. Columns for the check matrix may be processed to reduce the amount of logic needed to implement the check matrix in circuit logic.

  3. An Integrated Environment for Efficient Formal Design and Verification

    NASA Technical Reports Server (NTRS)

    1998-01-01

    The general goal of this project was to improve the practicality of formal methods by combining techniques from model checking and theorem proving. At the time the project was proposed, the model checking and theorem proving communities were applying different tools to similar problems, but there was not much cross-fertilization. This project involved a group from SRI that had substantial experience in the development and application of theorem-proving technology, and a group at Stanford that specialized in model checking techniques. Now, over five years after the proposal was submitted, there are many research groups working on combining theorem-proving and model checking techniques, and much more communication between the model checking and theorem proving research communities. This project contributed significantly to this research trend. The research work under this project covered a variety of topics: new theory and algorithms; prototype tools; verification methodology; and applications to problems in particular domains.

  4. [Practical implementation of a quality management system in a radiological department].

    PubMed

    Huber, S; Zech, C J

    2011-10-01

    This article describes the architecture of a project aiming to implement a DIN EN ISO 9001 quality management system in a radiological department. It is intended to be a practical guide to demonstrate each step of the project leading to certification of the system. In a planning phase resources for the implementation of the project have to be identified and a quality management (QM) group as core team has to be formed. In the first project phase all available documents have to be checked and compiled in the QM manual. Moreover all relevant processes of the department have to be described in so-called process descriptions. In a second step responsibilities for the project are identified. Customer and employee surveys have to be carried out and a nonconformity management system has to be implemented. In this phase internal audits are also needed to check the new QM system, which is finally tested in the external certification audit with reference to its conformity with the standards.

  5. Respiratory Protection Toolkit: Providing Guidance Without Changing Requirements-Can We Make an Impact?

    PubMed

    Bien, Elizabeth Ann; Gillespie, Gordon Lee; Betcher, Cynthia Ann; Thrasher, Terri L; Mingerink, Donna R

    2016-12-01

    International travel and infectious respiratory illnesses worldwide place health care workers (HCWs) at increasing risk of respiratory exposures. To ensure the highest quality safety initiatives, one health care system used a quality improvement model of Plan-Do-Study-Act and guidance from Occupational Safety and Health Administration's (OSHA) May 2015 Hospital Respiratory Protection Program (RPP) Toolkit to assess a current program. The toolkit aided in identification of opportunities for improvement within their well-designed RPP. One opportunity was requiring respirator use during aerosol-generating procedures for specific infectious illnesses. Observation data demonstrated opportunities to mitigate controllable risks including strap placement, user seal check, and reuse of disposable N95 filtering facepiece respirators. Subsequent interdisciplinary collaboration resulted in other ideas to decrease risks and increase protection from potentially infectious respiratory illnesses. The toolkit's comprehensive document to evaluate the program showed that while the OSHA standards have not changed, the addition of the toolkit can better protect HCWs. © 2016 The Author(s).

  6. The dopamine D2/D3 receptor agonist quinpirole increases checking-like behaviour in an operant observing response task with uncertain reinforcement: A novel possible model of OCD?

    PubMed Central

    Eagle, Dawn M.; Noschang, Cristie; d’Angelo, Laure-Sophie Camilla; Noble, Christie A.; Day, Jacob O.; Dongelmans, Marie Louise; Theobald, David E.; Mar, Adam C.; Urcelay, Gonzalo P.; Morein-Zamir, Sharon; Robbins, Trevor W.

    2014-01-01

    Excessive checking is a common, debilitating symptom of obsessive-compulsive disorder (OCD). In an established rodent model of OCD checking behaviour, quinpirole (dopamine D2/3-receptor agonist) increased checking in open-field tests, indicating dopaminergic modulation of checking-like behaviours. We designed a novel operant paradigm for rats (observing response task (ORT)) to further examine cognitive processes underpinning checking behaviour and clarify how and why checking develops. We investigated i) how quinpirole increases checking, ii) dependence of these effects on D2/3 receptor function (following treatment with D2/3 receptor antagonist sulpiride) and iii) effects of reward uncertainty. In the ORT, rats pressed an ‘observing’ lever for information about the location of an ‘active’ lever that provided food reinforcement. High- and low-checkers (defined from baseline observing) received quinpirole (0.5 mg/kg, 10 treatments) or vehicle. Parametric task manipulations assessed observing/checking under increasing task demands relating to reinforcement uncertainty (variable response requirement and active-lever location switching). Treatment with sulpiride further probed the pharmacological basis of long-term behavioural changes. Quinpirole selectively increased checking, both functional observing lever presses (OLPs) and non-functional extra OLPs (EOLPs). The increase in OLPs and EOLPs was long-lasting, without further quinpirole administration. Quinpirole did not affect the immediate ability to use information from checking. Vehicle and quinpirole-treated rats (VEH and QNP respectively) were selectively sensitive to different forms of uncertainty. Sulpiride reduced non-functional EOLPs in QNP rats but had no effect on functional OLPs. These data have implications for treatment of compulsive checking in OCD, particularly for serotonin-reuptake-inhibitor treatment-refractory cases, where supplementation with dopamine receptor antagonists may be beneficial. PMID:24406720

  7. Quality of surgical randomized controlled trials for acute cholecystitis: assessment based on CONSORT and additional check items.

    PubMed

    Shikata, Satoru; Nakayama, Takeo; Yamagishi, Hisakazu

    2008-01-01

    In this study, we conducted a limited survey of reports of surgical randomized controlled trials, using the consolidated standards of reporting trials (CONSORT) statement and additional check items to clarify problems in the evaluation of surgical reports. A total of 13 randomized trials were selected from two latest review articles on biliary surgery. Each randomized trial was evaluated according to 28 quality measures that comprised items from the CONSORT statement plus additional items. Analysis focused on relationships between the quality of each study and the estimated effect gap ("pooled estimate in meta-analysis" -- "estimated effect of each study"). No definite relationships were found between individual study quality and the estimated effect gap. The following items could have been described but were not provided in almost all the surgical RCT reports: "clearly defined outcomes"; "details of randomization"; "participant flow charts"; "intention-to-treat analysis"; "ancillary analyses"; and "financial conflicts of interest". The item, "participation of a trial methodologist in the study" was not found in any of the reports. Although the quality of reporting trials is not always related to a biased estimation of treatment effect, the items used for quality measures must be described to enable readers to evaluate the quality and applicability of the reporting. Further development of an assessment tool is needed for items specific to surgical randomized controlled trials.

  8. Development of an algorithm to provide awareness in choosing study designs for inclusion in systematic reviews of healthcare interventions: a method study

    PubMed Central

    Peinemann, Frank; Kleijnen, Jos

    2015-01-01

    Objectives To develop an algorithm that aims to provide guidance and awareness for choosing multiple study designs in systematic reviews of healthcare interventions. Design Method study: (1) To summarise the literature base on the topic. (2) To apply the integration of various study types in systematic reviews. (3) To devise decision points and outline a pragmatic decision tree. (4) To check the plausibility of the algorithm by backtracking its pathways in four systematic reviews. Results (1) The results of our systematic review of the published literature have already been published. (2) We recaptured the experience from our four previously conducted systematic reviews that required the integration of various study types. (3) We chose length of follow-up (long, short), frequency of events (rare, frequent) and types of outcome as decision points (death, disease, discomfort, disability, dissatisfaction) and aligned the study design labels according to the Cochrane Handbook. We also considered practical or ethical concerns, and the problem of unavailable high-quality evidence. While applying the algorithm, disease-specific circumstances and aims of interventions should be considered. (4) We confirmed the plausibility of the pathways of the algorithm. Conclusions We propose that the algorithm can assist to bring seminal features of a systematic review with multiple study designs to the attention of anyone who is planning to conduct a systematic review. It aims to increase awareness and we think that it may reduce the time burden on review authors and may contribute to the production of a higher quality review. PMID:26289450

  9. Innovating to enhance clinical data management using non-commercial and open source solutions across a multi-center network supporting inpatient pediatric care and research in Kenya

    PubMed Central

    Tuti, Timothy; Bitok, Michael; Paton, Chris; Makone, Boniface; Malla, Lucas; Muinga, Naomi; Gathara, David; English, Mike

    2016-01-01

    Objective To share approaches and innovations adopted to deliver a relatively inexpensive clinical data management (CDM) framework within a low-income setting that aims to deliver quality pediatric data useful for supporting research, strengthening the information culture and informing improvement efforts in local clinical practice. Materials and methods The authors implemented a CDM framework to support a Clinical Information Network (CIN) using Research Electronic Data Capture (REDCap), a noncommercial software solution designed for rapid development and deployment of electronic data capture tools. It was used for collection of standardized data from case records of multiple hospitals’ pediatric wards. R, an open-source statistical language, was used for data quality enhancement, analysis, and report generation for the hospitals. Results In the first year of CIN, the authors have developed innovative solutions to support the implementation of a secure, rapid pediatric data collection system spanning 14 hospital sites with stringent data quality checks. Data have been collated on over 37 000 admission episodes, with considerable improvement in clinical documentation of admissions observed. Using meta-programming techniques in R, coupled with branching logic, randomization, data lookup, and Application Programming Interface (API) features offered by REDCap, CDM tasks were configured and automated to ensure quality data was delivered for clinical improvement and research use. Conclusion A low-cost clinically focused but geographically dispersed quality CDM (Clinical Data Management) in a long-term, multi-site, and real world context can be achieved and sustained and challenges can be overcome through thoughtful design and implementation of open-source tools for handling data and supporting research. PMID:26063746

  10. Innovating to enhance clinical data management using non-commercial and open source solutions across a multi-center network supporting inpatient pediatric care and research in Kenya.

    PubMed

    Tuti, Timothy; Bitok, Michael; Paton, Chris; Makone, Boniface; Malla, Lucas; Muinga, Naomi; Gathara, David; English, Mike

    2016-01-01

    To share approaches and innovations adopted to deliver a relatively inexpensive clinical data management (CDM) framework within a low-income setting that aims to deliver quality pediatric data useful for supporting research, strengthening the information culture and informing improvement efforts in local clinical practice. The authors implemented a CDM framework to support a Clinical Information Network (CIN) using Research Electronic Data Capture (REDCap), a noncommercial software solution designed for rapid development and deployment of electronic data capture tools. It was used for collection of standardized data from case records of multiple hospitals' pediatric wards. R, an open-source statistical language, was used for data quality enhancement, analysis, and report generation for the hospitals. In the first year of CIN, the authors have developed innovative solutions to support the implementation of a secure, rapid pediatric data collection system spanning 14 hospital sites with stringent data quality checks. Data have been collated on over 37 000 admission episodes, with considerable improvement in clinical documentation of admissions observed. Using meta-programming techniques in R, coupled with branching logic, randomization, data lookup, and Application Programming Interface (API) features offered by REDCap, CDM tasks were configured and automated to ensure quality data was delivered for clinical improvement and research use. A low-cost clinically focused but geographically dispersed quality CDM (Clinical Data Management) in a long-term, multi-site, and real world context can be achieved and sustained and challenges can be overcome through thoughtful design and implementation of open-source tools for handling data and supporting research. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  11. The evidence for Shiatsu: a systematic review of Shiatsu and acupressure

    PubMed Central

    2011-01-01

    Background Shiatsu, similar to acupressure, uses finger pressure, manipulations and stretches, along Traditional Chinese Medicine meridians. Shiatsu is popular in Europe, but lacks reviews on its evidence-base. Methods Acupressure and Shiatsu clinical trials were identified using the MeSH term 'acupressure' in: EBM reviews; AMED; BNI; CINAHL; EMBASE; MEDLINE; PsycARTICLES; Science Direct; Blackwell Synergy; Ingenta Select; Wiley Interscience; Index to Theses and ZETOC. References of articles were checked. Inclusion criteria were Shiatsu or acupressure administered manually/bodily, published after January 1990. Two reviewers performed independent study selection and evaluation of study design and reporting, using standardised checklists (CONSORT, TREND, CASP and STRICTA). Results Searches identified 1714 publications. Final inclusions were 9 Shiatsu and 71 acupressure studies. A quarter were graded A (highest quality). Shiatsu studies comprised 1 RCT, three controlled non-randomised, one within-subjects, one observational and 3 uncontrolled studies investigating mental and physical health issues. Evidence was of insufficient quantity and quality. Acupressure studies included 2 meta-analyses, 6 systematic reviews and 39 RCTs. Strongest evidence was for pain (particularly dysmenorrhoea, lower back and labour), post-operative nausea and vomiting. Additionally quality evidence found improvements in sleep in institutionalised elderly. Variable/poor quality evidence existed for renal disease symptoms, dementia, stress, anxiety and respiratory conditions. Appraisal tools may be inappropriate for some study designs. Potential biases included focus on UK/USA databases, limited grey literature, and exclusion of qualitative and pre-1989 studies. Conclusions Evidence is improving in quantity, quality and reporting, but more research is needed, particularly for Shiatsu, where evidence is poor. Acupressure may be beneficial for pain, nausea and vomiting and sleep. PMID:21982157

  12. Preparation for a Changing World: Quality Education Program Study. Booklet 10-A (Needs Assessment).

    ERIC Educational Resources Information Center

    Bucks County Public Schools, Doylestown, PA.

    The general needs assessment instrument can provide the means for a school district to assess its needs relative to the Ten Goals of Quality Education. It is comprised of behavior statements taken from the category schemes. The student must check the appropriate number for each statement representing "always" through "never".…

  13. MOM: A meteorological data checking expert system in CLIPS

    NASA Technical Reports Server (NTRS)

    Odonnell, Richard

    1990-01-01

    Meteorologists have long faced the problem of verifying the data they use. Experience shows that there is a sizable number of errors in the data reported by meteorological observers. This is unacceptable for computer forecast models, which depend on accurate data for accurate results. Most errors that occur in meteorological data are obvious to the meteorologist, but time constraints prevent hand-checking. For this reason, it is necessary to have a 'front end' to the computer model to ensure the accuracy of input. Various approaches to automatic data quality control have been developed by several groups. MOM is a rule-based system implemented in CLIPS and utilizing 'consistency checks' and 'range checks'. The system is generic in the sense that it knows some meteorological principles, regardless of specific station characteristics. Specific constraints kept as CLIPS facts in a separate file provide for system flexibility. Preliminary results show that the expert system has detected some inconsistencies not noticed by a local expert.

  14. Preliminary Retrospective Analysis of Daily Tomotherapy Output Constancy Checks Using Statistical Process Control.

    PubMed

    Mezzenga, Emilio; D'Errico, Vincenzo; Sarnelli, Anna; Strigari, Lidia; Menghi, Enrico; Marcocci, Francesco; Bianchini, David; Benassi, Marcello

    2016-01-01

    The purpose of this study was to retrospectively evaluate the results from a Helical TomoTherapy Hi-Art treatment system relating to quality controls based on daily static and dynamic output checks using statistical process control methods. Individual value X-charts, exponentially weighted moving average charts, and process capability and acceptability indices were used to monitor the treatment system performance. Daily output values measured from January 2014 to January 2015 were considered. The results obtained showed that, although the process was in control, there was an out-of-control situation in the principal maintenance intervention for the treatment system. In particular, process capability indices showed a decreasing percentage of points in control which was, however, acceptable according to AAPM TG148 guidelines. Our findings underline the importance of restricting the acceptable range of daily output checks and suggest a future line of investigation for a detailed process control of daily output checks for the Helical TomoTherapy Hi-Art treatment system.

  15. Modelling spoilage of fresh turbot and evaluation of a time-temperature integrator (TTI) label under fluctuating temperature.

    PubMed

    Nuin, Maider; Alfaro, Begoña; Cruz, Ziortza; Argarate, Nerea; George, Susie; Le Marc, Yvan; Olley, June; Pin, Carmen

    2008-10-31

    Kinetic models were developed to predict the microbial spoilage and the sensory quality of fresh fish and to evaluate the efficiency of a commercial time-temperature integrator (TTI) label, Fresh Check(R), to monitor shelf life. Farmed turbot (Psetta maxima) samples were packaged in PVC film and stored at 0, 5, 10 and 15 degrees C. Microbial growth and sensory attributes were monitored at regular time intervals. The response of the Fresh Check device was measured at the same temperatures during the storage period. The sensory perception was quantified according to a global sensory indicator obtained by principal component analysis as well as to the Quality Index Method, QIM, as described by Rahman and Olley [Rahman, H.A., Olley, J., 1984. Assessment of sensory techniques for quality assessment of Australian fish. CSIRO Tasmanian Regional Laboratory. Occasional paper n. 8. Available from the Australian Maritime College library. Newnham. Tasmania]. Both methods were found equally valid to monitor the loss of sensory quality. The maximum specific growth rate of spoilage bacteria, the rate of change of the sensory indicators and the rate of change of the colour measurements of the TTI label were modelled as a function of temperature. The temperature had a similar effect on the bacteria, sensory and Fresh Check kinetics. At the time of sensory rejection, the bacterial load was ca. 10(5)-10(6) cfu/g. The end of shelf life indicated by the Fresh Check label was close to the sensory rejection time. The performance of the models was validated under fluctuating temperature conditions by comparing the predicted and measured values for all microbial, sensory and TTI responses. The models have been implemented in a Visual Basic add-in for Excel called "Fish Shelf Life Prediction (FSLP)". This program predicts sensory acceptability and growth of spoilage bacteria in fish and the response of the TTI at constant and fluctuating temperature conditions. The program is freely available at http://www.azti.es/muestracontenido.asp?idcontenido=980&content=15&nodo1=30&nodo2=0.

  16. 78 FR 16854 - Agency Information Collection Activities: Submission for OMB Review; Comment Request Re National...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-19

    ... institutions to bring those individuals and families who have rarely, if ever, held a checking account, a savings account or other type of transaction or check cashing account at an insured depository institution... size and worth of the ``unbanked'' market in the United States.'' The Household Survey is designed to...

  17. 10 CFR 36.81 - Records and retention periods.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... required by § 36.55 until the Commission terminates the license. (f) Records of radiation surveys required by § 36.57 for 3 years from the date of the survey. (g) Records of radiation survey meter...) Records on the design checks required by § 36.39 and the construction control checks as required by § 36...

  18. 10 CFR 36.81 - Records and retention periods.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... required by § 36.55 until the Commission terminates the license. (f) Records of radiation surveys required by § 36.57 for 3 years from the date of the survey. (g) Records of radiation survey meter...) Records on the design checks required by § 36.39 and the construction control checks as required by § 36...

  19. 10 CFR 36.81 - Records and retention periods.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... required by § 36.55 until the Commission terminates the license. (f) Records of radiation surveys required by § 36.57 for 3 years from the date of the survey. (g) Records of radiation survey meter...) Records on the design checks required by § 36.39 and the construction control checks as required by § 36...

  20. 10 CFR 36.81 - Records and retention periods.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... required by § 36.55 until the Commission terminates the license. (f) Records of radiation surveys required by § 36.57 for 3 years from the date of the survey. (g) Records of radiation survey meter...) Records on the design checks required by § 36.39 and the construction control checks as required by § 36...

  1. 14 CFR 61.58 - Pilot-in-command proficiency check: Operation of an aircraft that requires more than one pilot...

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ...-powered; (5) For a pilot authorized by the Administrator to operate an experimental turbojet-powered aircraft that possesses, by original design or through modification, more than a single seat, the required proficiency check for all of the experimental turbojet-powered aircraft for which the pilot holds an...

  2. 14 CFR 61.58 - Pilot-in-command proficiency check: Operation of an aircraft that requires more than one pilot...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ...-powered; (5) For a pilot authorized by the Administrator to operate an experimental turbojet-powered aircraft that possesses, by original design or through modification, more than a single seat, the required proficiency check for all of the experimental turbojet-powered aircraft for which the pilot holds an...

  3. Minimum Check List for Mechanical and Electrical Plans & Specifications.

    ERIC Educational Resources Information Center

    North Carolina State Dept. of Public Instruction, Raleigh. Div. of School Facility Services.

    This is the fifth revision of the Minimum Check List since its origin in 1960 by North Carolina's School Planning. The checklist was developed to serve as a means of communication between school agencies and design professionals and has been widely used in the development and review of mechanical and electrical plans and specifications by…

  4. Effects of Check and Connect on Attendance, Behavior, and Academics: A Randomized Effectiveness Trial

    ERIC Educational Resources Information Center

    Maynard, Brandy R.; Kjellstrand, Elizabeth K.; Thompson, Aaron M.

    2014-01-01

    Objectives: This study examined the effects of Check & Connect (C&C) on the attendance, behavior, and academic outcomes of at-risk youth in a field-based effectiveness trial. Method: A multisite randomized block design was used, wherein 260 primarily Hispanic (89%) and economically disadvantaged (74%) students were randomized to treatment…

  5. 28 CFR 8.14 - Disposition of property before forfeiture.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... violation of law, is not contraband, and has no design or other characteristics that particularly suit it for use in illegal activities. This payment must be in the form of a money order, an official bank check, or a cashier's check made payable to the United States Marshals Service. A bond in the form of a...

  6. 28 CFR 8.14 - Disposition of property before forfeiture.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... violation of law, is not contraband, and has no design or other characteristics that particularly suit it for use in illegal activities. This payment must be in the form of a money order, an official bank check, or a cashier's check made payable to the United States Marshals Service. A bond in the form of a...

  7. Importance of Survey Design for Studying the Epidemiology of Emerging Tobacco Product Use Among Youth.

    PubMed

    Delnevo, Cristine D; Gundersen, Daniel A; Manderski, Michelle T B; Giovenco, Daniel P; Giovino, Gary A

    2017-08-15

    Accurate surveillance is critical for monitoring the epidemiology of emerging tobacco products in the United States, and survey science suggests that survey response format can impact prevalence estimates. We utilized data from the 2014 New Jersey Youth Tobacco Survey (n = 3,909) to compare estimates of the prevalence of 4 behaviors (ever hookah use, current hookah use, ever e-cigarette use, and current e-cigarette use) among New Jersey high school students, as assessed using "check-all-that-apply" questions, with estimates measured by means of "forced-choice" questions. Measurement discrepancies were apparent for all 4 outcomes, with the forced-choice questions yielding prevalence estimates approximately twice those of the check-all-that-apply questions, and agreement was fair to moderate. The sensitivity of the check-all-that-apply questions, treating the forced-choice format as the "gold standard," ranged from 38.1% (current hookah use) to 58.3% (ever e-cigarette use), indicating substantial false-negative rates. These findings highlight the impact of question response format on prevalence estimates of emerging tobacco products among youth and suggest that estimates generated by means of check-all-that-apply questions may be biased downward. Alternative survey designs should be considered to avoid check-all-that-apply response formats, and researchers should use caution when interpreting tobacco use data obtained from check-all-that-apply formats. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  8. Detection and Analysis of the Quality of Ibuprofen Granules

    NASA Astrophysics Data System (ADS)

    Yu-bin, Ji; Xin, LI; Guo-song, Xin; Qin-bing, Xue

    2017-12-01

    The Ibuprofen Granules comprehensive quality testing to ensure that it is in accordance with the provisions of Chinese pharmacopoeia. With reference of Chinese pharmacopoeia, the Ibuprofen Granules is tested by UV, HPLC, in terms of grain size checking, volume deviation, weight loss on drying detection, dissolution rate detection, and quality evaluation. Results indicated that Ibuprofen Granules conform to the standards. The Ibuprofen Granules are qualified and should be permitted to be marketed.

  9. Methods for increasing upper airway muscle tonus in treating obstructive sleep apnea: systematic review.

    PubMed

    Valbuza, Juliana Spelta; de Oliveira, Márcio Moysés; Conti, Cristiane Fiquene; Prado, Lucila Bizari F; de Carvalho, Luciane Bizari Coin; do Prado, Gilmar Fernandes

    2010-12-01

    Treatment of obstructive sleep apnea (OSA) using methods for increasing upper airway muscle tonus has been controversial and poorly reported. Thus, a review of the evidence is needed to evaluate the effectiveness of these methods. The design used was a systematic review of randomized controlled trials. Data sources are from the Cochrane Library, Medline, Embase and Scielo, registries of ongoing trials, theses indexed at Biblioteca Regional de Medicina/Pan-American Health Organization of the World Health Organization and the reference lists of all the trials retrieved. This was a review of randomized or quasi-randomized double-blind trials on OSA. Two reviewers independently applied eligibility criteria. One reviewer assessed study quality and extracted data, and these processes were checked by a second reviewer. The primary outcome was a decrease in the apnea/hypopnea index (AHI) of below five episodes per hour. Other outcomes were subjective sleep quality, sleep quality measured by night polysomnography, quality of life measured subjectively and adverse events associated with the treatments. Three eligible trials were included. Two studies showed improvements through the objective and subjective analyses, and one study showed improvement of snoring, but not of AHI while the subjective analyses showed no improvement. The adverse events were reported and they were not significant. There is no accepted scientific evidence that methods aiming to increase muscle tonus of the stomatognathic system are effective in reducing AHI to below five events per hour. Well-designed randomized controlled trials are needed to assess the efficacy of such methods.

  10. Health Effects of Ozone Pollution

    EPA Pesticide Factsheets

    Inhaling ozone can cause coughing, shortness of breath, worse asthma or bronchitis symptoms, and irritation and damage to airways.You can reduce your exposure to ozone pollution by checking air quality where you live.

  11. 46 CFR 160.176-13 - Approval Tests.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... thread count must be at least 400 N (90 lb.). (v) [Reserved] (w) Visual examination. One complete... check the quality of incoming lifejacket components and the production process. Test samples must come...

  12. 46 CFR 160.176-13 - Approval Tests.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... thread count must be at least 400 N (90 lb.). (v) [Reserved] (w) Visual examination. One complete... check the quality of incoming lifejacket components and the production process. Test samples must come...

  13. 46 CFR 160.176-13 - Approval Tests.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... thread count must be at least 400 N (90 lb.). (v) [Reserved] (w) Visual examination. One complete... check the quality of incoming lifejacket components and the production process. Test samples must come...

  14. 46 CFR 160.176-13 - Approval Tests.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... thread count must be at least 400 N (90 lb.). (v) [Reserved] (w) Visual examination. One complete... check the quality of incoming lifejacket components and the production process. Test samples must come...

  15. 77 FR 38273 - Science Advisory Board; Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-27

    ... Administration (NOAA) science programs are of the highest quality and provide optimal support to resource... Environmental Laboratory, 7600 Sand Point Way NE., Seattle, Washington 98115. Please check the SAB Web site http...

  16. Quality assessment of clinical education services in teaching hospitals located in Kerman, Iran.

    PubMed

    Yazdi-Feyzabadi, Vahid; Gozashti, Mohammad Hossein; Komsari, Samane; Mohammadtaghizadeh, Sedigheh; Amiresmaili, Mohammadreza

    2015-11-01

    Clinical education is one of the most important components of the resource generation function of health systems, and it has a very important role in graduates' competency with respect to effective, practical education. This study aimed to assess the quality of clinical services in Kerman's teaching hospitals located in southeastern Iran. This cross-sectional study was conducted in 2011 on 303 medical students at different levels of medical education at Kerman's teaching hospitals. A modified SERVQUAL instrument was used to collect the data after its validity and reliability were checked. The data were analyzed by SPSS 18.0 using the paired t-test, Kruskal-Wallis, and post hoc tests, when appropriate. In all five dimensions of quality, gaps were observed between students' perceptions and expectations as follows: Assurance (mean = -1.18), Responsiveness (-1.56), Empathy (-1.4), Reliability (-1.27), and Tangibles (-1.21). There was a significant difference between the quality perceptions and expectations of the medical students (p < 0.001). A significant difference was observed between three educational levels, including externships, internships, and assistantships regarding the dimensions of the quality gaps (p < 0.001). The clinical services provided by teaching hospitals in the study did not meet the students' expectations at any of the three educational levels. As we precisely assessed the dimensions and items that had the higher quality gaps, it was apparent that, for most part, clinical education officials could improve the quality by designing interventions, which would not be very difficult to do.

  17. Code development for ships -- A demonstration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ayyub, B.; Mansour, A.E.; White, G.

    1996-12-31

    A demonstration summary of a reliability-based structural design code for ships is presented for two ship types, a cruiser and a tanker. For both ship types, code requirements cover four failure modes: hull girder bulking, unstiffened plate yielding and buckling, stiffened plate buckling, and fatigue of critical detail. Both serviceability and ultimate limit states are considered. Because of limitation on the length, only hull girder modes are presented in this paper. Code requirements for other modes will be presented in future publication. A specific provision of the code will be a safety check expression. The design variables are to bemore » taken at their nominal values, typically values in the safe side of the respective distributions. Other safety check expressions for hull girder failure that include load combination factors, as well as consequence of failure factors, are considered. This paper provides a summary of safety check expressions for the hull girder modes.« less

  18. Use of satellite remote sensing as a monitoring tool for land and water resources development activities in an Indian tropical site.

    PubMed

    Behera, M D; Gupta, A K; Barik, S K; Das, P; Panda, R M

    2018-06-15

    With the availability of satellite data from free data domain, remote sensing has increasingly become a fast-hand tool for monitoring of land and water resources development activities with minimal cost and time. Here, we verified construction of check dams and implementation of plantation activities in two districts of Tripura state using Landsat and Sentinel-2 images for the years 2008 and 2016-2017, respectively. We applied spectral reflectance curves and index-based proxies to quantify these activities for two time periods. A subset of the total check dams and plantation sites was chosen on the basis of site condition, nature of check dams, and planted species for identification on satellite images, and another subset was randomly chosen to validate identification procedure. The normalized difference water index (NDWI) derived from Landsat and Senitnel-2 were used to quantify water area evolved, qualify the water quality, and influence of associated tree shadows. Three types of check dams were observed, i.e., full, partial, and fully soil exposed on the basis of the presence of grass or scrub on the check dams. Based on the nature of check dam and site characteristics, we classified the water bodies under 11-categories using six interpretation keys (size, shape, water depth, quality, shadow of associated trees, catchment area). The check dams constructed on existing narrow gullies totally covered by branches or associated plants were not identified without field verification. Further, use of EVI enabled us to approve the plantation activities and adjudge the corresponding increase in vegetation vigor. The plantation activities were established based on the presence and absence of existing vegetation. Clearing on the plantation sites for plantation shows differential increase in EVI values during the initial years. The 403 plantation sites were categorized into 12 major groups on the basis of presence of dominant species and site conditions. The dominant species were Areca catechu, Musa paradisiaca, Ananas comosus, Bambusa sp., and mix plantation of A. catechu and M. paradisiaca. However, the highest maximum increase in average EVI was observed for the pine apple plantation sites (0.11), followed by Bambussa sp. (0.10). These sites were fully covered with plantation without any exposed soil. The present study successfully demonstrates a satellite-based survey supplemented with ground information evaluating the changes in vegetation profile due to plantation activities, locations of check dams, extent of water bodies, downstream irrigation, and catchment area of water bodies.

  19. Performance characteristics of an automated gas chromatograph-ion trap mass spectrometer system used for the 1995 Southern Oxidants Study field investigation in Nashville, Tennessee

    NASA Astrophysics Data System (ADS)

    Daughtrey, E. Hunter; Adams, Jeffrey R.; Oliver, Karen D.; Kronmiller, Keith G.; McClenny, William A.

    1998-09-01

    A trailer-deployed automated gas chromatograph-mass spectrometer (autoGC-MS) system capable of making continuous hourly measurements was used to determine volatile organic compounds (VOCs) in ambient air at New Hendersonville, Tennessee, and Research Triangle Park, North Carolina, in 1995. The system configuration, including the autoGC-MS, trailer and transfer line, siting, and sampling plan and schedule, is described. The autoGC-MS system employs a pair of matched sorbent traps to allow simultaneous sampling and desorption. Desorption is followed by Stirling engine cryofocusing and subsequent GC separation and mass spectral identification and quantification. Quality control measurements described include evaluating precision and accuracy of replicate analyses of independently supplied audit and round-robin canisters and determining the completeness of the data sets taken in Tennessee. Data quality objectives for precision (±10%) and accuracy (±20%) of 10- to 20-ppbv audit canisters and a completeness of >75% data capture were met. Quality assurance measures used in reviewing the data set include retention time stability, calibration checks, frequency distribution checks, and checks of the mass spectra. Special procedures and tests were used to minimize sorbent trap artifacts, to verify the quality of a standard prepared in our laboratory, and to prove the integrity of the insulated, heated transfer line. A rigorous determination of total system blank concentration levels using humidified scientific air spiked with ozone allowed estimation of method detection limits, ranging from 0.01 to 1.0 ppb C, for most of the 100 target compounds, which were a composite list of the target compounds for the Photochemical Assessment Monitoring Station network, those for Environmental Protection Agency method TO-14, and selected oxygenated VOCs.

  20. Exploring Antarctic Land Surface Temperature Extremes Using Condensed Anomaly Databases

    NASA Astrophysics Data System (ADS)

    Grant, Glenn Edwin

    Satellite observations have revolutionized the Earth Sciences and climate studies. However, data and imagery continue to accumulate at an accelerating rate, and efficient tools for data discovery, analysis, and quality checking lag behind. In particular, studies of long-term, continental-scale processes at high spatiotemporal resolutions are especially problematic. The traditional technique of downloading an entire dataset and using customized analysis code is often impractical or consumes too many resources. The Condensate Database Project was envisioned as an alternative method for data exploration and quality checking. The project's premise was that much of the data in any satellite dataset is unneeded and can be eliminated, compacting massive datasets into more manageable sizes. Dataset sizes are further reduced by retaining only anomalous data of high interest. Hosting the resulting "condensed" datasets in high-speed databases enables immediate availability for queries and exploration. Proof of the project's success relied on demonstrating that the anomaly database methods can enhance and accelerate scientific investigations. The hypothesis of this dissertation is that the condensed datasets are effective tools for exploring many scientific questions, spurring further investigations and revealing important information that might otherwise remain undetected. This dissertation uses condensed databases containing 17 years of Antarctic land surface temperature anomalies as its primary data. The study demonstrates the utility of the condensate database methods by discovering new information. In particular, the process revealed critical quality problems in the source satellite data. The results are used as the starting point for four case studies, investigating Antarctic temperature extremes, cloud detection errors, and the teleconnections between Antarctic temperature anomalies and climate indices. The results confirm the hypothesis that the condensate databases are a highly useful tool for Earth Science analyses. Moreover, the quality checking capabilities provide an important method for independent evaluation of dataset veracity.

  1. Low-cost and high-speed optical mark reader based on an intelligent line camera

    NASA Astrophysics Data System (ADS)

    Hussmann, Stephan; Chan, Leona; Fung, Celine; Albrecht, Martin

    2003-08-01

    Optical Mark Recognition (OMR) is thoroughly reliable and highly efficient provided that high standards are maintained at both the planning and implementation stages. It is necessary to ensure that OMR forms are designed with due attention to data integrity checks, the best use is made of features built into the OMR, used data integrity is checked before the data is processed and data is validated before it is processed. This paper describes the design and implementation of an OMR prototype system for marking multiple-choice tests automatically. Parameter testing is carried out before the platform and the multiple-choice answer sheet has been designed. Position recognition and position verification methods have been developed and implemented in an intelligent line scan camera. The position recognition process is implemented into a Field Programmable Gate Array (FPGA), whereas the verification process is implemented into a micro-controller. The verified results are then sent to the Graphical User Interface (GUI) for answers checking and statistical analysis. At the end of the paper the proposed OMR system will be compared with commercially available system on the market.

  2. The rate of cis-trans conformation errors is increasing in low-resolution crystal structures.

    PubMed

    Croll, Tristan Ian

    2015-03-01

    Cis-peptide bonds (with the exception of X-Pro) are exceedingly rare in native protein structures, yet a check for these is not currently included in the standard workflow for some common crystallography packages nor in the automated quality checks that are applied during submission to the Protein Data Bank. This appears to be leading to a growing rate of inclusion of spurious cis-peptide bonds in low-resolution structures both in absolute terms and as a fraction of solved residues. Most concerningly, it is possible for structures to contain very large numbers (>1%) of spurious cis-peptide bonds while still achieving excellent quality reports from MolProbity, leading to concerns that ignoring such errors is allowing software to overfit maps without producing telltale errors in, for example, the Ramachandran plot.

  3. Indicator methods to evaluate the hygienic performance of industrial scale operating Biowaste Composting Plants.

    PubMed

    Martens, Jürgen

    2005-01-01

    The hygienic performance of biowaste composting plants to ensure the quality of compost is of high importance. Existing compost quality assurance systems reflect this importance through intensive testing of hygienic parameters. In many countries, compost quality assurance systems are under construction and it is necessary to check and to optimize the methods to state the hygienic performance of composting plants. A set of indicator methods to evaluate the hygienic performance of normal operating biowaste composting plants was developed. The indicator methods were developed by investigating temperature measurements from indirect process tests from 23 composting plants belonging to 11 design types of the Hygiene Design Type Testing System of the German Compost Quality Association (BGK e.V.). The presented indicator methods are the grade of hygienization, the basic curve shape, and the hygienic risk area. The temperature courses of single plants are not distributed normally, but they were grouped by cluster analysis in normal distributed subgroups. That was a precondition to develop the mentioned indicator methods. For each plant the grade of hygienization was calculated through transformation into the standard normal distribution. It shows the part in percent of the entire data set which meet the legal temperature requirements. The hygienization grade differs widely within the design types and falls below 50% for about one fourth of the plants. The subgroups are divided visually into basic curve shapes which stand for different process courses. For each plant the composition of the entire data set out of the various basic curve shapes can be used as an indicator for the basic process conditions. Some basic curve shapes indicate abnormal process courses which can be emended through process optimization. A hygienic risk area concept using the 90% range of variation of the normal temperature courses was introduced. Comparing the design type range of variation with the legal temperature defaults showed hygienic risk areas over the temperature courses which could be minimized through process optimization. The hygienic risk area of four design types shows a suboptimal hygienic performance.

  4. Who benefit from school doctors' health checks: a prospective study of a screening method.

    PubMed

    Nikander, Kirsi; Kosola, Silja; Kaila, Minna; Hermanson, Elina

    2018-06-27

    School health services provide an excellent opportunity for the detection and treatment of children at risk of later health problems. However, the optimal use of school doctors' skills and expertise remains unknown. Furthermore, no validated method for screening children for school doctors' assessments exists. The aims of the study are 1) to evaluate the benefits or harm of school doctors' routine health checks in primary school grades 1 and 5 (at ages 7 and 11) and 2) to explore whether some of the school doctors' routine health checks can be omitted using study questionnaires. This is a prospective, multicenter observational study conducted in four urban municipalities in Southern Finland by comparing the need for a school doctor's assessment to the benefit gained from it. We will recruit a random sample of 1050 children from 21 schools from primary school grades 1 and 5. Before the school doctor's health check, parents, nurses and teachers fill a study questionnaire to identify any potential concerns about each child. Doctors, blinded to the questionnaire responses, complete an electronic report after the appointment, including given instructions and follow-up plans. The child, parent, doctor and researchers assess the benefit of the health check. The researchers compare the need for a doctor's appointment to the benefit gained from it. At one year after the health check, we will analyze the implementation of the doctors' interventions and follow-up plans. The study will increase our knowledge of the benefits of school doctors' routine health checks and assess the developed screening method. We hypothesize that targeting the health checks to the children in greatest need would increase the quality of school health services. ClinicalTrials.gov Identifier: NCT03178331 , date of registration June 6 th 2017.

  5. EC97-44354-2

    NASA Image and Video Library

    1997-12-16

    An image of the F-16XL #1 during its functional flight check of the Digital Flight Control System (DFCS) on December 16, 1997. The mission was flown by NASA research pilot Dana Purifoy, and lasted 1 hour and 25 minutes. The tests included pilot familiarly, functional check, and handling qualities evaluation maneuvers to a speed of Mach 0.6 and 300 knots. Purifoy completed all the briefed data points with no problems, and reported that the DFCS handled as well, if not better than the analog computer system that it replaced.

  6. A novel method for routine quality assurance of volumetric-modulated arc therapy.

    PubMed

    Wang, Qingxin; Dai, Jianrong; Zhang, Ke

    2013-10-01

    Volumetric-modulated arc therapy (VMAT) is delivered through synchronized variation of gantry angle, dose rate, and multileaf collimator (MLC) leaf positions. The delivery dynamic nature challenges the parameter setting accuracy of linac control system. The purpose of this study was to develop a novel method for routine quality assurance (QA) of VMAT linacs. ArcCheck is a detector array with diodes distributing in spiral pattern on cylindrical surface. Utilizing its features, a QA plan was designed to strictly test all varying parameters during VMAT delivery on an Elekta Synergy linac. In this plan, there are 24 control points. The gantry rotates clockwise from 181° to 179°. The dose rate, gantry speed, and MLC positions cover their ranges commonly used in clinic. The two borders of MLC-shaped field seat over two columns of diodes of ArcCheck when the gantry rotates to the angle specified by each control point. The ratio of dose rate between each of these diodes and the diode closest to the field center is a certain value and sensitive to the MLC positioning error of the leaf crossing the diode. Consequently, the positioning error can be determined by the ratio with the help of a relationship curve. The time when the gantry reaches the angle specified by each control point can be acquired from the virtual inclinometer that is a feature of ArcCheck. The gantry speed between two consecutive control points is then calculated. The aforementioned dose rate is calculated from an acm file that is generated during ArcCheck measurements. This file stores the data measured by each detector in 50 ms updates with each update in a separate row. A computer program was written in MATLAB language to process the data. The program output included MLC positioning errors and the dose rate at each control point as well as the gantry speed between control points. To evaluate this method, this plan was delivered for four consecutive weeks. The actual dose rate and gantry speed were compared with the QA plan specified. Additionally, leaf positioning errors were intentionally introduced to investigate the sensitivity of this method. The relationship curves were established for detecting MLC positioning errors during VMAT delivery. For four consecutive weeks measured, 98.4%, 94.9%, 89.2%, and 91.0% of the leaf positioning errors were within ± 0.5 mm, respectively. For the intentionally introduced leaf positioning systematic errors of -0.5 and +1 mm, the detected leaf positioning errors of 20 Y1 leaf were -0.48 ± 0.14 and 1.02 ± 0.26 mm, respectively. The actual gantry speed and dose rate closely followed the values specified in the VMAT QA plan. This method can assess the accuracy of MLC positions and the dose rate at each control point as well as the gantry speed between control points at the same time. It is efficient and suitable for routine quality assurance of VMAT.

  7. DNA origami-based standards for quantitative fluorescence microscopy.

    PubMed

    Schmied, Jürgen J; Raab, Mario; Forthmann, Carsten; Pibiri, Enrico; Wünsch, Bettina; Dammeyer, Thorben; Tinnefeld, Philip

    2014-01-01

    Validating and testing a fluorescence microscope or a microscopy method requires defined samples that can be used as standards. DNA origami is a new tool that provides a framework to place defined numbers of small molecules such as fluorescent dyes or proteins in a programmed geometry with nanometer precision. The flexibility and versatility in the design of DNA origami microscopy standards makes them ideally suited for the broad variety of emerging super-resolution microscopy methods. As DNA origami structures are durable and portable, they can become a universally available specimen to check the everyday functionality of a microscope. The standards are immobilized on a glass slide, and they can be imaged without further preparation and can be stored for up to 6 months. We describe a detailed protocol for the design, production and use of DNA origami microscopy standards, and we introduce a DNA origami rectangle, bundles and a nanopillar as fluorescent nanoscopic rulers. The protocol provides procedures for the design and realization of fluorescent marks on DNA origami structures, their production and purification, quality control, handling, immobilization, measurement and data analysis. The procedure can be completed in 1-2 d.

  8. Consumer Mobile Apps for Potential Drug-Drug Interaction Check: Systematic Review and Content Analysis Using the Mobile App Rating Scale (MARS).

    PubMed

    Kim, Ben Yb; Sharafoddini, Anis; Tran, Nam; Wen, Emily Y; Lee, Joon

    2018-03-28

    General consumers can now easily access drug information and quickly check for potential drug-drug interactions (PDDIs) through mobile health (mHealth) apps. With aging population in Canada, more people have chronic diseases and comorbidities leading to increasing numbers of medications. The use of mHealth apps for checking PDDIs can be helpful in ensuring patient safety and empowerment. The aim of this study was to review the characteristics and quality of publicly available mHealth apps that check for PDDIs. Apple App Store and Google Play were searched to identify apps with PDDI functionality. The apps' general and feature characteristics were extracted. The Mobile App Rating Scale (MARS) was used to assess the quality. A total of 23 apps were included for the review-12 from Apple App Store and 11 from Google Play. Only 5 of these were paid apps, with an average price of $7.19 CAD. The mean MARS score was 3.23 out of 5 (interquartile range 1.34). The mean MARS scores for the apps from Google Play and Apple App Store were not statistically different (P=.84). The information dimension was associated with the highest score (3.63), whereas the engagement dimension resulted in the lowest score (2.75). The total number of features per app, average rating, and price were significantly associated with the total MARS score. Some apps provided accurate and comprehensive information about potential adverse drug effects from PDDIs. Given the potentially severe consequences of incorrect drug information, there is a need for oversight to eliminate low quality and potentially harmful apps. Because managing PDDIs is complex in the absence of complete information, secondary features such as medication reminder, refill reminder, medication history tracking, and pill identification could help enhance the effectiveness of PDDI apps. ©Ben YB Kim, Anis Sharafoddini, Nam Tran, Emily Y Wen, Joon Lee. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 28.03.2018.

  9. Consumer Mobile Apps for Potential Drug-Drug Interaction Check: Systematic Review and Content Analysis Using the Mobile App Rating Scale (MARS)

    PubMed Central

    Kim, Ben YB; Sharafoddini, Anis; Tran, Nam; Wen, Emily Y

    2018-01-01

    Background General consumers can now easily access drug information and quickly check for potential drug-drug interactions (PDDIs) through mobile health (mHealth) apps. With aging population in Canada, more people have chronic diseases and comorbidities leading to increasing numbers of medications. The use of mHealth apps for checking PDDIs can be helpful in ensuring patient safety and empowerment. Objective The aim of this study was to review the characteristics and quality of publicly available mHealth apps that check for PDDIs. Methods Apple App Store and Google Play were searched to identify apps with PDDI functionality. The apps’ general and feature characteristics were extracted. The Mobile App Rating Scale (MARS) was used to assess the quality. Results A total of 23 apps were included for the review—12 from Apple App Store and 11 from Google Play. Only 5 of these were paid apps, with an average price of $7.19 CAD. The mean MARS score was 3.23 out of 5 (interquartile range 1.34). The mean MARS scores for the apps from Google Play and Apple App Store were not statistically different (P=.84). The information dimension was associated with the highest score (3.63), whereas the engagement dimension resulted in the lowest score (2.75). The total number of features per app, average rating, and price were significantly associated with the total MARS score. Conclusions Some apps provided accurate and comprehensive information about potential adverse drug effects from PDDIs. Given the potentially severe consequences of incorrect drug information, there is a need for oversight to eliminate low quality and potentially harmful apps. Because managing PDDIs is complex in the absence of complete information, secondary features such as medication reminder, refill reminder, medication history tracking, and pill identification could help enhance the effectiveness of PDDI apps. PMID:29592848

  10. Utilisation of preventative health check-ups in the UK: findings from individual-level repeated cross-sectional data from 1992 to 2008

    PubMed Central

    Labeit, Alexander; Peinemann, Frank; Baker, Richard

    2013-01-01

    Objectives To analyse and compare the determinants of screening uptake for different National Health Service (NHS) health check-ups in the UK. Design Individual-level analysis of repeated cross-sectional surveys with balanced panel data. Setting The UK. Participants Individuals taking part in the British Household Panel Survey (BHPS), 1992–2008. Outcome measure Uptake of NHS health check-ups for cervical cancer screening, breast cancer screening, blood pressure checks, cholesterol tests, dental screening and eyesight tests. Methods Dynamic panel data models (random effects panel probit with initial conditions). Results Having had a health check-up 1 year before, and previously in accordance with the recommended schedule, was associated with higher uptake of health check-ups. Individuals who visited a general practitioner (GP) had a significantly higher uptake in 5 of the 6 health check-ups. Uptake was highest in the recommended age group for breast and cervical cancer screening. For all health check-ups, age had a non-linear relationship. Lower self-rated health status was associated with increased uptake of blood pressure checks and cholesterol tests; smoking was associated with decreased uptake of 4 health check-ups. The effects of socioeconomic variables differed for the different health check-ups. Ethnicity did not have a significant influence on any health check-up. Permanent household income had an influence only on eyesight tests and dental screening. Conclusions Common determinants for having health check-ups are age, screening history and a GP visit. Policy interventions to increase uptake should consider the central role of the GP in promoting screening examinations and in preserving a high level of uptake. Possible economic barriers to access for prevention exist for dental screening and eyesight tests, and could be a target for policy intervention. Trial registration This observational study was not registered. PMID:24366576

  11. Data services providing by the Ukrainian NODC (MHI NASU)

    NASA Astrophysics Data System (ADS)

    Eremeev, V.; Godin, E.; Khaliulin, A.; Ingerov, A.; Zhuk, E.

    2009-04-01

    At modern stage of the World Ocean study information support of investigation based on ad-vanced computer technologies becomes of particular importance. These abstracts are devoted to presentation of several data services developed in the Ukrainian NODC on the base of the Ma-rine Environmental and Information Technologies Department of MHI NASU. The Data Quality Control Service Using experience of international collaboration in the field of data collection and quality check we have developed the quality control (QC) software providing both preliminary(automatic) and expert(manual) data quality check procedures. The current version of the QC software works for the Mediterranean and Black seas and includes the climatic arrays for hydrological and few hydrochemical parameters based on such products as MEDAR/MEDATLAS II, Physical Oceanography of the Black Sea and Climatic Atlas of Oxygen and Hydrogen Sulfide in the Black sea. The data quality check procedure includes metadata control and hydrological and hydrochemical data control. Metadata control provides checking of duplicate cruises and pro-files, date and chronology, ship velocity, station location, sea depth and observation depth. Data QC procedure includes climatic (or range for parameters with small number of observations) data QC, density inversion check for hydrological data and searching for spikes. Using of cli-matic fields and profiles prepared by regional oceanography experts leads to more reliable results of data quality check procedure. The Data Access Services The Ukrainian NODC provides two products for data access - on-line software and data access module for the MHI NASU local net. This software allows select-ing data on rectangle area, on date, on months, on cruises. The result of query is metadata which are presented in the table and the visual presentation of stations on the map. It is possible to see both metadata and data. For this purpose it is necessary to select station in the table of metadata or on the map. There is also an opportunity to export data in ODV format. The product is avail-able on http://www.ocean.nodc.org.ua/DataAccess.php The local net version provides access to the oceanological database of the MHI NASU. The cur-rent version allows selecting data by spatial and temporal limits, depth, values of parameters, quality flags and works for the Mediterranean and Black seas. It provides visualization of meta-data and data, statistics of data selection, data export into several data formats. The Operational Data Management Services The collaborators of the MHI Experimental Branch developed a system of obtaining information on water pressure and temperature, as well as on atmospheric pressure. Sea level observations are also conducted. The obtained data are transferred online. The interface for operation data access was developed. It allows to select parameters (sea level, water temperature, atmospheric pressure, wind and wa-ter pressure) and time interval to see parameter graphics. The product is available on http://www.ocean.nodc.org.ua/Katsively.php . The Climatic products The current version of the Climatic Atlas includes maps on such pa-rameters as temperature, salinity, density, heat storage, dynamic heights, upper boundary of hy-drogen sulfide and lower boundary of oxygen for the Black sea basin. Maps for temperature, sa-linity, density were calculated on 19 standard depths and averaged monthly for depths 0 - 300 m and annually for lower depth values. The climatic maps of upper boundary of hydrogen sulfide and lower boundary of oxygen were averaged by decades from 20 till 90 of the XX century and by seasons. Two versions of climatic atlas viewer - on-line and desktop for presentation of the climatic maps were developed. They provide similar functions of selection and viewing maps by parameter, month and depth and saving maps in various formats. On-line version of atlas is available on http://www.ocean.nodc.org.ua/Main_Atlas.php .

  12. 40 CFR Appendix B to Part 75 - Quality Assurance and Quality Control Procedures

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... in section 2.3 of this appendix and the Hg emission tests described in §§ 75.81(c) and 75.81(d)(4). 1.2Specific Requirements for Continuous Emissions Monitoring Systems 1.2.1Calibration Error Test and Linearity Check Procedures Keep a written record of the procedures used for daily calibration error tests and...

  13. Basal Area Growth Estimators for Survivor Component: A Quality Control Application

    Treesearch

    Charles E. Thomas; Francis A. Roesch

    1990-01-01

    Several possible estimators are available for basal area growth of survivor trees, when horizontal prism (or point) plots (HPP) are remeasured. This study's comparison of three estimators not only provides a check for the estimate of basal area growth but suggests that they can provide a quality control indicator for yield procedures. An example is derived from...

  14. Methods developed to elucidate nursing related adverse events in Japan.

    PubMed

    Yamagishi, Manaho; Kanda, Katsuya; Takemura, Yukie

    2003-05-01

    Financial resources for quality assurance in Japanese hospitals are limited and few hospitals have quality monitoring systems of nursing service systems. However, recently its necessity has been recognized. This study has cost effectively used adverse event occurrence rates as indicators of the quality of nursing service, and audited methods of collecting data on adverse events to elucidate their approximate true numbers. Data collection was conducted in July, August and November 2000 at a hospital in Tokyo that administered both primary and secondary health care services (281 beds, six wards, average length of stay 23 days). We collected adverse events through incident reports, logs, check-lists, nurse interviews, medication error questionnaires, urine leucocyte tests, patient interviews and medical records. Adverse events included the unplanned removals of invasive lines, medication errors, falls, pressure sores, skin deficiencies, physical restraints, and nosocomial infections. After evaluating the time and useful outcomes of each source, it soon became clear that we could elucidate adverse events most consistently and cost-effectively through incident reports, check lists, nurse interviews, urine leucocyte tests and medication error questionnaires. This study suggests that many hospitals in Japan could monitor the quality of the nursing service using these sources.

  15. IEC 61511 and the capital project process--a protective management system approach.

    PubMed

    Summers, Angela E

    2006-03-17

    This year, the process industry has reached an important milestone in process safety-the acceptance of an internationally recognized standard for safety instrumented systems (SIS). This standard, IEC 61511, documents good engineering practice for the assessment, design, operation, maintenance, and management of SISs. The foundation of the standard is established by several requirements in Part 1, Clauses 5-7, which cover the development of a management system aimed at ensuring that functional safety is achieved. The management system includes a quality assurance process for the entire SIS lifecycle, requiring the development of procedures, identification of resources and acquisition of tools. For maximum benefit, the deliverables and quality control checks required by the standard should be integrated into the capital project process, addressing safety, environmental, plant productivity, and asset protection. Industry has become inundated with a multitude of programs focusing on safety, quality, and cost performance. This paper introduces a protective management system, which builds upon the work process identified in IEC 61511. Typical capital project phases are integrated with the management system to yield one comprehensive program to efficiently manage process risk. Finally, the paper highlights areas where internal practices or guidelines should be developed to improve program performance and cost effectiveness.

  16. Neurofeedback Training as a New Method in Treatment of Crystal Methamphetamine Dependent Patients: A Preliminary Study.

    PubMed

    Rostami, R; Dehghani-Arani, F

    2015-09-01

    This study aimed to compare the effectiveness of neurofeedback (NFB) plus pharmacotherapy with pharmacotherapy alone, on addiction severity, mental health, and quality of life in crystal methamphetamine-dependent (CMD) patients. The study included 100 CMD patients undergoing a medical treatment who volunteered for this randomized controlled trial. After being evaluated by a battery of questionnaires that included addiction severity index questionnaire, Symptoms Check List 90 version, and World Health Organization Quality of Life, the participants were randomly assigned to an experimental or a control group. The experimental group received thirty 50-min sessions of NFB in addition to their usual medication over a 2-month period; meanwhile, the control group received only their usual medication. In accordance with this study's pre-test-post-test design, both study groups were evaluated again after completing their respective treatment regimens. Multivariate analysis of covariance showed the experimental group to have lower severity of addiction, better psychological health, and better quality of life in than the control group. The differences between the two groups were statistically significant. These finding suggest that NFB can be used to improve the effectiveness of treatment results in CMD patients.

  17. Improving Quality and Reducing Waste in Allied Health Workplace Education Programs: A Pragmatic Operational Education Framework Approach.

    PubMed

    Golder, Janet; Farlie, Melanie K; Sevenhuysen, Samantha

    2016-01-01

    Efficient utilisation of education resources is required for the delivery of effective learning opportunities for allied health professionals. This study aimed to develop an education framework to support delivery of high-quality education within existing education resources. This study was conducted in a large metropolitan health service. Homogenous and purposive sampling methods were utilised in Phase 1 (n=43) and 2 (n=14) consultation stages. Participants included 25 allied health professionals, 22 managers, 1 educator, and 3 executives. Field notes taken during 43 semi-structured interviews and 4 focus groups were member-checked, and semantic thematic analysis methods were utilised. Framework design was informed by existing published framework development guides. The framework model contains governance, planning, delivery, and evaluation and research elements and identifies performance indicators, practice examples, and support tools for a range of stakeholders. Themes integrated into framework content include improving quality of education and training provided and delivery efficiency, greater understanding of education role requirements, and workforce support for education-specific knowledge and skill development. This framework supports efficient delivery of allied health workforce education and training to the highest standard, whilst pragmatically considering current allied health education workforce demands.

  18. An efficient visualization method for analyzing biometric data

    NASA Astrophysics Data System (ADS)

    Rahmes, Mark; McGonagle, Mike; Yates, J. Harlan; Henning, Ronda; Hackett, Jay

    2013-05-01

    We introduce a novel application for biometric data analysis. This technology can be used as part of a unique and systematic approach designed to augment existing processing chains. Our system provides image quality control and analysis capabilities. We show how analysis and efficient visualization are used as part of an automated process. The goal of this system is to provide a unified platform for the analysis of biometric images that reduce manual effort and increase the likelihood of a match being brought to an examiner's attention from either a manual or lights-out application. We discuss the functionality of FeatureSCOPE™ which provides an efficient tool for feature analysis and quality control of biometric extracted features. Biometric databases must be checked for accuracy for a large volume of data attributes. Our solution accelerates review of features by a factor of up to 100 times. Review of qualitative results and cost reduction is shown by using efficient parallel visual review for quality control. Our process automatically sorts and filters features for examination, and packs these into a condensed view. An analyst can then rapidly page through screens of features and flag and annotate outliers as necessary.

  19. Locomotor dysfunction and risk of cardiovascular disease, quality of life, and medical costs: design of the Locomotive Syndrome and Health Outcome in Aizu Cohort Study (LOHAS) and baseline characteristics of the study population.

    PubMed

    Otani, Koji; Takegami, Misa; Fukumori, Norio; Sekiguchi, Miho; Onishi, Yoshihiro; Yamazaki, Shin; Ono, Rei; Otoshi, Kenichi; Hayashino, Yasuaki; Fukuhara, Shunichi; Kikuchi, Shin-Ichi; Konno, Shin-Ichi

    2012-05-01

    There is little evidence regarding long-term outcomes of locomotor dysfunction such as cardiovascular events, quality of life, and death. We are conducting a prospective cohort study to evaluate risk of cardiovascular disease, quality of life, medical costs, and mortality attributable to locomotor dysfunction. The present study determined baseline characteristics of participants in the Locomotive Syndrome and Health Outcome in Aizu Cohort Study (LOHAS). Cohort participants were recruited from residents between 40 and 80 years old who received regular health check-ups conducted by local government each year between 2008 and 2010 in Minami-Aizu Town and Tadami Town in Fukushima Prefecture, Japan. Musculoskeletal examination included assessment of physical examination of the cervical and lumbar spine, and upper and lower extremities and of physical function, such as grasping power, one-leg standing time, and time for the 3-m timed up-and-go test. Cardiovascular risk factors, including blood pressure and biological parameters, were measured at annual health check-ups. We also conducted a self-administered questionnaire survey. LOHAS participants comprised 1,289 men (mean age 65.7 years) and 1,954 women (mean age 66.2 years) at the first year. The proportion of obese individuals (body mass index 25.0 kg/m(2)) was 31.9% in men and 34.3% in women, and 41.0% of participants reported being followed up for hypertension, 7.0% for diabetes, and 43.6% for hypercholesterolemia. Prevalence of lumbar spinal stenosis was 10.7% in men and 12.9% in women, while prevalence of low back pain was 15.8% in men and 17.6% in women. The LOHAS is a novel population-based prospective cohort study that will provide an opportunity to estimate the risk of cardiovascular disease, quality of life, medical costs, and mortality attributable to locomotor dysfunction, and to provide the epidemiological information required to develop policies for detection of locomotor dysfunction.

  20. A New Quality Control Method base on IRMCD for Wind Profiler Observation towards Future Assimilation Application

    NASA Astrophysics Data System (ADS)

    Chen, Min; Zhang, Yu

    2017-04-01

    A wind profiler network with a total of 65 profiling radars was operated by the MOC/CMA in China until July 2015. In this study, a quality control procedure is constructed to incorporate the profiler data from the wind-profiling network into the local data assimilation and forecasting system (BJRUC). The procedure applies a blacklisting check that removes stations with gross errors and an outlier check that rejects data with large deviations from the background. Instead of the bi-weighting method, which has been commonly implemented in outlier elimination for one-dimensional scalar observations, an outlier elimination method is developed based on the iterated reweighted minimum covariance determinant (IRMCD) for multi-variate observations such as wind profiler data. A quality control experiment is separately performed for subsets containing profiler data tagged in parallel with/without rain flags at every 00UTC/12UTC from 20 June to 30 Sep 2015. From the results, we find that with the quality control, the frequency distributions of the differences between the observations and model background become more Gaussian-like and meet the requirements of a Gaussian distribution for data assimilation. Further intensive assessment for each quality control step reveals that the stations rejected by blacklisting contain poor data quality, and the IRMCD rejects outliers in a robust and physically reasonable manner.

  1. Symbolic LTL Compilation for Model Checking: Extended Abstract

    NASA Technical Reports Server (NTRS)

    Rozier, Kristin Y.; Vardi, Moshe Y.

    2007-01-01

    In Linear Temporal Logic (LTL) model checking, we check LTL formulas representing desired behaviors against a formal model of the system designed to exhibit these behaviors. To accomplish this task, the LTL formulas must be translated into automata [21]. We focus on LTL compilation by investigating LTL satisfiability checking via a reduction to model checking. Having shown that symbolic LTL compilation algorithms are superior to explicit automata construction algorithms for this task [16], we concentrate here on seeking a better symbolic algorithm.We present experimental data comparing algorithmic variations such as normal forms, encoding methods, and variable ordering and examine their effects on performance metrics including processing time and scalability. Safety critical systems, such as air traffic control, life support systems, hazardous environment controls, and automotive control systems, pervade our daily lives, yet testing and simulation alone cannot adequately verify their reliability [3]. Model checking is a promising approach to formal verification for safety critical systems which involves creating a formal mathematical model of the system and translating desired safety properties into a formal specification for this model. The complement of the specification is then checked against the system model. When the model does not satisfy the specification, model-checking tools accompany this negative answer with a counterexample, which points to an inconsistency between the system and the desired behaviors and aids debugging efforts.

  2. What do patients know about their low back pain? An analysis of the quality of information available on the Internet.

    PubMed

    Galbusera, Fabio; Brayda-Bruno, Marco; Freutel, Maren; Seitz, Andreas; Steiner, Malte; Wehrle, Esther; Wilke, Hans-Joachim

    2012-01-01

    Previous surveys showed a poor quality of the web sites providing health information about low back pain. However, the rapid and continuous evolution of the Internet content may question the current validity of those investigations. The present study is aimed to quantitatively assess the quality of the Internet information about low back pain retrieved with the most commonly employed search engines. An Internet search with the keywords "low back pain" has been performed with Google, Yahoo!® and Bing™ in the English language. The top 30 hits obtained with each search engine were evaluated by five independent raters and averaged following criteria derived from previous works. All search results were categorized as declaring compliant to a quality standard for health information (e.g. HONCode) or not and based on the web site type (Institutional, Free informative, Commercial, News, Social Network, Unknown). The quality of the hits retrieved by the three search engines was extremely similar. The web sites had a clear purpose, were easy to navigate, and mostly lacked in validity and quality of the provided links. The conformity to a quality standard was correlated with a marked greater quality of the web sites in all respects. Institutional web sites had the best validity and ease of use. Free informative web sites had good quality but a markedly lower validity compared to Institutional websites. Commercial web sites provided more biased information. News web sites were well designed and easy to use, but lacked in validity. The average quality of the hits retrieved by the most commonly employed search engines could be defined as satisfactory and favorably comparable with previous investigations. Awareness of the user about checking the quality of the information remains of concern.

  3. National Contaminant Occurrence Database (NCOD)

    EPA Pesticide Factsheets

    This site describes water sample analytical data that EPA is currently using and has used in the past for analysis, rulemaking, and rule evaluation. The data have been checked for data quality and analyzed for national representativeness.

  4. WIM data analyst's manual

    DOT National Transportation Integrated Search

    2010-06-01

    This manual provides information and recommended procedures to be utilized by an agencys Weigh-in-Motion (WIM) Office Data Analyst to perform validation and quality control (QC) checks of WIM traffic data. This manual focuses on data generated by ...

  5. Distress | Springboard Beyond Cancer

    Cancer.gov

    Distress may affect your ability to cope with a cancer diagnosis or treatment. It may cause you to miss check-ups or delay treatment. Even mild distress can affect the quality of life for you and your caregivers.

  6. [ISO 9001:2015 Certification in Quality Management].

    PubMed

    Enders, Christian; Lang, Gabriele E; Lang, Gerhard K; Werner, Jens Ulrich

    2017-07-01

    Quality management improves the structures, processes and results of organizations of all kinds. Many practices and clinics have their existing quality management system certified according to ISO 9001, (e.g., to check their own quality management system or to obtain a testimonial against third parties). The latest version ISO 9001:2015 contains some changes, both structurally and in terms of content. These changes can be met with reasonable efforts. An ISO 9001:2015 certification represents a value for your organization, but these advantages are often not directly measurable. Georg Thieme Verlag KG Stuttgart · New York.

  7. Extremely accurate sequential verification of RELAP5-3D

    DOE PAGES

    Mesina, George L.; Aumiller, David L.; Buschman, Francis X.

    2015-11-19

    Large computer programs like RELAP5-3D solve complex systems of governing, closure and special process equations to model the underlying physics of nuclear power plants. Further, these programs incorporate many other features for physics, input, output, data management, user-interaction, and post-processing. For software quality assurance, the code must be verified and validated before being released to users. For RELAP5-3D, verification and validation are restricted to nuclear power plant applications. Verification means ensuring that the program is built right by checking that it meets its design specifications, comparing coding to algorithms and equations and comparing calculations against analytical solutions and method ofmore » manufactured solutions. Sequential verification performs these comparisons initially, but thereafter only compares code calculations between consecutive code versions to demonstrate that no unintended changes have been introduced. Recently, an automated, highly accurate sequential verification method has been developed for RELAP5-3D. The method also provides to test that no unintended consequences result from code development in the following code capabilities: repeating a timestep advancement, continuing a run from a restart file, multiple cases in a single code execution, and modes of coupled/uncoupled operation. In conclusion, mathematical analyses of the adequacy of the checks used in the comparisons are provided.« less

  8. A performance improvement case study in aircraft maintenance and its implications for hazard identification.

    PubMed

    Ward, Marie; McDonald, Nick; Morrison, Rabea; Gaynor, Des; Nugent, Tony

    2010-02-01

    Aircraft maintenance is a highly regulated, safety critical, complex and competitive industry. There is a need to develop innovative solutions to address process efficiency without compromising safety and quality. This paper presents the case that in order to improve a highly complex system such as aircraft maintenance, it is necessary to develop a comprehensive and ecologically valid model of the operational system, which represents not just what is meant to happen, but what normally happens. This model then provides the backdrop against which to change or improve the system. A performance report, the Blocker Report, specific to aircraft maintenance and related to the model was developed gathering data on anything that 'blocks' task or check performance. A Blocker Resolution Process was designed to resolve blockers and improve the current check system. Significant results were obtained for the company in the first trial and implications for safety management systems and hazard identification are discussed. Statement of Relevance: Aircraft maintenance is a safety critical, complex, competitive industry with a need to develop innovative solutions to address process and safety efficiency. This research addresses this through the development of a comprehensive and ecologically valid model of the system linked with a performance reporting and resolution system.

  9. Data Quality Control of the French Permanent Broadband Network in the RESIF Framework

    NASA Astrophysics Data System (ADS)

    Grunberg, Marc; Lambotte, Sophie; Engels, Fabien; Dretzen, Remi; Hernandez, Alain

    2014-05-01

    In the framework of the RESIF (Réseau Sismologique et géodésique Français) project, a new information system is being setting up, allowing the improvement of the management and the distribution of high quality data from the different elements of RESIF and the associated networks. Within this information system, EOST (in Strasbourg) is in charge of collecting real-time permanent broadband seismic waveform, and performing Quality Control on these data. The real-time and validated data set are pushed to the French National Distribution Center (Isterre/Grenoble) in order to make them publicly available. Furthermore EOST hosts the BCSF-ReNaSS, in charge of the French metropolitan seismic bulletin. This allows to benefit from some high-end quality control based on the national and world-wide seismicity. Here we present first the real-time seismic data flow from the stations of the French National Broad Band Network to EOST, and then, the data Quality Control procedures that were recently installed, including some new developments. The data Quality Control consists in applying a variety of subprocesses to check the consistency of the whole system and process from the stations to the data center. This allows us to verify that instruments and data transmission are operating correctly. Moreover analysis of the ambient noise helps to characterize intrinsic seismic quality of the stations and to identify other kind of disturbances. The deployed Quality Control consist in a pipeline that starts with low-level procedures : check the real-time miniseed data file (file naming convention, data integrity), check for inconsistencies between waveform and meta-data (channel name, sample rate, etc.), compute waveform statistics (data availability, gap/overlap, mean, rms, time quality, spike). It is followed by some high-level procedures such as : power spectral density computation (PSD), STA/LTA computation to be correlated to the seismicity, phases picking and stations magnitudes discrepancies. The results of quality control is visualized through a web interface. This latter gathers data from different information systems to provide a global view on last events that could impact the data (like intervention on site or seismic events, etc.). This work is still an ongoing project. We intend to add more sophisticated procedures to enhanced our data Quality Control. Among them, we will deploy a seismic moment tensor inversion tool for amplitude, time and polarity control and a noise correlation procedure for time drift detections.

  10. Quality Assurance Specifications for Planetary Protection Assays

    NASA Astrophysics Data System (ADS)

    Baker, Amy

    As the European Space Agency planetary protection (PP) activities move forward to support the ExoMars and other planetary missions, it will become necessary to increase staffing of labo-ratories that provide analyses for these programs. Standardization of procedures, a comprehen-sive quality assurance program, and unilateral training of personnel will be necessary to ensure that the planetary protection goals and schedules are met. The PP Quality Assurance/Quality Control (QAQC) program is designed to regulate and monitor procedures performed by labora-tory personnel to ensure that all work meets data quality objectives through the assembly and launch process. Because personnel time is at a premium and sampling schedules are often de-pendent on engineering schedules, it is necessary to have flexible staffing to support all sampling requirements. The most productive approach to having a competent and flexible work force is to establish well defined laboratory procedures and training programs that clearly address the needs of the program and the work force. The quality assurance specification for planetary protection assays has to ensure that labora-tories and associated personnel can demonstrate the competence to perform assays according to the applicable standard AD4. Detailed subjects included in the presentation are as follows: • field and laboratory control criteria • data reporting • personnel training requirements and certification • laboratory audit criteria. Based upon RD2 for primary and secondary validation and RD3 for data quality objectives, the QAQC will provide traceable quality assurance safeguards by providing structured laboratory requirements for guidelines and oversight including training and technical updates, standardized documentation, standardized QA/QC checks, data review and data archiving.

  11. Compositional schedulability analysis of real-time actor-based systems.

    PubMed

    Jaghoori, Mohammad Mahdi; de Boer, Frank; Longuet, Delphine; Chothia, Tom; Sirjani, Marjan

    2017-01-01

    We present an extension of the actor model with real-time, including deadlines associated with messages, and explicit application-level scheduling policies, e.g.,"earliest deadline first" which can be associated with individual actors. Schedulability analysis in this setting amounts to checking whether, given a scheduling policy for each actor, every task is processed within its designated deadline. To check schedulability, we introduce a compositional automata-theoretic approach, based on maximal use of model checking combined with testing. Behavioral interfaces define what an actor expects from the environment, and the deadlines for messages given these assumptions. We use model checking to verify that actors match their behavioral interfaces. We extend timed automata refinement with the notion of deadlines and use it to define compatibility of actor environments with the behavioral interfaces. Model checking of compatibility is computationally hard, so we propose a special testing process. We show that the analyses are decidable and automate the process using the Uppaal model checker.

  12. Building a Quality Controlled Database of Meteorological Data from NASA Kennedy Space Center and the United States Air Force's Eastern Range

    NASA Technical Reports Server (NTRS)

    Brenton, James C.; Barbre. Robert E., Jr.; Decker, Ryan K.; Orcutt, John M.

    2018-01-01

    The National Aeronautics and Space Administration's (NASA) Marshall Space Flight Center (MSFC) Natural Environments Branch (EV44) has provided atmospheric databases and analysis in support of space vehicle design and day-of-launch operations for NASA and commercial launch vehicle programs launching from the NASA Kennedy Space Center (KSC), co-located on the United States Air Force's Eastern Range (ER) at the Cape Canaveral Air Force Station. The ER complex is one of the most heavily instrumented sites in the United States with over 31 towers measuring various atmospheric parameters on a continuous basis. An inherent challenge with large sets of data consists of ensuring erroneous data is removed from databases, and thus excluded from launch vehicle design analyses. EV44 has put forth great effort in developing quality control (QC) procedures for individual meteorological instruments, however no standard QC procedures for all databases currently exists resulting in QC databases that have inconsistencies in variables, methodologies, and periods of record. The goal of this activity is to use the previous efforts by EV44 to develop a standardized set of QC procedures from which to build meteorological databases from KSC and the ER, while maintaining open communication with end users from the launch community to develop ways to improve, adapt and grow the QC database. Details of the QC procedures will be described. As the rate of launches increases with additional launch vehicle programs, it is becoming more important that weather databases are continually updated and checked for data quality before use in launch vehicle design and certification analyses.

  13. Evaluation of a simulation procedure designed to recognize shape and contour of suspicious masses in mammography

    NASA Astrophysics Data System (ADS)

    Sousa, Maria A. Z.; Siqueira, Paula N.; Schiabel, Homero

    2015-03-01

    A large number of breast phantoms have been developed for conducting quality tests, characterization of imaging systems and computer aided diagnosis schemes, dosimetry and image perception. The realism of these phantoms is important for ensuring the accuracy of results and a greater range of applications. In this work, a developed phantom is considered proposing the use of PVC films for simulation of nodules inserted in the breast parenchyma designed for classification between malignant and benign signals according to the BI-RADS® standard. The investigation includes analysis of radiographic density, mass shape and its corresponding contour outlined by experienced radiologists. The material was cut based on lesions margins found in 44 clinical cases, which were divided between circumscribed and spiculated structures. Tests were performed to check the ability of the specialists in distinguishing the contour compared to actual cases while the shapes accuracy was determined quantitatively by evaluation metrics. Results showed the applicability of the chosen material creating image radiological patterns very similar to the actual ones.

  14. Payload Processing for Mice Drawer System

    NASA Technical Reports Server (NTRS)

    Brown, Judy

    2007-01-01

    Experimental payloads flown to the International Space Station provide us with valuable research conducted in a microgravity environment not attainable on earth. The Mice Drawer System is an experiment designed by Thales Alenia Space Italia to study the effects of microgravity on mice. It is designed to fly to orbit on the Space Shuttle Utilization Logistics Flight 2 in October 2008, remain onboard the International Space Station for approximately 100 days and then return to earth on a following Shuttle flight. The experiment apparatus will be housed inside a Double Payload Carrier. An engineering model of the Double Payload Carrier was sent to Kennedy Space Center for a fit check inside both Shuttles, and the rack that it will be installed in aboard the International Space Station. The Double Payload Carrier showed a good fit quality inside each vehicle, and Thales Alenia Space Italia will now construct the actual flight model and continue to prepare the Mice Drawer System experiment for launch.

  15. Common Sense or Gun Control? Political Communication and News Media Framing of Firearm Sale Background Checks after Newtown.

    PubMed

    McGinty, Emma E; Wolfson, Julia A; Sell, Tara Kirk; Webster, Daniel W

    2016-02-01

    Gun violence is a critical public health problem in the United States, but it is rarely at the top of the public policy agenda. The 2012 mass shooting in Newtown, Connecticut, opened a rare window of opportunity to strengthen firearm policies in the United States. In this study, we examine the American public's exposure to competing arguments for and against federal- and state-level universal background check laws, which would require a background check prior to every firearm sale, in a large sample of national and regional news stories (n = 486) published in the year following the Newtown shooting. Competing messages about background check laws could influence the outcome of policy debates by shifting support and political engagement among key constituencies such as gun owners and conservatives. We found that news media messages in support of universal background checks were fact-based and used rational arguments, and opposing messages often used rights-based frames designed to activate the core values of politically engaged gun owners. Reframing supportive messages about background check policies to align with gun owners' and conservatives' core values could be a promising strategy to increase these groups' willingness to vocalize their support for expanding background checks for firearm sales. Copyright © 2016 by Duke University Press.

  16. The Relationship Between Focused Attention Meditation Practice Habits, Psychological Symptoms, and Quality of Life.

    PubMed

    Bilican, F Isil

    2016-12-01

    This study examined the relationship between focused attention meditation practice habits, psychological symptoms, and quality of life. The participants were 30 adults from New York, NY, practicing Ananda Marga spirituality. They were administered the Symptom Check List-90-R and the Quality of Life Index. The findings pointed out while Ananda Marga meditation practice habits were not associated with improvements in psychological symptoms, longer years in meditation practice was associated with improvements in overall, social and psychological/spiritual quality of life. Longer periods of meditation practice per session were related to lower levels of overall quality of life and economic quality of life.

  17. Rainfall, Streamflow, and Water-Quality Data During Stormwater Monitoring, Halawa Stream Drainage Basin, Oahu, Hawaii, July 1, 2004 to June 30, 2005

    USGS Publications Warehouse

    Young, Stacie T.M.; Ball, Marcael T.J.

    2005-01-01

    Storm runoff water-quality samples were collected as part of the State of Hawaii Department of Transportation Stormwater Monitoring Program. This program is designed to assess the effects of highway runoff and urban runoff on Halawa Stream. For this program, rainfall data were collected at two stations, continuous streamflow data at two stations, and water-quality data at five stations, which include the two continuous streamflow stations. This report summarizes rainfall, streamflow, and water-quality data collected between July 1, 2004 and June 30, 2005. A total of 15 samples was collected over three storms during July 1, 2004 to June 30, 2005. In general, an attempt was made to collect grab samples nearly simultaneously at all five stations and flow-weighted time-composite samples at the three stations equipped with automatic samplers. However, all three storms were partially sampled because either not all stations were sampled or not all composite samples were collected. Samples were analyzed for total suspended solids, total dissolved solids, nutrients, chemical oxygen demand, and selected trace metals (cadmium, chromium, copper, lead, nickel, and zinc). Chromium and nickel were added to the analysis starting October 1, 2004. Grab samples were additionally analyzed for oil and grease, total petroleum hydrocarbons, fecal coliform, and biological oxygen demand. Quality-assurance/quality-control samples were also collected during storms and during routine maintenance to verify analytical procedures and check the effectiveness of equipment-cleaning procedures.

  18. Characterizations of a Quality Certified Athletic Trainer

    PubMed Central

    Raab, Scot; Wolfe, Brent D.; Gould, Trenton E.; Piland, Scott G.

    2011-01-01

    Context: Didactic proficiency does not ensure clinical aptitude. Quality athletic health care requires clinical knowledge and affective traits. Objective: To develop a grounded theory explaining the constructs of a quality certified athletic trainer (AT). Design: Delphi study. Setting: Interviews in conference rooms or business offices and by telephone. Patients or Other Participants: Thirteen ATs (men = 8, women = 5) stratified across the largest employment settings (high school, college, clinical) in the 4 largest districts of the National Athletic Trainers' Association (2, 3, 4, 9). Data Collection and Analysis: Open-ended interview questions were audio recorded, transcribed, and reviewed before condensing. Two member checks ensured trustworthiness. Open coding reduced text to descriptive adjectives. Results: We grouped adjectives into 5 constructs (care, communication, commitment, integrity, knowledge) and grouped these constructs into 2 higher-order constructs (affective traits, effective traits). Conclusions: According to participants, ATs who demonstrate the ability to care, show commitment and integrity, value professional knowledge, and communicate effectively with others can be identified as quality ATs. These abilities facilitate the creation of positive relationships. These relationships allow the quality AT to interact with patients and other health care professionals on a knowledgeable basis that ultimately improves health care delivery. Our resulting theory supported the examination of characteristics not traditionally assessed in an athletic training education program. If researchers can show that these characteristics develop ATs into quality ATs (eg, those who work better with others, relate meaningfully with patients, and improve the standard of health care), they must be cultivated in the educational setting. PMID:22488194

  19. Quality indicators for initial licensure and discipline in nursing laws in South Korea and North Carolina.

    PubMed

    Kim, K K; Kjervik, D K; Foster, B

    2014-03-01

    The Korean regulatory framework of nursing licensure reflects that of the USA, but its content differs in some of the powers related to quality assurance. This article compares regulatory quality indicators and describes core standards in nursing regulations that are related to both initial licensure and discipline for three groups: the National Council of State Boards of Nursing, the North Carolina and the South of Korea. A descriptive, comparative law design is used to examine the differences and similarities in the quality indicators and core standards found in three documents: the National Council of State Boards of Nursing Model Act, the North Carolina Nursing Practice Act and the Korean Medical Service Act for registered nurses. The findings indicate that ten quality indicators and two standards appear in study objects. Although most of the quality indicators are common to all documents, some differences are found in terms of the scope of criminal background checks and the range of grounds for disciplinary action. These findings cannot be generalized in the USA because although the North Carolina nursing act was selected as an example of US nursing laws, nursing laws differ somewhat across states. This comparative study shows a clear opportunity to develop indicators that acknowledge the important areas of competence and good moral character and how they can improve patient safety in Korea. This study provides recommendations for Korean nursing legislative redesign and pointers for other jurisdictions to consider. © 2013 International Council of Nurses.

  20. Design criteria monograph for pressure regulators, relief valves, check valves, burst disks, and explosive valves

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Monograph reviews and assesses current design practices, and from them establishes firm guidance for achieving greater consistency in design, increased reliability in end product, and greater efficiency in design effort. Five devices are treated separately. Guides to aid in configuration selection are outlined.

  1. Tracing the Rationale Behind UML Model Change Through Argumentation

    NASA Astrophysics Data System (ADS)

    Jureta, Ivan J.; Faulkner, Stéphane

    Neglecting traceability—i.e., the ability to describe and follow the life of a requirement—is known to entail misunderstanding and miscommunication, leading to the engineering of poor quality systems. Following the simple principles that (a) changes to UML model instances ought be justified to the stakeholders, (b) justification should proceed in a structured manner to ensure rigor in discussions, critique, and revisions of model instances, and (c) the concept of argument instantiated in a justification process ought to be well defined and understood, the present paper introduces the UML Traceability through Argumentation Method (UML-TAM) to enable the traceability of design rationale in UML while allowing the appropriateness of model changes to be checked by analysis of the structure of the arguments provided to justify such changes.

  2. Preventing youth access to alcohol: outcomes from a multi-community time-series trial*.

    PubMed

    Wagenaar, Alexander C; Toomey, Traci L; Erickson, Darin J

    2005-03-01

    AIMS/INTERVENTION: The Complying with the Minimum Drinking Age project (CMDA) is a community trial designed to test effects of two interventions designed to reduce alcohol sales to minors: (1) training for management of retail alcohol establishments and (2) enforcement checks of alcohol establishments. CMDA is a multi-community time-series quasi-experimental trial with a nested cohort design. CMDA was implemented in 20 cities in four geographic areas in the US Midwest. The core outcome, propensity for alcohol sales to minors, was directly tested with research staff who attempted to purchase alcohol without showing age identification using a standardized protocol in 602 on-premise and 340 off-premise alcohol establishments. Data were collected every other week in all communities over 4 years. Mixed-model regression and Box-Jenkins time-series analyses were used to assess short- and long-term establishment-specific and general community-level effects of the two interventions. Effects of the training intervention were mixed. Specific deterrent effects were observed for enforcement checks, with an immediate 17% reduction in likelihood of sales to minors. These effects decayed entirely within 3 months in off-premise establishments and to an 8.2% reduction in on-premise establishments. Enforcement checks prevent alcohol sales to minors. At the intensity levels tested, enforcement primarily affected specific establishments checked, with limited diffusion to the whole community. Finally, most of the enforcement effect decayed within 3 months, suggesting that a regular schedule of enforcement is necessary to maintain deterrence.

  3. Design and Performance Checks of the NPL Axial Heat Flow Apparatus

    NASA Astrophysics Data System (ADS)

    Wu, J.; Clark, J.; Stacey, C.; Salmon, D.

    2015-03-01

    This paper describes the design and performance checks of the NPL axial heat flow apparatus developed at the National Physical Laboratory for measurement of thermal conductivity. This apparatus is based on an absolute steady-state technique and is suitable for measuring specimens with thermal conductivities in the range from to and at temperatures between and . A uniform heat flow is induced in a cylindrical bar-shaped specimen that is firmly clamped between a guarded heater unit at the top and a water-cooled base. Heat is supplied at a known rate at the top end of the specimen by the heater unit and constrained to flow axially through the specimen by a surrounding edge-guard system, which is closely matched to the temperature gradient within the test specimen. The performance of this apparatus has been checked against existing NPL thermal-conductivity reference materials NPL 2S89 (based on Stainless Steel 310) and BSC Pure Iron (pure iron supplied by the British Steel Corporation with 99.96 % purity). The measured data produced by the newly designed NPL axial heat flow apparatus agree with the reference data for NPL 2S89 within 2 % and with that of BSC Pure Iron to within 3 % at temperatures from to . This apparatus is being used to provide accurate measurements to industrial and academic organizations and has also been used to develop a new range of NPL reference materials for checking other experimental techniques and procedures for thermal-conductivity measurements.

  4. Sediment trapping efficiency of adjustable check dam in laboratory and field experiment

    NASA Astrophysics Data System (ADS)

    Wang, Chiang; Chen, Su-Chin; Lu, Sheng-Jui

    2014-05-01

    Check dam has been constructed at mountain area to block debris flow, but has been filled after several events and lose its function of trapping. For the reason, the main facilities of our research is the adjustable steel slit check dam, which with the advantages of fast building, easy to remove or adjust it function. When we can remove transverse beams to drain sediments off and keep the channel continuity. We constructed adjustable steel slit check dam on the Landow torrent, Huisun Experiment Forest station as the prototype to compare with model in laboratory. In laboratory experiments, the Froude number similarity was used to design the dam model. The main comparisons focused on types of sediment trapping and removing, sediment discharge, and trapping rate of slit check dam. In different types of removing transverse beam showed different kind of sediment removal and differences on rate of sediment removing, removing rate, and particle size distribution. The sediment discharge in check dam with beams is about 40%~80% of check dam without beams. Furthermore, the spacing of beams is considerable factor to the sediment discharge. In field experiment, this research uses time-lapse photography to record the adjustable steel slit check dam on the Landow torrent. The typhoon Soulik made rainfall amounts of 600 mm in eight hours and induced debris flow in Landow torrent. Image data of time-lapse photography demonstrated that after several sediment transport event the adjustable steel slit check dam was buried by debris flow. The result of lab and field experiments: (1)Adjustable check dam could trap boulders and stop woody debris flow and flush out fine sediment to supply the need of downstream river. (2)The efficiency of sediment trapping in adjustable check dam with transverse beams was significantly improved. (3)The check dam without transverse beams can remove the sediment and keep the ecosystem continuity.

  5. The dopamine D2/D3 receptor agonist quinpirole increases checking-like behaviour in an operant observing response task with uncertain reinforcement: a novel possible model of OCD.

    PubMed

    Eagle, Dawn M; Noschang, Cristie; d'Angelo, Laure-Sophie Camilla; Noble, Christie A; Day, Jacob O; Dongelmans, Marie Louise; Theobald, David E; Mar, Adam C; Urcelay, Gonzalo P; Morein-Zamir, Sharon; Robbins, Trevor W

    2014-05-01

    Excessive checking is a common, debilitating symptom of obsessive-compulsive disorder (OCD). In an established rodent model of OCD checking behaviour, quinpirole (dopamine D2/3-receptor agonist) increased checking in open-field tests, indicating dopaminergic modulation of checking-like behaviours. We designed a novel operant paradigm for rats (observing response task (ORT)) to further examine cognitive processes underpinning checking behaviour and clarify how and why checking develops. We investigated i) how quinpirole increases checking, ii) dependence of these effects on D2/3 receptor function (following treatment with D2/3 receptor antagonist sulpiride) and iii) effects of reward uncertainty. In the ORT, rats pressed an 'observing' lever for information about the location of an 'active' lever that provided food reinforcement. High- and low-checkers (defined from baseline observing) received quinpirole (0.5mg/kg, 10 treatments) or vehicle. Parametric task manipulations assessed observing/checking under increasing task demands relating to reinforcement uncertainty (variable response requirement and active-lever location switching). Treatment with sulpiride further probed the pharmacological basis of long-term behavioural changes. Quinpirole selectively increased checking, both functional observing lever presses (OLPs) and non-functional extra OLPs (EOLPs). The increase in OLPs and EOLPs was long-lasting, without further quinpirole administration. Quinpirole did not affect the immediate ability to use information from checking. Vehicle and quinpirole-treated rats (VEH and QNP respectively) were selectively sensitive to different forms of uncertainty. Sulpiride reduced non-functional EOLPs in QNP rats but had no effect on functional OLPs. These data have implications for treatment of compulsive checking in OCD, particularly for serotonin-reuptake-inhibitor treatment-refractory cases, where supplementation with dopamine receptor antagonists may be beneficial. Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.

  6. Music festival attendees' illicit drug use, knowledge and practices regarding drug content and purity: a cross-sectional survey.

    PubMed

    Day, Niamh; Criss, Joshua; Griffiths, Benjamin; Gujral, Shireen Kaur; John-Leader, Franklin; Johnston, Jennifer; Pit, Sabrina

    2018-01-05

    Drug checking is a harm reduction strategy which allows users to check the content and purity of illicit drugs. Although drug checking has been trialled internationally, with demonstrated value as a harm reduction and health promotion strategy, the use of such services in Australia remains a contentious issue. This study aimed to investigate the proportion and patterns of illicit drug use among young people, their attitudes towards drug checking at festivals and the potential impact of drug checking on intended drug use behaviour. The survey was conducted at a major Australian music festival in 2016. Data was collected from a sample of festival attendees (n = 642) aged between 18 and 30 years. A descriptive analysis of the data was performed. Nearly three-quarters (73.4%) of participants reported that they had used illicit drugs in the past 12 months, most commonly cannabis (63.9%) and ecstasy (59.8%). A large proportion of participants believed 'somewhat' or 'a lot' that drug checking services could help users seek help to reduce harm (86.5%) and that drug checking services should be combined with harm reduction advice (84.9%). However, two thirds of the participants agreed 'somewhat' or 'a lot' that drug sellers may use this service as a quality control mechanism (68.6%). Approximately half (54.4%) indicated they would be highly likely and a third (32.7%) would be somewhat likely to utilise free drug checking services should they be available at music festivals. When asked whether the results of drug checking would influence their drug use behaviour, participants reported that they would not take substances shown to contain methamphetamine (65.1%), ketamine (57.5%) or para-methoxyamphetamine (PMA) (58.4%). The majority of festival attendees aged 18-30 participating in this study reported a history of illicit drug use and were in favour of the provision of free drug checking at festivals. A considerable proportion reported that the results of drug checking would influence their drug use behaviour. The findings of this study can contribute to the debate regarding whether drug checking services could potentially play a major role in harm reduction and health promotion programming for young people attending festivals.

  7. STS-93 MS Tognini checks the BRIC experiment petri dishes on the middeck

    NASA Image and Video Library

    2013-11-18

    STS093-350-008 (22-27 July 1999) --- Astronaut Michel Tognini, mission specialist representing France’s Centre National d’Etudes Spatiales (CNES), checks the Biological Research in Canisters (BRIC) payload petri dishes on the mid deck of the Space Shuttle Columbia. BRIC was designed to investigate the effects of space flight on small arthropod animals and plant specimens.

  8. STS_135_Russia

    NASA Image and Video Library

    2011-03-28

    Space suit designer Oleg Gerasimenko shares some tips on the Sokol suit with NASA astronaut Rex Walheim during a fit check at the Zvezda facility on Monday, March 28, 2011, in Moscow. The crew of the final shuttle mission traveled to Moscow for a suit fit check of their Russian Soyuz suits that will be required in the event of an emergency. ( NASA Photo / Houston Chronicle, Smiley N. Pool )

  9. Real-World Use and Self-Reported Health Outcomes of a Patient-Designed Do-it-Yourself Mobile Technology System for Diabetes: Lessons for Mobile Health.

    PubMed

    Lee, Joyce M; Newman, Mark W; Gebremariam, Achamyeleh; Choi, Preciosa; Lewis, Dana; Nordgren, Weston; Costik, John; Wedding, James; West, Benjamin; Gilby, Nancy Benovich; Hannemann, Christopher; Pasek, Josh; Garrity, Ashley; Hirschfeld, Emily

    2017-04-01

    The aim of this study is to compare demographic/disease characteristics of users versus nonusers of a do-it-yourself (DIY) mobile technology system for diabetes (Nightscout), to describe its uses and personalization, and to evaluate associated changes in health behaviors and outcomes. A cross-sectional, household-level online survey was used. Of 1268 household respondents who were members of the CGM in the Cloud Facebook group, there were 1157 individuals with diabetes who provided information about Nightscout use. The majority of individuals with diabetes in the household sample were 6-12 years old (followed by 18 years and above, and 13-17 years), non-Hispanic whites (90.2%), with type 1 diabetes (99.4%). The majority used an insulin pump (85.6%) and CGM (97.0%) and had private health insurance (83.8%). Nightscout use was more prevalent among children compared with adolescents and adults. Children used Nightscout for nighttime, school, sporting events, and travel; adults used it for nighttime, work, travel, and sporting events. Whereas the majority of adults viewed their own data without assistance from others, among pediatric users, a median of three individuals (range: 0-8) viewed Nightscout, with a median of three devices per viewer (range: 0-7). Individuals reported that after Nightscout adoption, they checked blood glucose values with a meter less often; bolused more frequently; gave more boluses without checking first with a blood glucose meter; and experienced significant improvements in HbA1c and quality of life. The Nightscout Project is a patient-driven mobile technology for health and may have beneficial effects on glycemic control and quality of life.

  10. Clinical implementation and failure mode and effects analysis of HDR skin brachytherapy using Valencia and Leipzig surface applicators.

    PubMed

    Sayler, Elaine; Eldredge-Hindy, Harriet; Dinome, Jessie; Lockamy, Virginia; Harrison, Amy S

    2015-01-01

    The planning procedure for Valencia and Leipzig surface applicators (VLSAs) (Nucletron, Veenendaal, The Netherlands) differs substantially from CT-based planning; the unfamiliarity could lead to significant errors. This study applies failure modes and effects analysis (FMEA) to high-dose-rate (HDR) skin brachytherapy using VLSAs to ensure safety and quality. A multidisciplinary team created a protocol for HDR VLSA skin treatments and applied FMEA. Failure modes were identified and scored by severity, occurrence, and detectability. The clinical procedure was then revised to address high-scoring process nodes. Several key components were added to the protocol to minimize risk probability numbers. (1) Diagnosis, prescription, applicator selection, and setup are reviewed at weekly quality assurance rounds. Peer review reduces the likelihood of an inappropriate treatment regime. (2) A template for HDR skin treatments was established in the clinic's electronic medical record system to standardize treatment instructions. This reduces the chances of miscommunication between the physician and planner as well as increases the detectability of an error. (3) A screen check was implemented during the second check to increase detectability of an error. (4) To reduce error probability, the treatment plan worksheet was designed to display plan parameters in a format visually similar to the treatment console display, facilitating data entry and verification. (5) VLSAs are color coded and labeled to match the electronic medical record prescriptions, simplifying in-room selection and verification. Multidisciplinary planning and FMEA increased detectability and reduced error probability during VLSA HDR brachytherapy. This clinical model may be useful to institutions implementing similar procedures. Copyright © 2015 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  11. Determination of MLC model parameters for Monaco using commercial diode arrays.

    PubMed

    Kinsella, Paul; Shields, Laura; McCavana, Patrick; McClean, Brendan; Langan, Brian

    2016-07-08

    Multileaf collimators (MLCs) need to be characterized accurately in treatment planning systems to facilitate accurate intensity-modulated radiation therapy (IMRT) and volumetric-modulated arc therapy (VMAT). The aim of this study was to examine the use of MapCHECK 2 and ArcCHECK diode arrays for optimizing MLC parameters in Monaco X-ray voxel Monte Carlo (XVMC) dose calculation algorithm. A series of radiation test beams designed to evaluate MLC model parameters were delivered to MapCHECK 2, ArcCHECK, and EBT3 Gafchromic film for comparison. Initial comparison of the calculated and ArcCHECK-measured dose distributions revealed it was unclear how to change the MLC parameters to gain agreement. This ambiguity arose due to an insufficient sampling of the test field dose distributions and unexpected discrepancies in the open parts of some test fields. Consequently, the XVMC MLC parameters were optimized based on MapCHECK 2 measurements. Gafchromic EBT3 film was used to verify the accuracy of MapCHECK 2 measured dose distributions. It was found that adjustment of the MLC parameters from their default values resulted in improved global gamma analysis pass rates for MapCHECK 2 measurements versus calculated dose. The lowest pass rate of any MLC-modulated test beam improved from 68.5% to 93.5% with 3% and 2 mm gamma criteria. Given the close agreement of the optimized model to both MapCHECK 2 and film, the optimized model was used as a benchmark to highlight the relatively large discrepancies in some of the test field dose distributions found with ArcCHECK. Comparison between the optimized model-calculated dose and ArcCHECK-measured dose resulted in global gamma pass rates which ranged from 70.0%-97.9% for gamma criteria of 3% and 2 mm. The simple square fields yielded high pass rates. The lower gamma pass rates were attributed to the ArcCHECK overestimating the dose in-field for the rectangular test fields whose long axis was parallel to the long axis of the ArcCHECK. Considering ArcCHECK measurement issues and the lower gamma pass rates for the MLC-modulated test beams, it was concluded that MapCHECK 2 was a more suitable detector than ArcCHECK for the optimization process. © 2016 The Authors

  12. Application of reiteration of Hankel singular value decomposition in quality control

    NASA Astrophysics Data System (ADS)

    Staniszewski, Michał; Skorupa, Agnieszka; Boguszewicz, Łukasz; Michalczuk, Agnieszka; Wereszczyński, Kamil; Wicher, Magdalena; Konopka, Marek; Sokół, Maria; Polański, Andrzej

    2017-07-01

    Medical centres are obliged to store past medical records, including the results of quality assurance (QA) tests of the medical equipment, which is especially useful in checking reproducibility of medical devices and procedures. Analysis of multivariate time series is an important part of quality control of NMR data. In this work we proposean anomaly detection tool based on Reiteration of Hankel Singular Value Decomposition method. The presented method was compared with external software and authors obtained comparable results.

  13. [Quality control in herbal supplements].

    PubMed

    Oelker, Luisa

    2005-01-01

    Quality and safety of food and herbal supplements are the result of a whole of different elements as good manufacturing practice and process control. The process control must be active and able to individuate and correct all possible hazards. The main and most utilized instrument is the hazard analysis critical control point (HACCP) system the correct application of which can guarantee the safety of the product. Herbal supplements need, in addition to standard quality control, a set of checks to assure the harmlessness and safety of the plants used.

  14. The Method of Manufacturing Nonmetallic Test-Blocks on Different Sensitivity Classes

    NASA Astrophysics Data System (ADS)

    Kalinichenko, N. P.; Kalinichenko, A. N.; Lobanova, I. S.; Zaitseva, A. A.; Loboda, E. L.

    2016-01-01

    Nowadays in our modern world there is a vital question of quality control of details made from nonmetallic materials due to their wide spreading. Nondestructive penetrant testing is effective, and in some cases it is the only possible method of accidents prevention at high- risk sites. A brief review of check sample necessary for quality evaluation of penetrant materials is considered. There was offered a way of making agents for quality of penetrant materials testing according to different liquid penetrant testing sensibility classes.

  15. EC02-0264-19

    NASA Image and Video Library

    2002-11-15

    How differential deflection of the inboard and outboard leading-edge flaps affected the handling qualities of this modified F/A-18A was evaluated during the first check flight in the Active Aeroelastic Wing program at NASA's Dryden Flight Research Center.

  16. A complex intervention to improve implementation of World Health Organization guidelines for diagnosis of severe illness in low-income settings: a quasi-experimental study from Uganda.

    PubMed

    Cummings, Matthew J; Goldberg, Elijah; Mwaka, Savio; Kabajaasi, Olive; Vittinghoff, Eric; Cattamanchi, Adithya; Katamba, Achilles; Kenya-Mugisha, Nathan; Jacob, Shevin T; Davis, J Lucian

    2017-11-06

    To improve management of severely ill hospitalized patients in low-income settings, the World Health Organization (WHO) established a triage tool called "Quick Check" to provide clinicians with a rapid, standardized approach to identify patients with severe illness based on recognition of abnormal vital signs. Despite the availability of these guidelines, recognition of severe illness remains challenged in low-income settings, largely as a result of infrequent vital sign monitoring. We conducted a staggered, pre-post quasi-experimental study at four inpatient health facilities in western Uganda to assess the impact of a multi-modal intervention for improving quality of care following formal training on WHO "Quick Check" guidelines for diagnosis of severe illness in low-income settings. Intervention components were developed using the COM-B ("capability," "opportunity," and "motivation" determine "behavior") model and included clinical mentoring by an expert in severe illness care, collaborative improvement meetings with external support supervision, and continuous audits of clinical performance with structured feedback. There were 5759 patients hospitalized from August 2014 to May 2015: 1633 were admitted before and 4126 during the intervention period. Designed to occur twice monthly, collaborative improvement meetings occurred every 2-4 weeks at each site. Clinical mentoring sessions, designed to occur monthly, occurred every 4-6 months at each site. Audit and feedback reports were implemented weekly as designed. During the intervention period, there were significant increases in the site-adjusted likelihood of initial assessment of temperature, heart rate, blood pressure, respiratory rate, mental status, and pulse oximetry. Patients admitted during the intervention period were significantly more likely to be diagnosed with sepsis (4.3 vs. 0.4%, risk ratio 10.1, 95% CI 3.0-31.0, p < 0.001) and severe respiratory distress (3.9 vs. 0.9%, risk ratio 4.5, 95% CI 1.8-10.9, p = 0.001). Theory-informed quality improvement programs can improve vital sign collection and diagnosis of severe illness in low-income settings. Further implementation, evaluation, and scale-up of such interventions are needed to enhance hospital-based triage and severe illness management in these settings. Severe illness management system (SIMS) intervention development, ISRCTN46976783.

  17. [Quality control of laser imagers].

    PubMed

    Winkelbauer, F; Ammann, M; Gerstner, N; Imhof, H

    1992-11-01

    Multiformat imagers based on laser systems are used for documentation in an increasing number of investigations. The specific problems of quality control are explained and the persistence of film processing in these imager systems of different configuration with (Machine 1: 3M-Laser-Imager-Plus M952 with connected 3M Film-Processor, 3M-Film IRB, X-Rax Chemical Mixer 3M-XPM, 3M-Developer and Fixer) or without (Machine 2: 3M-Laser-Imager-Plus M952 with separate DuPont-Cronex Film-processor, Kodak IR-Film, Kodak Automixer, Kodak-Developer and Fixer) connected film processing unit are investigated. In our checking based on DIN 6868 and ONORM S 5240 we found persistence of film processing in the equipment with directly adapted film processing unit according to DIN and ONORM. The checking of film persistence as demanded by DIN 6868 in these equipment could therefore be performed in longer periods. Systems with conventional darkroom processing comparatively show plain increased fluctuation, and hence the demanded daily control is essential to guarantee appropriate reaction and constant quality of documentation.

  18. System-level change in mental health services in North Wales: An observational study using systems thinking.

    PubMed

    Evans, S; Huxley, P J; Maxwell, N; Huxley, K L S

    2014-06-01

    To describe changes to mental health services using systems thinking. Structured standardized quality of life assessment (Manchester Short Quality of Life Assessment: MANSA) was used to establish service user priorities for changes to service provision (part of a process known as check in systems thinking). Current service performance in these priority areas was identified, and changes to service arrangements were planned, implemented and monitored by task and finish (T&F) groups (making use of a process known as flow in systems thinking). 81 MANSA assessments were completed at the check stage (by NM). Work finances and leisure activities emerged as service user priority areas for change, and T&F groups were established with representation of all sectors and service users. Ways to make improvements were observed, planned and implemented by T&F groups (the flow stage). The systems approach reveals how services and quality of life have been changed for patients in Wrexham. Further generalizable research is needed into the potential benefits of using systems thinking in mental health service evaluation. © The Author(s) 2013.

  19. Self-monitoring as a viable fading option in check-in/check-out.

    PubMed

    Miller, Leila M; Dufrene, Brad A; Joe Olmi, D; Tingstrom, Daniel; Filce, Hollie

    2015-04-01

    This study systematically replaced the teacher completed Daily Behavior Report Card (DBRC) and feedback component of check-in/check-out (CICO) with self-monitoring for four elementary students referred for Tier 2 behavioral supports within School-Wide Positive Behavior Interventions and Supports (SWPBIS). An ABAB withdrawal design was used to test the effectiveness of CICO. Then, following the second B phase, teacher completion of the DBRC and corresponding feedback to students was replaced with self-monitoring. For all four participants, CICO was associated with increases in academic engagement and reductions in disruptive behavior. Moreover, students' behavioral gains were maintained when teacher completion of the DBRC was replaced with self-monitoring. Results are discussed in terms of CICO research and practice. Copyright © 2014 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.

  20. A Semiautomated Journal Check-In and Binding System; or Variations on a Common Theme

    PubMed Central

    Livingston, Frances G.

    1967-01-01

    The journal check-in project described here, though based on a computerized system, uses only unit-record equipment and is designed for the medium-sized library. The frequency codes used are based on the date printed on the journal rather than on the expected date of receipt, which allows for more stability in the coding scheme. The journal's volume number and issue number, which in other systems are usually predetermined by a computer, are inserted at the time of check-in. Routine claiming of overdue issues and a systematic binding schedule have also been developed as by-products. PMID:6041836

Top