Sample records for tooling quality control

  1. Patient-specific dosimetric endpoints based treatment plan quality control in radiotherapy.

    PubMed

    Song, Ting; Staub, David; Chen, Mingli; Lu, Weiguo; Tian, Zhen; Jia, Xun; Li, Yongbao; Zhou, Linghong; Jiang, Steve B; Gu, Xuejun

    2015-11-07

    In intensity modulated radiotherapy (IMRT), the optimal plan for each patient is specific due to unique patient anatomy. To achieve such a plan, patient-specific dosimetric goals reflecting each patient's unique anatomy should be defined and adopted in the treatment planning procedure for plan quality control. This study is to develop such a personalized treatment plan quality control tool by predicting patient-specific dosimetric endpoints (DEs). The incorporation of patient specific DEs is realized by a multi-OAR geometry-dosimetry model, capable of predicting optimal DEs based on the individual patient's geometry. The overall quality of a treatment plan is then judged with a numerical treatment plan quality indicator and characterized as optimal or suboptimal. Taking advantage of clinically available prostate volumetric modulated arc therapy (VMAT) treatment plans, we built and evaluated our proposed plan quality control tool. Using our developed tool, six of twenty evaluated plans were identified as sub-optimal plans. After plan re-optimization, these suboptimal plans achieved better OAR dose sparing without sacrificing the PTV coverage, and the dosimetric endpoints of the re-optimized plans agreed well with the model predicted values, which validate the predictability of the proposed tool. In conclusion, the developed tool is able to accurately predict optimally achievable DEs of multiple OARs, identify suboptimal plans, and guide plan optimization. It is a useful tool for achieving patient-specific treatment plan quality control.

  2. Quality control troubleshooting tools for the mill floor

    Treesearch

    John Dramm

    2000-01-01

    Statistical Process Control (SPC) provides effective tools for improving process quality in the forest products industry resulting in reduced costs and improved productivity. Implementing SPC helps identify and locate problems that occur in wood products manufacturing. SPC tools achieve their real value when applied on the mill floor for monitoring and troubleshooting...

  3. Quality Dashboards: Technical and Architectural Considerations of an Actionable Reporting Tool for Population Management

    PubMed Central

    Olsha-Yehiav, Maya; Einbinder, Jonathan S.; Jung, Eunice; Linder, Jeffrey A.; Greim, Julie; Li, Qi; Schnipper, Jeffrey L.; Middleton, Blackford

    2006-01-01

    Quality Dashboards (QD) is a condition-specific, actionable web-based application for quality reporting and population management that is integrated into the Electronic Health Record (EHR). Using server-based graphic web controls in a .Net environment to construct Quality Dashboards allows customization of the reporting tool without the need to rely on commercial business intelligence tool. Quality Dashboards will improve patient care and quality outcomes as clinicians utilize the reporting tool for population management. PMID:17238671

  4. An Annotated Reading List for Concurrent Engineering

    DTIC Science & Technology

    1989-07-01

    The seven tools are sometimes referred to as the seven old tools.) -9- Ishikawa , Kaoru , What is Total Quality Control? The Japanese Way, Prentice-Hall...some solutions. * Ishikawa (1982) presents a practical guide (with easy to use tools) for implementing qual- ity control at the working level...study of, :-, ieering for the last two years. Is..ikawa, Kaoru , Guide to Quality Control, Kraus International Publications, White Plains, NY, 1982. The

  5. A case study: application of statistical process control tool for determining process capability and sigma level.

    PubMed

    Chopra, Vikram; Bairagi, Mukesh; Trivedi, P; Nagar, Mona

    2012-01-01

    Statistical process control is the application of statistical methods to the measurement and analysis of variation process. Various regulatory authorities such as Validation Guidance for Industry (2011), International Conference on Harmonisation ICH Q10 (2009), the Health Canada guidelines (2009), Health Science Authority, Singapore: Guidance for Product Quality Review (2008), and International Organization for Standardization ISO-9000:2005 provide regulatory support for the application of statistical process control for better process control and understanding. In this study risk assessments, normal probability distributions, control charts, and capability charts are employed for selection of critical quality attributes, determination of normal probability distribution, statistical stability, and capability of production processes, respectively. The objective of this study is to determine tablet production process quality in the form of sigma process capability. By interpreting data and graph trends, forecasting of critical quality attributes, sigma process capability, and stability of process were studied. The overall study contributes to an assessment of process at the sigma level with respect to out-of-specification attributes produced. Finally, the study will point to an area where the application of quality improvement and quality risk assessment principles for achievement of six sigma-capable processes is possible. Statistical process control is the most advantageous tool for determination of the quality of any production process. This tool is new for the pharmaceutical tablet production process. In the case of pharmaceutical tablet production processes, the quality control parameters act as quality assessment parameters. Application of risk assessment provides selection of critical quality attributes among quality control parameters. Sequential application of normality distributions, control charts, and capability analyses provides a valid statistical process control study on process. Interpretation of such a study provides information about stability, process variability, changing of trends, and quantification of process ability against defective production. Comparative evaluation of critical quality attributes by Pareto charts provides the least capable and most variable process that is liable for improvement. Statistical process control thus proves to be an important tool for six sigma-capable process development and continuous quality improvement.

  6. Total Quality Management (TQM), an Overview

    DTIC Science & Technology

    1991-09-01

    Quality Management (TQM). It discusses the reasons TQM is a current growth industry, what it is, and how one implements it. It describes the basic analytical tools, statistical process control, some advanced analytical tools, tools used by process improvement teams to enhance their own operations, and action plans for making improvements. The final sections discuss assessing quality efforts and measuring the quality to knowledge

  7. GenePRIMP: A software quality control tool

    ScienceCinema

    Amrita Pati

    2017-12-09

    Amrita Pati of the DOE Joint Genome Institute's Genome Biology group describes the software tool GenePRIMP and how it fits into the quality control pipeline for microbial genomics. Further details regarding GenePRIMP appear in a paper published online May 2, 2010 in Nature Methods.

  8. An Automated Treatment Plan Quality Control Tool for Intensity-Modulated Radiation Therapy Using a Voxel-Weighting Factor-Based Re-Optimization Algorithm.

    PubMed

    Song, Ting; Li, Nan; Zarepisheh, Masoud; Li, Yongbao; Gautier, Quentin; Zhou, Linghong; Mell, Loren; Jiang, Steve; Cerviño, Laura

    2016-01-01

    Intensity-modulated radiation therapy (IMRT) currently plays an important role in radiotherapy, but its treatment plan quality can vary significantly among institutions and planners. Treatment plan quality control (QC) is a necessary component for individual clinics to ensure that patients receive treatments with high therapeutic gain ratios. The voxel-weighting factor-based plan re-optimization mechanism has been proved able to explore a larger Pareto surface (solution domain) and therefore increase the possibility of finding an optimal treatment plan. In this study, we incorporated additional modules into an in-house developed voxel weighting factor-based re-optimization algorithm, which was enhanced as a highly automated and accurate IMRT plan QC tool (TPS-QC tool). After importing an under-assessment plan, the TPS-QC tool was able to generate a QC report within 2 minutes. This QC report contains the plan quality determination as well as information supporting the determination. Finally, the IMRT plan quality can be controlled by approving quality-passed plans and replacing quality-failed plans using the TPS-QC tool. The feasibility and accuracy of the proposed TPS-QC tool were evaluated using 25 clinically approved cervical cancer patient IMRT plans and 5 manually created poor-quality IMRT plans. The results showed high consistency between the QC report quality determinations and the actual plan quality. In the 25 clinically approved cases that the TPS-QC tool identified as passed, a greater difference could be observed for dosimetric endpoints for organs at risk (OAR) than for planning target volume (PTV), implying that better dose sparing could be achieved in OAR than in PTV. In addition, the dose-volume histogram (DVH) curves of the TPS-QC tool re-optimized plans satisfied the dosimetric criteria more frequently than did the under-assessment plans. In addition, the criteria for unsatisfied dosimetric endpoints in the 5 poor-quality plans could typically be satisfied when the TPS-QC tool generated re-optimized plans without sacrificing other dosimetric endpoints. In addition to its feasibility and accuracy, the proposed TPS-QC tool is also user-friendly and easy to operate, both of which are necessary characteristics for clinical use.

  9. HPTLC Fingerprint Analysis: A Quality Control for Authentication of Herbal Phytochemicals

    NASA Astrophysics Data System (ADS)

    Ram, Mauji; Abdin, M. Z.; Khan, M. A.; Jha, Prabhakar

    Authentication and consistent quality are the basic requirement for Indian traditional medicine (TIM), Chinese traditional herbal medicine (TCHM), and their commercial products, regardless of the kind of research conducted to modernize the TIM and TCHM. The complexities of TIM and TCHM challenge the current official quality control mode, for which only a few biochemical markers were selected for identification and quantitative assay. Referring too many unknown factors existed in TIM and TCHM, it is impossible and unnecessary to pinpoint qualitatively and quantitatively every single component contained in the herbal drug. Chromatographic fingerprint is a rational option to meet the need for more effective and powerful quality assessment to TIM and TCHM. The optimized chromatographic fingerprint is not only an alternative analytical tool for authentication, but also an approach to express the various pattern of chemical ingredients distribution in the herbal drugs and preserve such "database" for further multifaced sustainable studies. Analytical separation techniques, for example, high-performance liquid chromatography (HPLC), gas chromatography (GC) and mass spectrometry (MS) were among the most popular methods of choice used for quality control of raw material and finished herbal product. Fingerprint analysis approach using high-performance thin-layer chromatography (HPTLC) has become the most potent tool for quality control of herbal medicines because of its simplicity and reliability. It can serve as a tool for identification, authentication, and quality control of herbal drugs. In this chapter, attempts are being made to expand the use of HPTLC and at the same time create interest among prospective researcher in herbal analysis. The developed method can be used as a quality control tool for rapid authentication from a wide variety of herbal samples. Some examples demonstrated the role of fingerprinting in quality control and assessment.

  10. Making Quality Sense: A Guide to Quality, Tools and Techniques, Awards and the Thinking Behind Them.

    ERIC Educational Resources Information Center

    Owen, Jane

    This document is intended to guide further education colleges and work-based learning providers through some of the commonly used tools, techniques, and theories of quality management. The following are among the topics discussed: (1) various ways of defining quality; methods used by organizations to achieve quality (quality control, quality…

  11. A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.

    PubMed

    Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco

    2018-01-01

    One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.

  12. An Automated Treatment Plan Quality Control Tool for Intensity-Modulated Radiation Therapy Using a Voxel-Weighting Factor-Based Re-Optimization Algorithm

    PubMed Central

    Song, Ting; Li, Nan; Zarepisheh, Masoud; Li, Yongbao; Gautier, Quentin; Zhou, Linghong; Mell, Loren; Jiang, Steve; Cerviño, Laura

    2016-01-01

    Intensity-modulated radiation therapy (IMRT) currently plays an important role in radiotherapy, but its treatment plan quality can vary significantly among institutions and planners. Treatment plan quality control (QC) is a necessary component for individual clinics to ensure that patients receive treatments with high therapeutic gain ratios. The voxel-weighting factor-based plan re-optimization mechanism has been proved able to explore a larger Pareto surface (solution domain) and therefore increase the possibility of finding an optimal treatment plan. In this study, we incorporated additional modules into an in-house developed voxel weighting factor-based re-optimization algorithm, which was enhanced as a highly automated and accurate IMRT plan QC tool (TPS-QC tool). After importing an under-assessment plan, the TPS-QC tool was able to generate a QC report within 2 minutes. This QC report contains the plan quality determination as well as information supporting the determination. Finally, the IMRT plan quality can be controlled by approving quality-passed plans and replacing quality-failed plans using the TPS-QC tool. The feasibility and accuracy of the proposed TPS-QC tool were evaluated using 25 clinically approved cervical cancer patient IMRT plans and 5 manually created poor-quality IMRT plans. The results showed high consistency between the QC report quality determinations and the actual plan quality. In the 25 clinically approved cases that the TPS-QC tool identified as passed, a greater difference could be observed for dosimetric endpoints for organs at risk (OAR) than for planning target volume (PTV), implying that better dose sparing could be achieved in OAR than in PTV. In addition, the dose-volume histogram (DVH) curves of the TPS-QC tool re-optimized plans satisfied the dosimetric criteria more frequently than did the under-assessment plans. In addition, the criteria for unsatisfied dosimetric endpoints in the 5 poor-quality plans could typically be satisfied when the TPS-QC tool generated re-optimized plans without sacrificing other dosimetric endpoints. In addition to its feasibility and accuracy, the proposed TPS-QC tool is also user-friendly and easy to operate, both of which are necessary characteristics for clinical use. PMID:26930204

  13. Comparison of quality control software tools for diffusion tensor imaging.

    PubMed

    Liu, Bilan; Zhu, Tong; Zhong, Jianhui

    2015-04-01

    Image quality of diffusion tensor imaging (DTI) is critical for image interpretation, diagnostic accuracy and efficiency. However, DTI is susceptible to numerous detrimental artifacts that may impair the reliability and validity of the obtained data. Although many quality control (QC) software tools are being developed and are widely used and each has its different tradeoffs, there is still no general agreement on an image quality control routine for DTIs, and the practical impact of these tradeoffs is not well studied. An objective comparison that identifies the pros and cons of each of the QC tools will be helpful for the users to make the best choice among tools for specific DTI applications. This study aims to quantitatively compare the effectiveness of three popular QC tools including DTI studio (Johns Hopkins University), DTIprep (University of North Carolina at Chapel Hill, University of Iowa and University of Utah) and TORTOISE (National Institute of Health). Both synthetic and in vivo human brain data were used to quantify adverse effects of major DTI artifacts to tensor calculation as well as the effectiveness of different QC tools in identifying and correcting these artifacts. The technical basis of each tool was discussed, and the ways in which particular techniques affect the output of each of the tools were analyzed. The different functions and I/O formats that three QC tools provide for building a general DTI processing pipeline and integration with other popular image processing tools were also discussed. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Control by quality: proposition of a typology.

    PubMed

    Pujo, P; Pillet, M

    The application of Quality tools and methods in industrial management has always had a fundamental impact on the control of production. It influences the behavior of the actors concerned, while introducing the necessary notions and formalizations, especially for production systems with little or no automation, which constitute a large part of the industrial activity. Several quality approaches are applied in the workshop and are implemented at the level of the control. In this paper, the authors present a typology of the various approaches that have successively influenced control, such as statistical process control, quality assurance, and continuous improvement. First the authors present a parallel between production control and quality organizational structure. They note the duality between control, which is aimed at increasing productivity, and quality, which aims to satisfy the needs of the customer. They also note the hierarchical organizational structure of these two systems of management with, at each level, the notion of a feedback loop. This notion is fundamental to any kind of decision making. The paper is organized around the operational, tactical, and strategic levels, by describing for each level the main methods and tools for control by quality. The overview of these tools and methods starts at the operational level, with the Statistical Process Control, the Taguchi technique, and the "six sigma" approach. On the tactical level, we find a quality system approach, with a documented description of the procedures introduced in the firm. The management system can refer here to Quality Assurance, Total Productive Maintenance, or Management by Total Quality. The formalization through procedures of the rules of decision governing the process control enhances the validity of these rules. This leads to the enhancement of their reliability and to their consolidation. All this counterbalances the human, intrinsically fluctuating, behavior of the control operators. Strategic control by quality is then detailed, and the two main approaches, the continuous improvement approach and the proactive improvement approach, are introduced. Finally, the authors observe that at each of the three levels, the continuous process improvement, which is a component of Total Quality, becomes an essential preoccupation for the control. Ultimately, the recursive utilization of the Deming cycle remains the best practice for the control by quality.

  15. Assessing Educational Processes Using Total-Quality-Management Measurement Tools.

    ERIC Educational Resources Information Center

    Macchia, Peter, Jr.

    1993-01-01

    Discussion of the use of Total Quality Management (TQM) assessment tools in educational settings highlights and gives examples of fishbone diagrams, or cause and effect charts; Pareto diagrams; control charts; histograms and check sheets; scatter diagrams; and flowcharts. Variation and quality are discussed in terms of continuous process…

  16. Inconsistency in the items included in tools used in general health research and physical therapy to evaluate the methodological quality of randomized controlled trials: a descriptive analysis

    PubMed Central

    2013-01-01

    Background Assessing the risk of bias of randomized controlled trials (RCTs) is crucial to understand how biases affect treatment effect estimates. A number of tools have been developed to evaluate risk of bias of RCTs; however, it is unknown how these tools compare to each other in the items included. The main objective of this study was to describe which individual items are included in RCT quality tools used in general health and physical therapy (PT) research, and how these items compare to those of the Cochrane Risk of Bias (RoB) tool. Methods We used comprehensive literature searches and a systematic approach to identify tools that evaluated the methodological quality or risk of bias of RCTs in general health and PT research. We extracted individual items from all quality tools. We calculated the frequency of quality items used across tools and compared them to those in the RoB tool. Comparisons were made between general health and PT quality tools using Chi-squared tests. Results In addition to the RoB tool, 26 quality tools were identified, with 19 being used in general health and seven in PT research. The total number of quality items included in general health research tools was 130, compared with 48 items across PT tools and seven items in the RoB tool. The most frequently included items in general health research tools (14/19, 74%) were inclusion and exclusion criteria, and appropriate statistical analysis. In contrast, the most frequent items included in PT tools (86%, 6/7) were: baseline comparability, blinding of investigator/assessor, and use of intention-to-treat analysis. Key items of the RoB tool (sequence generation and allocation concealment) were included in 71% (5/7) of PT tools, and 63% (12/19) and 37% (7/19) of general health research tools, respectively. Conclusions There is extensive item variation across tools that evaluate the risk of bias of RCTs in health research. Results call for an in-depth analysis of items that should be used to assess risk of bias of RCTs. Further empirical evidence on the use of individual items and the psychometric properties of risk of bias tools is needed. PMID:24044807

  17. Inconsistency in the items included in tools used in general health research and physical therapy to evaluate the methodological quality of randomized controlled trials: a descriptive analysis.

    PubMed

    Armijo-Olivo, Susan; Fuentes, Jorge; Ospina, Maria; Saltaji, Humam; Hartling, Lisa

    2013-09-17

    Assessing the risk of bias of randomized controlled trials (RCTs) is crucial to understand how biases affect treatment effect estimates. A number of tools have been developed to evaluate risk of bias of RCTs; however, it is unknown how these tools compare to each other in the items included. The main objective of this study was to describe which individual items are included in RCT quality tools used in general health and physical therapy (PT) research, and how these items compare to those of the Cochrane Risk of Bias (RoB) tool. We used comprehensive literature searches and a systematic approach to identify tools that evaluated the methodological quality or risk of bias of RCTs in general health and PT research. We extracted individual items from all quality tools. We calculated the frequency of quality items used across tools and compared them to those in the RoB tool. Comparisons were made between general health and PT quality tools using Chi-squared tests. In addition to the RoB tool, 26 quality tools were identified, with 19 being used in general health and seven in PT research. The total number of quality items included in general health research tools was 130, compared with 48 items across PT tools and seven items in the RoB tool. The most frequently included items in general health research tools (14/19, 74%) were inclusion and exclusion criteria, and appropriate statistical analysis. In contrast, the most frequent items included in PT tools (86%, 6/7) were: baseline comparability, blinding of investigator/assessor, and use of intention-to-treat analysis. Key items of the RoB tool (sequence generation and allocation concealment) were included in 71% (5/7) of PT tools, and 63% (12/19) and 37% (7/19) of general health research tools, respectively. There is extensive item variation across tools that evaluate the risk of bias of RCTs in health research. Results call for an in-depth analysis of items that should be used to assess risk of bias of RCTs. Further empirical evidence on the use of individual items and the psychometric properties of risk of bias tools is needed.

  18. Terms of Productivity, Including the Relationship Between Productivity, Effectiveness and Efficiency.

    DTIC Science & Technology

    1989-04-01

    for Awareness Juran on Planning for Quality, 1988, J.M. Juran What is Total Quality Control? The Japanese Way, 1985, Kaoru Ishikawa Guide to Quality...Control, 1982, Kaoru Ishikawa Andrews, M. (1985). Statistical Process Control: Mandatory Management Tool. Production April 1985. Bushe, G. (1988

  19. Assessing Lymphatic Filariasis Data Quality in Endemic Communities in Ghana, Using the Neglected Tropical Diseases Data Quality Assessment Tool for Preventive Chemotherapy.

    PubMed

    de Souza, Dziedzom K; Yirenkyi, Eric; Otchere, Joseph; Biritwum, Nana-Kwadwo; Ameme, Donne K; Sackey, Samuel; Ahorlu, Collins; Wilson, Michael D

    2016-03-01

    The activities of the Global Programme for the Elimination of Lymphatic Filariasis have been in operation since the year 2000, with Mass Drug Administration (MDA) undertaken yearly in disease endemic communities. Information collected during MDA-such as population demographics, age, sex, drugs used and remaining, and therapeutic and geographic coverage-can be used to assess the quality of the data reported. To assist country programmes in evaluating the information reported, the WHO, in collaboration with NTD partners, including ENVISION/RTI, developed an NTD Data Quality Assessment (DQA) tool, for use by programmes. This study was undertaken to evaluate the tool and assess the quality of data reported in some endemic communities in Ghana. A cross sectional study, involving review of data registers and interview of drug distributors, disease control officers, and health information officers using the NTD DQA tool, was carried out in selected communities in three LF endemic Districts in Ghana. Data registers for service delivery points were obtained from District health office for assessment. The assessment verified reported results in comparison with recounted values for five indicators: number of tablets received, number of tablets used, number of tablets remaining, MDA coverage, and population treated. Furthermore, drug distributors, disease control officers, and health information officers (at the first data aggregation level), were interviewed, using the DQA tool, to determine the performance of the functional areas of the data management system. The results showed that over 60% of the data reported were inaccurate, and exposed the challenges and limitations of the data management system. The DQA tool is a very useful monitoring and evaluation (M&E) tool that can be used to elucidate and address data quality issues in various NTD control programmes.

  20. Handling Qualities Evaluation of Pilot Tools for Spacecraft Docking in Earth Orbit

    NASA Technical Reports Server (NTRS)

    Bilimoria, Karl D.; Mueller, Eric; Frost, Chad

    2009-01-01

    A new generation of spacecraft is now under development by NASA to replace the Space Shuttle and return astronauts to the Moon. These spacecraft will have a manual control capability for several mission tasks, and the ease and precision with which pilots can execute these tasks will have an important effect on mission risk and training costs. This paper focuses on the handling qualities of a spacecraft based on dynamics similar to that of the Crew Exploration Vehicle, during the last segment of the docking task with a space station in low Earth orbit. A previous study established that handling qualities for this task degrade significantly as the level of translation-into-rotation coupling increases. The goal of this study is to evaluate the efficacy of various pilot aids designed to mitigate the handling qualities degradation caused by this coupling. Four pilot tools were ev adluaetead:d-band box/indicator, flight-path marker, translation guidance cues, and feed-forward control. Each of these pilot tools improved handling qualities, generally with greater improvements resulting from using these tools in combination. A key result of this study is that feedforward control effectively counteracts coupling effects, providing solid Level 1 handling qualities for the spacecraft configuration evaluated.

  1. Assessments of the quality of randomized controlled trials published in International Journal of Urology from 1994 to 2011.

    PubMed

    Cho, Hee Ju; Chung, Jae Hoon; Jo, Jung Ki; Kang, Dong Hyuk; Cho, Jeong Man; Yoo, Tag Keun; Lee, Seung Wook

    2013-12-01

    Randomized controlled trials are one of the most reliable resources for assessing the effectiveness and safety of medical treatments. Low quality randomized controlled trials carry a large bias that can ultimately impair the reliability of their conclusions. The present study aimed to evaluate the quality of randomized controlled trials published in International Journal of Urology by using multiple quality assessment tools. Randomized controlled trials articles published in International Journal of Urology were found using the PubMed MEDLINE database, and qualitative analysis was carried out with three distinct assessment tools: the Jadad scale, the van Tulder scale and the Cochrane Collaboration Risk of Bias Tool. The quality of randomized controlled trials was analyzed by publication year, type of subjects, intervention, presence of funding and whether an institutional review board reviewed the study. A total of 68 randomized controlled trial articles were published among a total of 1399 original articles in International Journal of Urology. Among these randomized controlled trials, 10 (2.70%) were from 1994 to 1999, 23 (4.10%) were from 2000 to 2005 and 35 (4.00%) were from 2006 to 2011 (P = 0.494). On the assessment with the Jadad and van Tulder scale, the numbers and percentage of high quality randomized controlled trials increased over time. The studies that had institutional review board reviews, funding resources or that were carried out in multiple institutions had an increased percentage of high quality articles. The numbers and percentage of high-quality randomized controlled trials published in International Journal of Urology have increased over time. Furthermore, randomized controlled trials with funding resources, institutional review board reviews or carried out in multiple institutions have been found to be of higher quality compared with others not presenting these features. © 2013 The Japanese Urological Association.

  2. Chemometrics-based process analytical technology (PAT) tools: applications and adaptation in pharmaceutical and biopharmaceutical industries.

    PubMed

    Challa, Shruthi; Potumarthi, Ravichandra

    2013-01-01

    Process analytical technology (PAT) is used to monitor and control critical process parameters in raw materials and in-process products to maintain the critical quality attributes and build quality into the product. Process analytical technology can be successfully implemented in pharmaceutical and biopharmaceutical industries not only to impart quality into the products but also to prevent out-of-specifications and improve the productivity. PAT implementation eliminates the drawbacks of traditional methods which involves excessive sampling and facilitates rapid testing through direct sampling without any destruction of sample. However, to successfully adapt PAT tools into pharmaceutical and biopharmaceutical environment, thorough understanding of the process is needed along with mathematical and statistical tools to analyze large multidimensional spectral data generated by PAT tools. Chemometrics is a chemical discipline which incorporates both statistical and mathematical methods to obtain and analyze relevant information from PAT spectral tools. Applications of commonly used PAT tools in combination with appropriate chemometric method along with their advantages and working principle are discussed. Finally, systematic application of PAT tools in biopharmaceutical environment to control critical process parameters for achieving product quality is diagrammatically represented.

  3. Artificial Intelligence Approach to Support Statistical Quality Control Teaching

    ERIC Educational Resources Information Center

    Reis, Marcelo Menezes; Paladini, Edson Pacheco; Khator, Suresh; Sommer, Willy Arno

    2006-01-01

    Statistical quality control--SQC (consisting of Statistical Process Control, Process Capability Studies, Acceptance Sampling and Design of Experiments) is a very important tool to obtain, maintain and improve the Quality level of goods and services produced by an organization. Despite its importance, and the fact that it is taught in technical and…

  4. Developing a consumer evaluation tool of weight control strategy advertisements on the Internet.

    PubMed

    Luevorasirikul, Kanokrat; Gray, Nicola J; Anderson, Claire W

    2008-06-01

    To develop two evaluation tools for weight loss and weight gain advertisements on the Internet in order to help consumers to evaluate the quality of information within these advertisements. One hundred websites identified by Internet search engines for weight loss and weight gain strategies (50 websites each) were evaluated using two specific scoring instruments, developed by adapting questions from the 'DISCERN' tool and reviewing all related weight control guidelines and advertising regulations. The validity and reliability of the adapted tools were tested. Our evaluation tools rated the information from most websites as poor quality (70%). In the case of weight loss strategies, statements about rapid (18%) and permanent (28%) weight loss caused concern as well as lack of sensible advice about dieting and a lack of product warnings (84%). Safety concerns relating to weight gain products were the lack of warnings about side effects in products containing steroids and creatine (92%). The adapted tools exhibited acceptable validity and reliability. Quality of information within weight control advertisements on the Internet was generally poor. Problems of false claims, little advice on healthy ways to modify weight and few warnings on side effects have been highlighted in this study.

  5. Implementation of "Quality by Design (QbD)" Approach for the Development of 5-Fluorouracil Loaded Thermosensitive Hydrogel.

    PubMed

    Dalwadi, Chintan; Patel, Gayatri

    2016-01-01

    The purpose of this study was to investigate Quality by Design (QbD) principle for the preparation of hydrogel products to prove both practicability and utility of executing QbD concept to hydrogel based controlled release systems. Product and process understanding will help in decreasing the variability of critical material and process parameters, which give quality product output and reduce the risk. This study includes the identification of the Quality Target Product Profiles (QTPPs) and Critical Quality Attributes (CQAs) from literature or preliminary studies. To identify and control the variability in process and material attributes, two tools of QbD was utilized, Quality Risk Management (QRM) and Experimental Design. Further, it helps to identify the effect of these attributes on CQAs. Potential risk factors were identified from fishbone diagram and screened by risk assessment and optimized by 3-level 2- factor experimental design with center points in triplicate, to analyze the precision of the target process. This optimized formulation was further characterized by gelling time, gelling temperature, rheological parameters, in-vitro biodegradation and in-vitro drug release. Design space was created using experimental design tool that gives the control space and working within this controlled space reduces all the failure modes below the risk level. In conclusion, QbD approach with QRM tool provides potent and effectual pyramid to enhance the quality into the hydrogel.

  6. Standardisation of DNA quantitation by image analysis: quality control of instrumentation.

    PubMed

    Puech, M; Giroud, F

    1999-05-01

    DNA image analysis is frequently performed in clinical practice as a prognostic tool and to improve diagnosis. The precision of prognosis and diagnosis depends on the accuracy of analysis and particularly on the quality of image analysis systems. It has been reported that image analysis systems used for DNA quantification differ widely in their characteristics (Thunissen et al.: Cytometry 27: 21-25, 1997). This induces inter-laboratory variations when the same sample is analysed in different laboratories. In microscopic image analysis, the principal instrumentation errors arise from the optical and electronic parts of systems. They bring about problems of instability, non-linearity, and shading and glare phenomena. The aim of this study is to establish tools and standardised quality control procedures for microscopic image analysis systems. Specific reference standard slides have been developed to control instability, non-linearity, shading and glare phenomena and segmentation efficiency. Some systems have been controlled with these tools and these quality control procedures. Interpretation criteria and accuracy limits of these quality control procedures are proposed according to the conclusions of a European project called PRESS project (Prototype Reference Standard Slide). Beyond these limits, tested image analysis systems are not qualified to realise precise DNA analysis. The different procedures presented in this work determine if an image analysis system is qualified to deliver sufficiently precise DNA measurements for cancer case analysis. If the controlled systems are beyond the defined limits, some recommendations are given to find a solution to the problem.

  7. Human Connectome Project Informatics: quality control, database services, and data visualization

    PubMed Central

    Marcus, Daniel S.; Harms, Michael P.; Snyder, Abraham Z.; Jenkinson, Mark; Wilson, J Anthony; Glasser, Matthew F.; Barch, Deanna M.; Archie, Kevin A.; Burgess, Gregory C.; Ramaratnam, Mohana; Hodge, Michael; Horton, William; Herrick, Rick; Olsen, Timothy; McKay, Michael; House, Matthew; Hileman, Michael; Reid, Erin; Harwell, John; Coalson, Timothy; Schindler, Jon; Elam, Jennifer S.; Curtiss, Sandra W.; Van Essen, David C.

    2013-01-01

    The Human Connectome Project (HCP) has developed protocols, standard operating and quality control procedures, and a suite of informatics tools to enable high throughput data collection, data sharing, automated data processing and analysis, and data mining and visualization. Quality control procedures include methods to maintain data collection consistency over time, to measure head motion, and to establish quantitative modality-specific overall quality assessments. Database services developed as customizations of the XNAT imaging informatics platform support both internal daily operations and open access data sharing. The Connectome Workbench visualization environment enables user interaction with HCP data and is increasingly integrated with the HCP's database services. Here we describe the current state of these procedures and tools and their application in the ongoing HCP study. PMID:23707591

  8. Dynamic Analyses of Result Quality in Energy-Aware Approximate Programs

    NASA Astrophysics Data System (ADS)

    RIngenburg, Michael F.

    Energy efficiency is a key concern in the design of modern computer systems. One promising approach to energy-efficient computation, approximate computing, trades off output precision for energy efficiency. However, this tradeoff can have unexpected effects on computation quality. This thesis presents dynamic analysis tools to study, debug, and monitor the quality and energy efficiency of approximate computations. We propose three styles of tools: prototyping tools that allow developers to experiment with approximation in their applications, online tools that instrument code to determine the key sources of error, and online tools that monitor the quality of deployed applications in real time. Our prototyping tool is based on an extension to the functional language OCaml. We add approximation constructs to the language, an approximation simulator to the runtime, and profiling and auto-tuning tools for studying and experimenting with energy-quality tradeoffs. We also present two online debugging tools and three online monitoring tools. The first online tool identifies correlations between output quality and the total number of executions of, and errors in, individual approximate operations. The second tracks the number of approximate operations that flow into a particular value. Our online tools comprise three low-cost approaches to dynamic quality monitoring. They are designed to monitor quality in deployed applications without spending more energy than is saved by approximation. Online monitors can be used to perform real time adjustments to energy usage in order to meet specific quality goals. We present prototype implementations of all of these tools and describe their usage with several applications. Our prototyping, profiling, and autotuning tools allow us to experiment with approximation strategies and identify new strategies, our online tools succeed in providing new insights into the effects of approximation on output quality, and our monitors succeed in controlling output quality while still maintaining significant energy efficiency gains.

  9. Statistical quality control through overall vibration analysis

    NASA Astrophysics Data System (ADS)

    Carnero, M. a. Carmen; González-Palma, Rafael; Almorza, David; Mayorga, Pedro; López-Escobar, Carlos

    2010-05-01

    The present study introduces the concept of statistical quality control in automotive wheel bearings manufacturing processes. Defects on products under analysis can have a direct influence on passengers' safety and comfort. At present, the use of vibration analysis on machine tools for quality control purposes is not very extensive in manufacturing facilities. Noise and vibration are common quality problems in bearings. These failure modes likely occur under certain operating conditions and do not require high vibration amplitudes but relate to certain vibration frequencies. The vibration frequencies are affected by the type of surface problems (chattering) of ball races that are generated through grinding processes. The purpose of this paper is to identify grinding process variables that affect the quality of bearings by using statistical principles in the field of machine tools. In addition, an evaluation of the quality results of the finished parts under different combinations of process variables is assessed. This paper intends to establish the foundations to predict the quality of the products through the analysis of self-induced vibrations during the contact between the grinding wheel and the parts. To achieve this goal, the overall self-induced vibration readings under different combinations of process variables are analysed using statistical tools. The analysis of data and design of experiments follows a classical approach, considering all potential interactions between variables. The analysis of data is conducted through analysis of variance (ANOVA) for data sets that meet normality and homoscedasticity criteria. This paper utilizes different statistical tools to support the conclusions such as chi squared, Shapiro-Wilks, symmetry, Kurtosis, Cochran, Hartlett, and Hartley and Krushal-Wallis. The analysis presented is the starting point to extend the use of predictive techniques (vibration analysis) for quality control. This paper demonstrates the existence of predictive variables (high-frequency vibration displacements) that are sensible to the processes setup and the quality of the products obtained. Based on the result of this overall vibration analysis, a second paper will analyse self-induced vibration spectrums in order to define limit vibration bands, controllable every cycle or connected to permanent vibration-monitoring systems able to adjust sensible process variables identified by ANOVA, once the vibration readings exceed established quality limits.

  10. 10 CFR 71.125 - Control of measuring and test equipment.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... MATERIAL Quality Assurance § 71.125 Control of measuring and test equipment. The licensee, certificate holder, and applicant for a CoC shall establish measures to assure that tools, gauges, instruments, and other measuring and testing devices used in activities affecting quality are properly controlled...

  11. Developing and using a rubric for evaluating evidence-based medicine point-of-care tools.

    PubMed

    Shurtz, Suzanne; Foster, Margaret J

    2011-07-01

    The research sought to establish a rubric for evaluating evidence-based medicine (EBM) point-of-care tools in a health sciences library. The authors searched the literature for EBM tool evaluations and found that most previous reviews were designed to evaluate the ability of an EBM tool to answer a clinical question. The researchers' goal was to develop and complete rubrics for assessing these tools based on criteria for a general evaluation of tools (reviewing content, search options, quality control, and grading) and criteria for an evaluation of clinical summaries (searching tools for treatments of common diagnoses and evaluating summaries for quality control). Differences between EBM tools' options, content coverage, and usability were minimal. However, the products' methods for locating and grading evidence varied widely in transparency and process. As EBM tools are constantly updating and evolving, evaluation of these tools needs to be conducted frequently. Standards for evaluating EBM tools need to be established, with one method being the use of objective rubrics. In addition, EBM tools need to provide more information about authorship, reviewers, methods for evidence collection, and grading system employed.

  12. Quality Control System using Simple Implementation of Seven Tools for Batik Textile Manufacturing

    NASA Astrophysics Data System (ADS)

    Ragil Suryoputro, Muhammad; Sugarindra, Muchamad; Erfaisalsyah, Hendy

    2017-06-01

    In order to produce better products and mitigate defect in products, every company must implement a quality control system. Company will find means to implement a quality control system that is capable and reliable. One of the methods is using the simple implementation of the seven tools in quality control defects. The case studied in this research was the level of disability xyz grey fabric on a shuttle loom 2 on the Batik manufacturing company. The seven tools that include: flowchart, check sheet, histogram, scatter diagram combined with control charts, Pareto diagrams and fishbone diagrams (causal diagram). Check sheet results obtained types of defects in the grey fabric was woven xyz is warp, double warp, the warp break, double warp, empty warp, warp tenuous, ugly edges, thick warp, and rust. Based on the analysis of control chart indicates that the process is out of control. This can be seen in the graph control where there is still a lot of outlier data. Based on a scatter diagram shows a positive correlation between the percentage of disability and the number of production. Based on Pareto diagram, repair needs priority is for the dominant type of defect is warp (44%) and based on double warp value histogram is also the highest with a value of 23635.11 m. In addition, based on the analysis of the factors causing defect by fishbone diagram double warp or other types of defects originating from the materials, methods, machines, measurements, man and environment. Thus the company can take to minimize the prevention and repair of defects and improve product quality.

  13. The evolution of diagnosis-related groups (DRGs): from its beginnings in case-mix and resource use theory, to its implementation for payment and now for its current utilization for quality within and outside the hospital.

    PubMed

    Goldfield, Norbert

    2010-01-01

    Policymakers are searching for ways to control health care costs and improve quality. Diagnosis-related groups (DRGs) are by far the most important cost control and quality improvement tool that governments and private payers have implemented. This article reviews why DRGs have had this singular success both in the hospital sector and, over the past 10 years, in ambulatory and managed care settings. Last, the author reviews current trends in the development and implementation of tools that have the key ingredients of DRG success: categorical clinical model, separation of the clinical model from payment weights, separate payment adjustments for nonclinical factors, and outlier payments. Virtually all current tools used to manage health care costs and improve quality do not have these characteristics. This failure explains a key reason for the failure, for example, of the Medicare Advantage program to control health care costs. This article concludes with a discussion of future developments for DRG-type models outside the hospital sector.

  14. Manufacturing Execution Systems: Examples of Performance Indicator and Operational Robustness Tools.

    PubMed

    Gendre, Yannick; Waridel, Gérard; Guyon, Myrtille; Demuth, Jean-François; Guelpa, Hervé; Humbert, Thierry

    Manufacturing Execution Systems (MES) are computerized systems used to measure production performance in terms of productivity, yield, and quality. In the first part, performance indicator and overall equipment effectiveness (OEE), process robustness tools and statistical process control are described. The second part details some tools to help process robustness and control by operators by preventing deviations from target control charts. MES was developed by Syngenta together with CIMO for automation.

  15. Information quality-control model

    NASA Technical Reports Server (NTRS)

    Vincent, D. A.

    1971-01-01

    Model serves as graphic tool for estimating complete product objectives from limited input information, and is applied to cost estimations, product-quality evaluations, and effectiveness measurements for manpower resources allocation. Six product quality levels are defined.

  16. Controlling Pollutants and Sources: Indoor Air Quality Design Tools for Schools

    EPA Pesticide Factsheets

    To protect indoor environmental quality the designer should understand indoor air quality problems and seek to eliminate potential sources of contamination that originate from outdoors as well as indoors.

  17. Quality control and in-service inspection technology for hybrid-composite girder bridges.

    DOT National Transportation Integrated Search

    2014-08-01

    This report describes efforts to develop quality control tools and in-service inspection technologies for the fabrication and construction of Hybrid Composite Beams (HCBs). HCBs are a new bridge technology currently being evaluated by the Missouri De...

  18. Innovative tools for quality assessment: integrated quality criteria for review of multiple study designs (ICROMS).

    PubMed

    Zingg, W; Castro-Sanchez, E; Secci, F V; Edwards, R; Drumright, L N; Sevdalis, N; Holmes, A H

    2016-04-01

    With the aim to facilitate a more comprehensive review process in public health including patient safety, we established a tool that we have termed ICROMS (Integrated quality Criteria for the Review Of Multiple Study designs), which unifies, integrates and refines current quality criteria for a large range of study designs including qualitative research. Review, pilot testing and expert consensus. The tool is the result of an iterative four phase process over two years: 1) gathering of established criteria for assessing controlled, non-controlled and qualitative study designs; 2) pilot testing of a first version in two systematic reviews on behavioural change in infection prevention and control and in antibiotic prescribing; 3) further refinement and adding of additional study designs in the context of the European Centre for Disease Prevention and Control funded project 'Systematic review and evidence-based guidance on organisation of hospital infection control programmes' (SIGHT); 4) scrutiny by the pan-European expert panel of the SIGHT project, which had the objective of ensuring robustness of the systematic review. ICROMS includes established quality criteria for randomised studies, controlled before-and-after studies and interrupted time series, and incorporates criteria for non-controlled before-and-after studies, cohort studies and qualitative studies. The tool consists of two parts: 1) a list of quality criteria specific for each study design, as well as criteria applicable across all study designs by using a scoring system; 2) a 'decision matrix', which specifies the robustness of the study by identifying minimum requirements according to the study type and the relevance of the study to the review question. The decision matrix directly determines inclusion or exclusion of a study in the review. ICROMS was applied to a series of systematic reviews to test its feasibility and usefulness in the appraisal of multiple study designs. The tool was applicable across a wide range of study designs and outcome measures. ICROMS is a comprehensive yet feasible appraisal of a large range of study designs to be included in systematic reviews addressing behaviour change studies in patient safety and public health. The tool is sufficiently flexible to be applied to a variety of other domains in health-related research. Beyond its application to systematic reviews, we envisage that ICROMS can have a positive effect on researchers to be more rigorous in their study design and more diligent in their reporting. Copyright © 2015 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.

  19. ChiLin: a comprehensive ChIP-seq and DNase-seq quality control and analysis pipeline.

    PubMed

    Qin, Qian; Mei, Shenglin; Wu, Qiu; Sun, Hanfei; Li, Lewyn; Taing, Len; Chen, Sujun; Li, Fugen; Liu, Tao; Zang, Chongzhi; Xu, Han; Chen, Yiwen; Meyer, Clifford A; Zhang, Yong; Brown, Myles; Long, Henry W; Liu, X Shirley

    2016-10-03

    Transcription factor binding, histone modification, and chromatin accessibility studies are important approaches to understanding the biology of gene regulation. ChIP-seq and DNase-seq have become the standard techniques for studying protein-DNA interactions and chromatin accessibility respectively, and comprehensive quality control (QC) and analysis tools are critical to extracting the most value from these assay types. Although many analysis and QC tools have been reported, few combine ChIP-seq and DNase-seq data analysis and quality control in a unified framework with a comprehensive and unbiased reference of data quality metrics. ChiLin is a computational pipeline that automates the quality control and data analyses of ChIP-seq and DNase-seq data. It is developed using a flexible and modular software framework that can be easily extended and modified. ChiLin is ideal for batch processing of many datasets and is well suited for large collaborative projects involving ChIP-seq and DNase-seq from different designs. ChiLin generates comprehensive quality control reports that include comparisons with historical data derived from over 23,677 public ChIP-seq and DNase-seq samples (11,265 datasets) from eight literature-based classified categories. To the best of our knowledge, this atlas represents the most comprehensive ChIP-seq and DNase-seq related quality metric resource currently available. These historical metrics provide useful heuristic quality references for experiment across all commonly used assay types. Using representative datasets, we demonstrate the versatility of the pipeline by applying it to different assay types of ChIP-seq data. The pipeline software is available open source at https://github.com/cfce/chilin . ChiLin is a scalable and powerful tool to process large batches of ChIP-seq and DNase-seq datasets. The analysis output and quality metrics have been structured into user-friendly directories and reports. We have successfully compiled 23,677 profiles into a comprehensive quality atlas with fine classification for users.

  20. GEOSPATIAL QUALITY COUNCIL

    EPA Science Inventory

    Geospatial Science is increasingly becoming an important tool in making Agency decisions. QualIty Control and Quality Assurance are required to be integrated during the planning, implementation and assessment of geospatial databases, processes and products. In order to ensure Age...

  1. Quality and Efficiency Improvement Tools for Every Radiologist.

    PubMed

    Kudla, Alexei U; Brook, Olga R

    2018-06-01

    In an era of value-based medicine, data-driven quality improvement is more important than ever to ensure safe and efficient imaging services. Familiarity with high-value tools enables all radiologists to successfully engage in quality and efficiency improvement. In this article, we review the model for improvement, strategies for measurement, and common practical tools with real-life examples that include Run chart, Control chart (Shewhart chart), Fishbone (Cause-and-Effect or Ishikawa) diagram, Pareto chart, 5 Whys, and Root Cause Analysis. Copyright © 2018 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  2. CUSTOMER/SUPPLIER ACCOUNTABILITY AND PROGRAM IMPLEMENTATION

    EPA Science Inventory

    Quality assurance (QA) and quality control (QC) are the basic components of a QA program, which is a fundamental quality management tool. he quality of outputs and services strongly depends on the caliber of the communications between the "customer" and the "supplier." lear under...

  3. Developing and using a rubric for evaluating evidence-based medicine point-of-care tools

    PubMed Central

    Foster, Margaret J

    2011-01-01

    Objective: The research sought to establish a rubric for evaluating evidence-based medicine (EBM) point-of-care tools in a health sciences library. Methods: The authors searched the literature for EBM tool evaluations and found that most previous reviews were designed to evaluate the ability of an EBM tool to answer a clinical question. The researchers' goal was to develop and complete rubrics for assessing these tools based on criteria for a general evaluation of tools (reviewing content, search options, quality control, and grading) and criteria for an evaluation of clinical summaries (searching tools for treatments of common diagnoses and evaluating summaries for quality control). Results: Differences between EBM tools' options, content coverage, and usability were minimal. However, the products' methods for locating and grading evidence varied widely in transparency and process. Conclusions: As EBM tools are constantly updating and evolving, evaluation of these tools needs to be conducted frequently. Standards for evaluating EBM tools need to be established, with one method being the use of objective rubrics. In addition, EBM tools need to provide more information about authorship, reviewers, methods for evidence collection, and grading system employed. PMID:21753917

  4. qcML: An Exchange Format for Quality Control Metrics from Mass Spectrometry Experiments*

    PubMed Central

    Walzer, Mathias; Pernas, Lucia Espona; Nasso, Sara; Bittremieux, Wout; Nahnsen, Sven; Kelchtermans, Pieter; Pichler, Peter; van den Toorn, Henk W. P.; Staes, An; Vandenbussche, Jonathan; Mazanek, Michael; Taus, Thomas; Scheltema, Richard A.; Kelstrup, Christian D.; Gatto, Laurent; van Breukelen, Bas; Aiche, Stephan; Valkenborg, Dirk; Laukens, Kris; Lilley, Kathryn S.; Olsen, Jesper V.; Heck, Albert J. R.; Mechtler, Karl; Aebersold, Ruedi; Gevaert, Kris; Vizcaíno, Juan Antonio; Hermjakob, Henning; Kohlbacher, Oliver; Martens, Lennart

    2014-01-01

    Quality control is increasingly recognized as a crucial aspect of mass spectrometry based proteomics. Several recent papers discuss relevant parameters for quality control and present applications to extract these from the instrumental raw data. What has been missing, however, is a standard data exchange format for reporting these performance metrics. We therefore developed the qcML format, an XML-based standard that follows the design principles of the related mzML, mzIdentML, mzQuantML, and TraML standards from the HUPO-PSI (Proteomics Standards Initiative). In addition to the XML format, we also provide tools for the calculation of a wide range of quality metrics as well as a database format and interconversion tools, so that existing LIMS systems can easily add relational storage of the quality control data to their existing schema. We here describe the qcML specification, along with possible use cases and an illustrative example of the subsequent analysis possibilities. All information about qcML is available at http://code.google.com/p/qcml. PMID:24760958

  5. qcML: an exchange format for quality control metrics from mass spectrometry experiments.

    PubMed

    Walzer, Mathias; Pernas, Lucia Espona; Nasso, Sara; Bittremieux, Wout; Nahnsen, Sven; Kelchtermans, Pieter; Pichler, Peter; van den Toorn, Henk W P; Staes, An; Vandenbussche, Jonathan; Mazanek, Michael; Taus, Thomas; Scheltema, Richard A; Kelstrup, Christian D; Gatto, Laurent; van Breukelen, Bas; Aiche, Stephan; Valkenborg, Dirk; Laukens, Kris; Lilley, Kathryn S; Olsen, Jesper V; Heck, Albert J R; Mechtler, Karl; Aebersold, Ruedi; Gevaert, Kris; Vizcaíno, Juan Antonio; Hermjakob, Henning; Kohlbacher, Oliver; Martens, Lennart

    2014-08-01

    Quality control is increasingly recognized as a crucial aspect of mass spectrometry based proteomics. Several recent papers discuss relevant parameters for quality control and present applications to extract these from the instrumental raw data. What has been missing, however, is a standard data exchange format for reporting these performance metrics. We therefore developed the qcML format, an XML-based standard that follows the design principles of the related mzML, mzIdentML, mzQuantML, and TraML standards from the HUPO-PSI (Proteomics Standards Initiative). In addition to the XML format, we also provide tools for the calculation of a wide range of quality metrics as well as a database format and interconversion tools, so that existing LIMS systems can easily add relational storage of the quality control data to their existing schema. We here describe the qcML specification, along with possible use cases and an illustrative example of the subsequent analysis possibilities. All information about qcML is available at http://code.google.com/p/qcml. © 2014 by The American Society for Biochemistry and Molecular Biology, Inc.

  6. Total Quality Management: A Recipe for Success

    DTIC Science & Technology

    1990-04-02

    Total Quality Management (TQM) is a high level Department of Defense (DOD) initiative that is being touted as the primary management tool to force...to create a DOD wide organizational climate that will stimulate and perpetuate individual productivity enhancing contributions. Keywords: Quality control; Quality management ; TQM.

  7. FQC Dashboard: integrates FastQC results into a web-based, interactive, and extensible FASTQ quality control tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Joseph; Pirrung, Meg; McCue, Lee Ann

    FQC is software that facilitates large-scale quality control of FASTQ files by carrying out a QC protocol, parsing results, and aggregating quality metrics within and across experiments into an interactive dashboard. The dashboard utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data.

  8. Ten tools of continuous quality improvement: a review and case example of hospital discharge.

    PubMed

    Ziegenfuss, J T; McKenna, C K

    1995-01-01

    Concepts and methods of continuous quality improvement have been endorsed by quality specialists in American Health care, and their use has convinced CEOs that industrial methods can make a contribution to health and medical care. For all the quality improvement publications, there are still few that offer a clear, concise definition and an explanation of the primary tools for teaching purposes. This report reviews ten continuous quality improvement methods including: problem solving cycle, affinity diagrams, cause and effect diagrams, Pareto diagrams, histograms, bar charts, control charts, scatter diagrams, checklists, and a process decision program chart. These do not represent an exhaustive list, but a set of commonly used tools. They are applied to a case study of bed utilization in a university hospital.

  9. Median of patient results as a tool for assessment of analytical stability.

    PubMed

    Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft; Sölétormos, György

    2015-06-15

    In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable analytical bias based on biological variation. Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. Patient results applied in analytical quality performance control procedures are the most reliable sources of material as they represent the genuine substance of the measurements and therefore circumvent the problems associated with non-commutable materials in external assessment. Patient medians in the monthly monitoring of analytical stability in laboratory medicine are an inexpensive, simple and reliable tool to monitor the steadiness of the analytical practice. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Consequent use of IT tools as a driver for cost reduction and quality improvements

    NASA Astrophysics Data System (ADS)

    Hein, Stefan; Rapp, Roberto; Feustel, Andreas

    2013-10-01

    The semiconductor industry drives a lot of efforts in the field of cost reductions and quality improvements. The consequent use of IT tools is one possibility to support these goals. With the extensions of its 150mm Fab to 200mm Robert Bosch increased the systematic use of data analysis and Advanced Process Control (APC).

  11. Quality control in diagnostic immunohistochemistry: integrated on-slide positive controls.

    PubMed

    Bragoni, A; Gambella, A; Pigozzi, S; Grigolini, M; Fiocca, R; Mastracci, L; Grillo, F

    2017-11-01

    Standardization in immunohistochemistry is a priority in modern pathology and requires strict quality control. Cost containment has also become fundamental and auditing of all procedures must take into account both these principles. Positive controls must be routinely performed so that their positivity guarantees the appropriateness of the immunohistochemical procedure. The aim of this study is to develop a low cost (utilizing a punch biopsy-PB-tool) procedure to construct positive controls which can be integrated in the patient's tissue slide. Sixteen frequently used control blocks were selected and multiple cylindrical samples were obtained using a 5-mm diameter punch biopsy tool, separately re-embedding them in single blocks. For each diagnostic immunoreaction requiring a positive control, an integrated PB-control section (cut from the appropriate PB-control block) was added to the top right corner of the diagnostic slide before immunostaining. This integrated control technique permitted a saving of 4.75% in total direct lab costs and proved to be technically feasible and reliable. Our proposal is easy to perform and within the reach of all pathology labs, requires easily available tools, its application costs is less than using external paired controls and ensures that a specific control for each slide is always available.

  12. The methodological quality assessment tools for preclinical and clinical studies, systematic review and meta-analysis, and clinical practice guideline: a systematic review.

    PubMed

    Zeng, Xiantao; Zhang, Yonggang; Kwong, Joey S W; Zhang, Chao; Li, Sheng; Sun, Feng; Niu, Yuming; Du, Liang

    2015-02-01

    To systematically review the methodological assessment tools for pre-clinical and clinical studies, systematic review and meta-analysis, and clinical practice guideline. We searched PubMed, the Cochrane Handbook for Systematic Reviews of Interventions, Joanna Briggs Institute (JBI) Reviewers Manual, Centre for Reviews and Dissemination, Critical Appraisal Skills Programme (CASP), Scottish Intercollegiate Guidelines Network (SIGN), and the National Institute for Clinical Excellence (NICE) up to May 20th, 2014. Two authors selected studies and extracted data; quantitative analysis was performed to summarize the characteristics of included tools. We included a total of 21 assessment tools for analysis. A number of tools were developed by academic organizations, and some were developed by only a small group of researchers. The JBI developed the highest number of methodological assessment tools, with CASP coming second. Tools for assessing the methodological quality of randomized controlled studies were most abundant. The Cochrane Collaboration's tool for assessing risk of bias is the best available tool for assessing RCTs. For cohort and case-control studies, we recommend the use of the Newcastle-Ottawa Scale. The Methodological Index for Non-Randomized Studies (MINORS) is an excellent tool for assessing non-randomized interventional studies, and the Agency for Healthcare Research and Quality (ARHQ) methodology checklist is applicable for cross-sectional studies. For diagnostic accuracy test studies, the Quality Assessment of Diagnostic Accuracy Studies-2 (QUADAS-2) tool is recommended; the SYstematic Review Centre for Laboratory animal Experimentation (SYRCLE) risk of bias tool is available for assessing animal studies; Assessment of Multiple Systematic Reviews (AMSTAR) is a measurement tool for systematic reviews/meta-analyses; an 18-item tool has been developed for appraising case series studies, and the Appraisal of Guidelines, Research and Evaluation (AGREE)-II instrument is widely used to evaluate clinical practice guidelines. We have successfully identified a variety of methodological assessment tools for different types of study design. However, further efforts in the development of critical appraisal tools are warranted since there is currently a lack of such tools for other fields, e.g. genetic studies, and some existing tools (nested case-control studies and case reports, for example) are in need of updating to be in line with current research practice and rigor. In addition, it is very important that all critical appraisal tools remain subjective and performance bias is effectively avoided. © 2015 Chinese Cochrane Center, West China Hospital of Sichuan University and Wiley Publishing Asia Pty Ltd.

  13. HVAC SYSTEMS AS A TOOL IN CONTROLLING INDOOR AIR QUALITY: A LITERATURE REVIEW

    EPA Science Inventory

    The report gives results of a review of literature on the use of heating, ventilating, and air-conditioning (HVAC) systems to control indoor air quality (IAQ). Although significant progress has been made in reducing the energy consumption of HVAC systems, their effect on indoor a...

  14. Innovative Training for Occupational Health and Infection Control Workplace Assessment in Health Care

    ERIC Educational Resources Information Center

    O'Hara, Lyndsay; Bryce, Elizabeth Ann; Scharf, Sydney; Yassi, Annalee

    2012-01-01

    A user-friendly, high quality workplace assessment field guide and an accompanying worksheet are invaluable tools for recognizing hazards in the hospital environment. These tools ensure that both front line workers as well as health and safety and infection control professionals can systematically evaluate hazards and formulate recommendations.…

  15. Improving transcriptome construction in non-model organisms: integrating manual and automated gene definition in Emiliania huxleyi.

    PubMed

    Feldmesser, Ester; Rosenwasser, Shilo; Vardi, Assaf; Ben-Dor, Shifra

    2014-02-22

    The advent of Next Generation Sequencing technologies and corresponding bioinformatics tools allows the definition of transcriptomes in non-model organisms. Non-model organisms are of great ecological and biotechnological significance, and consequently the understanding of their unique metabolic pathways is essential. Several methods that integrate de novo assembly with genome-based assembly have been proposed. Yet, there are many open challenges in defining genes, particularly where genomes are not available or incomplete. Despite the large numbers of transcriptome assemblies that have been performed, quality control of the transcript building process, particularly on the protein level, is rarely performed if ever. To test and improve the quality of the automated transcriptome reconstruction, we used manually defined and curated genes, several of them experimentally validated. Several approaches to transcript construction were utilized, based on the available data: a draft genome, high quality RNAseq reads, and ESTs. In order to maximize the contribution of the various data, we integrated methods including de novo and genome based assembly, as well as EST clustering. After each step a set of manually curated genes was used for quality assessment of the transcripts. The interplay between the automated pipeline and the quality control indicated which additional processes were required to improve the transcriptome reconstruction. We discovered that E. huxleyi has a very high percentage of non-canonical splice junctions, and relatively high rates of intron retention, which caused unique issues with the currently available tools. While individual tools missed genes and artificially joined overlapping transcripts, combining the results of several tools improved the completeness and quality considerably. The final collection, created from the integration of several quality control and improvement rounds, was compared to the manually defined set both on the DNA and protein levels, and resulted in an improvement of 20% versus any of the read-based approaches alone. To the best of our knowledge, this is the first time that an automated transcript definition is subjected to quality control using manually defined and curated genes and thereafter the process is improved. We recommend using a set of manually curated genes to troubleshoot transcriptome reconstruction.

  16. A YEAR-LONG MM5 EVALUATION USING A MODEL EVALUATION TOOLKIT

    EPA Science Inventory

    Air quality modeling has expanded in both sophistication and application over the past decade. Meteorological and air quality modeling tools are being used for research, forecasting, and regulatory related emission control strategies. Results from air quality simulations have far...

  17. Some aspects of precise laser machining - Part 2: Experimental

    NASA Astrophysics Data System (ADS)

    Grabowski, Marcin; Wyszynski, Dominik; Ostrowski, Robert

    2018-05-01

    The paper describes the role of laser beam polarization on quality of laser beam machined cutting tool edge. In micromachining the preparation of the cutting tools in play a key role on dimensional accuracy, sharpness and the quality of the cutting edges. In order to assure quality and dimensional accuracy of the cutting tool edge it is necessary to apply laser polarization control. In the research diode pumped Nd:YAG 532nm pulse laser was applied. Laser beam polarization used in the research was linear (horizontal, vertical). The goal of the carried out research was to describe impact of laser beam polarization on efficiency of the cutting process and quality of machined parts (edge, surface) made of polycrystalline diamond (PCD) and cubic boron nitride (cBN). Application of precise cutting tool in micromachining has significant impact on the minimum uncut chip thickness and quality of the parts. The research was carried within the INNOLOT program funded by the National Centre for Research and Development.

  18. Operations management tools to be applied for textile

    NASA Astrophysics Data System (ADS)

    Maralcan, A.; Ilhan, I.

    2017-10-01

    In this paper, basic concepts of process analysis such as flow time, inventory, bottleneck, labour cost and utilization are illustrated first. The effect of bottleneck on the results of a business are especially emphasized. In the next section, tools on productivity measurement; KPI (Key Performance Indicators) Tree, OEE (Overall Equipment Effectiveness) and Takt Time are introduced and exemplified. KPI tree is a diagram on which we can visualize all the variables of an operation which are driving financial results through cost and profit. OEE is a tool to measure a potential extra capacity of an equipment or an employee. Takt time is a tool to determine the process flow rate according to the customer demand. KPI tree is studied through the whole process while OEE is exemplified for a stenter frame machine which is the most important machine (and usually the bottleneck) and the most expensive investment in a finishing plant. Takt time is exemplified for the quality control department. Finally quality tools, six sigma, control charts and jidoka are introduced. Six sigma is a tool to measure process capability and by the way probability of a defect. Control chart is a powerful tool to monitor the process. The idea of jidoka (detect, stop and alert) is about alerting the people that there is a problem in the process.

  19. Honeywell Technical Order Transfer Tests.

    DTIC Science & Technology

    1987-06-12

    of simple corrections, a reasonable reproduction of the original could be generated. The quality was not good enough for a production environment. Lack of automated quality control (AQC) tools could account for the errors.

  20. GEOSPATIAL QA

    EPA Science Inventory

    Geospatial Science is increasingly becoming an important tool in making Agency decisions. Quality Control and Quality Assurance are required to be integrated during the planning, implementation and assessment of geospatial databases, processes and products. In order to ensure Age...

  1. Maturity method demonstration : final report.

    DOT National Transportation Integrated Search

    2003-07-01

    The concrete maturity method is a quality control/quality assurance tool that can be used to assist contractors and transportation officials in producing cost-efficient, durable concrete structures. This report documents the findings of an investigat...

  2. Tools for Schools: Filtration for Improved Air Quality. Technical Services Bulletin.

    ERIC Educational Resources Information Center

    2001

    This product bulletin addresses air pollution control in educational facilities to enhance educational performance, provides air quality recommendations for schools, and examines the filtration needs of various school areas. The types of air particles typically present are highlighted, and the use of proper filtration to control gases and vapors…

  3. The National Shipbuilding Research Program, Analytical Quality Circles

    DTIC Science & Technology

    1986-09-01

    standard tools for quality control, in English, see “Guide to Quality Control” by Dr. Kaoru Ishikawa , Asian Productivity Organization, Aoyama Dai-ichi...factors affect work evaluation is shown schemati- cally by Characteristic-Factor Diagrams (also called Fishbone or Ishikawa Diagrams), see Figure 2-5

  4. Perceptual tools for quality-aware video networks

    NASA Astrophysics Data System (ADS)

    Bovik, A. C.

    2014-01-01

    Monitoring and controlling the quality of the viewing experience of videos transmitted over increasingly congested networks (especially wireless networks) is a pressing problem owing to rapid advances in video-centric mobile communication and display devices that are straining the capacity of the network infrastructure. New developments in automatic perceptual video quality models offer tools that have the potential to be used to perceptually optimize wireless video, leading to more efficient video data delivery and better received quality. In this talk I will review key perceptual principles that are, or could be used to create effective video quality prediction models, and leading quality prediction models that utilize these principles. The goal is to be able to monitor and perceptually optimize video networks by making them "quality-aware."

  5. A simple tool for neuroimaging data sharing

    PubMed Central

    Haselgrove, Christian; Poline, Jean-Baptiste; Kennedy, David N.

    2014-01-01

    Data sharing is becoming increasingly common, but despite encouragement and facilitation by funding agencies, journals, and some research efforts, most neuroimaging data acquired today is still not shared due to political, financial, social, and technical barriers to sharing data that remain. In particular, technical solutions are few for researchers that are not a part of larger efforts with dedicated sharing infrastructures, and social barriers such as the time commitment required to share can keep data from becoming publicly available. We present a system for sharing neuroimaging data, designed to be simple to use and to provide benefit to the data provider. The system consists of a server at the International Neuroinformatics Coordinating Facility (INCF) and user tools for uploading data to the server. The primary design principle for the user tools is ease of use: the user identifies a directory containing Digital Imaging and Communications in Medicine (DICOM) data, provides their INCF Portal authentication, and provides identifiers for the subject and imaging session. The user tool anonymizes the data and sends it to the server. The server then runs quality control routines on the data, and the data and the quality control reports are made public. The user retains control of the data and may change the sharing policy as they need. The result is that in a few minutes of the user’s time, DICOM data can be anonymized and made publicly available, and an initial quality control assessment can be performed on the data. The system is currently functional, and user tools and access to the public image database are available at http://xnat.incf.org/. PMID:24904398

  6. Advancement in modern approaches to mineral production quality control

    NASA Astrophysics Data System (ADS)

    Freidina, EV; Botvinnik, AA; Dvornikova, AN

    2017-02-01

    The natural resource potential of mineral deposits is represented by three categories: upside, attainable and investment. A modern methodology is proposed in this paper for production quality control, and its tools aimed at ensuring agreement between the product quality and the market requirements are described. The definitions of the costs of the product quality compliance and incompliance with the consumer requirements are introduced; the latter is suggested to use in evaluating resource potential of mineral deposits at a certain degree of probability.

  7. Recommendations for the Use of Automated Gray Matter Segmentation Tools: Evidence from Huntington’s Disease

    PubMed Central

    Johnson, Eileanoir B.; Gregory, Sarah; Johnson, Hans J.; Durr, Alexandra; Leavitt, Blair R.; Roos, Raymund A.; Rees, Geraint; Tabrizi, Sarah J.; Scahill, Rachael I.

    2017-01-01

    The selection of an appropriate segmentation tool is a challenge facing any researcher aiming to measure gray matter (GM) volume. Many tools have been compared, yet there is currently no method that can be recommended above all others; in particular, there is a lack of validation in disease cohorts. This work utilizes a clinical dataset to conduct an extensive comparison of segmentation tools. Our results confirm that all tools have advantages and disadvantages, and we present a series of considerations that may be of use when selecting a GM segmentation method, rather than a ranking of these tools. Seven segmentation tools were compared using 3 T MRI data from 20 controls, 40 premanifest Huntington’s disease (HD), and 40 early HD participants. Segmented volumes underwent detailed visual quality control. Reliability and repeatability of total, cortical, and lobular GM were investigated in repeated baseline scans. The relationship between each tool was also examined. Longitudinal within-group change over 3 years was assessed via generalized least squares regression to determine sensitivity of each tool to disease effects. Visual quality control and raw volumes highlighted large variability between tools, especially in occipital and temporal regions. Most tools showed reliable performance and the volumes were generally correlated. Results for longitudinal within-group change varied between tools, especially within lobular regions. These differences highlight the need for careful selection of segmentation methods in clinical neuroimaging studies. This guide acts as a primer aimed at the novice or non-technical imaging scientist providing recommendations for the selection of cohort-appropriate GM segmentation software. PMID:29066997

  8. Recommendations for the Use of Automated Gray Matter Segmentation Tools: Evidence from Huntington's Disease.

    PubMed

    Johnson, Eileanoir B; Gregory, Sarah; Johnson, Hans J; Durr, Alexandra; Leavitt, Blair R; Roos, Raymund A; Rees, Geraint; Tabrizi, Sarah J; Scahill, Rachael I

    2017-01-01

    The selection of an appropriate segmentation tool is a challenge facing any researcher aiming to measure gray matter (GM) volume. Many tools have been compared, yet there is currently no method that can be recommended above all others; in particular, there is a lack of validation in disease cohorts. This work utilizes a clinical dataset to conduct an extensive comparison of segmentation tools. Our results confirm that all tools have advantages and disadvantages, and we present a series of considerations that may be of use when selecting a GM segmentation method, rather than a ranking of these tools. Seven segmentation tools were compared using 3 T MRI data from 20 controls, 40 premanifest Huntington's disease (HD), and 40 early HD participants. Segmented volumes underwent detailed visual quality control. Reliability and repeatability of total, cortical, and lobular GM were investigated in repeated baseline scans. The relationship between each tool was also examined. Longitudinal within-group change over 3 years was assessed via generalized least squares regression to determine sensitivity of each tool to disease effects. Visual quality control and raw volumes highlighted large variability between tools, especially in occipital and temporal regions. Most tools showed reliable performance and the volumes were generally correlated. Results for longitudinal within-group change varied between tools, especially within lobular regions. These differences highlight the need for careful selection of segmentation methods in clinical neuroimaging studies. This guide acts as a primer aimed at the novice or non-technical imaging scientist providing recommendations for the selection of cohort-appropriate GM segmentation software.

  9. Master Pump Shutdown MPS Software Quality Assurance Plan (SQAP)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    BEVINS, R.R.

    2000-09-20

    The MPSS Software Quality Assurance (SQAP) describes the tools and strategy used in the development of the MPSS software. The document also describes the methodology for controlling and managing changes to the software.

  10. Quality Management and Building Government Information Services.

    ERIC Educational Resources Information Center

    Farrell, Maggie

    1998-01-01

    Discusses serving library-patron needs in terms of customer service and quality control. Highlights include tools for measuring the quality of service (e.g., the SERVQUAL survey), advisory boards or focus groups, library "service statements," changing patron needs, new information formats, and justifying depository library services. (JAK)

  11. Relationship between influence function accuracy and polishing quality in magnetorheological finishing

    NASA Astrophysics Data System (ADS)

    Schinhaerl, Markus; Schneider, Florian; Rascher, Rolf; Vogt, Christian; Sperber, Peter

    2010-10-01

    Magnetorheological finishing is a typical commercial application of a computer-controlled polishing process in the manufacturing of precision optical surfaces. Precise knowledge of the material removal characteristic of the polishing tool (influence function) is essential for controlling the material removal on the workpiece surface by the dwell time method. Results from the testing series with magnetorheological finishing have shown that a deviation of only 5% between the actual material removal characteristic of the polishing tool and that represented by the influence function caused a considerable reduction in the polishing quality. The paper discusses reasons for inaccuracies in the influence function and the effects on the polishing quality. The generic results of this research serve for the development of improved polishing strategies, and may be used in alternative applications of computer-controlled polishing processes that quantify the material removal characteristic by influence functions.

  12. A Review and Analysis of Remote Sensing Capability for Air Quality Measurements as a Potential Decision Support Tool Conducted by the NASA DEVELOP Program

    NASA Technical Reports Server (NTRS)

    Ross, A.; Richards, A.; Keith, K.; Frew, C.; Boseck, J.; Sutton, S.; Watts, C.; Rickman, D.

    2007-01-01

    This project focused on a comprehensive utilization of air quality model products as decision support tools (DST) needed for public health applications. A review of past and future air quality measurement methods and their uncertainty, along with the relationship of air quality to national and global public health, is vital. This project described current and future NASA satellite remote sensing and ground sensing capabilities and the potential for using these sensors to enhance the prediction, prevention, and control of public health effects that result from poor air quality. The qualitative uncertainty of current satellite remotely sensed air quality, the ground-based remotely sensed air quality, the air quality/public health model, and the decision making process is evaluated in this study. Current peer-reviewed literature suggests that remotely sensed air quality parameters correlate well with ground-based sensor data. A satellite remote-sensed and ground-sensed data complement is needed to enhance the models/tools used by policy makers for the protection of national and global public health communities

  13. Quality evaluation and control of end cap welds in PHWR fuel elements by ultrasonic examination

    NASA Astrophysics Data System (ADS)

    Choi, M. S.; Yang, M. S.

    1991-02-01

    The current quality control procedure of nuclear fuel end cap weld is mainly dependent on the destructive metallographic examination. A nondestructive examination technique, i.e., ultrasonic examination, has been developed to identify and evaluate weld discontinuities. A few interesting results of the weld quality evaluation by applying the developed ultrasonic examination technique to PHWR fuel welds are presented. In addition, the feasibility of the weld quality control by the ultrasonic examination is discussed. This study shows that the ultrasonic examination is effective and reliable method for detecting abnormal weld contours and weld discontinuities such as micro-fissure, crack, upset split and expulsion, and can be used as a quality control tool for the end cap welding process.

  14. FQC Dashboard: integrates FastQC results into a web-based, interactive, and extensible FASTQ quality control tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Joseph; Pirrung, Meg; McCue, Lee Ann

    FQC is software that facilitates quality control of FASTQ files by carrying out a QC protocol using FastQC, parsing results, and aggregating quality metrics into an interactive dashboard designed to richly summarize individual sequencing runs. The dashboard groups samples in dropdowns for navigation among the data sets, utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data.

  15. FQC Dashboard: integrates FastQC results into a web-based, interactive, and extensible FASTQ quality control tool

    DOE PAGES

    Brown, Joseph; Pirrung, Meg; McCue, Lee Ann

    2017-06-09

    FQC is software that facilitates quality control of FASTQ files by carrying out a QC protocol using FastQC, parsing results, and aggregating quality metrics into an interactive dashboard designed to richly summarize individual sequencing runs. The dashboard groups samples in dropdowns for navigation among the data sets, utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data.

  16. Optical Method For Monitoring Tool Control For Green Burnishing With Using Of Algorithms With Adaptive Settings

    NASA Astrophysics Data System (ADS)

    Lukyanov, A. A.; Grigoriev, S. N.; Bobrovskij, I. N.; Melnikov, P. A.; Bobrovskij, N. M.

    2017-05-01

    With regard to the complexity of the new technology and increase its reliability requirements laboriousness of control operations in industrial quality control systems increases significantly. The importance of quality management control due to the fact that its promotes the correct use of production conditions, the relevant requirements are required. Digital image processing allows to reach a new technological level of production (new technological way). The most complicated automated interpretation of information is the basis for decision-making in the management of production processes. In the case of surface analysis of tools used for processing with the using of metalworking fluids (MWF) it is more complicated. The authors suggest new algorithm for optical inspection of the wear of the cylinder tool for burnishing, which used in surface plastic deformation without using of MWF. The main advantage of proposed algorithm is the possibility of automatic recognition of images of burnisher tool with the subsequent allocation of its boundaries, finding a working surface and automatically allocating the defects and wear area. Software that implements the algorithm was developed by the authors in Matlab programming environment, but can be implemented using other programming languages.

  17. Study of the time and effort signal in cutting operations

    NASA Astrophysics Data System (ADS)

    Grosset, E.; Maillard, A.; Bouhelier, C.; Gasnier, J.

    1990-02-01

    Perception and treatment of an effort signal by computer methods is discussed. An automatic control system used to measure the wear of machine tools and carry out quality control throughout the cutting process is described. The testing system is used to evaluate the performance of tools which have been vacuum plated. The system is used as part of the BRITE study, the goal of which is to develop an expert system for measuring the wear of tools used during drilling and perforation operations.

  18. HPLC for quality control of polyimides

    NASA Technical Reports Server (NTRS)

    Young, P. R.; Sykes, G. F.

    1979-01-01

    High Pressure Liquid Chromatography (HPLC) as a quality control tool for polyimide resins and prepregs are presented. A data base to help establish accept/reject criteria for these materials was developed. This work is intended to supplement, not replace, standard quality control tests normally conducted on incoming resins and prepregs. To help achieve these objectives, the HPLC separation of LARC-160 polyimide precursor resin was characterized. Room temperature resin aging effects were studied. Graphite reinforced composites made from fresh and aged resin were fabricated and tested to determine if changes observed by HPLC were significant.

  19. Controlled dehydration improves the diffraction quality of two RNA crystals.

    PubMed

    Park, HaJeung; Tran, Tuan; Lee, Jun Hyuck; Park, Hyun; Disney, Matthew D

    2016-11-03

    Post-crystallization dehydration methods, applying either vapor diffusion or humidity control devices, have been widely used to improve the diffraction quality of protein crystals. Despite the fact that RNA crystals tend to diffract poorly, there is a dearth of reports on the application of dehydration methods to improve the diffraction quality of RNA crystals. We use dehydration techniques with a Free Mounting System (FMS, a humidity control device) to recover the poor diffraction quality of RNA crystals. These approaches were applied to RNA constructs that model various RNA-mediated repeat expansion disorders. The method we describe herein could serve as a general tool to improve diffraction quality of RNA crystals to facilitate structure determinations.

  20. A multiparametric automatic method to monitor long-term reproducibility in digital mammography: results from a regional screening programme.

    PubMed

    Gennaro, G; Ballaminut, A; Contento, G

    2017-09-01

    This study aims to illustrate a multiparametric automatic method for monitoring long-term reproducibility of digital mammography systems, and its application on a large scale. Twenty-five digital mammography systems employed within a regional screening programme were controlled weekly using the same type of phantom, whose images were analysed by an automatic software tool. To assess system reproducibility levels, 15 image quality indices (IQIs) were extracted and compared with the corresponding indices previously determined by a baseline procedure. The coefficients of variation (COVs) of the IQIs were used to assess the overall variability. A total of 2553 phantom images were collected from the 25 digital mammography systems from March 2013 to December 2014. Most of the systems showed excellent image quality reproducibility over the surveillance interval, with mean variability below 5%. Variability of each IQI was 5%, with the exception of one index associated with the smallest phantom objects (0.25 mm), which was below 10%. The method applied for reproducibility tests-multi-detail phantoms, cloud automatic software tool to measure multiple image quality indices and statistical process control-was proven to be effective and applicable on a large scale and to any type of digital mammography system. • Reproducibility of mammography image quality should be monitored by appropriate quality controls. • Use of automatic software tools allows image quality evaluation by multiple indices. • System reproducibility can be assessed comparing current index value with baseline data. • Overall system reproducibility of modern digital mammography systems is excellent. • The method proposed and applied is cost-effective and easily scalable.

  1. Improving guideline adherence through intensive quality improvement and the use of a National Quality Register in Sweden for acute myocardial infarction.

    PubMed

    Peterson, Anette; Carlhed, Rickard; Lindahl, Bertil; Lindström, Gunilla; Aberg, Christina; Andersson-Gäre, Boel; Bojestig, Mats

    2007-01-01

    Data from the Swedish National Register in Cardiac Care have shown over the last 10 years an enduring gap between optimal treatment of acute myocardial infarction (AMI) according to current guidelines and the treatment actually given. We performed a controlled, prospective study in order to evaluate the effects of applying a multidisciplinary team-based improvement methodology to the use of evidence-based treatments in AMI, together with the use of a modified National Quality Register. The project engaged 25% of the Swedish hospitals. Multidisciplinary teams from 20 hospitals participating in the National Register in Cardiac Care, ranging from small to large hospitals, were trained in continuous quality improvement methodology. Twenty matched hospitals served as controls. Our efforts were focused on finding and applying tools and methods to increase adherence to the national guidelines for 5 different treatments for AMI. For measurement, specially designed quality control charts were made available in the National Register for Cardiac Care. To close the gap, an important issue for the teams was to get all 5 treatments in place. Ten of the hospitals in the study group reduced the gap in 5 of 5 treatments by 50%, while none of the control hospitals did so. This first, controlled prospective study of a registry supported by multidisciplinary team-based improvement methodology showed that this approach led to rapidly improved adherence to AMI guidelines in a broad spectrum of hospitals and that National Quality Registers can be helpful tools.

  2. Severe postpartum haemorrhage after vaginal delivery: a statistical process control chart to report seven years of continuous quality improvement.

    PubMed

    Dupont, Corinne; Occelli, Pauline; Deneux-Tharaux, Catherine; Touzet, Sandrine; Duclos, Antoine; Bouvier-Colle, Marie-Hélène; Rudigoz, René-Charles; Huissoud, Cyril

    2014-07-01

    Severe postpartum haemorrhage after vaginal delivery: a statistical process control chart to report seven years of continuous quality improvement To use statistical process control charts to describe trends in the prevalence of severe postpartum haemorrhage after vaginal delivery. This assessment was performed 7 years after we initiated a continuous quality improvement programme that began with regular criteria-based audits Observational descriptive study, in a French maternity unit in the Rhône-Alpes region. Quarterly clinical audit meetings to analyse all cases of severe postpartum haemorrhage after vaginal delivery and provide feedback on quality of care with statistical process control tools. The primary outcomes were the prevalence of severe PPH after vaginal delivery and its quarterly monitoring with a control chart. The secondary outcomes included the global quality of care for women with severe postpartum haemorrhage, including the performance rate of each recommended procedure. Differences in these variables between 2005 and 2012 were tested. From 2005 to 2012, the prevalence of severe postpartum haemorrhage declined significantly, from 1.2% to 0.6% of vaginal deliveries (p<0.001). Since 2010, the quarterly rate of severe PPH has not exceeded the upper control limits, that is, been out of statistical control. The proportion of cases that were managed consistently with the guidelines increased for all of their main components. Implementation of continuous quality improvement efforts began seven years ago and used, among other tools, statistical process control charts. During this period, the prevalence of severe postpartum haemorrhage after vaginal delivery has been reduced by 50%. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  3. [Statistical process control applied to intensity modulated radiotherapy pretreatment controls with portal dosimetry].

    PubMed

    Villani, N; Gérard, K; Marchesi, V; Huger, S; François, P; Noël, A

    2010-06-01

    The first purpose of this study was to illustrate the contribution of statistical process control for a better security in intensity modulated radiotherapy (IMRT) treatments. This improvement is possible by controlling the dose delivery process, characterized by pretreatment quality control results. So, it is necessary to put under control portal dosimetry measurements (currently, the ionisation chamber measurements were already monitored by statistical process control thanks to statistical process control tools). The second objective was to state whether it is possible to substitute ionisation chamber with portal dosimetry in order to optimize time devoted to pretreatment quality control. At Alexis-Vautrin center, pretreatment quality controls in IMRT for prostate and head and neck treatments were performed for each beam of each patient. These controls were made with an ionisation chamber, which is the reference detector for the absolute dose measurement, and with portal dosimetry for the verification of dose distribution. Statistical process control is a statistical analysis method, coming from industry, used to control and improve the studied process quality. It uses graphic tools as control maps to follow-up process, warning the operator in case of failure, and quantitative tools to evaluate the process toward its ability to respect guidelines: this is the capability study. The study was performed on 450 head and neck beams and on 100 prostate beams. Control charts, showing drifts, both slow and weak, and also both strong and fast, of mean and standard deviation have been established and have shown special cause introduced (manual shift of the leaf gap of the multileaf collimator). Correlation between dose measured at one point, given with the EPID and the ionisation chamber has been evaluated at more than 97% and disagreement cases between the two measurements were identified. The study allowed to demonstrate the feasibility to reduce the time devoted to pretreatment controls, by substituting the ionisation chamber's measurements with those performed with EPID, and also that a statistical process control monitoring of data brought security guarantee. 2010 Société française de radiothérapie oncologique (SFRO). Published by Elsevier SAS. All rights reserved.

  4. Communication Tools for End-of-Life Decision-Making in Ambulatory Care Settings: A Systematic Review and Meta-Analysis.

    PubMed

    Oczkowski, Simon J; Chung, Han-Oh; Hanvey, Louise; Mbuagbaw, Lawrence; You, John J

    2016-01-01

    Patients with serious illness, and their families, state that better communication and decision-making with healthcare providers is a high priority to improve the quality of end-of-life care. Numerous communication tools to assist patients, family members, and clinicians in end-of-life decision-making have been published, but their effectiveness remains unclear. To determine, amongst adults in ambulatory care settings, the effect of structured communication tools for end-of-life decision-making on completion of advance care planning. We searched for relevant randomized controlled trials (RCTs) or non-randomized intervention studies in MEDLINE, EMBASE, CINAHL, ERIC, and the Cochrane Database of Randomized Controlled Trials from database inception until July 2014. Two reviewers independently screened articles for eligibility, extracted data, and assessed risk of bias. Grading of Recommendations Assessment, Development, and Evaluation (GRADE) was used to evaluate the quality of evidence for each of the primary and secondary outcomes. Sixty-seven studies, including 46 RCTs, were found. The majority evaluated communication tools in older patients (age >50) with no specific medical condition, but many specifically evaluated populations with cancer, lung, heart, neurologic, or renal disease. Most studies compared the use of communication tools against usual care, but several compared the tools to less-intensive advance care planning tools. The use of structured communication tools increased: the frequency of advance care planning discussions/discussions about advance directives (RR 2.31, 95% CI 1.25-4.26, p = 0.007, low quality evidence) and the completion of advance directives (ADs) (RR 1.92, 95% CI 1.43-2.59, p<0.001, low quality evidence); concordance between AD preferences and subsequent medical orders for use or non-use of life supporting treatment (RR 1.19, 95% CI 1.01-1.39, p = 0.028, very low quality evidence, 1 observational study); and concordance between the care desired and care received by patients (RR 1.17, 95% CI 1.05-1.30, p = 0.004, low quality evidence, 2 RCTs). The use of structured communication tools may increase the frequency of discussions about and completion of advance directives, and concordance between the care desired and the care received by patients. The use of structured communication tools rather than an ad-hoc approach to end-of-life decision-making should be considered, and the selection and implementation of such tools should be tailored to address local needs and context. PROSPERO CRD42014012913.

  5. Communication Tools for End-of-Life Decision-Making in Ambulatory Care Settings: A Systematic Review and Meta-Analysis

    PubMed Central

    Chung, Han-Oh; Hanvey, Louise; Mbuagbaw, Lawrence; You, John J.

    2016-01-01

    Background Patients with serious illness, and their families, state that better communication and decision-making with healthcare providers is a high priority to improve the quality of end-of-life care. Numerous communication tools to assist patients, family members, and clinicians in end-of-life decision-making have been published, but their effectiveness remains unclear. Objectives To determine, amongst adults in ambulatory care settings, the effect of structured communication tools for end-of-life decision-making on completion of advance care planning. Methods We searched for relevant randomized controlled trials (RCTs) or non-randomized intervention studies in MEDLINE, EMBASE, CINAHL, ERIC, and the Cochrane Database of Randomized Controlled Trials from database inception until July 2014. Two reviewers independently screened articles for eligibility, extracted data, and assessed risk of bias. Grading of Recommendations Assessment, Development, and Evaluation (GRADE) was used to evaluate the quality of evidence for each of the primary and secondary outcomes. Results Sixty-seven studies, including 46 RCTs, were found. The majority evaluated communication tools in older patients (age >50) with no specific medical condition, but many specifically evaluated populations with cancer, lung, heart, neurologic, or renal disease. Most studies compared the use of communication tools against usual care, but several compared the tools to less-intensive advance care planning tools. The use of structured communication tools increased: the frequency of advance care planning discussions/discussions about advance directives (RR 2.31, 95% CI 1.25–4.26, p = 0.007, low quality evidence) and the completion of advance directives (ADs) (RR 1.92, 95% CI 1.43–2.59, p<0.001, low quality evidence); concordance between AD preferences and subsequent medical orders for use or non-use of life supporting treatment (RR 1.19, 95% CI 1.01–1.39, p = 0.028, very low quality evidence, 1 observational study); and concordance between the care desired and care received by patients (RR 1.17, 95% CI 1.05–1.30, p = 0.004, low quality evidence, 2 RCTs). Conclusions The use of structured communication tools may increase the frequency of discussions about and completion of advance directives, and concordance between the care desired and the care received by patients. The use of structured communication tools rather than an ad-hoc approach to end-of-life decision-making should be considered, and the selection and implementation of such tools should be tailored to address local needs and context. Registration PROSPERO CRD42014012913 PMID:27119571

  6. Satisfaction monitoring for quality control in campground management

    Treesearch

    Wilbur F. LaPage; Malcolm I. Bevins

    1981-01-01

    A 4-year study of camper satisfaction indicates that satisfaction monitoring is a useful tool for campground managers to assess their performance and achieve a high level of quality control in their service to the public. An indication of camper satisfaction with campground management is gained from a report card on which a small sample of visitors rates 14 elements of...

  7. Variables Control Charts: A Measurement Tool to Detect Process Problems within Housing

    ERIC Educational Resources Information Center

    Luna, Andrew

    1999-01-01

    The purpose of this study was to use quality improvement tools to determine if the current process of supplying hot water to a high-rise residence hall for women at a southeastern Doctoral I granting institution was in control. After a series of focus groups among the residents in the hall, it was determined that they were mostly concerned about…

  8. QCScreen: a software tool for data quality control in LC-HRMS based metabolomics.

    PubMed

    Simader, Alexandra Maria; Kluger, Bernhard; Neumann, Nora Katharina Nicole; Bueschl, Christoph; Lemmens, Marc; Lirk, Gerald; Krska, Rudolf; Schuhmacher, Rainer

    2015-10-24

    Metabolomics experiments often comprise large numbers of biological samples resulting in huge amounts of data. This data needs to be inspected for plausibility before data evaluation to detect putative sources of error e.g. retention time or mass accuracy shifts. Especially in liquid chromatography-high resolution mass spectrometry (LC-HRMS) based metabolomics research, proper quality control checks (e.g. for precision, signal drifts or offsets) are crucial prerequisites to achieve reliable and comparable results within and across experimental measurement sequences. Software tools can support this process. The software tool QCScreen was developed to offer a quick and easy data quality check of LC-HRMS derived data. It allows a flexible investigation and comparison of basic quality-related parameters within user-defined target features and the possibility to automatically evaluate multiple sample types within or across different measurement sequences in a short time. It offers a user-friendly interface that allows an easy selection of processing steps and parameter settings. The generated results include a coloured overview plot of data quality across all analysed samples and targets and, in addition, detailed illustrations of the stability and precision of the chromatographic separation, the mass accuracy and the detector sensitivity. The use of QCScreen is demonstrated with experimental data from metabolomics experiments using selected standard compounds in pure solvent. The application of the software identified problematic features, samples and analytical parameters and suggested which data files or compounds required closer manual inspection. QCScreen is an open source software tool which provides a useful basis for assessing the suitability of LC-HRMS data prior to time consuming, detailed data processing and subsequent statistical analysis. It accepts the generic mzXML format and thus can be used with many different LC-HRMS platforms to process both multiple quality control sample types as well as experimental samples in one or more measurement sequences.

  9. Scater: pre-processing, quality control, normalization and visualization of single-cell RNA-seq data in R.

    PubMed

    McCarthy, Davis J; Campbell, Kieran R; Lun, Aaron T L; Wills, Quin F

    2017-04-15

    Single-cell RNA sequencing (scRNA-seq) is increasingly used to study gene expression at the level of individual cells. However, preparing raw sequence data for further analysis is not a straightforward process. Biases, artifacts and other sources of unwanted variation are present in the data, requiring substantial time and effort to be spent on pre-processing, quality control (QC) and normalization. We have developed the R/Bioconductor package scater to facilitate rigorous pre-processing, quality control, normalization and visualization of scRNA-seq data. The package provides a convenient, flexible workflow to process raw sequencing reads into a high-quality expression dataset ready for downstream analysis. scater provides a rich suite of plotting tools for single-cell data and a flexible data structure that is compatible with existing tools and can be used as infrastructure for future software development. The open-source code, along with installation instructions, vignettes and case studies, is available through Bioconductor at http://bioconductor.org/packages/scater . davis@ebi.ac.uk. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press.

  10. Effectiveness of the Assessment of Burden of COPD (ABC) tool on health-related quality of life in patients with COPD: a cluster randomised controlled trial in primary and hospital care

    PubMed Central

    Slok, Annerika H M; Kotz, Daniel; van Breukelen, Gerard; Chavannes, Niels H; Rutten-van Mölken, Maureen P M H; Kerstjens, Huib A M; van der Molen, Thys; Asijee, Guus M; Dekhuijzen, P N Richard; Holverda, Sebastiaan; Salomé, Philippe L; Goossens, Lucas M A; Twellaar, Mascha; in ‘t Veen, Johannes C C M; van Schayck, Onno C P

    2016-01-01

    Objective Assessing the effectiveness of the Assessment of Burden of COPD (ABC) tool on disease-specific quality of life in patients with chronic obstructive pulmonary disease (COPD) measured with the St. George's Respiratory Questionnaire (SGRQ), compared with usual care. Methods A pragmatic cluster randomised controlled trial, in 39 Dutch primary care practices and 17 hospitals, with 357 patients with COPD (postbronchodilator FEV1/FVC ratio <0.7) aged ≥40 years, who could understand and read the Dutch language. Healthcare providers were randomly assigned to the intervention or control group. The intervention group applied the ABC tool, which consists of a short validated questionnaire assessing the experienced burden of COPD, objective COPD parameter (eg, lung function) and a treatment algorithm including a visual display and treatment advice. The control group provided usual care. Researchers were blinded to group allocation during analyses. Primary outcome was the number of patients with a clinically relevant improvement in SGRQ score between baseline and 18-month follow-up. Secondary outcomes were the COPD Assessment Test (CAT) and the Patient Assessment of Chronic Illness Care (PACIC; a measurement of perceived quality of care). Results At 18-month follow-up, 34% of the 146 patients from 27 healthcare providers in the intervention group showed a clinically relevant improvement in the SGRQ, compared with 22% of the 148 patients from 29 healthcare providers in the control group (OR 1.85, 95% CI 1.08 to 3.16). No difference was found on the CAT (−0.26 points (scores ranging from 0 to 40); 95% CI −1.52 to 0.99). The PACIC showed a higher improvement in the intervention group (0.32 points (scores ranging from 1 to 5); 95% CI 0.14 to 0.50). Conclusions This study showed that use of the ABC tool may increase quality of life and perceived quality of care. Trial registration number NTR3788; Results. PMID:27401361

  11. Developing a tool for the preparation of GMP audit of pharmaceutical contract manufacturer.

    PubMed

    Linna, Anu; Korhonen, Mirka; Mannermaa, Jukka-Pekka; Airaksinen, Marja; Juppo, Anne Mari

    2008-06-01

    Outsourcing is rapidly growing in the pharmaceutical industry. When the manufacturing activities are outsourced, control of the product's quality has to be maintained. One way to confirm contract manufacturer's GMP (Good Manufacturing Practice) compliance is auditing. Audits can be supported for instance by using GMP questionnaires. The objective of this study was to develop a tool for the audit preparation of pharmaceutical contract manufacturers and to validate its contents by using Delphi method. At this phase of the study the tool was developed for non-sterile finished product contract manufacturers. A modified Delphi method was used with expert panel consisting of 14 experts from pharmaceutical industry, authorities and university. The content validity of the developed tool was assessed by a Delphi questionnaire round. The response rate in Delphi questionnaire round was 86%. The tool consisted of 103 quality items, from which 90 (87%) achieved the pre-defined agreement rate level (75%). Thirteen quality items which did not achieve the pre-defined agreement rate were excluded from the tool. The expert panel suggested only minor changes to the tool. The results show that the content validity of the developed audit preparation tool was good.

  12. A Lean Six Sigma approach to the improvement of the selenium analysis method.

    PubMed

    Cloete, Bronwyn C; Bester, André

    2012-11-02

    Reliable results represent the pinnacle assessment of quality of an analytical laboratory, and therefore variability is considered to be a critical quality problem associated with the selenium analysis method executed at Western Cape Provincial Veterinary Laboratory (WCPVL). The elimination and control of variability is undoubtedly of significant importance because of the narrow margin of safety between toxic and deficient doses of the trace element for good animal health. A quality methodology known as Lean Six Sigma was believed to present the most feasible solution for overcoming the adverse effect of variation, through steps towards analytical process improvement. Lean Six Sigma represents a form of scientific method type, which is empirical, inductive and deductive, and systematic, which relies on data, and is fact-based. The Lean Six Sigma methodology comprises five macro-phases, namely Define, Measure, Analyse, Improve and Control (DMAIC). Both qualitative and quantitative laboratory data were collected in terms of these phases. Qualitative data were collected by using quality-tools, namely an Ishikawa diagram, a Pareto chart, Kaizen analysis and a Failure Mode Effect analysis tool. Quantitative laboratory data, based on the analytical chemistry test method, were collected through a controlled experiment. The controlled experiment entailed 13 replicated runs of the selenium test method, whereby 11 samples were repetitively analysed, whilst Certified Reference Material (CRM) was also included in 6 of the runs. Laboratory results obtained from the controlled experiment was analysed by using statistical methods, commonly associated with quality validation of chemistry procedures. Analysis of both sets of data yielded an improved selenium analysis method, believed to provide greater reliability of results, in addition to a greatly reduced cycle time and superior control features. Lean Six Sigma may therefore be regarded as a valuable tool in any laboratory, and represents both a management discipline, and a standardised approach to problem solving and process optimisation.

  13. Quality indexing with computer-aided lexicography

    NASA Technical Reports Server (NTRS)

    Buchan, Ronald L.

    1992-01-01

    Indexing with computers is a far cry from indexing with the first indexing tool, the manual card sorter. With the aid of computer-aided lexicography, both indexing and indexing tools can provide standardization, consistency, and accuracy, resulting in greater quality control than ever before. A brief survey of computer activity in indexing is presented with detailed illustrations from NASA activity. Applications from techniques mentioned, such as Retrospective Indexing (RI), can be made to many indexing systems. In addition to improving the quality of indexing with computers, the improved efficiency with which certain tasks can be done is demonstrated.

  14. Field correlation of PQI gauge with nuclear density gauge: phase 1.

    DOT National Transportation Integrated Search

    2006-12-01

    Traditionally, the Oklahoma Department of Transportation (ODOT) uses a nuclear density gauge as a quality control (QC) and quality assurance (QA) tool for in-place density. The nuclear-based devices, however, tend to have problems associated with lic...

  15. Six Sigma Quality Management System and Design of Risk-based Statistical Quality Control.

    PubMed

    Westgard, James O; Westgard, Sten A

    2017-03-01

    Six sigma concepts provide a quality management system (QMS) with many useful tools for managing quality in medical laboratories. This Six Sigma QMS is driven by the quality required for the intended use of a test. The most useful form for this quality requirement is the allowable total error. Calculation of a sigma-metric provides the best predictor of risk for an analytical examination process, as well as a design parameter for selecting the statistical quality control (SQC) procedure necessary to detect medically important errors. Simple point estimates of sigma at medical decision concentrations are sufficient for laboratory applications. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Novel gene expression tools for rice biotechnology

    USDA-ARS?s Scientific Manuscript database

    Biotechnology is an effective and important method of improving both quality and agronomic traits in rice. We are developing novel molecular tools for genetic engineering, with a focus on developing novel transgene expression control elements (i.e. promoters) for rice. A suite of monocot grass promo...

  17. Application of reiteration of Hankel singular value decomposition in quality control

    NASA Astrophysics Data System (ADS)

    Staniszewski, Michał; Skorupa, Agnieszka; Boguszewicz, Łukasz; Michalczuk, Agnieszka; Wereszczyński, Kamil; Wicher, Magdalena; Konopka, Marek; Sokół, Maria; Polański, Andrzej

    2017-07-01

    Medical centres are obliged to store past medical records, including the results of quality assurance (QA) tests of the medical equipment, which is especially useful in checking reproducibility of medical devices and procedures. Analysis of multivariate time series is an important part of quality control of NMR data. In this work we proposean anomaly detection tool based on Reiteration of Hankel Singular Value Decomposition method. The presented method was compared with external software and authors obtained comparable results.

  18. Quality of the parent-child interaction in young children with type 1 diabetes mellitus: study protocol.

    PubMed

    Nieuwesteeg, Anke M; Pouwer, Frans; van Bakel, Hedwig Ja; Emons, Wilco Hm; Aanstoot, Henk-Jan; Odink, Roelof; Hartman, Esther E

    2011-04-14

    In young children with type 1 diabetes mellitus (T1DM) parents have full responsibility for the diabetes-management of their child (e.g. blood glucose monitoring, and administering insulin). Behavioral tasks in childhood, such as developing autonomy, and oppositional behavior (e.g. refusing food) may interfere with the diabetes-management to achieve an optimal blood glucose control. Furthermore, higher blood glucose levels are related to more behavioral problems. So parents might need to negotiate with their child on the diabetes-management to avoid this direct negative effect. This interference, the negotiations, and the parent's responsibility for diabetes may negatively affect the quality of parent-child interaction. Nevertheless, there is little knowledge about the quality of interaction between parents and young children with T1DM, and the possible impact this may have on glycemic control and psychosocial functioning of the child. While widely used global parent-child interaction observational methods are available, there is a need for an observational tool specifically tailored to the interaction patterns of parents and children with T1DM. The main aim of this study is to construct a disease-specific observational method to assess diabetes-specific parent-child interaction. Additional aim is to explore whether the quality of parent-child interactions is associated with the glycemic control, and psychosocial functioning (resilience, behavioral problems, and quality of life). First, we will examine which situations are most suitable for observing diabetes-specific interactions. Then, these situations will be video-taped in a pilot study (N = 15). Observed behaviors are described into rating scales, with each scale describing characteristics of parent-child interactional behaviors. Next, we apply the observational tool on a larger scale for further evaluation of the instrument (N = 120). The parents are asked twice (with two years in between) to fill out questionnaires about psychosocial functioning of their child with T1DM. Furthermore, glycemic control (HbA1c) will be obtained from their medical records. A disease-specific observational tool will enable the detailed assessment of the quality of diabetes-specific parent-child interactions. The availability of such a tool will facilitate future (intervention) studies that will yield more knowledge about impact of parent-child interactions on psychosocial functioning, and glycemic control of children with T1DM.

  19. Quality of the parent-child interaction in young children with type 1 diabetes mellitus: study protocol

    PubMed Central

    2011-01-01

    Background In young children with type 1 diabetes mellitus (T1DM) parents have full responsibility for the diabetes-management of their child (e.g. blood glucose monitoring, and administering insulin). Behavioral tasks in childhood, such as developing autonomy, and oppositional behavior (e.g. refusing food) may interfere with the diabetes-management to achieve an optimal blood glucose control. Furthermore, higher blood glucose levels are related to more behavioral problems. So parents might need to negotiate with their child on the diabetes-management to avoid this direct negative effect. This interference, the negotiations, and the parent's responsibility for diabetes may negatively affect the quality of parent-child interaction. Nevertheless, there is little knowledge about the quality of interaction between parents and young children with T1DM, and the possible impact this may have on glycemic control and psychosocial functioning of the child. While widely used global parent-child interaction observational methods are available, there is a need for an observational tool specifically tailored to the interaction patterns of parents and children with T1DM. The main aim of this study is to construct a disease-specific observational method to assess diabetes-specific parent-child interaction. Additional aim is to explore whether the quality of parent-child interactions is associated with the glycemic control, and psychosocial functioning (resilience, behavioral problems, and quality of life). Methods/Design First, we will examine which situations are most suitable for observing diabetes-specific interactions. Then, these situations will be video-taped in a pilot study (N = 15). Observed behaviors are described into rating scales, with each scale describing characteristics of parent-child interactional behaviors. Next, we apply the observational tool on a larger scale for further evaluation of the instrument (N = 120). The parents are asked twice (with two years in between) to fill out questionnaires about psychosocial functioning of their child with T1DM. Furthermore, glycemic control (HbA1c) will be obtained from their medical records. Discussion A disease-specific observational tool will enable the detailed assessment of the quality of diabetes-specific parent-child interactions. The availability of such a tool will facilitate future (intervention) studies that will yield more knowledge about impact of parent-child interactions on psychosocial functioning, and glycemic control of children with T1DM. PMID:21492413

  20. Quality controls for gamma cameras and PET cameras: development of a free open-source ImageJ program

    NASA Astrophysics Data System (ADS)

    Carlier, Thomas; Ferrer, Ludovic; Berruchon, Jean B.; Cuissard, Regis; Martineau, Adeline; Loonis, Pierre; Couturier, Olivier

    2005-04-01

    Acquisition data and treatments for quality controls of gamma cameras and Positron Emission Tomography (PET) cameras are commonly performed with dedicated program packages, which are running only on manufactured computers and differ from each other, depending on camera company and program versions. The aim of this work was to develop a free open-source program (written in JAVA language) to analyze data for quality control of gamma cameras and PET cameras. The program is based on the free application software ImageJ and can be easily loaded on any computer operating system (OS) and thus on any type of computer in every nuclear medicine department. Based on standard parameters of quality control, this program includes 1) for gamma camera: a rotation center control (extracted from the American Association of Physics in Medicine, AAPM, norms) and two uniformity controls (extracted from the Institute of Physics and Engineering in Medicine, IPEM, and National Electronic Manufacturers Association, NEMA, norms). 2) For PET systems, three quality controls recently defined by the French Medical Physicist Society (SFPM), i.e. spatial resolution and uniformity in a reconstructed slice and scatter fraction, are included. The determination of spatial resolution (thanks to the Point Spread Function, PSF, acquisition) allows to compute the Modulation Transfer Function (MTF) in both modalities of cameras. All the control functions are included in a tool box which is a free ImageJ plugin and could be soon downloaded from Internet. Besides, this program offers the possibility to save on HTML format the uniformity quality control results and a warning can be set to automatically inform users in case of abnormal results. The architecture of the program allows users to easily add any other specific quality control program. Finally, this toolkit is an easy and robust tool to perform quality control on gamma cameras and PET cameras based on standard computation parameters, is free, run on any type of computer and will soon be downloadable from the net (http://rsb.info.nih.gov/ij/plugins or http://nucleartoolkit.free.fr).

  1. Utilization of Light Detection and Ranging for Quality Control and Quality Assurance of Pavement Grades

    DOT National Transportation Integrated Search

    2018-02-01

    Light Detection and Ranging (Lidar) technology is a useful tool that can assist transportation agencies during the design, construction, and maintenance phases of transportation projects. To demonstrate the utility of Lidar, this report discusses how...

  2. Using the scanning electron microscope on the production line to assure quality semiconductors

    NASA Technical Reports Server (NTRS)

    Adolphsen, J. W.; Anstead, R. J.

    1972-01-01

    The use of the scanning electron microscope to detect metallization defects introduced during batch processing of semiconductor devices is discussed. A method of determining metallization integrity was developed which culminates in a procurement specification using the scanning microscope on the production line as a quality control tool. Batch process control of the metallization operation is monitored early in the manufacturing cycle.

  3. Evaluating online diagnostic decision support tools for the clinical setting.

    PubMed

    Pryor, Marie; White, David; Potter, Bronwyn; Traill, Roger

    2012-01-01

    Clinical decision support tools available at the point of care are an effective adjunct to support clinicians to make clinical decisions and improve patient outcomes. We developed a methodology and applied it to evaluate commercially available online clinical diagnostic decision support (DDS) tools for use at the point of care. We identified 11 commercially available DDS tools and assessed these against an evaluation instrument that included 6 categories; general information, content, quality control, search, clinical results and other features. We developed diagnostically challenging clinical case scenarios based on real patient experience that were commonly missed by junior medical staff. The evaluation was divided into 2 phases; an initial evaluation of all identified and accessible DDS tools conducted by the Clinical Information Access Portal (CIAP) team and a second phase that further assessed the top 3 tools identified in the initial evaluation phase. An evaluation panel consisting of senior and junior medical clinicians from NSW Health conducted the second phase. Of the eleven tools that were assessed against the evaluation instrument only 4 tools completely met the DDS definition that was adopted for this evaluation and were able to produce a differential diagnosis. From the initial phase of the evaluation 4 DDS tools scored 70% or more (maximum score 96%) for the content category, 8 tools scored 65% or more (maximum 100%) for the quality control category, 5 tools scored 65% or more (maximum 94%) for the search category, and 4 tools score 70% or more (maximum 81%) for the clinical results category. The second phase of the evaluation was focused on assessing diagnostic accuracy for the top 3 tools identified in the initial phase. Best Practice ranked highest overall against the 6 clinical case scenarios used. Overall the differentiating factor between the top 3 DDS tools was determined by diagnostic accuracy ranking, ease of use and the confidence and credibility of the clinical information. The evaluation methodology used here to assess the quality and comprehensiveness of clinical DDS tools was effective in identifying the most appropriate tool for the clinical setting. The use of clinical case scenarios is fundamental in determining the diagnostic accuracy and usability of the tools.

  4. Tools for quality control of fingerprint databases

    NASA Astrophysics Data System (ADS)

    Swann, B. Scott; Libert, John M.; Lepley, Margaret A.

    2010-04-01

    Integrity of fingerprint data is essential to biometric and forensic applications. Accordingly, the FBI's Criminal Justice Information Services (CJIS) Division has sponsored development of software tools to facilitate quality control functions relative to maintaining its fingerprint data assets inherent to the Integrated Automated Fingerprint Identification System (IAFIS) and Next Generation Identification (NGI). This paper provides an introduction of two such tools. The first FBI-sponsored tool was developed by the National Institute of Standards and Technology (NIST) and examines and detects the spectral signature of the ridge-flow structure characteristic of friction ridge skin. The Spectral Image Validation/Verification (SIVV) utility differentiates fingerprints from non-fingerprints, including blank frames or segmentation failures erroneously included in data; provides a "first look" at image quality; and can identify anomalies in sample rates of scanned images. The SIVV utility might detect errors in individual 10-print fingerprints inaccurately segmented from the flat, multi-finger image acquired by one of the automated collection systems increasing in availability and usage. In such cases, the lost fingerprint can be recovered by re-segmentation from the now compressed multi-finger image record. The second FBI-sponsored tool, CropCoeff was developed by MITRE and thoroughly tested via NIST. CropCoeff enables cropping of the replacement single print directly from the compressed data file, thus avoiding decompression and recompression of images that might degrade fingerprint features necessary for matching.

  5. Use of the MATRIXx Integrated Toolkit on the Microwave Anisotropy Probe Attitude Control System

    NASA Technical Reports Server (NTRS)

    Ward, David K.; Andrews, Stephen F.; McComas, David C.; ODonnell, James R., Jr.

    1999-01-01

    Recent advances in analytical software tools allow the analysis, simulation, flight code, and documentation of an algorithm to be generated from a single source, all within one integrated analytical design package. NASA's Microwave Anisotropy Probe project has used one such package, Integrated Systems' MATRIXx suite, in the design of the spacecraft's Attitude Control System. The project's experience with the linear analysis, simulation, code generation, and documentation tools will be presented and compared with more traditional development tools. In particular, the quality of the flight software generated will be examined in detail. Finally, lessons learned on each of the tools will be shared.

  6. SU-E-T-103: Development and Implementation of Web Based Quality Control Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Studinski, R; Taylor, R; Angers, C

    Purpose: Historically many radiation medicine programs have maintained their Quality Control (QC) test results in paper records or Microsoft Excel worksheets. Both these approaches represent significant logistical challenges, and are not predisposed to data review and approval. It has been our group's aim to develop and implement web based software designed not just to record and store QC data in a centralized database, but to provide scheduling and data review tools to help manage a radiation therapy clinics Equipment Quality control program. Methods: The software was written in the Python programming language using the Django web framework. In order tomore » promote collaboration and validation from other centres the code was made open source and is freely available to the public via an online source code repository. The code was written to provide a common user interface for data entry, formalize the review and approval process, and offer automated data trending and process control analysis of test results. Results: As of February 2014, our installation of QAtrack+ has 180 tests defined in its database and has collected ∼22 000 test results, all of which have been reviewed and approved by a physicist via QATrack+'s review tools. These results include records for quality control of Elekta accelerators, CT simulators, our brachytherapy programme, TomoTherapy and Cyberknife units. Currently at least 5 other centres are known to be running QAtrack+ clinically, forming the start of an international user community. Conclusion: QAtrack+ has proven to be an effective tool for collecting radiation therapy QC data, allowing for rapid review and trending of data for a wide variety of treatment units. As free and open source software, all source code, documentation and a bug tracker are available to the public at https://bitbucket.org/tohccmedphys/qatrackplus/.« less

  7. Alining Large Cylinders for Welding

    NASA Technical Reports Server (NTRS)

    Ehl, J. H.

    1985-01-01

    Special tooling alines and holds internally-stiffened large-diameter cylindrical parts for welding. Alinement brackets attached to strengthening fins on insides of cylindrical tank sections. Jackscrews on brackets raised or lowered to eliminate mismatches between adjacent sections. Tooling substantially reduces costs while allowing more precise control and improved quality.

  8. A Statistical Project Control Tool for Engineering Managers

    NASA Technical Reports Server (NTRS)

    Bauch, Garland T.

    2001-01-01

    This slide presentation reviews the use of a Statistical Project Control Tool (SPCT) for managing engineering projects. A literature review pointed to a definition of project success, (i.e., A project is successful when the cost, schedule, technical performance, and quality satisfy the customer.) The literature review also pointed to project success factors, and traditional project control tools, and performance measures that are detailed in the report. The essential problem is that with resources becoming more limited, and an increasing number or projects, project failure is increasing, there is a limitation of existing methods and systematic methods are required. The objective of the work is to provide a new statistical project control tool for project managers. Graphs using the SPCT method plotting results of 3 successful projects and 3 failed projects are reviewed, with success and failure being defined by the owner.

  9. Mycobacterial biomaterials and resources for researchers.

    PubMed

    Hazbón, Manzour Hernando; Rigouts, Leen; Schito, Marco; Ezewudo, Matthew; Kudo, Takuji; Itoh, Takashi; Ohkuma, Moriya; Kiss, Katalin; Wu, Linhuan; Ma, Juncai; Hamada, Moriyuki; Strong, Michael; Salfinger, Max; Daley, Charles L; Nick, Jerry A; Lee, Jung-Sook; Rastogi, Nalin; Couvin, David; Hurtado-Ortiz, Raquel; Bizet, Chantal; Suresh, Anita; Rodwell, Timothy; Albertini, Audrey; Lacourciere, Karen A; Deheer-Graham, Ana; Alexander, Sarah; Russell, Julie E; Bradford, Rebecca; Riojas, Marco A

    2018-06-01

    There are many resources available to mycobacterial researchers, including culture collections around the world that distribute biomaterials to the general scientific community, genomic and clinical databases, and powerful bioinformatics tools. However, many of these resources may be unknown to the research community. This review article aims to summarize and publicize many of these resources, thus strengthening the quality and reproducibility of mycobacterial research by providing the scientific community access to authenticated and quality-controlled biomaterials and a wealth of information, analytical tools and research opportunities.

  10. Expediting the Quest for Quality: The Role of IQAC in Academic Audit

    ERIC Educational Resources Information Center

    Nitonde, Rohidas

    2016-01-01

    Academic Audit is an important tool to control and maintain standards in academic sector. It has been found highly relevant by the experts across the world. Academic audit helps institutions to introspect and improve their quality. The present paper intends to probe into the possible role of Internal Quality Assurance Cell (IQAC) in Academic Audit…

  11. Data quality control and tools in passive seismic experiments exemplified on the Czech broadband seismic pool MOBNET in the AlpArray collaborative project

    NASA Astrophysics Data System (ADS)

    Vecsey, Luděk; Plomerová, Jaroslava; Jedlička, Petr; Munzarová, Helena; Babuška, Vladislav; AlpArray Working Group

    2017-12-01

    This paper focuses on major issues related to the data reliability and network performance of 20 broadband (BB) stations of the Czech (CZ) MOBNET (MOBile NETwork) seismic pool within the AlpArray seismic experiments. Currently used high-resolution seismological applications require high-quality data recorded for a sufficiently long time interval at seismological observatories and during the entire time of operation of the temporary stations. In this paper we present new hardware and software tools we have been developing during the last two decades while analysing data from several international passive experiments. The new tools help to assure the high-quality standard of broadband seismic data and eliminate potential errors before supplying data to seismological centres. Special attention is paid to crucial issues like the detection of sensor misorientation, timing problems, interchange of record components and/or their polarity reversal, sensor mass centring, or anomalous channel amplitudes due to, for example, imperfect gain. Thorough data quality control should represent an integral constituent of seismic data recording, preprocessing, and archiving, especially for data from temporary stations in passive seismic experiments. Large international seismic experiments require enormous efforts from scientists from different countries and institutions to gather hundreds of stations to be deployed in the field during a limited time period. In this paper, we demonstrate the beneficial effects of the procedures we have developed for acquiring a reliable large set of high-quality data from each group participating in field experiments. The presented tools can be applied manually or automatically on data from any seismic network.

  12. Effects of interactive patient smartphone support app on drug adherence and lifestyle changes in myocardial infarction patients: A randomized study.

    PubMed

    Johnston, Nina; Bodegard, Johan; Jerström, Susanna; Åkesson, Johanna; Brorsson, Hilja; Alfredsson, Joakim; Albertsson, Per A; Karlsson, Jan-Erik; Varenhorst, Christoph

    2016-08-01

    Patients with myocardial infarction (MI) seldom reach recommended targets for secondary prevention. This study evaluated a smartphone application ("app") aimed at improving treatment adherence and cardiovascular lifestyle in MI patients. Multicenter, randomized trial. A total of 174 ticagrelor-treated MI patients were randomized to either an interactive patient support tool (active group) or a simplified tool (control group) in addition to usual post-MI care. Primary end point was a composite nonadherence score measuring patient-registered ticagrelor adherence, defined as a combination of adherence failure events (2 missed doses registered in 7-day cycles) and treatment gaps (4 consecutive missed doses). Secondary end points included change in cardiovascular risk factors, quality of life (European Quality of Life-5 Dimensions), and patient device satisfaction (System Usability Scale). Patient mean age was 58 years, 81% were men, and 21% were current smokers. At 6 months, greater patient-registered drug adherence was achieved in the active vs the control group (nonadherence score: 16.6 vs 22.8 [P = .025]). Numerically, the active group was associated with higher degree of smoking cessation, increased physical activity, and change in quality of life; however, this did not reach statistical significance. Patient satisfaction was significantly higher in the active vs the control group (system usability score: 87.3 vs 78.1 [P = .001]). In MI patients, use of an interactive patient support tool improved patient self-reported drug adherence and may be associated with a trend toward improved cardiovascular lifestyle changes and quality of life. Use of a disease-specific interactive patient support tool may be an appreciated, simple, and promising complement to standard secondary prevention. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  13. Information systems as a quality management tool in clinical laboratories

    NASA Astrophysics Data System (ADS)

    Schmitz, Vanessa; Rosecler Bez el Boukhari, Marta

    2007-11-01

    This article describes information systems as a quality management tool in clinical laboratories. The quality of laboratory analyses is of fundamental importance for health professionals in aiding appropriate diagnosis and treatment. Information systems allow the automation of internal quality management processes, using standard sample tests, Levey-Jennings charts and Westgard multirule analysis. This simplifies evaluation and interpretation of quality tests and reduces the possibility of human error. This study proposes the development of an information system with appropriate functions and costs for the automation of internal quality control in small and medium-sized clinical laboratories. To this end, it evaluates the functions and usability of two commercial software products designed for this purpose, identifying the positive features of each, so that these can be taken into account during the development of the proposed system.

  14. RNA-SeQC: RNA-seq metrics for quality control and process optimization.

    PubMed

    DeLuca, David S; Levin, Joshua Z; Sivachenko, Andrey; Fennell, Timothy; Nazaire, Marc-Danie; Williams, Chris; Reich, Michael; Winckler, Wendy; Getz, Gad

    2012-06-01

    RNA-seq, the application of next-generation sequencing to RNA, provides transcriptome-wide characterization of cellular activity. Assessment of sequencing performance and library quality is critical to the interpretation of RNA-seq data, yet few tools exist to address this issue. We introduce RNA-SeQC, a program which provides key measures of data quality. These metrics include yield, alignment and duplication rates; GC bias, rRNA content, regions of alignment (exon, intron and intragenic), continuity of coverage, 3'/5' bias and count of detectable transcripts, among others. The software provides multi-sample evaluation of library construction protocols, input materials and other experimental parameters. The modularity of the software enables pipeline integration and the routine monitoring of key measures of data quality such as the number of alignable reads, duplication rates and rRNA contamination. RNA-SeQC allows investigators to make informed decisions about sample inclusion in downstream analysis. In summary, RNA-SeQC provides quality control measures critical to experiment design, process optimization and downstream computational analysis. See www.genepattern.org to run online, or www.broadinstitute.org/rna-seqc/ for a command line tool.

  15. Tools for Measuring and Improving Performance.

    ERIC Educational Resources Information Center

    Jurow, Susan

    1993-01-01

    Explains the need for meaningful performance measures in libraries and the Total Quality Management (TQM) approach to data collection. Five tools representing different stages of a TQM inquiry are covered (i.e., the Shewhart Cycle, flowcharts, cause-and-effect diagrams, Pareto charts, and control charts), and benchmarking is addressed. (Contains…

  16. A comprehensive quality control workflow for paired tumor-normal NGS experiments.

    PubMed

    Schroeder, Christopher M; Hilke, Franz J; Löffler, Markus W; Bitzer, Michael; Lenz, Florian; Sturm, Marc

    2017-06-01

    Quality control (QC) is an important part of all NGS data analysis stages. Many available tools calculate QC metrics from different analysis steps of single sample experiments (raw reads, mapped reads and variant lists). Multi-sample experiments, as sequencing of tumor-normal pairs, require additional QC metrics to ensure validity of results. These multi-sample QC metrics still lack standardization. We therefore suggest a new workflow for QC of DNA sequencing of tumor-normal pairs. With this workflow well-known single-sample QC metrics and additional metrics specific for tumor-normal pairs can be calculated. The segmentation into different tools offers a high flexibility and allows reuse for other purposes. All tools produce qcML, a generic XML format for QC of -omics experiments. qcML uses quality metrics defined in an ontology, which was adapted for NGS. All QC tools are implemented in C ++ and run both under Linux and Windows. Plotting requires python 2.7 and matplotlib. The software is available under the 'GNU General Public License version 2' as part of the ngs-bits project: https://github.com/imgag/ngs-bits. christopher.schroeder@med.uni-tuebingen.de. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  17. User-friendly solutions for microarray quality control and pre-processing on ArrayAnalysis.org

    PubMed Central

    Eijssen, Lars M. T.; Jaillard, Magali; Adriaens, Michiel E.; Gaj, Stan; de Groot, Philip J.; Müller, Michael; Evelo, Chris T.

    2013-01-01

    Quality control (QC) is crucial for any scientific method producing data. Applying adequate QC introduces new challenges in the genomics field where large amounts of data are produced with complex technologies. For DNA microarrays, specific algorithms for QC and pre-processing including normalization have been developed by the scientific community, especially for expression chips of the Affymetrix platform. Many of these have been implemented in the statistical scripting language R and are available from the Bioconductor repository. However, application is hampered by lack of integrative tools that can be used by users of any experience level. To fill this gap, we developed a freely available tool for QC and pre-processing of Affymetrix gene expression results, extending, integrating and harmonizing functionality of Bioconductor packages. The tool can be easily accessed through a wizard-like web portal at http://www.arrayanalysis.org or downloaded for local use in R. The portal provides extensive documentation, including user guides, interpretation help with real output illustrations and detailed technical documentation. It assists newcomers to the field in performing state-of-the-art QC and pre-processing while offering data analysts an integral open-source package. Providing the scientific community with this easily accessible tool will allow improving data quality and reuse and adoption of standards. PMID:23620278

  18. Application of machine learning and expert systems to Statistical Process Control (SPC) chart interpretation

    NASA Technical Reports Server (NTRS)

    Shewhart, Mark

    1991-01-01

    Statistical Process Control (SPC) charts are one of several tools used in quality control. Other tools include flow charts, histograms, cause and effect diagrams, check sheets, Pareto diagrams, graphs, and scatter diagrams. A control chart is simply a graph which indicates process variation over time. The purpose of drawing a control chart is to detect any changes in the process signalled by abnormal points or patterns on the graph. The Artificial Intelligence Support Center (AISC) of the Acquisition Logistics Division has developed a hybrid machine learning expert system prototype which automates the process of constructing and interpreting control charts.

  19. Benefits of a comprehensive quality program for cryopreserved PBMC covering 28 clinical trials sites utilizing an integrated, analytical web-based portal

    PubMed Central

    Ducar, Constance; Smith, Donna; Pinzon, Cris; Stirewalt, Michael; Cooper, Cristine; McElrath, M. Juliana; Hural, John

    2014-01-01

    The HIV Vaccine Trials Network (HVTN) is a global network of 28 clinical trial sites dedicated to identifying an effective HIV vaccine. Cryopreservation of high-quality peripheral blood mononuclear cells (PBMC) is critical for the assessment of vaccine-induced cellular immune functions. The HVTN PBMC Quality Management Program is designed to ensure viable PBMC are processed, stored and shipped for clinical trial assays from all HVTN clinical trial sites. The program has evolved by developing and incorporating best practices for laboratory and specimen quality and implementing automated, web-based tools. These tools allow the site-affiliated processing laboratories and the central Laboratory Operations Unit to rapidly collect, analyze and report PBMC quality data. The HVTN PBMC Quality Management Program includes five key components: 1) Laboratory Assessment, 2) PBMC Training and Certification, 3) Internal Quality Control, 4) External Quality Control (EQC), and 5) Assay Specimen Quality Control. Fresh PBMC processing data is uploaded from each clinical site processing laboratory to a central HVTN Statistical and Data Management Center database for access and analysis on a web portal. Samples are thawed at a central laboratory for assay or specimen quality control and sample quality data is uploaded directly to the database by the central laboratory. Four year cumulative data covering 23,477 blood draws reveals an average fresh PBMC yield of 1.45×106 ±0.48 cells per milliliter of useable whole blood. 95% of samples were within the acceptable range for fresh cell yield of 0.8–3.2×106 cells/ml of usable blood. Prior to full implementation of the HVTN PBMC Quality Management Program, the 2007 EQC evaluations from 10 international sites showed a mean day 2 thawed viability of 83.1% and recovery of 67.5%. Since then, four year cumulative data covering 3338 specimens used in immunologic assays shows that 99.88% had acceptable viabilities (>66%) for use in cellular assays (mean, 91.46% ±4.5%), and 96.2% had acceptable recoveries (50%–130%) with a mean of recovery of 85.8% ±19.12% of the originally cryopreserved cells. EQC testing revealed that since August 2009, failed recoveries dropped from 4.1% to 1.6% and failed viabilities dropped from 1.0% to 0.3%. The HVTN PBMC quality program provides for laboratory assessment, training and tools for identifying problems, implementing corrective action and monitoring for improvements. These data support the benefits of implementing a comprehensive, web-based PBMC quality program for large clinical trials networks. PMID:24709391

  20. Measuring and improving the quality of postoperative epidural analgesia for major abdominal surgery using statistical process control charts.

    PubMed

    Duncan, Fiona; Haigh, Carol

    2013-10-01

    To explore and improve the quality of continuous epidural analgesia for pain relief using Statistical Process Control tools. Measuring the quality of pain management interventions is complex. Intermittent audits do not accurately capture the results of quality improvement initiatives. The failure rate for one intervention, epidural analgesia, is approximately 30% in everyday practice, so it is an important area for improvement. Continuous measurement and analysis are required to understand the multiple factors involved in providing effective pain relief. Process control and quality improvement Routine prospectively acquired data collection started in 2006. Patients were asked about their pain and side effects of treatment. Statistical Process Control methods were applied for continuous data analysis. A multidisciplinary group worked together to identify reasons for variation in the data and instigated ideas for improvement. The key measure for improvement was a reduction in the percentage of patients with an epidural in severe pain. The baseline control charts illustrated the recorded variation in the rate of several processes and outcomes for 293 surgical patients. The mean visual analogue pain score (VNRS) was four. There was no special cause variation when data were stratified by surgeons, clinical area or patients who had experienced pain before surgery. Fifty-seven per cent of patients were hypotensive on the first day after surgery. We were able to demonstrate a significant improvement in the failure rate of epidurals as the project continued with quality improvement interventions. Statistical Process Control is a useful tool for measuring and improving the quality of pain management. The applications of Statistical Process Control methods offer the potential to learn more about the process of change and outcomes in an Acute Pain Service both locally and nationally. We have been able to develop measures for improvement and benchmarking in routine care that has led to the establishment of a national pain registry. © 2013 Blackwell Publishing Ltd.

  1. Quality control management and communication between radiologists and technologists.

    PubMed

    Nagy, Paul G; Pierce, Benjamin; Otto, Misty; Safdar, Nabile M

    2008-06-01

    The greatest barrier to quality control (QC) in the digital imaging environment is the lack of communication and documentation between those who interpret images and those who acquire them. Paper-based QC methods are insufficient in a digital image management system. Problem work flow must be incorporated into reengineering efforts when migrating to a digital practice. The authors implemented a Web-based QC feedback tool to document and facilitate the communication of issues identified by radiologists. The goal was to promote a responsive and constructive tool that contributes to a culture of quality. The hypothesis was that by making it easier for radiologists to submit quality issues, the number of QC issues submitted would increase. The authors integrated their Web-based quality tracking system with a clinical picture archiving and communication system so that radiologists could report quality issues without disrupting clinical work flow. Graphical dashboarding techniques aid supervisors in using this database to identify the root causes of different types of issues. Over the initial 12-month rollout period, starting in the general section, the authors recorded 20 times more QC issues submitted by radiologists, accompanied by a rise in technologists' responsiveness to QC issues. For technologists with high numbers of QC issues, the incorporation of data from this tracking system proved useful in performance appraisals and in driving individual improvement. This tool is an example of the types of information technology innovations that can be leveraged to support QC in the digital imaging environment. Initial data suggest that the result is not only an improvement in quality but higher levels of satisfaction for both radiologists and technologists.

  2. Implementing self sustained quality control procedures in a clinical laboratory.

    PubMed

    Khatri, Roshan; K C, Sanjay; Shrestha, Prabodh; Sinha, J N

    2013-01-01

    Quality control is an essential component in every clinical laboratory which maintains the excellence of laboratory standards, supplementing to proper disease diagnosis, patient care and resulting in overall strengthening of health care system. Numerous quality control schemes are available, with combinations of procedures, most of which are tedious, time consuming and can be "too technical" whereas commercially available quality control materials can be expensive especially for laboratories in developing nations like Nepal. Here, we present a procedure performed at our centre with self prepared control serum and use of simple statistical tools for quality assurance. The pooled serum was prepared as per guidelines for preparation of stabilized liquid quality control serum from human sera. Internal Quality Assessment was performed on this sample, on a daily basis which included measurement of 12 routine biochemical parameters. The results were plotted on Levey-Jennings charts and analysed with quality control rules, for a period of one month. The mean levels of biochemical analytes in self prepared control serum were within normal physiological range. This serum was evaluated every day along with patients' samples. The results obtained were plotted on control charts and analysed using common quality control rules to identify possible systematic and random errors. Immediate mitigation measures were taken and the dispatch of erroneous reports was avoided. In this study we try to highlight on a simple internal quality control procedure which can be performed by laboratories, with minimum technology, expenditure, and expertise and improve reliability and validity of the test reports.

  3. Process and control systems for composites manufacturing

    NASA Technical Reports Server (NTRS)

    Tsiang, T. H.; Wanamaker, John L.

    1992-01-01

    A precise control of composite material processing would not only improve part quality, but it would also directly reduce the overall manufacturing cost. The development and incorporation of sensors will help to generate real-time information for material processing relationships and equipment characteristics. In the present work, the thermocouple, pressure transducer, and dielectrometer technologies were investigated. The monitoring sensors were integrated with the computerized control system in three non-autoclave fabrication techniques: hot-press, self contained tool (self heating and pressurizing), and pressure vessel). The sensors were implemented in the parts and tools.

  4. Simulation techniques in hyperthermia treatment planning

    PubMed Central

    Paulides, MM; Stauffer, PR; Neufeld, E; Maccarini, P; Kyriakou, A; Canters, RAM; Diederich, C; Bakker, JF; Van Rhoon, GC

    2013-01-01

    Clinical trials have shown that hyperthermia (HT), i.e. an increase of tissue temperature to 39-44°C, significantly enhance radiotherapy and chemotherapy effectiveness (1). Driven by the developments in computational techniques and computing power, personalized hyperthermia treatment planning (HTP) has matured and has become a powerful tool for optimizing treatment quality. Electromagnetic, ultrasound, and thermal simulations using realistic clinical setups are now being performed to achieve patient-specific treatment optimization. In addition, extensive studies aimed to properly implement novel HT tools and techniques, and to assess the quality of HT, are becoming more common. In this paper, we review the simulation tools and techniques developed for clinical hyperthermia, and evaluate their current status on the path from “model” to “clinic”. In addition, we illustrate the major techniques employed for validation and optimization. HTP has become an essential tool for improvement, control, and assessment of HT treatment quality. As such, it plays a pivotal role in the quest to establish HT as an efficacious addition to multi-modality treatment of cancer. PMID:23672453

  5. Some Inspection Methods for Quality Control and In-service Inspection of GLARE

    NASA Astrophysics Data System (ADS)

    Sinke, J.

    2003-07-01

    Quality control of materials and structures is an important issue, also for GLARE. During the manufacturing stage the processes and materials should be monitored and checked frequently in order to obtain a qualified product. During the operation of the aircraft, frequent monitoring and inspections are performed to maintain the quality at a prescribed level. Therefore, in-service inspection methods are applied, and when necessary repair activities are conducted. For the quality control of the GLARE panels and components during manufacturing, the C-scan method proves to be an effective tool. For in-service inspection the Eddy Current Method is one of the suitable options. In this paper a brief overview is presented of both methods and their application on GLARE products.

  6. Many roads may lead to Rome: Selected features of quality control within environmental assessment systems in the US, NL, CA, and UK

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Günther, Markus, E-mail: markus.guenther@tu-berlin.de; Geißler, Gesa; Köppel, Johann

    As there is no one-and-only concept on how to precisely define and establish quality control (QC) or quality assurance (QA) in the making of environmental assessments (EA), this paper presents selected features of international approaches that address quality in EA systems in the USA, the Netherlands, Canada, and the United Kingdom. Based on explanative case studies, we highlight the embedding of specific quality control features within the EA systems, the objectives and processes, and relevant transparency challenges. Such features of QC/QA approaches can be considered in cases where substantial quality control and assurance efforts are still missing. Yet further researchmore » needs to be conducted on the efficacy of these approaches, which remains beyond the scope of this study. - Highlights: • We present four tools for quality control and assurance from different EA systems. • Approaches vary in institutional setting, objectives, procedures, and transparency. • Highlighted features might provide guidance in cases where QC/QA is still lacking.« less

  7. Machine Learning: A Crucial Tool for Sensor Design

    PubMed Central

    Zhao, Weixiang; Bhushan, Abhinav; Santamaria, Anthony D.; Simon, Melinda G.; Davis, Cristina E.

    2009-01-01

    Sensors have been widely used for disease diagnosis, environmental quality monitoring, food quality control, industrial process analysis and control, and other related fields. As a key tool for sensor data analysis, machine learning is becoming a core part of novel sensor design. Dividing a complete machine learning process into three steps: data pre-treatment, feature extraction and dimension reduction, and system modeling, this paper provides a review of the methods that are widely used for each step. For each method, the principles and the key issues that affect modeling results are discussed. After reviewing the potential problems in machine learning processes, this paper gives a summary of current algorithms in this field and provides some feasible directions for future studies. PMID:20191110

  8. AfterQC: automatic filtering, trimming, error removing and quality control for fastq data.

    PubMed

    Chen, Shifu; Huang, Tanxiao; Zhou, Yanqing; Han, Yue; Xu, Mingyan; Gu, Jia

    2017-03-14

    Some applications, especially those clinical applications requiring high accuracy of sequencing data, usually have to face the troubles caused by unavoidable sequencing errors. Several tools have been proposed to profile the sequencing quality, but few of them can quantify or correct the sequencing errors. This unmet requirement motivated us to develop AfterQC, a tool with functions to profile sequencing errors and correct most of them, plus highly automated quality control and data filtering features. Different from most tools, AfterQC analyses the overlapping of paired sequences for pair-end sequencing data. Based on overlapping analysis, AfterQC can detect and cut adapters, and furthermore it gives a novel function to correct wrong bases in the overlapping regions. Another new feature is to detect and visualise sequencing bubbles, which can be commonly found on the flowcell lanes and may raise sequencing errors. Besides normal per cycle quality and base content plotting, AfterQC also provides features like polyX (a long sub-sequence of a same base X) filtering, automatic trimming and K-MER based strand bias profiling. For each single or pair of FastQ files, AfterQC filters out bad reads, detects and eliminates sequencer's bubble effects, trims reads at front and tail, detects the sequencing errors and corrects part of them, and finally outputs clean data and generates HTML reports with interactive figures. AfterQC can run in batch mode with multiprocess support, it can run with a single FastQ file, a single pair of FastQ files (for pair-end sequencing), or a folder for all included FastQ files to be processed automatically. Based on overlapping analysis, AfterQC can estimate the sequencing error rate and profile the error transform distribution. The results of our error profiling tests show that the error distribution is highly platform dependent. Much more than just another new quality control (QC) tool, AfterQC is able to perform quality control, data filtering, error profiling and base correction automatically. Experimental results show that AfterQC can help to eliminate the sequencing errors for pair-end sequencing data to provide much cleaner outputs, and consequently help to reduce the false-positive variants, especially for the low-frequency somatic mutations. While providing rich configurable options, AfterQC can detect and set all the options automatically and require no argument in most cases.

  9. Statistical process control: separating signal from noise in emergency department operations.

    PubMed

    Pimentel, Laura; Barrueto, Fermin

    2015-05-01

    Statistical process control (SPC) is a visually appealing and statistically rigorous methodology very suitable to the analysis of emergency department (ED) operations. We demonstrate that the control chart is the primary tool of SPC; it is constructed by plotting data measuring the key quality indicators of operational processes in rationally ordered subgroups such as units of time. Control limits are calculated using formulas reflecting the variation in the data points from one another and from the mean. SPC allows managers to determine whether operational processes are controlled and predictable. We review why the moving range chart is most appropriate for use in the complex ED milieu, how to apply SPC to ED operations, and how to determine when performance improvement is needed. SPC is an excellent tool for operational analysis and quality improvement for these reasons: 1) control charts make large data sets intuitively coherent by integrating statistical and visual descriptions; 2) SPC provides analysis of process stability and capability rather than simple comparison with a benchmark; 3) SPC allows distinction between special cause variation (signal), indicating an unstable process requiring action, and common cause variation (noise), reflecting a stable process; and 4) SPC keeps the focus of quality improvement on process rather than individual performance. Because data have no meaning apart from their context, and every process generates information that can be used to improve it, we contend that SPC should be seriously considered for driving quality improvement in emergency medicine. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. A mask quality control tool for the OSIRIS multi-object spectrograph

    NASA Astrophysics Data System (ADS)

    López-Ruiz, J. C.; Vaz Cedillo, Jacinto Javier; Ederoclite, Alessandro; Bongiovanni, Ángel; González Escalera, Víctor

    2012-09-01

    OSIRIS multi object spectrograph uses a set of user-customised-masks, which are manufactured on-demand. The manufacturing process consists of drilling the specified slits on the mask with the required accuracy. Ensuring that slits are on the right place when observing is of vital importance. We present a tool for checking the quality of the process of manufacturing the masks which is based on analyzing the instrument images obtained with the manufactured masks on place. The tool extracts the slit information from these images, relates specifications with the extracted slit information, and finally communicates to the operator if the manufactured mask fulfills the expectations of the mask designer. The proposed tool has been built using scripting languages and using standard libraries such as opencv, pyraf and scipy. The software architecture, advantages and limits of this tool in the lifecycle of a multiobject acquisition are presented.

  11. GCE Data Toolbox for MATLAB - a software framework for automating environmental data processing, quality control and documentation

    NASA Astrophysics Data System (ADS)

    Sheldon, W.; Chamblee, J.; Cary, R. H.

    2013-12-01

    Environmental scientists are under increasing pressure from funding agencies and journal publishers to release quality-controlled data in a timely manner, as well as to produce comprehensive metadata for submitting data to long-term archives (e.g. DataONE, Dryad and BCO-DMO). At the same time, the volume of digital data that researchers collect and manage is increasing rapidly due to advances in high frequency electronic data collection from flux towers, instrumented moorings and sensor networks. However, few pre-built software tools are available to meet these data management needs, and those tools that do exist typically focus on part of the data management lifecycle or one class of data. The GCE Data Toolbox has proven to be both a generalized and effective software solution for environmental data management in the Long Term Ecological Research Network (LTER). This open source MATLAB software library, developed by the Georgia Coastal Ecosystems LTER program, integrates metadata capture, creation and management with data processing, quality control and analysis to support the entire data lifecycle. Raw data can be imported directly from common data logger formats (e.g. SeaBird, Campbell Scientific, YSI, Hobo), as well as delimited text files, MATLAB files and relational database queries. Basic metadata are derived from the data source itself (e.g. parsed from file headers) and by value inspection, and then augmented using editable metadata templates containing boilerplate documentation, attribute descriptors, code definitions and quality control rules. Data and metadata content, quality control rules and qualifier flags are then managed together in a robust data structure that supports database functionality and ensures data validity throughout processing. A growing suite of metadata-aware editing, quality control, analysis and synthesis tools are provided with the software to support managing data using graphical forms and command-line functions, as well as developing automated workflows for unattended processing. Finalized data and structured metadata can be exported in a wide variety of text and MATLAB formats or uploaded to a relational database for long-term archiving and distribution. The GCE Data Toolbox can be used as a complete, light-weight solution for environmental data and metadata management, but it can also be used in conjunction with other cyber infrastructure to provide a more comprehensive solution. For example, newly acquired data can be retrieved from a Data Turbine or Campbell LoggerNet Database server for quality control and processing, then transformed to CUAHSI Observations Data Model format and uploaded to a HydroServer for distribution through the CUAHSI Hydrologic Information System. The GCE Data Toolbox can also be leveraged in analytical workflows developed using Kepler or other systems that support MATLAB integration or tool chaining. This software can therefore be leveraged in many ways to help researchers manage, analyze and distribute the data they collect.

  12. Computer applications in scientific balloon quality control

    NASA Astrophysics Data System (ADS)

    Seely, Loren G.; Smith, Michael S.

    Seal defects and seal tensile strength are primary determinants of product quality in scientific balloon manufacturing; they therefore require a unit of quality measure. The availability of inexpensive and powerful data-processing tools can serve as the basis of a quality-trends-discerning analysis of products. The results of one such analysis are presently given in graphic form for use on the production floor. Software descriptions and their sample outputs are presented, together with a summary of overall and long-term effects of these methods on product quality.

  13. Computerized Decision Aids for Shared Decision Making in Serious Illness: Systematic Review.

    PubMed

    Staszewska, Anna; Zaki, Pearl; Lee, Joon

    2017-10-06

    Shared decision making (SDM) is important in achieving patient-centered care. SDM tools such as decision aids are intended to inform the patient. When used to assist in decision making between treatments, decision aids have been shown to reduce decisional conflict, increase ease of decision making, and increase modification of previous decisions. The purpose of this systematic review is to assess the impact of computerized decision aids on patient-centered outcomes related to SDM for seriously ill patients. PubMed and Scopus databases were searched to identify randomized controlled trials (RCTs) that assessed the impact of computerized decision aids on patient-centered outcomes and SDM in serious illness. Six RCTs were identified and data were extracted on study population, design, and results. Risk of bias was assessed by a modified Cochrane Risk of Bias Tool for Quality Assessment of Randomized Controlled Trials. Six RCTs tested decision tools in varying serious illnesses. Three studies compared different computerized decision aids against each other and a control. All but one study demonstrated improvement in at least one patient-centered outcome. Computerized decision tools may reduce unnecessary treatment in patients with low disease severity in comparison with informational pamphlets. Additionally, electronic health record (EHR) portals may provide the opportunity to manage care from the home for individuals affected by illness. The quality of decision aids is of great importance. Furthermore, satisfaction with the use of tools is associated with increased patient satisfaction and reduced decisional conflict. Finally, patients may benefit from computerized decision tools without the need for increased physician involvement. Most computerized decision aids improved at least one patient-centered outcome. All RCTs identified were at a High Risk of Bias or Unclear Risk of Bias. Effort should be made to improve the quality of RCTs testing SDM aids in serious illness. ©Anna Staszewska, Pearl Zaki, Joon Lee. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 06.10.2017.

  14. Spatially-controlled illumination with rescan confocal microscopy enhances image quality, resolution and reduces photodamage

    NASA Astrophysics Data System (ADS)

    Krishnaswami, Venkataraman; De Luca, Giulia M. R.; Breedijk, Ronald M. P.; Van Noorden, Cornelis J. F.; Manders, Erik M. M.; Hoebe, Ron A.

    2017-02-01

    Fluorescence microscopy is an important tool in biomedical imaging. An inherent trade-off lies between image quality and photodamage. Recently, we have introduced rescan confocal microscopy (RCM) that improves the lateral resolution of a confocal microscope down to 170 nm. Previously, we have demonstrated that with controlled-light exposure microscopy, spatial control of illumination reduces photodamage without compromising image quality. Here, we show that the combination of these two techniques leads to high resolution imaging with reduced photodamage without compromising image quality. Implementation of spatially-controlled illumination was carried out in RCM using a line scanning-based approach. Illumination is spatially-controlled for every line during imaging with the help of a prediction algorithm that estimates the spatial profile of the fluorescent specimen. The estimation is based on the information available from previously acquired line images. As a proof-of-principle, we show images of N1E-115 neuroblastoma cells, obtained by this new setup with reduced illumination dose, improved resolution and without compromising image quality.

  15. Colorimetry as Quality Control Tool for Individual Inkjet-Printed Pediatric Formulations.

    PubMed

    Wickström, Henrika; Nyman, Johan O; Indola, Mathias; Sundelin, Heidi; Kronberg, Leif; Preis, Maren; Rantanen, Jukka; Sandler, Niklas

    2017-02-01

    Printing technologies were recently introduced to the pharmaceutical field for manufacturing of drug delivery systems. Printing allows on demand manufacturing of flexible pharmaceutical doses in a personalized manner, which is critical for a successful and safe treatment of patient populations with specific needs, such as children and the elderly, and patients facing multimorbidity. Printing of pharmaceuticals as technique generates new demands on the quality control procedures. For example, rapid quality control is needed as the printing can be done on demand and at the point of care. This study evaluated the potential use of a handheld colorimetry device for quality control of printed doses of vitamin Bs on edible rice and sugar substrates. The structural features of the substrates with and without ink were also compared. A multicomponent ink formulation with vitamin B 1 , B 2 , B 3 , and B 6 was developed. Doses (4 cm 2 ) were prepared by applying 1-10 layers of yellow ink onto the white substrates using thermal inkjet technology. The colorimetric method was seen to be viable in detecting doses up to the 5th and 6th printed layers until color saturation of the yellow color parameter (b*) was observed on the substrates. Liquid chromatography mass spectrometry was used as a reference method for the colorimetry measurements plotted against the number of printed layers. It was concluded that colorimetry could be used as a quality control tool for detection of different doses. However, optimization of the color addition needs to be done to avoid color saturation within the planned dose interval.

  16. Inpatient preanalytic process improvements.

    PubMed

    Wagar, Elizabeth A; Phipps, Ron; Del Guidice, Robert; Middleton, Lavinia P; Bingham, John; Prejean, Cheryl; Johnson-Hamilton, Martha; Philip, Pheba; Le, Ngoc Han; Muses, Waheed

    2013-12-01

    Phlebotomy services are a common target for preanalytic improvements. Many new, quality engineering tools have recently been applied in clinical laboratories. However, data on relatively few projects have been published. This example describes a complete application of current, quality engineering tools to improve preanalytic phlebotomy services. To decrease the response time in the preanalytic inpatient laboratory by 25%, to reduce the number of incident reports related to preanalytic phlebotomy, and to make systematic process changes that satisfied the stakeholders. The Department of Laboratory Medicine, General Services Section, at the University of Texas MD Anderson Cancer Center (Houston) is responsible for inpatient phlebotomy in a 24-hour operation, which serves 689 inpatient beds. The study director was project director of the Division of Pathology and Laboratory Medicine's Quality Improvement Section and was assisted by 2 quality technologists and an industrial engineer from MD Anderson Office of Performance Improvement. After implementing each solution, using well-recognized, quality tools and metrics, the response time for blood collection decreased by 23%, which was close to meeting the original responsiveness goal of 25%. The response time between collection and arrival in the laboratory decreased by 8%. Applicable laboratory-related incident reports were reduced by 43%. Comprehensive application of quality tools, such as statistical control charts, Pareto diagrams, value-stream maps, process failure modes and effects analyses, fishbone diagrams, solution prioritization matrices, and customer satisfaction surveys can significantly improve preset goals for inpatient phlebotomy.

  17. Use Cases for Combining Web Services with ArcPython Tools for Enabling Quality Control of Land Remote Sensing Data Products.

    NASA Astrophysics Data System (ADS)

    Krehbiel, C.; Maiersperger, T.; Friesz, A.; Harriman, L.; Quenzer, R.; Impecoven, K.

    2016-12-01

    Three major obstacles facing big Earth data users include data storage, management, and analysis. As the amount of satellite remote sensing data increases, so does the need for better data storage and management strategies to exploit the plethora of data now available. Standard GIS tools can help big Earth data users whom interact with and analyze increasingly large and diverse datasets. In this presentation we highlight how NASA's Land Processes Distributed Active Archive Center (LP DAAC) is tackling these big Earth data challenges. We provide a real life use case example to describe three tools and services provided by the LP DAAC to more efficiently exploit big Earth data in a GIS environment. First, we describe the Open-source Project for a Network Data Access Protocol (OPeNDAP), which calls to specific data, minimizing the amount of data that a user downloads and improves the efficiency of data downloading and processing. Next, we cover the LP DAAC's Application for Extracting and Exploring Analysis Ready Samples (AppEEARS), a web application interface for extracting and analyzing land remote sensing data. From there, we review an ArcPython toolbox that was developed to provide quality control services to land remote sensing data products. Locating and extracting specific subsets of larger big Earth datasets improves data storage and management efficiency for the end user, and quality control services provides a straightforward interpretation of big Earth data. These tools and services are beneficial to the GIS user community in terms of standardizing workflows and improving data storage, management, and analysis tactics.

  18. MRIQC: Advancing the automatic prediction of image quality in MRI from unseen sites

    PubMed Central

    2017-01-01

    Quality control of MRI is essential for excluding problematic acquisitions and avoiding bias in subsequent image processing and analysis. Visual inspection is subjective and impractical for large scale datasets. Although automated quality assessments have been demonstrated on single-site datasets, it is unclear that solutions can generalize to unseen data acquired at new sites. Here, we introduce the MRI Quality Control tool (MRIQC), a tool for extracting quality measures and fitting a binary (accept/exclude) classifier. Our tool can be run both locally and as a free online service via the OpenNeuro.org portal. The classifier is trained on a publicly available, multi-site dataset (17 sites, N = 1102). We perform model selection evaluating different normalization and feature exclusion approaches aimed at maximizing across-site generalization and estimate an accuracy of 76%±13% on new sites, using leave-one-site-out cross-validation. We confirm that result on a held-out dataset (2 sites, N = 265) also obtaining a 76% accuracy. Even though the performance of the trained classifier is statistically above chance, we show that it is susceptible to site effects and unable to account for artifacts specific to new sites. MRIQC performs with high accuracy in intra-site prediction, but performance on unseen sites leaves space for improvement which might require more labeled data and new approaches to the between-site variability. Overcoming these limitations is crucial for a more objective quality assessment of neuroimaging data, and to enable the analysis of extremely large and multi-site samples. PMID:28945803

  19. Evaluation of stormwater micropollutant source control and end-of-pipe control strategies using an uncertainty-calibrated integrated dynamic simulation model.

    PubMed

    Vezzaro, L; Sharma, A K; Ledin, A; Mikkelsen, P S

    2015-03-15

    The estimation of micropollutant (MP) fluxes in stormwater systems is a fundamental prerequisite when preparing strategies to reduce stormwater MP discharges to natural waters. Dynamic integrated models can be important tools in this step, as they can be used to integrate the limited data provided by monitoring campaigns and to evaluate the performance of different strategies based on model simulation results. This study presents an example where six different control strategies, including both source-control and end-of-pipe treatment, were compared. The comparison focused on fluxes of heavy metals (copper, zinc) and organic compounds (fluoranthene). MP fluxes were estimated by using an integrated dynamic model, in combination with stormwater quality measurements. MP sources were identified by using GIS land usage data, runoff quality was simulated by using a conceptual accumulation/washoff model, and a stormwater retention pond was simulated by using a dynamic treatment model based on MP inherent properties. Uncertainty in the results was estimated with a pseudo-Bayesian method. Despite the great uncertainty in the MP fluxes estimated by the runoff quality model, it was possible to compare the six scenarios in terms of discharged MP fluxes, compliance with water quality criteria, and sediment accumulation. Source-control strategies obtained better results in terms of reduction of MP emissions, but all the simulated strategies failed in fulfilling the criteria based on emission limit values. The results presented in this study shows how the efficiency of MP pollution control strategies can be quantified by combining advanced modeling tools (integrated stormwater quality model, uncertainty calibration). Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Implementation of the trauma registry as a tool for quality improvement in trauma care in a brazilian hospital: the first 12 months.

    PubMed

    Parreira, José Gustavo; de Campos, Tércio; Perlingeiro, Jacqueline A Gianinni; Soldá, Silvia C; Assef, José Cesar; Gonçalves, Augusto Canton; Zuffo, Bruno Malteze; Floriano, Caio Gomes; de Oliveira, Erik Haruk; de Oliveira, Renato Vieira Rodrigues; Oliveira, Amanda Lima; de Melo, Caio Gullo; Below, Cristiano; Miranda, Dino R Pérez; Santos, Gabriella Colasuonno; de Almeida, Gabriele Madeira; Brianti, Isabela Campos; Votto, Karina Baruel de Camargo; Schues, Patrick Alexander Sauer; dos Santos, Rafael Gomes; de Figueredo, Sérgio Mazzola Poli; de Araujo, Tatiani Gonçalves; Santos, Bruna do Nascimento; Ferreira, Laura Cardoso Manduca; Tanaka, Giuliana Olivi; Matos, Thiara; da Sousa, Maria Daiana; Augusto, Samara de Souza

    2015-01-01

    to analyze the implementation of a trauma registry in a university teaching hospital delivering care under the unified health system (SUS), and its ability to identify points for improvement in the quality of care provided. the data collection group comprised students from medicine and nursing courses who were holders of FAPESP scholarships (technical training 1) or otherwise, overseen by the coordinators of the project. The itreg (ECO Sistemas-RJ/SBAIT) software was used as the database tool. Several quality "filters" were proposed to select those cases for review in the quality control process. data for 1344 trauma patients were input to the itreg database between March and November 2014. Around 87.0% of cases were blunt trauma patients, 59.6% had RTS>7.0 and 67% ISS<9. Full records were available for 292 cases, which were selected for review in the quality program. The auditing filters most frequently registered were laparotomy four hours after admission and drainage of acute subdural hematomas four hours after admission. Several points for improvement were flagged, such as control of overtriage of patients, the need to reduce the number of negative imaging exams, the development of protocols for achieving central venous access, and management of major TBI. the trauma registry provides a clear picture of the points to be improved in trauma patient care, however, there are specific peculiarities for implementing this tool in the Brazilian milieu.

  1. Improved quality management to enhance the efficacy of the sterile insect technique for lepidopteran pests

    USDA-ARS?s Scientific Manuscript database

    Lepidoptera are among the most severe pests of food and fibre crops in the world and are mainly controlled using broad spectrum insecticides. This does not lead to sustainable control and farmers are demanding alternative control tools which are both effective and friendly to the environment. The st...

  2. Adaptive Data Processing Technique for Lidar-Assisted Control to Bridge the Gap between Lidar Systems and Wind Turbines: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schlipf, David; Raach, Steffen; Haizmann, Florian

    2015-12-14

    This paper presents first steps toward an adaptive lidar data processing technique crucial for lidar-assisted control in wind turbines. The prediction time and the quality of the wind preview from lidar measurements depend on several factors and are not constant. If the data processing is not continually adjusted, the benefit of lidar-assisted control cannot be fully exploited, or can even result in harmful control action. An online analysis of the lidar and turbine data are necessary to continually reassess the prediction time and lidar data quality. In this work, a structured process to develop an analysis tool for the predictionmore » time and a new hardware setup for lidar-assisted control are presented. The tool consists of an online estimation of the rotor effective wind speed from lidar and turbine data and the implementation of an online cross correlation to determine the time shift between both signals. Further, initial results from an ongoing campaign in which this system was employed for providing lidar preview for feed-forward pitch control are presented.« less

  3. QUALITY ASSURANCE AND QUALITY CONTROL IN THE DEVELOPMENT AND APPLICATION OF THE AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT (AGWA) TOOL

    EPA Science Inventory

    Planning and assessment in land and water resource management are evolving from simple, local-scale problems toward complex, spatially explicit regional ones. Such problems have to be addressed with distributed models that can compute runoff and erosion at different spatial and t...

  4. Software Quality Control at Belle II

    NASA Astrophysics Data System (ADS)

    Ritter, M.; Kuhr, T.; Hauth, T.; Gebard, T.; Kristof, M.; Pulvermacher, C.; Belle Software Group, II

    2017-10-01

    Over the last seven years the software stack of the next generation B factory experiment Belle II has grown to over one million lines of C++ and Python code, counting only the part included in offline software releases. There are several thousand commits to the central repository by about 100 individual developers per year. To keep a coherent software stack of high quality that it can be sustained and used efficiently for data acquisition, simulation, reconstruction, and analysis over the lifetime of the Belle II experiment is a challenge. A set of tools is employed to monitor the quality of the software and provide fast feedback to the developers. They are integrated in a machinery that is controlled by a buildbot master and automates the quality checks. The tools include different compilers, cppcheck, the clang static analyzer, valgrind memcheck, doxygen, a geometry overlap checker, a check for missing or extra library links, unit tests, steering file level tests, a sophisticated high-level validation suite, and an issue tracker. The technological development infrastructure is complemented by organizational means to coordinate the development.

  5. [Development of quality assurance/quality control web system in radiotherapy].

    PubMed

    Okamoto, Hiroyuki; Mochizuki, Toshihiko; Yokoyama, Kazutoshi; Wakita, Akihisa; Nakamura, Satoshi; Ueki, Heihachi; Shiozawa, Keiko; Sasaki, Koji; Fuse, Masashi; Abe, Yoshihisa; Itami, Jun

    2013-12-01

    Our purpose is to develop a QA/QC (quality assurance/quality control) web system using a server-side script language such as HTML (HyperText Markup Language) and PHP (Hypertext Preprocessor), which can be useful as a tool to share information about QA/QC in radiotherapy. The system proposed in this study can be easily built in one's own institute, because HTML can be easily handled. There are two desired functions in a QA/QC web system: (i) To review the results of QA/QC for a radiotherapy machine, manuals, and reports necessary for routinely performing radiotherapy through this system. By disclosing the results, transparency can be maintained, (ii) To reveal a protocol for QA/QC in one's own institute using pictures and movies relating to QA/QC for simplicity's sake, which can also be used as an educational tool for junior radiation technologists and medical physicists. By using this system, not only administrators, but also all staff involved in radiotherapy, can obtain information about the conditions and accuracy of treatment machines through the QA/QC web system.

  6. Discussion of the quality control and performance testing of ultrasound diagnostic equipment

    NASA Astrophysics Data System (ADS)

    Jiang, Junjie

    2018-03-01

    In recent years, with the rapid development of ultrasonography, the application and popularization of new technology used in ultrasound equipment, the level of providing diagnostic information for doctors enhances unceasingly, which has become the indispensable diagnostic tool for medical institutions. The performance of equipment is directly related to the doctor’s diagnosis and the patient’s health, therefore, it is very important to choose a good method for quality control and performance testing.

  7. Are you good enough for your patients? The European certification model in laparoscopic surgery

    PubMed Central

    Campo, R.; Molinas, C.R.; De Wilde, R.L.; Brolmann, H.; Brucker, S.; Mencaglia, L.; Odonovan, P.; Wallwiener, D.; Wattiez, A.

    2012-01-01

    Quality control, training and education in gynaecological surgery are challenged and urgent measures are emerging. The implementation of a structured and validated program for training and quality control seems the most urgent measurement to be taken. The European Academy of Gynaecological Surgery has made a first attempt to do so. Through a practical and theoretical tests system, the skills of an individual surgeon is measured and the conditions to enter the different level of expertise are clearly defined. This certification system based on the best possible level of scientific evidence provides a first practical tool, universally implementable for a decent quality control and structured training program in Gynaecological laparoscopic surgery. PMID:24753896

  8. Water Network Tool for Resilience v. 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-12-09

    WNTR is a python package designed to simulate and analyze resilience of water distribution networks. The software includes: - Pressure driven and demand driven hydraulic simulation - Water quality simulation to track concentration, trace, and water age - Conditional controls to simulate power outages - Models to simulate pipe breaks - A wide range of resilience metrics - Analysis and visualization tools

  9. Benefits of a comprehensive quality program for cryopreserved PBMC covering 28 clinical trials sites utilizing an integrated, analytical web-based portal.

    PubMed

    Ducar, Constance; Smith, Donna; Pinzon, Cris; Stirewalt, Michael; Cooper, Cristine; McElrath, M Juliana; Hural, John

    2014-07-01

    The HIV Vaccine Trials Network (HVTN) is a global network of 28 clinical trial sites dedicated to identifying an effective HIV vaccine. Cryopreservation of high-quality peripheral blood mononuclear cells (PBMC) is critical for the assessment of vaccine-induced cellular immune functions. The HVTN PBMC Quality Management Program is designed to ensure that viable PBMC are processed, stored and shipped for clinical trial assays from all HVTN clinical trial sites. The program has evolved by developing and incorporating best practices for laboratory and specimen quality and implementing automated, web-based tools. These tools allow the site-affiliated processing laboratories and the central Laboratory Operations Unit to rapidly collect, analyze and report PBMC quality data. The HVTN PBMC Quality Management Program includes five key components: 1) Laboratory Assessment, 2) PBMC Training and Certification, 3) Internal Quality Control, 4) External Quality Control (EQC), and 5) Assay Specimen Quality Control. Fresh PBMC processing data is uploaded from each clinical site processing laboratory to a central HVTN Statistical and Data Management Center database for access and analysis on a web portal. Samples are thawed at a central laboratory for assay or specimen quality control and sample quality data is uploaded directly to the database by the central laboratory. Four year cumulative data covering 23,477 blood draws reveals an average fresh PBMC yield of 1.45×10(6)±0.48 cells per milliliter of useable whole blood. 95% of samples were within the acceptable range for fresh cell yield of 0.8-3.2×10(6) cells/ml of usable blood. Prior to full implementation of the HVTN PBMC Quality Management Program, the 2007 EQC evaluations from 10 international sites showed a mean day 2 thawed viability of 83.1% and a recovery of 67.5%. Since then, four year cumulative data covering 3338 specimens used in immunologic assays shows that 99.88% had acceptable viabilities (>66%) for use in cellular assays (mean, 91.46% ±4.5%), and 96.2% had acceptable recoveries (50%-130%) with a mean of recovery of 85.8% ±19.12% of the originally cryopreserved cells. EQC testing revealed that since August 2009, failed recoveries dropped from 4.1% to 1.6% and failed viabilities dropped from 1.0% to 0.3%. The HVTN PBMC quality program provides for laboratory assessment, training and tools for identifying problems, implementing corrective action and monitoring for improvements. These data support the benefits of implementing a comprehensive, web-based PBMC quality program for large clinical trials networks. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Sigma Metrics Across the Total Testing Process.

    PubMed

    Charuruks, Navapun

    2017-03-01

    Laboratory quality control has been developed for several decades to ensure patients' safety, from a statistical quality control focus on the analytical phase to total laboratory processes. The sigma concept provides a convenient way to quantify the number of errors in extra-analytical and analytical phases through the defect per million and sigma metric equation. Participation in a sigma verification program can be a convenient way to monitor analytical performance continuous quality improvement. Improvement of sigma-scale performance has been shown from our data. New tools and techniques for integration are needed. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. Development and implementation of an audit tool for quality control of parenteral nutrition.

    PubMed

    García-Rodicio, Sonsoles; Abajo, Celia; Godoy, Mercedes; Catalá, Miguel Angel

    2009-01-01

    The aim of this article is to describe the development of a quality control methodology applied to patients receiving parenteral nutrition (PN) and to present the results obtained over the past 10 years. Development of the audit tool: In 1995, a total of 13 PN quality criteria and their standards were defined based on literature and past experiences. They were applied during 5 different 6-month audits carried out in subsequent years. According to the results of each audit, the criteria with lower validity were eliminated, while others were optimized and new criteria were introduced to complete the monitoring of other areas not previously examined. Currently, the quality control process includes 22 quality criteria and their standards that examine the following 4 different areas: (1) indication and duration of PN; (2) nutrition assessment, adequacy of the nutrition support, and monitoring; (3) metabolic and infectious complications; and (4) global efficacy of the nutrition support regimen. The authors describe the current definition of each criterion and present the results obtained in the 5 audits performed. In the past year, 9 of the 22 criteria reached the predefined standards. The areas detected for further improvements were: indication for PN, nutrition assessment, and management of catheter infections. The definition of quality criteria and their standards is an efficient method of providing a qualitative and quantitative analysis of the clinical care of patients receiving PN. It detects areas for improvement and assists in developing a methodology to work efficiently.

  12. Total Quality Management in Construction

    DTIC Science & Technology

    1993-08-01

    1989. I Ishikawa , Kaoru . Guide to Quality Control. White Plains, New York: Asian Productivity Organization, 1982. 5 Kelso, Frank B. II, Admiral...category are explored to determine the most significant causes which produce the effect. Figure 24 shows a typical cause-and effect diagram. Kaoru ... Ishikawa describes cause-and-effect diagrams as I an effective tool "to clearly illustrate the various causes I affecting quality by sorting out and relating

  13. Intelligent Chemistry Management System (ICMS)--A new approach to steam generator chemistry control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barto, R.J.; Farrell, D.M.; Noto, F.A.

    1986-04-01

    The Intelligent Chemistry Management System (ICMS) is a new tool which assists in steam generator chemistry control. Utilizing diagnostic capabilities, the ICMS will provide utility and industrial boiler operators, system chemists, and plant engineers with a tool for monitoring, diagnosing, and controlling steam generator system chemistry. By reducing the number of forced outages through early identification of potentially detrimental conditions, suggestion of possible causes, and execution of corrective actions, improvements in unit availability and reliability will result. The system monitors water and steam quality at a number of critical locations in the plant.

  14. An experimental study of cutting performances in machining of nimonic super alloy GH2312

    NASA Astrophysics Data System (ADS)

    Du, Jinfu; Wang, Xi; Xu, Min; Mao, Jin; Zhao, Xinglong

    2018-05-01

    Nimonic super alloy are extensively used in the aerospace industry because of its unique properties. As they are quite costly and difficult to machine, the machining tool is easy to get worn. To solve the problem, an experiment was carried out on a numerical control slitting automatic lathe to analysis the tool wearing conditions and parts' surface quality of nimonic super alloy GH2132 under different cutters. The selection of suitable cutter, reasonable cutting data and cutting speed is obtained and some conclusions are made. The excellent coating tool, compared with other hard alloy cutters, along with suitable cutting data will greatly improve the production efficiency and product quality, it can completely meet the process of nimonic super alloy GH2312.

  15. Machine tools and fixtures: A compilation

    NASA Technical Reports Server (NTRS)

    1971-01-01

    As part of NASA's Technology Utilizations Program, a compilation was made of technological developments regarding machine tools, jigs, and fixtures that have been produced, modified, or adapted to meet requirements of the aerospace program. The compilation is divided into three sections that include: (1) a variety of machine tool applications that offer easier and more efficient production techniques; (2) methods, techniques, and hardware that aid in the setup, alignment, and control of machines and machine tools to further quality assurance in finished products: and (3) jigs, fixtures, and adapters that are ancillary to basic machine tools and aid in realizing their greatest potential.

  16. Total quality management in American industry.

    PubMed

    Widtfeldt, A K; Widtfeldt, J R

    1992-07-01

    The definition of total quality management is conformance to customer requirements and specifications, fitness for use, buyer satisfaction, and value at an affordable price. The three individuals who have developed the total quality management concepts in the United States are W.E. Deming, J.M. Juran, and Philip Crosby. The universal principles of total quality management are (a) a customer focus, (b) management commitment, (c) training, (d) process capability and control, and (e) measurement through quality improvement tools. Results from the National Demonstration Project on Quality Improvement in Health Care showed the principles of total quality management could be applied to healthcare.

  17. Application of miniaturized near-infrared spectroscopy for quality control of extemporaneous orodispersible films.

    PubMed

    Foo, Wen Chin; Widjaja, Effendi; Khong, Yuet Mei; Gokhale, Rajeev; Chan, Sui Yung

    2018-02-20

    Extemporaneous oral preparations are routinely compounded in the pharmacy due to a lack of suitable formulations for special populations. Such small-scale pharmacy preparations also present an avenue for individualized pharmacotherapy. Orodispersible films (ODF) have increasingly been evaluated as a suitable dosage form for extemporaneous oral preparations. Nevertheless, as with all other extemporaneous preparations, safety and quality remain a concern. Although the United States Pharmacopeia (USP) recommends analytical testing of compounded preparations for quality assurance, pharmaceutical assays are typically not routinely performed for such non-sterile pharmacy preparations, due to the complexity and high cost of conventional assay methods such as high performance liquid chromatography (HPLC). Spectroscopic methods including Raman, infrared and near-infrared spectroscopy have been successfully applied as quality control tools in the industry. The state-of-art benchtop spectrometers used in those studies have the advantage of superior resolution and performance, but are not suitable for use in a small-scale pharmacy setting. In this study, we investigated the application of a miniaturized near infrared (NIR) spectrometer as a quality control tool for identification and quantification of drug content in extemporaneous ODFs. Miniaturized near infrared (NIR) spectroscopy is suitable for small-scale pharmacy applications in view of its small size, portability, simple user interface, rapid measurement and real-time prediction results. Nevertheless, the challenge with miniaturized NIR spectroscopy is its lower resolution compared to state-of-art benchtop equipment. We have successfully developed NIR spectroscopy calibration models for identification of ODFs containing five different drugs, and quantification of drug content in ODFs containing 2-10mg ondansetron (OND). The qualitative model for drug identification produced 100% prediction accuracy. The quantitative model to predict OND drug content in ODFs was divided into two calibrations for improved accuracy: Calibration I and II covered the 2-4mg and 4-10mg ranges respectively. Validation was performed for method accuracy, linearity and precision. In conclusion, this study demonstrates the feasibility of miniaturized NIR spectroscopy as a quality control tool for small-scale, pharmacy preparations. Due to its non-destructive nature, every dosage unit can be tested thus affording positive impact on patient safety. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Development of a practical approach to expert elicitation for randomised controlled trials with missing health outcomes: Application to the IMPROVE trial.

    PubMed

    Mason, Alexina J; Gomes, Manuel; Grieve, Richard; Ulug, Pinar; Powell, Janet T; Carpenter, James

    2017-08-01

    The analyses of randomised controlled trials with missing data typically assume that, after conditioning on the observed data, the probability of missing data does not depend on the patient's outcome, and so the data are 'missing at random' . This assumption is usually implausible, for example, because patients in relatively poor health may be more likely to drop out. Methodological guidelines recommend that trials require sensitivity analysis, which is best informed by elicited expert opinion, to assess whether conclusions are robust to alternative assumptions about the missing data. A major barrier to implementing these methods in practice is the lack of relevant practical tools for eliciting expert opinion. We develop a new practical tool for eliciting expert opinion and demonstrate its use for randomised controlled trials with missing data. We develop and illustrate our approach for eliciting expert opinion with the IMPROVE trial (ISRCTN 48334791), an ongoing multi-centre randomised controlled trial which compares an emergency endovascular strategy versus open repair for patients with ruptured abdominal aortic aneurysm. In the IMPROVE trial at 3 months post-randomisation, 21% of surviving patients did not complete health-related quality of life questionnaires (assessed by EQ-5D-3L). We address this problem by developing a web-based tool that provides a practical approach for eliciting expert opinion about quality of life differences between patients with missing versus complete data. We show how this expert opinion can define informative priors within a fully Bayesian framework to perform sensitivity analyses that allow the missing data to depend upon unobserved patient characteristics. A total of 26 experts, of 46 asked to participate, completed the elicitation exercise. The elicited quality of life scores were lower on average for the patients with missing versus complete data, but there was considerable uncertainty in these elicited values. The missing at random analysis found that patients randomised to the emergency endovascular strategy versus open repair had higher average (95% credible interval) quality of life scores of 0.062 (-0.005 to 0.130). Our sensitivity analysis that used the elicited expert information as pooled priors found that the gain in average quality of life for the emergency endovascular strategy versus open repair was 0.076 (-0.054 to 0.198). We provide and exemplify a practical tool for eliciting the expert opinion required by recommended approaches to the sensitivity analyses of randomised controlled trials. We show how this approach allows the trial analysis to fully recognise the uncertainty that arises from making alternative, plausible assumptions about the reasons for missing data. This tool can be widely used in the design, analysis and interpretation of future trials, and to facilitate this, materials are available for download.

  19. Practical solution for control of the pre-analytical phase in decentralized clinical laboratories for meeting the requirements of the medical laboratory accreditation standard DIN EN ISO 15189.

    PubMed

    Vacata, Vladimir; Jahns-Streubel, Gerlinde; Baldus, Mirjana; Wood, William Graham

    2007-01-01

    This report was written in response to the article by Wood published recently in this journal. It describes a practical solution to the problems of controlling the pre-analytical phase in the clinical diagnostic laboratory. As an indicator of quality in the pre-analytical phase of sample processing, a target analyte was chosen which is sensitive to delay in centrifugation and/or analysis. The results of analyses of the samples sent by satellite medical practitioners were compared with those from an on-site hospital laboratory with a controllable optimized pre-analytical phase. The aim of the comparison was: (a) to identify those medical practices whose mean/median sample values significantly deviate from those of the control situation in the hospital laboratory due to the possible problems in the pre-analytical phase; (b) to aid these laboratories in the process of rectifying these problems. A Microsoft Excel-based Pre-Analytical Survey tool (PAS tool) has been developed which addresses the above mentioned problems. It has been tested on serum potassium which is known to be sensitive to delay and/or irregularities in sample treatment. The PAS tool has been shown to be one possibility for improving the quality of the analyses by identifying the sources of problems within the pre-analytical phase, thus allowing them to be rectified. Additionally, the PAS tool has an educational value and can also be adopted for use in other decentralized laboratories.

  20. Quality assurance, an administrative means to a managerial end: Part I. A historical overview.

    PubMed

    Clark, G B

    1990-01-01

    Quality has become the hallmark of industrial excellence. Many diverse factors have heightened national concern about managing quality control throughout the health-care industry, including laboratory services. Industry-wide focus on quality control has created a need for an administrative program to evaluate its effectiveness. That program is medical quality assurance. Because of national and industry-wide concern, development of quality assurance theory has gained increasing importance in medical accreditation and management circles. Scrutiny of the application of quality assurance has become particularly prominent during accreditation inspections. Implementing quality assurance programs now demands more of already finite resources. The professional laboratory manager should understand how quality assurance has developed in the United States during the past 150 years. The well-informed manager should recognize why the health-care industry only recently began to develop its own expertise in quality assurance. It is also worthwhile to understand how heavily health care has relied on the lessons learned in the non-health-care sector. This three-part series will present information that will help in applying quality assurance more effectively as a management tool in the medical laboratory. This first part outlines the early industrial, socioeconomic, and medicolegal background of quality assurance. Terminology is defined with some distinction made between the terms management and administration. The second part will address current accreditation requirements. Special emphasis will be placed on the practical application of accreditation guidelines, providing a template for quality assurance methods in the medical laboratory. The third part will provide an overview of quality assurance as a total management tool with some suggestions for developing and implementing a quality assurance program.

  1. Design and implementation of a control structure for quality products in a crude oil atmospheric distillation column.

    PubMed

    Sotelo, David; Favela-Contreras, Antonio; Sotelo, Carlos; Jiménez, Guillermo; Gallegos-Canales, Luis

    2017-11-01

    In recent years, interest for petrochemical processes has been increasing, especially in refinement area. However, the high variability in the dynamic characteristics present in the atmospheric distillation column poses a challenge to obtain quality products. To improve distillates quality in spite of the changes in the input crude oil composition, this paper details a new design of a control strategy in a conventional crude oil distillation plant defined using formal interaction analysis tools. The process dynamic and its control are simulated on Aspen HYSYS ® dynamic environment under real operating conditions. The simulation results are compared against a typical control strategy commonly used in crude oil atmospheric distillation columns. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  2. A geographic information system screening tool to tackle diffuse pollution through the use of sustainable drainage systems.

    PubMed

    Todorovic, Zorica; Breton, Neil P

    2014-01-01

    Sustainable drainage systems (SUDS) offer many benefits that traditional solutions do not. Traditional approaches are unable to offer a solution to problems of flood management and water quality. Holistic consideration of the wide range of benefits from SUDS can result in advantages such as improved flood resilience and water quality enhancement through consideration of diffuse pollution sources. Using a geographical information system (GIS) approach, diffuse pollutant sources and opportunities for SUDS are easily identified. Consideration of potential SUDS locations results in source, site and regional controls, leading to improved water quality (to meet Water Framework Directive targets). The paper will discuss two different applications of the tool, the first of which is where the pollutant of interest is known. In this case the outputs of the tool highlight and isolate the areas contributing the pollutants and suggest the adequate SUDS measures to meet the required criteria. The second application is where the tool identifies likely pollutants at a receiving location, and SUDS measures are proposed to reduce pollution with assessed efficiencies.

  3. Selection and application of microbial source tracking tools for water-quality investigations

    USGS Publications Warehouse

    Stoeckel, Donald M.

    2005-01-01

    Microbial source tracking (MST) is a complex process that includes many decision-making steps. Once a contamination problem has been defined, the potential user of MST tools must thoroughly consider study objectives before deciding upon a source identifier, a detection method, and an analytical approach to apply to the problem. Regardless of which MST protocol is chosen, underlying assumptions can affect the results and interpretation. It is crucial to incorporate tests of those assumptions in the study quality-control plan to help validate results and facilitate interpretation. Detailed descriptions of MST objectives, protocols, and assumptions are provided in this report to assist in selection and application of MST tools for water-quality investigations. Several case studies illustrate real-world applications of MST protocols over a range of settings, spatial scales, and types of contamination. Technical details of many available source identifiers and detection methods are included as appendixes. By use of this information, researchers should be able to formulate realistic expectations for the information that MST tools can provide and, where possible, successfully execute investigations to characterize sources of fecal contamination to resource waters.

  4. Modal control theory and application to aircraft lateral handling qualities design

    NASA Technical Reports Server (NTRS)

    Srinathkumar, S.

    1978-01-01

    A multivariable synthesis procedure based on eigenvalue/eigenvector assignment is reviewed and is employed to develop a systematic design procedure to meet the lateral handling qualities design objectives of a fighter aircraft over a wide range of flight conditions. The closed loop modal characterization developed provides significant insight into the design process and plays a pivotal role in the synthesis of robust feedback systems. The simplicity of the synthesis algorithm yields an efficient computer aided interactive design tool for flight control system synthesis.

  5. Cognitive screening tools for identification of dementia in illiterate and low-educated older adults, a systematic review and meta-analysis.

    PubMed

    Paddick, Stella-Maria; Gray, William K; McGuire, Jackie; Richardson, Jenny; Dotchin, Catherine; Walker, Richard W

    2017-06-01

    The majority of older adults with dementia live in low- and middle-income countries (LMICs). Illiteracy and low educational background are common in older LMIC populations, particularly in rural areas, and cognitive screening tools developed for this setting must reflect this. This study aimed to review published validation studies of cognitive screening tools for dementia in low-literacy settings in order to determine the most appropriate tools for use. A systematic search of major databases was conducted according to PRISMA guidelines. Validation studies of brief cognitive screening tests including illiterate participants or those with elementary education were eligible. Studies were quality assessed using the QUADAS-2 tool. Good or fair quality studies were included in a bivariate random-effects meta-analysis and a hierarchical summary receiver operating characteristic (HSROC) curve constructed. Forty-five eligible studies were quality assessed. A significant proportion utilized a case-control design, resulting in spectrum bias. The area under the ROC (AUROC) curve was 0.937 for community/low prevalence studies, 0.881 for clinic based/higher prevalence studies, and 0.869 for illiterate populations. For the Mini-Mental State Examination (MMSE) (and adaptations), the AUROC curve was 0.853. Numerous tools for assessment of cognitive impairment in low-literacy settings have been developed, and tools developed for use in high-income countries have also been validated in low-literacy settings. Most tools have been inadequately validated, with only MMSE, cognitive abilities screening instrument (CASI), Eurotest, and Fototest having more than one published good or fair quality study in an illiterate or low-literate setting. At present no screening test can be recommended.

  6. Multicriteria Gain Tuning for Rotorcraft Flight Controls (also entitled The Development of the Conduit Advanced Control System Design and Evaluation Interface with a Case Study Application Fly by Wire Helicopter Design)

    NASA Technical Reports Server (NTRS)

    Biezad, Daniel

    1997-01-01

    Handling qualities analysis and control law design would seem to be naturally complimenting components of aircraft flight control system design, however these two closely coupled disciplines are often not well integrated in practice. Handling qualities engineers and control system engineers may work in separate groups within an aircraft company. Flight control system engineers and handling quality specialists may come from different backgrounds and schooling and are often not aware of the other group's research. Thus while the handling qualities specifications represent desired aircraft response characteristics, these are rarely incorporated directly in the control system design process. Instead modem control system design techniques are based on servo-loop robustness specifications, and simple representations of the desired control response. Comprehensive handling qualities analysis is often left until the end of the design cycle and performed as a check of the completed design for satisfactory performance. This can lead to costly redesign or less than satisfactory aircraft handling qualities when the flight testing phase is reached. The desire to integrate the fields of handling qualities and flight,control systems led to the development of the CONDUIT system. This tool facilitates control system designs that achieve desired handling quality requirements and servo-loop specifications in a single design process. With CONDUIT, the control system engineer is now able to directly design and control systems to meet the complete handling specifications. CONDUIT allows the designer to retain a preferred control law structure, but then tunes the system parameters to meet the handling quality requirements.

  7. Continuous processing and the applications of online tools in pharmaceutical product manufacture: developments and examples.

    PubMed

    Ooi, Shing Ming; Sarkar, Srimanta; van Varenbergh, Griet; Schoeters, Kris; Heng, Paul Wan Sia

    2013-04-01

    Continuous processing and production in pharmaceutical manufacturing has received increased attention in recent years mainly due to the industries' pressing needs for more efficient, cost-effective processes and production, as well as regulatory facilitation. To achieve optimum product quality, the traditional trial-and-error method for the optimization of different process and formulation parameters is expensive and time consuming. Real-time evaluation and the control of product quality using an online process analyzer in continuous processing can provide high-quality production with very high-throughput at low unit cost. This review focuses on continuous processing and the application of different real-time monitoring tools used in the pharmaceutical industry for continuous processing from powder to tablets.

  8. A new dataset validation system for the Planetary Science Archive

    NASA Astrophysics Data System (ADS)

    Manaud, N.; Zender, J.; Heather, D.; Martinez, S.

    2007-08-01

    The Planetary Science Archive is the official archive for the Mars Express mission. It has received its first data by the end of 2004. These data are delivered by the PI teams to the PSA team as datasets, which are formatted conform to the Planetary Data System (PDS). The PI teams are responsible for analyzing and calibrating the instrument data as well as the production of reduced and calibrated data. They are also responsible of the scientific validation of these data. ESA is responsible of the long-term data archiving and distribution to the scientific community and must ensure, in this regard, that all archived products meet quality. To do so, an archive peer-review is used to control the quality of the Mars Express science data archiving process. However a full validation of its content is missing. An independent review board recently recommended that the completeness of the archive as well as the consistency of the delivered data should be validated following well-defined procedures. A new validation software tool is being developed to complete the overall data quality control system functionality. This new tool aims to improve the quality of data and services provided to the scientific community through the PSA, and shall allow to track anomalies in and to control the completeness of datasets. It shall ensure that the PSA end-users: (1) can rely on the result of their queries, (2) will get data products that are suitable for scientific analysis, (3) can find all science data acquired during a mission. We defined dataset validation as the verification and assessment process to check the dataset content against pre-defined top-level criteria, which represent the general characteristics of good quality datasets. The dataset content that is checked includes the data and all types of information that are essential in the process of deriving scientific results and those interfacing with the PSA database. The validation software tool is a multi-mission tool that has been designed to provide the user with the flexibility of defining and implementing various types of validation criteria, to iteratively and incrementally validate datasets, and to generate validation reports.

  9. Eelgrass indicator deployment system (EIDS): A low tech tool for short-term evaluation of eelgrass response to water quality

    EPA Science Inventory

    Eelgrass is often considered a sentinel species that can be used as an indicator of water clarity and quality. I used the Eelgrass Indicator Deployment System (EIDS) in a series of short term experiments to evaluate eelgrass growth and survival at a decline and a control site in...

  10. Investigation of priorities in water quality management based on correlations and variations.

    PubMed

    Boyacıoğlu, Hülya; Gündogdu, Vildan; Boyacıoğlu, Hayal

    2013-04-15

    The development of water quality assessment strategies investigating spatial and temporal changes caused by natural and anthropogenic phenomena is an important tool in management practices. This paper used cluster analysis, water quality index method, sensitivity analysis and canonical correlation analysis to investigate priorities in pollution control activities. Data sets representing 22 surface water quality parameters were subject to analysis. Results revealed that organic pollution was serious threat for overall water quality in the region. Besides, oil and grease, lead and mercury were the critical variables violating the standard. In contrast to inorganic variables, organic and physical-inorganic chemical parameters were influenced by variations in physical conditions (discharge, temperature). This study showed that information produced based on the variations and correlations in water quality data sets can be helpful to investigate priorities in water management activities. Moreover statistical techniques and index methods are useful tools in data - information transformation process. Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. Sigma metrics as a tool for evaluating the performance of internal quality control in a clinical chemistry laboratory.

    PubMed

    Kumar, B Vinodh; Mohan, Thuthi

    2018-01-01

    Six Sigma is one of the most popular quality management system tools employed for process improvement. The Six Sigma methods are usually applied when the outcome of the process can be measured. This study was done to assess the performance of individual biochemical parameters on a Sigma Scale by calculating the sigma metrics for individual parameters and to follow the Westgard guidelines for appropriate Westgard rules and levels of internal quality control (IQC) that needs to be processed to improve target analyte performance based on the sigma metrics. This is a retrospective study, and data required for the study were extracted between July 2015 and June 2016 from a Secondary Care Government Hospital, Chennai. The data obtained for the study are IQC - coefficient of variation percentage and External Quality Assurance Scheme (EQAS) - Bias% for 16 biochemical parameters. For the level 1 IQC, four analytes (alkaline phosphatase, magnesium, triglyceride, and high-density lipoprotein-cholesterol) showed an ideal performance of ≥6 sigma level, five analytes (urea, total bilirubin, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level and for level 2 IQCs, same four analytes of level 1 showed a performance of ≥6 sigma level, and four analytes (urea, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level. For all analytes <6 sigma level, the quality goal index (QGI) was <0.8 indicating the area requiring improvement to be imprecision except cholesterol whose QGI >1.2 indicated inaccuracy. This study shows that sigma metrics is a good quality tool to assess the analytical performance of a clinical chemistry laboratory. Thus, sigma metric analysis provides a benchmark for the laboratory to design a protocol for IQC, address poor assay performance, and assess the efficiency of existing laboratory processes.

  12. Effects of light quality on the accumulation of phytochemicals in vegetables produced in controlled environments: a review.

    PubMed

    Bian, Zhong Hua; Yang, Qi Chang; Liu, Wen Ke

    2015-03-30

    Phytochemicals in vegetables are important for human health, and their biosynthesis, metabolism and accumulation are affected by environmental factors. Light condition (light quality, light intensity and photoperiod) is one of the most important environmental variables in regulating vegetable growth, development and phytochemical accumulation, particularly for vegetables produced in controlled environments. With the development of light-emitting diode (LED) technology, the regulation of light environments has become increasingly feasible for the provision of ideal light quality, intensity and photoperiod for protected facilities. In this review, the effects of light quality regulation on phytochemical accumulation in vegetables produced in controlled environments are identified, highlighting the research progress and advantages of LED technology as a light environment regulation tool for modifying phytochemical accumulation in vegetables. © 2014 Society of Chemical Industry.

  13. Tools for Local and Distributed Climate Data Access

    NASA Astrophysics Data System (ADS)

    Schweitzer, R.; O'Brien, K.; Burger, E. F.; Smith, K. M.; Manke, A. B.; Radhakrishnan, A.; Balaji, V.

    2017-12-01

    Last year we reported on our efforts to adapt existing tools to facilitate model development. During the lifecycle of a Climate Model Intercomparison Project (CMIP), data must be quality controlled before it can be published and studied. Like previous efforts, the next CMIP6 will produce an unprecedented volume of data. For an institution, modelling group or modeller the volume of data is unmanageable without tools that organize and automate as many processes as possible. Even if a modelling group has tools for data and metadata management, it often falls on individuals to do the initial quality assessment for a model run with bespoke tools. Using individually crafted tools can lead to interruptions when project personnel change and may result in inconsistencies and duplication of effort across groups. This talk will expand on our experiences using available tools (Ferret/PyFerret, the Live Access Server, the GFDL Curator, the GFDL Model Development Database Interface and the THREDDS Data Server) to seamlessly automate the data assembly process to give users "one-click" access to a rich suite of Web-based analysis and comparison tools. On the surface, it appears that this collection of tools is well suited to the task, but our experience of the last year taught us that the data volume and distributed storage adds a number of challenges in adapting the tools for this task. Quality control and initial evaluation add their own set of challenges. We will discuss how we addressed the needs of QC researchers by expanding standard tools to include specialized plots and leveraged the configurability of the tools to add specific user defined analysis operations so they are available to everyone using the system. We also report on our efforts to overcome some of the technical barriers for wide adoption of the tools by providing pre-built containers that are easily deployed in virtual machine and cloud environments. Finally, we will offer some suggestions for added features, configuration options and improved robustness that can make future implementation of similar systems operate faster and more reliably. Solving these challenges for data sets distributed narrowly across networks and storage systems of points the way to solving similar problems associated with sharing data distributed across institutions continents.

  14. Electronic Nose for Quality Control of Colombian Coffee through the Detection of Defects in “Cup Tests”

    PubMed Central

    Rodríguez, Juan; Durán, Cristhian; Reyes, Adriana

    2010-01-01

    Electronic noses (ENs), are used for many applications, but we must emphasize the importance of their application to foodstuffs like coffee. This paper presents a research study about the analysis of Colombian coffee samples for the detection and classification of defects (i.e., using “Cup Tests”), which was conducted at the Almacafé quality control laboratory in Cúcuta, Colombia. The results obtained show that the application of an electronic nose called “A-NOSE”, may be used in the coffee industry for the cupping tests. The results show that e-nose technology can be a useful tool for quality control to evaluate the excellence of the Colombian coffee produced by National Federation of Coffee Growers. PMID:22315525

  15. Electronic nose for quality control of Colombian coffee through the detection of defects in "Cup Tests".

    PubMed

    Rodríguez, Juan; Durán, Cristhian; Reyes, Adriana

    2010-01-01

    Electronic noses (ENs), are used for many applications, but we must emphasize the importance of their application to foodstuffs like coffee. This paper presents a research study about the analysis of Colombian coffee samples for the detection and classification of defects (i.e., using "Cup Tests"), which was conducted at the Almacafé quality control laboratory in Cúcuta, Colombia. The results obtained show that the application of an electronic nose called "A-NOSE", may be used in the coffee industry for the cupping tests. The results show that e-nose technology can be a useful tool for quality control to evaluate the excellence of the Colombian coffee produced by National Federation of Coffee Growers.

  16. The IEO Data Center Management System: Tools for quality control, analysis and access marine data

    NASA Astrophysics Data System (ADS)

    Casas, Antonia; Garcia, Maria Jesus; Nikouline, Andrei

    2010-05-01

    Since 1994 the Data Centre of the Spanish Oceanographic Institute develops system for archiving and quality control of oceanographic data. The work started in the frame of the European Marine Science & Technology Programme (MAST) when a consortium of several Mediterranean Data Centres began to work on the MEDATLAS project. Along the years, old software modules for MS DOS were rewritten, improved and migrated to Windows environment. Oceanographic data quality control includes now not only vertical profiles (mainly CTD and bottles observations) but also time series of currents and sea level observations. New powerful routines for analysis and for graphic visualization were added. Data presented originally in ASCII format were organized recently in an open source MySQL database. Nowadays, the IEO, as part of SeaDataNet Infrastructure, has designed and developed a new information system, consistent with the ISO 19115 and SeaDataNet standards, in order to manage the large and diverse marine data and information originated in Spain by different sources, and to interoperate with SeaDataNet. The system works with data stored in ASCII files (MEDATLAS, ODV) as well as data stored within the relational database. The components of the system are: 1.MEDATLAS Format and Quality Control - QCDAMAR: Quality Control of Marine Data. Main set of tools for working with data presented as text files. Includes extended quality control (searching for duplicated cruises and profiles, checking date, position, ship velocity, constant profiles, spikes, density inversion, sounding, acceptable data, impossible regional values,...) and input/output filters. - QCMareas: A set of procedures for the quality control of tide gauge data according to standard international Sea Level Observing System. These procedures include checking for unexpected anomalies in the time series, interpolation, filtering, computation of basic statistics and residuals. 2. DAMAR: A relational data base (MySql) designed to manage the wide variety of marine information as common vocabularies, Catalogues (CSR & EDIOS), Data and Metadata. 3.Other tools for analysis and data management - Import_DB: Script to import data and metadata from the Medatlas ASCII files into the database. - SelDamar/Selavi: interface with the database for local and web access. Allows selective retrievals applying the criteria introduced by the user, as geographical bounds, data responsible, cruises, platform, time periods, etc. Includes also statistical reference values calculation, plotting of original and mean profiles together with vertical interpolation. - ExtractDAMAR: Script to extract data when they are archived in ASCII files that meet the criteria upon an user request through SelDamar interface and export them in ODV format, making also a unit conversion.

  17. Innovative Stormwater Quality Tools by SARA for Holistic Watershed Master Planning

    NASA Astrophysics Data System (ADS)

    Thomas, S. M.; Su, Y. C.; Hummel, P. R.

    2016-12-01

    Stormwater management strategies such as Best Management Practices (BMP) and Low-Impact Development (LID) have increasingly gained attention in urban runoff control, becoming vital to holistic watershed master plans. These strategies can help address existing water quality impairments and support regulatory compliance, as well as guide planning and management of future development when substantial population growth and urbanization is projected to occur. However, past efforts have been limited to qualitative planning due to the lack of suitable tools to conduct quantitative assessment. The San Antonio River Authority (SARA), with the assistance of Lockwood, Andrews & Newnam, Inc. (LAN) and AQUA TERRA Consultants (a division of RESPEC), developed comprehensive hydrodynamic and water quality models using the Hydrological Simulation Program-FORTRAN (HSPF) for several urban watersheds in the San Antonio River Basin. These models enabled watershed management to look at water quality issues on a more refined temporal and spatial scale than the limited monitoring data. They also provided a means to locate and quantify potential water quality impairments and evaluate the effects of mitigation measures. To support the models, a suite of software tools were developed. including: 1) SARA Timeseries Utility Tool for managing and processing of large model timeseries files, 2) SARA Load Reduction Tool to determine load reductions needed to achieve screening levels for each modeled constituent on a sub-basin basis, and 3) SARA Enhanced BMP Tool to determine the optimal combination of BMP types and units needed to achieve the required load reductions. Using these SARA models and tools, water quality agencies and stormwater professionals can determine the optimal combinations of BMP/LID to accomplish their goals and save substantial stormwater infrastructure and management costs. The tools can also help regulators and permittees evaluate the feasibility of achieving compliance using BMP/LID. The project has gained national attention, being showcased in multiple newsletters, professional magazines, and conference presentations. The project also won the Texas American Council of Engineering Companies (ACEC) Gold Medal Award and the ACEC National Recognition Award in 2016.

  18. 19 CFR 10.877 - Direct costs of processing operations.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ..., supervisory, quality control, and similar personnel; (2) Tools, dies, molds, and other indirect materials, and... are not limited to: (1) Profit; and (2) General expenses of doing business that are either not...

  19. 19 CFR 10.814 - Direct costs of processing operations.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ..., supervisory, quality control, and similar personnel; (2) Tools, dies, molds, and other indirect materials, and... are not limited to: (1) Profit; and (2) General expenses of doing business that are either not...

  20. 19 CFR 10.774 - Direct costs of processing operations.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ..., supervisory, quality control, and similar personnel; (2) Tools, dies, molds, and other indirect materials, and... are not limited to: (1) Profit; and (2) General expenses of doing business that are either not...

  1. 19 CFR 10.814 - Direct costs of processing operations.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ..., supervisory, quality control, and similar personnel; (2) Tools, dies, molds, and other indirect materials, and... are not limited to: (1) Profit; and (2) General expenses of doing business that are either not...

  2. 19 CFR 10.774 - Direct costs of processing operations.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ..., supervisory, quality control, and similar personnel; (2) Tools, dies, molds, and other indirect materials, and... are not limited to: (1) Profit; and (2) General expenses of doing business that are either not...

  3. Reporting Quality Assessment of Randomized Controlled Trials Published in Nephrology Urology Monthly Journal.

    PubMed

    Mehrazmay, Alireza; Karambakhsh, Alireza; Salesi, Mahmood

    2015-07-01

    Randomized controlled trials (RCTs) are important tools for evidence-based health care decisions. It is, therefore, important that they be conducted and reported with the highest possible standards. The aim of this study was to evaluate the reporting quality of the RCTs published in nephrology urology monthly journal and to examine whether there was a change over time in the reporting quality. The quality of each report was assessed using the Consolidated Standards of Reporting Trials (CONSORT) 2010 Statement checklist and a 5-point quality assessment instrument, i.e. the Jadad scale. Eighteen (14 Iranian and 4 non-Iranian) RCTs were published from 2012 to 2014 on topics including renal stone (16.6%), hemodialysis and transplantation (38.8%), and prostate conditions (11.1%). Interventions comprised surgery, drugs, and teaching method in 7 (38 %), 10 (55%), and 1 (5%) of them, respectively. According to the CONSORT checklist, the weakest reported items were registration number, identification as a randomized trial in the title, and settings and locations where the data were collected. The mean Jadad score of the reports was 2.72 ± 1.36 (54% of their maximum possible total score). According to the Jadad and CONSORT scales, there was an increase in the quality of reporting from 2012 to 2014. This assessment shows low reporting quality scores in reports. Training courses for researchers, using standard reporting tools (e.g. CONSORT 2010 Statement checklist), and consultation with methodologists can improve the quality of published RCTs.

  4. Handling Qualities Optimization for Rotorcraft Conceptual Design

    NASA Technical Reports Server (NTRS)

    Lawrence, Ben; Theodore, Colin R.; Berger, Tom

    2016-01-01

    Over the past decade, NASA, under a succession of rotary-wing programs has been moving towards coupling multiple discipline analyses in a rigorous consistent manner to evaluate rotorcraft conceptual designs. Handling qualities is one of the component analyses to be included in a future NASA Multidisciplinary Analysis and Optimization framework for conceptual design of VTOL aircraft. Similarly, the future vision for the capability of the Concept Design and Assessment Technology Area (CD&A-TA) of the U.S Army Aviation Development Directorate also includes a handling qualities component. SIMPLI-FLYD is a tool jointly developed by NASA and the U.S. Army to perform modeling and analysis for the assessment of flight dynamics and control aspects of the handling qualities of rotorcraft conceptual designs. An exploration of handling qualities analysis has been carried out using SIMPLI-FLYD in illustrative scenarios of a tiltrotor in forward flight and single-main rotor helicopter at hover. Using SIMPLI-FLYD and the conceptual design tool NDARC integrated into a single process, the effects of variations of design parameters such as tail or rotor size were evaluated in the form of margins to fixed- and rotary-wing handling qualities metrics as well as the vehicle empty weight. The handling qualities design margins are shown to vary across the flight envelope due to both changing flight dynamic and control characteristics and changing handling qualities specification requirements. The current SIMPLI-FLYD capability and future developments are discussed in the context of an overall rotorcraft conceptual design process.

  5. DASHBOARDS & CONTROL CHARTS EXPERIENCES IN IMPROVING SAFETY AT HANFORD WASHINGTON

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    PREVETTE, S.S.

    2006-02-27

    The aim of this paper is to demonstrate the integration of safety methodology, quality tools, leadership, and teamwork at Hanford and their significant positive impact on safe performance of work. Dashboards, Leading Indicators, Control charts, Pareto Charts, Dr. W. Edward Deming's Red Bead Experiment, and Dr. Deming's System of Profound Knowledge have been the principal tools and theory of an integrated management system. Coupled with involved leadership and teamwork, they have led to significant improvements in worker safety and protection, and environmental restoration at one of the nation's largest nuclear cleanup sites.

  6. Characterizing SWCNT Dispersion in Polymer Composites

    NASA Technical Reports Server (NTRS)

    Lillehei, Peter T.; Kim, Jae-Woo; Gibbons, Luke; Park, Cheol

    2007-01-01

    The new wave of single wall carbon nanotube (SWCNT) infused composites will yield structurally sound multifunctional nanomaterials. The SWCNT network requires thorough dispersion within the polymer matrix in order to maximize the benefits of the nanomaterial. However, before any nanomaterials can be used in aerospace applications a means of quality assurance and quality control must be certified. Quality control certification requires a means of quantification, however, the measurement protocol mandates a method of seeing the dispersion first. We describe here the new tools that we have developed and implemented to first be able to see carbon nanotubes in polymers and second to measure or quantify the dispersion of the nanotubes.

  7. [Methodological quality and reporting quality evaluation of randomized controlled trials published in China Journal of Chinese Materia Medica].

    PubMed

    Yu, Dan-Dan; Xie, Yan-Ming; Liao, Xing; Zhi, Ying-Jie; Jiang, Jun-Jie; Chen, Wei

    2018-02-01

    To evaluate the methodological quality and reporting quality of randomized controlled trials(RCTs) published in China Journal of Chinese Materia Medica, we searched CNKI and China Journal of Chinese Materia webpage to collect RCTs since the establishment of the magazine. The Cochrane risk of bias assessment tool was used to evaluate the methodological quality of RCTs. The CONSORT 2010 list was adopted as reporting quality evaluating tool. Finally, 184 RCTs were included and evaluated methodologically, of which 97 RCTs were evaluated with reporting quality. For the methodological evaluating, 62 trials(33.70%) reported the random sequence generation; 9(4.89%) trials reported the allocation concealment; 25(13.59%) trials adopted the method of blinding; 30(16.30%) trials reported the number of patients withdrawing, dropping out and those lost to follow-up;2 trials (1.09%) reported trial registration and none of the trial reported the trial protocol; only 8(4.35%) trials reported the sample size estimation in details. For reporting quality appraising, 3 reporting items of 25 items were evaluated with high-quality,including: abstract, participants qualified criteria, and statistical methods; 4 reporting items with medium-quality, including purpose, intervention, random sequence method, and data collection of sites and locations; 9 items with low-quality reporting items including title, backgrounds, random sequence types, allocation concealment, blindness, recruitment of subjects, baseline data, harms, and funding;the rest of items were of extremely low quality(the compliance rate of reporting item<10%). On the whole, the methodological and reporting quality of RCTs published in the magazine are generally low. Further improvement in both methodological and reporting quality for RCTs of traditional Chinese medicine are warranted. It is recommended that the international standards and procedures for RCT design should be strictly followed to conduct high-quality trials. At the same time, in order to improve the reporting quality of randomized controlled trials, CONSORT standards should be adopted in the preparation of research reports and submissions. Copyright© by the Chinese Pharmaceutical Association.

  8. Single-case synthesis tools I: Comparing tools to evaluate SCD quality and rigor.

    PubMed

    Zimmerman, Kathleen N; Ledford, Jennifer R; Severini, Katherine E; Pustejovsky, James E; Barton, Erin E; Lloyd, Blair P

    2018-03-03

    Tools for evaluating the quality and rigor of single case research designs (SCD) are often used when conducting SCD syntheses. Preferred components include evaluations of design features related to the internal validity of SCD to obtain quality and/or rigor ratings. Three tools for evaluating the quality and rigor of SCD (Council for Exceptional Children, What Works Clearinghouse, and Single-Case Analysis and Design Framework) were compared to determine if conclusions regarding the effectiveness of antecedent sensory-based interventions for young children changed based on choice of quality evaluation tool. Evaluation of SCD quality differed across tools, suggesting selection of quality evaluation tools impacts evaluation findings. Suggestions for selecting an appropriate quality and rigor assessment tool are provided and across-tool conclusions are drawn regarding the quality and rigor of studies. Finally, authors provide guidance for using quality evaluations in conjunction with outcome analyses when conducting syntheses of interventions evaluated in the context of SCD. Copyright © 2018 Elsevier Ltd. All rights reserved.

  9. Decision support tool for diagnosing the source of variation

    NASA Astrophysics Data System (ADS)

    Masood, Ibrahim; Azrul Azhad Haizan, Mohamad; Norbaya Jumali, Siti; Ghazali, Farah Najihah Mohd; Razali, Hazlin Syafinaz Md; Shahir Yahya, Mohd; Azlan, Mohd Azwir bin

    2017-08-01

    Identifying the source of unnatural variation (SOV) in manufacturing process is essential for quality control. The Shewhart control chart patterns (CCPs) are commonly used to monitor the SOV. However, a proper interpretation of CCPs associated to its SOV requires a high skill industrial practitioner. Lack of knowledge in process engineering will lead to erroneous corrective action. The objective of this study is to design the operating procedures of computerized decision support tool (DST) for process diagnosis. The DST is an embedded tool in CCPs recognition scheme. Design methodology involves analysis of relationship between geometrical features, manufacturing process and CCPs. The DST contents information about CCPs and its possible root cause error and description on SOV phenomenon such as process deterioration in tool bluntness, offsetting tool, loading error, and changes in materials hardness. The DST will be useful for an industrial practitioner in making effective troubleshooting.

  10. Is the soil quality monitoring an effective tool in consumers' protection of agricultural crops from cadmium soil contamination?-a case of the Silesia region (Poland).

    PubMed

    Piekut, Agata; Baranowska, Renata; Marchwińska-Wyrwał, Ewa; Ćwieląg-Drabek, Małgorzata; Hajok, Ilona; Dziubanek, Grzegorz; Grochowska-Niedworok, Elżbieta

    2017-12-16

    The monitoring of soil quality should be a control tool used to reduce the adverse health effects arising from exposure to toxic chemicals in soil through cultivated crop absorption. The aim of the study was to evaluate the effectiveness of the monitoring and control system of soil quality in Poland, in terms of consumer safety, for agricultural plants cultivated in areas with known serious cadmium contamination, such as Silesia Province. To achieve the objective, the contents of cadmium in soils and vegetables in the Silesia administrative area were examined. The obtained results were compared with the results of soil contamination from the quality monitoring of arable soil in Poland. The studies show a significant exceedance of the permissible values of cadmium in soil samples and the vegetables cultivated on that soil. The threat to consumer health is a valid concern, although this threat was not indicated by the results of the national monitoring of soil quality. The results indicated an unequal distribution of risk to consumers resulting from contaminated soil. Moreover, the monitoring systems should be designed at the local or regional scale to guarantee the safety of consumers of edible plants cultivated in the areas contaminated with cadmium.

  11. The PEDro scale had acceptably high convergent validity, construct validity, and interrater reliability in evaluating methodological quality of pharmaceutical trials.

    PubMed

    Yamato, Tie Parma; Maher, Chris; Koes, Bart; Moseley, Anne

    2017-06-01

    The Physiotherapy Evidence Database (PEDro) scale has been widely used to investigate methodological quality in physiotherapy randomized controlled trials; however, its validity has not been tested for pharmaceutical trials. The aim of this study was to investigate the validity and interrater reliability of the PEDro scale for pharmaceutical trials. The reliability was also examined for the Cochrane Back and Neck (CBN) Group risk of bias tool. This is a secondary analysis of data from a previous study. We considered randomized placebo controlled trials evaluating any pain medication for chronic spinal pain or osteoarthritis. Convergent validity was evaluated by correlating the PEDro score with the summary score of the CBN risk of bias tool. The construct validity was tested using a linear regression analysis to determine the degree to which the total PEDro score is associated with treatment effect sizes, journal impact factor, and the summary score for the CBN risk of bias tool. The interrater reliability was estimated using the Prevalence and Bias Adjusted Kappa coefficient and 95% confidence interval (CI) for the PEDro scale and CBN risk of bias tool. Fifty-three trials were included, with 91 treatment effect sizes included in the analyses. The correlation between PEDro scale and CBN risk of bias tool was 0.83 (95% CI 0.76-0.88) after adjusting for reliability, indicating strong convergence. The PEDro score was inversely associated with effect sizes, significantly associated with the summary score for the CBN risk of bias tool, and not associated with the journal impact factor. The interrater reliability for each item of the PEDro scale and CBN risk of bias tool was at least substantial for most items (>0.60). The intraclass correlation coefficient for the PEDro score was 0.80 (95% CI 0.68-0.88), and for the CBN, risk of bias tool was 0.81 (95% CI 0.69-0.88). There was evidence for the convergent and construct validity for the PEDro scale when used to evaluate methodological quality of pharmacological trials. Both risk of bias tools have acceptably high interrater reliability. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Voice over Internet Protocol (VoIP) Technology as a Global Learning Tool: Information Systems Success and Control Belief Perspectives

    ERIC Educational Resources Information Center

    Chen, Charlie C.; Vannoy, Sandra

    2013-01-01

    Voice over Internet Protocol- (VoIP) enabled online learning service providers struggling with high attrition rates and low customer loyalty issues despite VoIP's high degree of system fit for online global learning applications. Effective solutions to this prevalent problem rely on the understanding of system quality, information quality, and…

  13. Using COPE To Improve Quality of Care: The Experience of the Family Planning Association of Kenya.

    ERIC Educational Resources Information Center

    Bradley, Janet

    1998-01-01

    COPE (Client-Oriented, Provider-Efficient) methodology, a self-assessment tool that has been used in 35 countries around the world, was used to improve the quality of care in family planning clinics in Kenya. COPE involves a process that legitimately invests power with providers and clinic-level staff. It gives providers more control over their…

  14. [Video-based self-control in surgical teaching. A new tool in a new concept].

    PubMed

    Dahmen, U; Sänger, C; Wurst, C; Arlt, J; Wei, W; Dondorf, F; Richter, B; Settmacher, U; Dirsch, O

    2013-10-01

    Image and video-based results and process control are essential tools of a new teaching concept for conveying surgical skills. The new teaching concept integrates approved teaching principles and new media. Every performance of exercises is videotaped and the result photographically recorded. The quality of the process and result becomes accessible for an analysis by the teacher and the student/learner. The learner is instructed to perform a criteria-based self-analysis of the video and image material by themselves. The new learning concept has so far been successfully applied in seven rounds within the newly designed modular class "Intensivkurs Chirurgische Techniken" (Intensive training of surgical techniques). Result documentation and analysis via digital picture was completed by almost every student. The quality of the results was high. Interestingly the result quality did not correlate with the time needed for the exercise. The training success had a lasting effect. The new and elaborate concept improves the quality of teaching. In the long run resources for patient care should be saved when training students according to this concept prior to performing tasks in the operating theater. These resources should be allocated for further refining innovative teaching concepts.

  15. Advanced manufacturing development of a composite empennage component for L-1011 aircraft

    NASA Technical Reports Server (NTRS)

    Alva, T.; Henkel, J.; Johnson, R.; Carll, B.; Jackson, A.; Mosesian, B.; Brozovic, R.; Obrien, R.; Eudaily, R.

    1982-01-01

    This is the final report of technical work conducted during the fourth phase of a multiphase program having the objective of the design, development and flight evaluation of an advanced composite empennage component manufactured in a production environment at a cost competitive with those of its metal counterpart, and at a weight savings of at least 20 percent. The empennage component selected for this program is the vertical fin box of the L-1011 aircraft. The box structure extends from the fuselage production joint to the tip rib and includes front and rear spars. During Phase 4 of the program, production quality tooling was designed and manufactured to produce three sets of covers, ribs, spars, miscellaneous parts, and subassemblies to assemble three complete ACVF units. Recurring and nonrecurring cost data were compiled and documented in the updated producibility/design to cost plan. Nondestruct inspections, quality control tests, and quality acceptance tests were performed in accordance with the quality assurance plan and the structural integrity control plan. Records were maintained to provide traceability of material and parts throughout the manufacturing development phase. It was also determined that additional tooling would not be required to support the current and projected L-1011 production rate.

  16. Numerical Implementation of Indicators and Statistical Control Tools in Monitoring and Evaluating CACEI-ISO Indicators of Study Program in Industrial Process by Systematization

    ERIC Educational Resources Information Center

    Ayala, Gabriela Cota; Real, Francia Angélica Karlos; Ivan, Ramirez Alvarado Edqar

    2016-01-01

    The research was conducted to determine if the study program of the career of industrial processes Technological University of Chihuahua, 1 year after that it was certified by CACEI, continues achieving the established indicators and ISO 9001: 2008, implementing quality tools, monitoring of essential indicators are determined, flow charts are…

  17. Measuring the quality of infection control in Dutch nursing homes using a standardized method; the Infection prevention RIsk Scan (IRIS)

    PubMed Central

    2014-01-01

    Background We developed a standardised method to assess the quality of infection control in Dutch Nursing Home (NH), based on a cross-sectional survey that visualises the results. The method was called the Infection control RIsk Infection Scan (IRIS). We tested the applicability of this new tool in a multicentre surveillance executed June and July 2012. Methods The IRIS includes two patient outcome-variables, i.e. the prevalence of healthcare associated infections (HAI) and rectal carriage of Extended-Spectrum Beta-Lactamase (ESBL) producing Enterobacteriaceae (ESBL-E); two patient-related risk factors, i.e. use of medical devices, and antimicrobial therapy; and three ward-related risk factors, i.e. environmental contamination, availability of local guidelines, and shortcomings in infection prevention preconditions. Results were categorised as low-, intermediate- and high risk, presented in an easy-to-read graphic risk spider-plot. This plot was given as feedback to management and healthcare workers of the NH. Results Large differences were found among most the variables in the different NH. Common shortcomings were the availability of infection control guidelines and the level of environmental cleaning. Most striking differences were observed in the prevalence of ESBL carriage, ranged from zero to 20.6% (p < 0.001). Conclusions The IRIS provided a rapid and easy to understand assessment of the infection control situation of the participating NH. The results can be used to improve the quality of infection control based on the specific needs of a NH but needs further validation in future studies. Repeated measurement can determine the effectiveness of the interventions. This makes the IRIS a useful tool for quality systems. PMID:25243067

  18. The Cluster-Randomized BRIGHT Trial: Proactive Case Finding for Community-Dwelling Older Adults

    PubMed Central

    Kerse, Ngaire; McLean, Chris; Moyes, Simon A.; Peri, Kathy; Ng, Terence; Wilkinson-Meyers, Laura; Brown, Paul; Latham, Nancy; Connolly, Martin

    2014-01-01

    PURPOSE People are now living longer, but disability may affect the quality of those additional years of life. We undertook a trial to assess whether case finding reduces disability among older primary care patients. METHODS We conducted a cluster-randomized trial of the Brief Risk Identification Geriatric Health Tool (BRIGHT) among 60 primary care practices in New Zealand, assigning them to an intervention or control group. Intervention practices sent a BRIGHT screening tool to older adults every birthday; those with a score of 3 or higher were referred to regional geriatric services for assessment and, if needed, service provision. Control practices provided usual care. Main outcomes, assessed in blinded fashion, were residential care placement and hospitalization, and secondary outcomes were disability, assessed with Nottingham Extended Activities of Daily Living Scale (NEADL), and quality of life, assessed with the World Health Organization Quality of Life scale, abbreviated version (WHOQOL-BREF). RESULTS All 8,308 community-dwelling patients aged 75 years and older were approached; 3,893 (47%) participated, of whom 3,010 (77%) completed the trial. Their mean age was 80.3 (SD 4.5) years, and 55% were women. Overall, 88% of the intervention group returned a BRIGHT tool; 549 patients were referred. After 36 months, patients in the intervention group were more likely than those in the control group to have been placed in residential care: 8.4% vs 6.2% (hazard ratio = 1.32; 95% CI, 1.04–1.68; P = .02). Intervention patients had smaller declines in mean scores for physical health-related quality of life (1.6 vs 2.9 points, P = .007) and psychological health-related quality of life (1.1 vs 2.4 points, P = .005). Hospitalization, disability, and use of services did not differ between groups, however. CONCLUSIONS Our case-finding strategy was effective in increasing identification of older adults with disability, but there was little evidence of improved outcomes. Further research could trial stronger primary care integration strategies. PMID:25384813

  19. Selected Aspects Of The Risk In The Supply Chain In Context Of The Supplier Quality Management

    NASA Astrophysics Data System (ADS)

    Koblen, Ivan; Lestyánszka Škůrková, Katarína

    2015-06-01

    The introductory part of the paper underlines the importance of "Risk-based thinking" in the Quality Management System (QMS) and risk in the supply chain, as a principle part of the QMS. After introducing the key terms, the authors focused on the principle part of the article - explanation of the external and internal supply chain risks and the main factors concerning the supply risks, demand risks and environmental risks (as cardinal types of external supply chain risks) as well as the manufacturing and process risks, network/planning and control risks (as most important types of internal supply chain risks). The authors inform on the selected supply chain risk management tools, especially on those which are linked to the appropriate utilization of quality management tools.

  20. User’s manual for the Automated Data Assurance and Management application developed for quality control of Everglades Depth Estimation Network water-level data

    USGS Publications Warehouse

    Petkewich, Matthew D.; Daamen, Ruby C.; Roehl, Edwin A.; Conrads, Paul

    2016-09-29

    The generation of Everglades Depth Estimation Network (EDEN) daily water-level and water-depth maps is dependent on high quality real-time data from over 240 water-level stations. To increase the accuracy of the daily water-surface maps, the Automated Data Assurance and Management (ADAM) tool was created by the U.S. Geological Survey as part of Greater Everglades Priority Ecosystems Science. The ADAM tool is used to provide accurate quality-assurance review of the real-time data from the EDEN network and allows estimation or replacement of missing or erroneous data. This user’s manual describes how to install and operate the ADAM software. File structure and operation of the ADAM software is explained using examples.

  1. A low-cost sensing system for cooperative air quality monitoring in urban areas.

    PubMed

    Brienza, Simone; Galli, Andrea; Anastasi, Giuseppe; Bruschi, Paolo

    2015-05-26

    Air quality in urban areas is a very important topic as it closely affects the health of citizens. Recent studies highlight that the exposure to polluted air can increase the incidence of diseases and deteriorate the quality of life. Hence, it is necessary to develop tools for real-time air quality monitoring, so as to allow appropriate and timely decisions. In this paper, we present uSense, a low-cost cooperative monitoring tool that allows knowing, in real-time, the concentrations of polluting gases in various areas of the city. Specifically, users monitor the areas of their interest by deploying low-cost and low-power sensor nodes. In addition, they can share the collected data following a social networking approach. uSense has been tested through an in-field experimentation performed in different areas of a city. The obtained results are in line with those provided by the local environmental control authority and show that uSense can be profitably used for air quality monitoring.

  2. Managing Epilepsy Well: Emerging e-Tools for epilepsy self-management.

    PubMed

    Shegog, Ross; Bamps, Yvan A; Patel, Archna; Kakacek, Jody; Escoffery, Cam; Johnson, Erica K; Ilozumba, Ukwuoma O

    2013-10-01

    The Managing Epilepsy Well (MEW) Network was established in 2007 by the Centers for Disease Control and Prevention Epilepsy Program to expand epilepsy self-management research. The network has employed collaborative research strategies to develop, test, and disseminate evidence-based, community-based, and e-Health interventions (e-Tools) for epilepsy self-management for people with epilepsy, caregivers, and health-care providers. Since its inception, MEW Network collaborators have conducted formative studies (n=7) investigating the potential of e-Health to support epilepsy self-management and intervention studies evaluating e-Tools (n=5). The MEW e-Tools (the MEW website, WebEase, UPLIFT, MINDSET, and PEARLS online training) and affiliated e-Tools (Texting 4 Control) are designed to complement self-management practices in each phase of the epilepsy care continuum. These tools exemplify a concerted research agenda, shared methodological principles and models for epilepsy self-management, and a communal knowledge base for implementing e-Health to improve quality of life for people with epilepsy. © 2013.

  3. The use of a quartz crystal microbalance as an analytical tool to monitor particle/surface and particle/particle interactions under dry ambient and pressurized conditions: a study using common inhaler components.

    PubMed

    Turner, N W; Bloxham, M; Piletsky, S A; Whitcombe, M J; Chianella, I

    2016-12-19

    Metered dose inhalers (MDI) and multidose powder inhalers (MPDI) are commonly used for the treatment of chronic obstructive pulmonary diseases and asthma. Currently, analytical tools to monitor particle/particle and particle/surface interaction within MDI and MPDI at the macro-scale do not exist. A simple tool capable of measuring such interactions would ultimately enable quality control of MDI and MDPI, producing remarkable benefits for the pharmaceutical industry and the users of inhalers. In this paper, we have investigated whether a quartz crystal microbalance (QCM) could become such a tool. A QCM was used to measure particle/particle and particle/surface interactions on the macroscale, by additions of small amounts of MDPI components, in the powder form into a gas stream. The subsequent interactions with materials on the surface of the QCM sensor were analyzed. Following this, the sensor was used to measure fluticasone propionate, a typical MDI active ingredient, in a pressurized gas system to assess its interactions with different surfaces under conditions mimicking the manufacturing process. In both types of experiments the QCM was capable of discriminating interactions of different components and surfaces. The results have demonstrated that the QCM is a suitable platform for monitoring macro-scale interactions and could possibly become a tool for quality control of inhalers.

  4. Electronic tools to support medication reconciliation: a systematic review.

    PubMed

    Marien, Sophie; Krug, Bruno; Spinewine, Anne

    2017-01-01

    Medication reconciliation (MedRec) is essential for reducing patient harm caused by medication discrepancies across care transitions. Electronic support has been described as a promising approach to moving MedRec forward. We systematically reviewed the evidence about electronic tools that support MedRec, by (a) identifying tools; (b) summarizing their characteristics with regard to context, tool, implementation, and evaluation; and (c) summarizing key messages for successful development and implementation. We searched PubMed, the Cumulative Index to Nursing and Allied Health Literature, Embase, PsycINFO, and the Cochrane Library, and identified additional reports from reference lists, reviews, and patent databases. Reports were included if the electronic tool supported medication history taking and the identification and resolution of medication discrepancies. Two researchers independently selected studies, evaluated the quality of reporting, and extracted data. Eighteen reports relative to 11 tools were included. There were eight quality improvement projects, five observational effectiveness studies, three randomized controlled trials (RCTs) or RCT protocols (ie, descriptions of RCTs in progress), and two patents. All tools were developed in academic environments in North America. Most used electronic data from multiple sources and partially implemented functionalities considered to be important. Relevant information on functionalities and implementation features was frequently missing. Evaluations mainly focused on usability, adherence, and user satisfaction. One RCT evaluated the effect on potential adverse drug events. Successful implementation of electronic tools to support MedRec requires favorable context, properly designed tools, and attention to implementation features. Future research is needed to evaluate the effect of these tools on the quality and safety of healthcare. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  5. Feasibility of personalised remote long-term follow-up of people with cochlear implants: a randomised controlled trial.

    PubMed

    Cullington, Helen; Kitterick, Padraig; Weal, Mark; Margol-Gromada, Magdalena

    2018-04-20

    Substantial resources are required to provide lifelong postoperative care to people with cochlear implants. Most patients visit the clinic annually. We introduced a person-centred remote follow-up pathway, giving patients telemedicine tools to use at home so they would only visit the centre when intervention was required. To assess the feasibility of comparing a remote care pathway with the standard pathway in adults using cochlear implants. Two-arm randomised controlled trial. Randomisation used a minimisation approach, controlling for potential confounding factors. Participant blinding was not possible, but baseline measures occurred before allocation. University of Southampton Auditory Implant Service: provider of National Health Service care. 60 adults who had used cochlear implants for at least 6 months. Control group (n=30) followed usual care pathway.Remote care group (n=30) received care remotely for 6 months incorporating: home hearing in noise test, online support tool and self-adjustment of device (only 10 had compatible equipment). Primary: change in patient activation; measured using the Patient Activation Measure.Secondary: change in hearing and quality of life; qualitative feedback from patients and clinicians. One participant in the remote care group dropped out. The remote care group showed a greater increase in patient activation than the control group. Changes in hearing differed between the groups. The remote care group improved on the Triple Digit Test hearing test; the control group perceived their hearing was worse on the Speech, Spatial and Qualities of Hearing Scale questionnaire. Quality of life remained unchanged in both groups. Patients and clinicians were generally positive about remote care tools and wanted to continue. Adults with cochlear implants were willing to be randomised and complied with the protocol. Personalised remote care for long-term follow-up is feasible and acceptable, leading to more empowered patients. ISRCTN14644286. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  6. Feasibility of personalised remote long-term follow-up of people with cochlear implants: a randomised controlled trial

    PubMed Central

    Kitterick, Padraig; Weal, Mark; Margol-Gromada, Magdalena

    2018-01-01

    Introduction Substantial resources are required to provide lifelong postoperative care to people with cochlear implants. Most patients visit the clinic annually. We introduced a person-centred remote follow-up pathway, giving patients telemedicine tools to use at home so they would only visit the centre when intervention was required. Objectives To assess the feasibility of comparing a remote care pathway with the standard pathway in adults using cochlear implants. Design Two-arm randomised controlled trial. Randomisation used a minimisation approach, controlling for potential confounding factors. Participant blinding was not possible, but baseline measures occurred before allocation. Setting University of Southampton Auditory Implant Service: provider of National Health Service care. Participants 60 adults who had used cochlear implants for at least 6 months. Interventions Control group (n=30) followed usual care pathway. Remote care group (n=30) received care remotely for 6 months incorporating: home hearing in noise test, online support tool and self-adjustment of device (only 10 had compatible equipment). Main outcome measures Primary: change in patient activation; measured using the Patient Activation Measure. Secondary: change in hearing and quality of life; qualitative feedback from patients and clinicians. Results One participant in the remote care group dropped out. The remote care group showed a greater increase in patient activation than the control group. Changes in hearing differed between the groups. The remote care group improved on the Triple Digit Test hearing test; the control group perceived their hearing was worse on the Speech, Spatial and Qualities of Hearing Scale questionnaire. Quality of life remained unchanged in both groups. Patients and clinicians were generally positive about remote care tools and wanted to continue. Conclusions Adults with cochlear implants were willing to be randomised and complied with the protocol. Personalised remote care for long-term follow-up is feasible and acceptable, leading to more empowered patients. Trial registration number ISRCTN14644286. PMID:29678970

  7. Process monitoring and visualization solutions for hot-melt extrusion: a review.

    PubMed

    Saerens, Lien; Vervaet, Chris; Remon, Jean Paul; De Beer, Thomas

    2014-02-01

    Hot-melt extrusion (HME) is applied as a continuous pharmaceutical manufacturing process for the production of a variety of dosage forms and formulations. To ensure the continuity of this process, the quality of the extrudates must be assessed continuously during manufacturing. The objective of this review is to provide an overview and evaluation of the available process analytical techniques which can be applied in hot-melt extrusion. Pharmaceutical extruders are equipped with traditional (univariate) process monitoring tools, observing barrel and die temperatures, throughput, screw speed, torque, drive amperage, melt pressure and melt temperature. The relevance of several spectroscopic process analytical techniques for monitoring and control of pharmaceutical HME has been explored recently. Nevertheless, many other sensors visualizing HME and measuring diverse critical product and process parameters with potential use in pharmaceutical extrusion are available, and were thoroughly studied in polymer extrusion. The implementation of process analytical tools in HME serves two purposes: (1) improving process understanding by monitoring and visualizing the material behaviour and (2) monitoring and analysing critical product and process parameters for process control, allowing to maintain a desired process state and guaranteeing the quality of the end product. This review is the first to provide an evaluation of the process analytical tools applied for pharmaceutical HME monitoring and control, and discusses techniques that have been used in polymer extrusion having potential for monitoring and control of pharmaceutical HME. © 2013 Royal Pharmaceutical Society.

  8. Development of a practical approach to expert elicitation for randomised controlled trials with missing health outcomes: Application to the IMPROVE trial

    PubMed Central

    Mason, Alexina J; Gomes, Manuel; Grieve, Richard; Ulug, Pinar; Powell, Janet T; Carpenter, James

    2017-01-01

    Background/aims: The analyses of randomised controlled trials with missing data typically assume that, after conditioning on the observed data, the probability of missing data does not depend on the patient’s outcome, and so the data are ‘missing at random’ . This assumption is usually implausible, for example, because patients in relatively poor health may be more likely to drop out. Methodological guidelines recommend that trials require sensitivity analysis, which is best informed by elicited expert opinion, to assess whether conclusions are robust to alternative assumptions about the missing data. A major barrier to implementing these methods in practice is the lack of relevant practical tools for eliciting expert opinion. We develop a new practical tool for eliciting expert opinion and demonstrate its use for randomised controlled trials with missing data. Methods: We develop and illustrate our approach for eliciting expert opinion with the IMPROVE trial (ISRCTN 48334791), an ongoing multi-centre randomised controlled trial which compares an emergency endovascular strategy versus open repair for patients with ruptured abdominal aortic aneurysm. In the IMPROVE trial at 3 months post-randomisation, 21% of surviving patients did not complete health-related quality of life questionnaires (assessed by EQ-5D-3L). We address this problem by developing a web-based tool that provides a practical approach for eliciting expert opinion about quality of life differences between patients with missing versus complete data. We show how this expert opinion can define informative priors within a fully Bayesian framework to perform sensitivity analyses that allow the missing data to depend upon unobserved patient characteristics. Results: A total of 26 experts, of 46 asked to participate, completed the elicitation exercise. The elicited quality of life scores were lower on average for the patients with missing versus complete data, but there was considerable uncertainty in these elicited values. The missing at random analysis found that patients randomised to the emergency endovascular strategy versus open repair had higher average (95% credible interval) quality of life scores of 0.062 (−0.005 to 0.130). Our sensitivity analysis that used the elicited expert information as pooled priors found that the gain in average quality of life for the emergency endovascular strategy versus open repair was 0.076 (−0.054 to 0.198). Conclusion: We provide and exemplify a practical tool for eliciting the expert opinion required by recommended approaches to the sensitivity analyses of randomised controlled trials. We show how this approach allows the trial analysis to fully recognise the uncertainty that arises from making alternative, plausible assumptions about the reasons for missing data. This tool can be widely used in the design, analysis and interpretation of future trials, and to facilitate this, materials are available for download. PMID:28675302

  9. A scoping review of patient discharge from intensive care: opportunities and tools to improve care.

    PubMed

    Stelfox, Henry T; Lane, Dan; Boyd, Jamie M; Taylor, Simon; Perrier, Laure; Straus, Sharon; Zygun, David; Zuege, Danny J

    2015-02-01

    We conducted a scoping review to systematically review the literature reporting patient discharge from ICUs, identify facilitators and barriers to high-quality care, and describe tools developed to improve care. We searched Medline, Embase, CINAHL, and the Cochrane Central Register of Controlled Trials. Data were extracted on the article type, study details for research articles, patient population, phase of care during discharge, and dimensions of health-care quality. From 8,154 unique publications we included 224 articles. Of these, 131 articles (58%) were original research, predominantly case series (23%) and cohort (16%) studies; 12% were narrative reviews; and 11% were guidelines/policies. Common themes included patient and family needs/experiences (29% of articles) and the importance of complete and accurate information (26%). Facilitators of high-quality care included provider-patient communication (30%), provider-provider communication (25%), and the use of guidelines/policies (29%). Patient and family anxiety (21%) and limited availability of ICU and ward resources (26%) were reported barriers to high-quality care. A total of 47 tools to facilitate patient discharge from the ICU were identified and focused on patient evaluation for discharge (29%), discharge planning and teaching (47%), and optimized discharge summaries (23%). Common themes, facilitators and barriers related to patient and family needs/experiences, communication, and the use of guidelines/policies to standardize patient discharge from ICU transcend the literature. Candidate tools to improve care are available; comparative evaluation is needed prior to broad implementation and could be tested through local quality-improvement programs.

  10. Internal quality control: planning and implementation strategies.

    PubMed

    Westgard, James O

    2003-11-01

    The first essential in setting up internal quality control (IQC) of a test procedure in the clinical laboratory is to select the proper IQC procedure to implement, i.e. choosing the statistical criteria or control rules, and the number of control measurements, according to the quality required for the test and the observed performance of the method. Then the right IQC procedure must be properly implemented. This review focuses on strategies for planning and implementing IQC procedures in order to improve the quality of the IQC. A quantitative planning process is described that can be implemented with graphical tools such as power function or critical-error graphs and charts of operating specifications. Finally, a total QC strategy is formulated to minimize cost and maximize quality. A general strategy for IQC implementation is recommended that employs a three-stage design in which the first stage provides high error detection, the second stage low false rejection and the third stage prescribes the length of the analytical run, making use of an algorithm involving the average of normal patients' data.

  11. 48 CFR 9904.403-60 - Illustrations.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... personnel, labor hours, payroll, number of hires. 2. Manufacturing policies, (quality control, industrial engineering, production, scheduling, tooling, inspection and testing, etc 2. Manufacturing cost input, manufacturing direct labor. 3. Engineering policies 3. Total engineering costs, engineering direct labor, number...

  12. 48 CFR 9904.403-60 - Illustrations.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... personnel, labor hours, payroll, number of hires. 2. Manufacturing policies, (quality control, industrial engineering, production, scheduling, tooling, inspection and testing, etc 2. Manufacturing cost input, manufacturing direct labor. 3. Engineering policies 3. Total engineering costs, engineering direct labor, number...

  13. 48 CFR 9904.403-60 - Illustrations.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... personnel, labor hours, payroll, number of hires. 2. Manufacturing policies, (quality control, industrial engineering, production, scheduling, tooling, inspection and testing, etc 2. Manufacturing cost input, manufacturing direct labor. 3. Engineering policies 3. Total engineering costs, engineering direct labor, number...

  14. QC-ART: A tool for real-time quality control assessment of mass spectrometry-based proteomics data.

    PubMed

    Stanfill, Bryan A; Nakayasu, Ernesto S; Bramer, Lisa M; Thompson, Allison M; Ansong, Charles K; Clauss, Therese; Gritsenko, Marina A; Monroe, Matthew E; Moore, Ronald J; Orton, Daniel J; Piehowski, Paul D; Schepmoes, Athena A; Smith, Richard D; Webb-Robertson, Bobbie-Jo; Metz, Thomas O

    2018-04-17

    Liquid chromatography-mass spectrometry (LC-MS)-based proteomics studies of large sample cohorts can easily require from months to years to complete. Acquiring consistent, high-quality data in such large-scale studies is challenging because of normal variations in instrumentation performance over time, as well as artifacts introduced by the samples themselves, such as those due to collection, storage and processing. Existing quality control methods for proteomics data primarily focus on post-hoc analysis to remove low-quality data that would degrade downstream statistics; they are not designed to evaluate the data in near real-time, which would allow for interventions as soon as deviations in data quality are detected.  In addition to flagging analyses that demonstrate outlier behavior, evaluating how the data structure changes over time can aide in understanding typical instrument performance or identify issues such as a degradation in data quality due to the need for instrument cleaning and/or re-calibration.  To address this gap for proteomics, we developed Quality Control Analysis in Real-Time (QC-ART), a tool for evaluating data as they are acquired in order to dynamically flag potential issues with instrument performance or sample quality.  QC-ART has similar accuracy as standard post-hoc analysis methods with the additional benefit of real-time analysis.  We demonstrate the utility and performance of QC-ART in identifying deviations in data quality due to both instrument and sample issues in near real-time for LC-MS-based plasma proteomics analyses of a sample subset of The Environmental Determinants of Diabetes in the Young cohort. We also present a case where QC-ART facilitated the identification of oxidative modifications, which are often underappreciated in proteomic experiments. Published under license by The American Society for Biochemistry and Molecular Biology, Inc.

  15. Donated chemical probes for open science

    PubMed Central

    Ackloo, Suzanne; Arrowsmith, Cheryl H; Bauser, Marcus; Baryza, Jeremy L; Blagg, Julian; Böttcher, Jark; Bountra, Chas; Brown, Peter J; Bunnage, Mark E; Carter, Adrian J; Damerell, David; Dötsch, Volker; Drewry, David H; Edwards, Aled M; Edwards, James; Elkins, Jon M; Fischer, Christian; Frye, Stephen V; Gollner, Andreas; Grimshaw, Charles E; IJzerman, Adriaan; Hanke, Thomas; Hartung, Ingo V; Hitchcock, Steve; Howe, Trevor; Hughes, Terry V; Laufer, Stefan; Li, Volkhart MJ; Liras, Spiros; Marsden, Brian D; Matsui, Hisanori; Mathias, John; O'Hagan, Ronan C; Owen, Dafydd R; Pande, Vineet; Rauh, Daniel; Rosenberg, Saul H; Roth, Bryan L; Schneider, Natalie S; Scholten, Cora; Singh Saikatendu, Kumar; Simeonov, Anton; Takizawa, Masayuki; Tse, Chris; Thompson, Paul R; Treiber, Daniel K; Viana, Amélia YI; Wells, Carrow I; Willson, Timothy M; Zuercher, William J; Knapp, Stefan

    2018-01-01

    Potent, selective and broadly characterized small molecule modulators of protein function (chemical probes) are powerful research reagents. The pharmaceutical industry has generated many high-quality chemical probes and several of these have been made available to academia. However, probe-associated data and control compounds, such as inactive structurally related molecules and their associated data, are generally not accessible. The lack of data and guidance makes it difficult for researchers to decide which chemical tools to choose. Several pharmaceutical companies (AbbVie, Bayer, Boehringer Ingelheim, Janssen, MSD, Pfizer, and Takeda) have therefore entered into a pre-competitive collaboration to make available a large number of innovative high-quality probes, including all probe-associated data, control compounds and recommendations on use (https://openscienceprobes.sgc-frankfurt.de/). Here we describe the chemical tools and target-related knowledge that have been made available, and encourage others to join the project. PMID:29676732

  16. Real-Time Performance of Mechatronic PZT Module Using Active Vibration Feedback Control.

    PubMed

    Aggogeri, Francesco; Borboni, Alberto; Merlo, Angelo; Pellegrini, Nicola; Ricatto, Raffaele

    2016-09-25

    This paper proposes an innovative mechatronic piezo-actuated module to control vibrations in modern machine tools. Vibrations represent one of the main issues that seriously compromise the quality of the workpiece. The active vibration control (AVC) device is composed of a host part integrated with sensors and actuators synchronized by a regulator; it is able to make a self-assessment and adjust to alterations in the environment. In particular, an innovative smart actuator has been designed and developed to satisfy machining requirements during active vibration control. This study presents the mechatronic model based on the kinematic and dynamic analysis of the AVC device. To ensure a real time performance, a H2-LQG controller has been developed and validated by simulations involving a machine tool, PZT actuator and controller models. The Hardware in the Loop (HIL) architecture is adopted to control and attenuate the vibrations. A set of experimental tests has been performed to validate the AVC module on a commercial machine tool. The feasibility of the real time vibration damping is demonstrated and the simulation accuracy is evaluated.

  17. Real-Time Performance of Mechatronic PZT Module Using Active Vibration Feedback Control

    PubMed Central

    Aggogeri, Francesco; Borboni, Alberto; Merlo, Angelo; Pellegrini, Nicola; Ricatto, Raffaele

    2016-01-01

    This paper proposes an innovative mechatronic piezo-actuated module to control vibrations in modern machine tools. Vibrations represent one of the main issues that seriously compromise the quality of the workpiece. The active vibration control (AVC) device is composed of a host part integrated with sensors and actuators synchronized by a regulator; it is able to make a self-assessment and adjust to alterations in the environment. In particular, an innovative smart actuator has been designed and developed to satisfy machining requirements during active vibration control. This study presents the mechatronic model based on the kinematic and dynamic analysis of the AVC device. To ensure a real time performance, a H2-LQG controller has been developed and validated by simulations involving a machine tool, PZT actuator and controller models. The Hardware in the Loop (HIL) architecture is adopted to control and attenuate the vibrations. A set of experimental tests has been performed to validate the AVC module on a commercial machine tool. The feasibility of the real time vibration damping is demonstrated and the simulation accuracy is evaluated. PMID:27681732

  18. Implementation of statistical process control for proteomic experiments via LC MS/MS.

    PubMed

    Bereman, Michael S; Johnson, Richard; Bollinger, James; Boss, Yuval; Shulman, Nick; MacLean, Brendan; Hoofnagle, Andrew N; MacCoss, Michael J

    2014-04-01

    Statistical process control (SPC) is a robust set of tools that aids in the visualization, detection, and identification of assignable causes of variation in any process that creates products, services, or information. A tool has been developed termed Statistical Process Control in Proteomics (SProCoP) which implements aspects of SPC (e.g., control charts and Pareto analysis) into the Skyline proteomics software. It monitors five quality control metrics in a shotgun or targeted proteomic workflow. None of these metrics require peptide identification. The source code, written in the R statistical language, runs directly from the Skyline interface, which supports the use of raw data files from several of the mass spectrometry vendors. It provides real time evaluation of the chromatographic performance (e.g., retention time reproducibility, peak asymmetry, and resolution), and mass spectrometric performance (targeted peptide ion intensity and mass measurement accuracy for high resolving power instruments) via control charts. Thresholds are experiment- and instrument-specific and are determined empirically from user-defined quality control standards that enable the separation of random noise and systematic error. Finally, Pareto analysis provides a summary of performance metrics and guides the user to metrics with high variance. The utility of these charts to evaluate proteomic experiments is illustrated in two case studies.

  19. Effectiveness of a Technology-Based Intervention to Teach Evidence-Based Practice: The EBR Tool.

    PubMed

    Long, JoAnn D; Gannaway, Paula; Ford, Cindy; Doumit, Rita; Zeeni, Nadine; Sukkarieh-Haraty, Ola; Milane, Aline; Byers, Beverly; Harrison, LaNell; Hatch, Daniel; Brown, Justin; Proper, Sharlan; White, Patricia; Song, Huaxin

    2016-02-01

    As the world becomes increasingly digital, advances in technology have changed how students access evidence-based information. Research suggests that students overestimate their ability to locate quality online research and lack the skills needed to evaluate the scientific literature. Clinical nurses report relying on personal experience to answer clinical questions rather than searching evidence-based sources. To address the problem, a web-based, evidence-based research (EBR) tool that is usable from a computer, smartphone, or iPad was developed and tested. The purpose of the EBR tool is to guide students through the basic steps needed to locate and critically appraise the online scientific literature while linking users to quality electronic resources to support evidence-based practice (EBP). Testing of the tool took place in a mixed-method, quasi-experimental, and two-population randomized controlled trial (RCT) design in a U.S. and Middle East university. A statistically significant improvement in overall research skills was supported in the quasi-experimental nursing student group and RCT nutrition student group using the EBR tool. A statistically significant proportional difference was supported in the RCT nutrition and PharmD intervention groups in participants' ability to distinguish the credibility of online source materials compared with controls. The majority of participants could correctly apply PICOTS to a case study when using the tool. The data from this preliminary study suggests that the EBR tool enhanced student overall research skills and selected EBP skills while generating data for assessment of learning outcomes. The EBR tool places evidence-based resources at the fingertips of users by addressing some of the most commonly cited barriers to research utilization while exposing users to information and online literacy standards of practice, meeting a growing need within nursing curricula. © 2016 Sigma Theta Tau International.

  20. MO-E-9A-01: Risk Based Quality Management: TG100 In Action

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huq, M; Palta, J; Dunscombe, P

    2014-06-15

    One of the goals of quality management in radiation therapy is to gain high confidence that patients will receive the prescribed treatment correctly. To accomplish these goals professional societies such as the American Association of Physicists in Medicine (AAPM) has published many quality assurance (QA), quality control (QC), and quality management (QM) guidance documents. In general, the recommendations provided in these documents have emphasized on performing device-specific QA at the expense of process flow and protection of the patient against catastrophic errors. Analyses of radiation therapy incidents find that they are most often caused by flaws in the overall therapymore » process, from initial consult through final treatment, than by isolated hardware or computer failures detectable by traditional physics QA. This challenge is shared by many intrinsically hazardous industries. Risk assessment tools and analysis techniques have been developed to define, identify, and eliminate known and/or potential failures, problems, or errors, from a system, process and/or service before they reach the customer. These include, but are not limited to, process mapping, failure modes and effects analysis (FMEA), fault tree analysis (FTA), and establishment of a quality management program that best avoids the faults and risks that have been identified in the overall process. These tools can be easily adapted to radiation therapy practices because of their simplicity and effectiveness to provide efficient ways to enhance the safety and quality of treatment processes. Task group 100 (TG100) of AAPM has developed a risk-based quality management program that uses these tools. This session will be devoted to a discussion of these tools and how these tools can be used in a given radiotherapy clinic to develop a risk based QM program. Learning Objectives: Learn how to design a process map for a radiotherapy process. Learn how to perform a FMEA analysis for a given process. Learn what Fault tree analysis is all about. Learn how to design a quality management program based upon the information obtained from process mapping, FMEA and FTA.« less

  1. European Guidelines for AP/PA chest X-rays: routinely satisfiable in a paediatric radiology division?

    PubMed

    Tschauner, Sebastian; Marterer, Robert; Gübitz, Michael; Kalmar, Peter I; Talakic, Emina; Weissensteiner, Sabine; Sorantin, Erich

    2016-02-01

    Accurate collimation helps to reduce unnecessary irradiation and improves radiographic image quality, which is especially important in the radiosensitive paediatric population. For AP/PA chest radiographs in children, a minimal field size (MinFS) from "just above the lung apices" to "T12/L1" with age-dependent tolerance is suggested by the 1996 European Commission (EC) guidelines, which were examined qualitatively and quantitatively at a paediatric radiology division. Five hundred ninety-eight unprocessed chest X-rays (45% boys, 55% girls; mean age 3.9 years, range 0-18 years) were analysed with a self-developed tool. Qualitative standards were assessed based on the EC guidelines, as well as the overexposed field size and needlessly irradiated tissue compared to the MinFS. While qualitative guideline recommendations were satisfied, mean overexposure of +45.1 ± 18.9% (range +10.2% to +107.9%) and tissue overexposure of +33.3 ± 13.3% were found. Only 4% (26/598) of the examined X-rays completely fulfilled the EC guidelines. This study presents a new chest radiography quality control tool which allows assessment of field sizes, distances, overexposures and quality parameters based on the EC guidelines. Utilising this tool, we detected inadequate field sizes, inspiration depths, and patient positioning. Furthermore, some debatable EC guideline aspects were revealed. • European Guidelines on X-ray quality recommend exposed field sizes for common examinations. • The major failing in paediatric radiographic imaging techniques is inappropriate field size. • Optimal handling of radiographic units can reduce radiation exposure to paediatric patients. • Constant quality control helps ensure optimal chest radiographic image acquisition in children.

  2. From field notes to data portal - An operational QA/QC framework for tower networks

    NASA Astrophysics Data System (ADS)

    Sturtevant, C.; Hackley, S.; Meehan, T.; Roberti, J. A.; Holling, G.; Bonarrigo, S.

    2016-12-01

    Quality assurance and control (QA/QC) is one of the most important yet challenging aspects of producing research-quality data. This is especially so for environmental sensor networks collecting numerous high-frequency measurement streams at distributed sites. Here, the quality issues are multi-faceted, including sensor malfunctions, unmet theoretical assumptions, and measurement interference from the natural environment. To complicate matters, there are often multiple personnel managing different sites or different steps in the data flow. For large, centrally managed sensor networks such as NEON, the separation of field and processing duties is in the extreme. Tower networks such as Ameriflux, ICOS, and NEON continue to grow in size and sophistication, yet tools for robust, efficient, scalable QA/QC have lagged. Quality control remains a largely manual process relying on visual inspection of the data. In addition, notes of observed measurement interference or visible problems are often recorded on paper without an explicit pathway to data flagging during processing. As such, an increase in network size requires a near-proportional increase in personnel devoted to QA/QC, quickly stressing the human resources available. There is a need for a scalable, operational QA/QC framework that combines the efficiency and standardization of automated tests with the power and flexibility of visual checks, and includes an efficient communication pathway from field personnel to data processors to end users. Here we propose such a framework and an accompanying set of tools in development, including a mobile application template for recording tower maintenance and an R/shiny application for efficiently monitoring and synthesizing data quality issues. This framework seeks to incorporate lessons learned from the Ameriflux community and provide tools to aid continued network advancements.

  3. Practical quality control tools for curves and surfaces

    NASA Technical Reports Server (NTRS)

    Small, Scott G.

    1992-01-01

    Curves (geometry) and surfaces created by Computer Aided Geometric Design systems in the engineering environment must satisfy two basic quality criteria: the geometric shape must have the desired engineering properties; and the objects must be parameterized in a way which does not cause computational difficulty for geometric processing and engineering analysis. Interactive techniques are described which are in use at Boeing to evaluate the quality of aircraft geometry prior to Computational Fluid Dynamic analysis, including newly developed methods for examining surface parameterization and its effects.

  4. Test Analysis Tools to Ensure Higher Quality of On-Board Real Time Software for Space Applications

    NASA Astrophysics Data System (ADS)

    Boudillet, O.; Mescam, J.-C.; Dalemagne, D.

    2008-08-01

    EADS Astrium Space Transportation, in its Les Mureaux premises, is responsible for the French M51 nuclear deterrent missile onboard SW. There was also developed over 1 million of line of code, mostly in ADA, for the Automated Transfer Vehicle (ATV) onboard SW and the flight control SW of the ARIANE5 launcher which has put it into orbit. As part of the ATV SW, ASTRIUM ST has developed the first Category A SW ever qualified for a European space application. To ensure that all these embedded SW have been developed with the highest quality and reliability level, specific development tools have been designed to cover the steps of source code verification, automated validation test or complete target instruction coverage verification. Three of such dedicated tools are presented here.

  5. FQC Dashboard: integrates FastQC results into a web-based, interactive, and extensible FASTQ quality control tool.

    PubMed

    Brown, Joseph; Pirrung, Meg; McCue, Lee Ann

    2017-06-09

    FQC is software that facilitates quality control of FASTQ files by carrying out a QC protocol using FastQC, parsing results, and aggregating quality metrics into an interactive dashboard designed to richly summarize individual sequencing runs. The dashboard groups samples in dropdowns for navigation among the data sets, utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data. FQC is implemented in Python 3 and Javascript, and is maintained under an MIT license. Documentation and source code is available at: https://github.com/pnnl/fqc . joseph.brown@pnnl.gov. © The Author(s) 2017. Published by Oxford University Press.

  6. MO-F-211-01: Methods for Completing Practice Quality Improvement (PQI).

    PubMed

    Johnson, J; Brown, K; Ibbott, G; Pawlicki, T

    2012-06-01

    Practice Quality Improvement (PQI) is becoming an expected part of routine practice in healthcare as an approach to provide more efficient, effective and high quality care. Additionally, as part of the ABR's Maintenance of Certification (MOC) pathway, medical physicists are now expected to complete a PQI project. This session will describe the history behind and benefits of the ABR's MOC program, provide details of quality improvement methods and how to successfully complete a PQI project. PQI methods include various commonly used engineering and management tools. The Plan-Do-Study-Act (PDSA) cycle will be presented as one project planning and implementation tool. Other PQI analysis instruments such as flowcharts, Pareto charts, process control charts and fishbone diagrams will also be explained with examples. Cause analysis, solution development and implementation, and post-implementation measurement will be presented. Project identification and definition as well as appropriate measurement tool selection will be offered. Methods to choose key quality metrics (key quality indicators) will also be addressed. Several sample PQI projects and templates available through the AAPM and other organizations will be described. At least three examples of completed PQI projects will be shared. 1. Identify and define a PQI project 2. Identify and select measurement methods/techniques for use with the PQI project 3. Describe example(s) of completed projects. © 2012 American Association of Physicists in Medicine.

  7. Technical Note: Independent component analysis for quality assurance in functional MRI.

    PubMed

    Astrakas, Loukas G; Kallistis, Nikolaos S; Kalef-Ezra, John A

    2016-02-01

    Independent component analysis (ICA) is an established method of analyzing human functional MRI (fMRI) data. Here, an ICA-based fMRI quality control (QC) tool was developed and used. ICA-based fMRI QC tool to be used with a commercial phantom was developed. In an attempt to assess the performance of the tool relative to preexisting alternative tools, it was used seven weeks before and eight weeks after repair of a faulty gradient amplifier of a non-state-of-the-art MRI unit. More specifically, its performance was compared with the AAPM 100 acceptance testing and quality assurance protocol and two fMRI QC protocols, proposed by Freidman et al. ["Report on a multicenter fMRI quality assurance protocol," J. Magn. Reson. Imaging 23, 827-839 (2006)] and Stocker et al. ["Automated quality assurance routines for fMRI data applied to a multicenter study," Hum. Brain Mapp. 25, 237-246 (2005)], respectively. The easily developed and applied ICA-based QC protocol provided fMRI QC indices and maps equally sensitive to fMRI instabilities with the indices and maps of other established protocols. The ICA fMRI QC indices were highly correlated with indices of other fMRI QC protocols and in some cases theoretically related to them. Three or four independent components with slow varying time series are detected under normal conditions. ICA applied on phantom measurements is an easy and efficient tool for fMRI QC. Additionally, it can protect against misinterpretations of artifact components as human brain activations. Evaluating fMRI QC indices in the central region of a phantom is not always the optimal choice.

  8. Quality-by-design III: application of near-infrared spectroscopy to monitor roller compaction in-process and product quality attributes of immediate release tablets.

    PubMed

    Kona, Ravikanth; Fahmy, Raafat M; Claycamp, Gregg; Polli, James E; Martinez, Marilyn; Hoag, Stephen W

    2015-02-01

    The objective of this study is to use near-infrared spectroscopy (NIRS) coupled with multivariate chemometric models to monitor granule and tablet quality attributes in the formulation development and manufacturing of ciprofloxacin hydrochloride (CIP) immediate release tablets. Critical roller compaction process parameters, compression force (CFt), and formulation variables identified from our earlier studies were evaluated in more detail. Multivariate principal component analysis (PCA) and partial least square (PLS) models were developed during the development stage and used as a control tool to predict the quality of granules and tablets. Validated models were used to monitor and control batches manufactured at different sites to assess their robustness to change. The results showed that roll pressure (RP) and CFt played a critical role in the quality of the granules and the finished product within the range tested. Replacing binder source did not statistically influence the quality attributes of the granules and tablets. However, lubricant type has significantly impacted the granule size. Blend uniformity, crushing force, disintegration time during the manufacturing was predicted using validated PLS regression models with acceptable standard error of prediction (SEP) values, whereas the models resulted in higher SEP for batches obtained from different manufacturing site. From this study, we were able to identify critical factors which could impact the quality attributes of the CIP IR tablets. In summary, we demonstrated the ability of near-infrared spectroscopy coupled with chemometrics as a powerful tool to monitor critical quality attributes (CQA) identified during formulation development.

  9. Development of National Program of Cancer Registries SAS Tool for Population-Based Cancer Relative Survival Analysis.

    PubMed

    Dong, Xing; Zhang, Kevin; Ren, Yuan; Wilson, Reda; O'Neil, Mary Elizabeth

    2016-01-01

    Studying population-based cancer survival by leveraging the high-quality cancer incidence data collected by the Centers for Disease Control and Prevention's National Program of Cancer Registries (NPCR) can offer valuable insight into the cancer burden and impact in the United States. We describe the development and validation of a SASmacro tool that calculates population-based cancer site-specific relative survival estimates comparable to those obtained through SEER*Stat. The NPCR relative survival analysis SAS tool (NPCR SAS tool) was developed based on the relative survival method and SAS macros developed by Paul Dickman. NPCR cancer incidence data from 25 states submitted in November 2012 were used, specifically cases diagnosed from 2003 to 2010 with follow-up through 2010. Decennial and annual complete life tables published by the National Center for Health Statistics (NCHS) for 2000 through 2009 were used. To assess comparability between the 2 tools, 5-year relative survival rates were calculated for 25 cancer sites by sex, race, and age group using the NPCR SAS tool and the National Cancer Institute's SEER*Stat 8.1.5 software. A module to create data files for SEER*Stat was also developed for the NPCR SAS tool. Comparison of the results produced by both SAS and SEER*Stat showed comparable and reliable relative survival estimates for NPCR data. For a majority of the sites, the net differences between the NPCR SAS tool and SEER*Stat-produced relative survival estimates ranged from -0.1% to 0.1%. The estimated standard errors were highly comparable between the 2 tools as well. The NPCR SAS tool will allow researchers to accurately estimate cancer 5-year relative survival estimates that are comparable to those produced by SEER*Stat for NPCR data. Comparison of output from the NPCR SAS tool and SEER*Stat provided additional quality control capabilities for evaluating data prior to producing NPCR relative survival estimates.

  10. Generating Options for Active Risk Control (GO-ARC): introducing a novel technique.

    PubMed

    Card, Alan J; Ward, James R; Clarkson, P John

    2014-01-01

    After investing significant amounts of time and money in conducting formal risk assessments, such as root cause analysis (RCA) or failure mode and effects analysis (FMEA), healthcare workers are left to their own devices in generating high-quality risk control options. They often experience difficulty in doing so, and tend toward an overreliance on administrative controls (the weakest category in the hierarchy of risk controls). This has important implications for patient safety and the cost effectiveness of risk management operations. This paper describes a before and after pilot study of the Generating Options for Active Risk Control (GO-ARC) technique, a novel tool to improve the quality of the risk control options generation process. The quantity, quality (using the three-tiered hierarchy of risk controls), variety, and novelty of risk controls generated. Use of the GO-ARC technique was associated with improvement on all measures. While this pilot study has some notable limitations, it appears that the GO-ARC technique improved the risk control options generation process. Further research is needed to confirm this finding. It is also important to note that improved risk control options are a necessary, but not sufficient, step toward the implementation of more robust risk controls. © 2013 National Association for Healthcare Quality.

  11. Analytical performance evaluation of a high-volume hematology laboratory utilizing sigma metrics as standard of excellence.

    PubMed

    Shaikh, M S; Moiz, B

    2016-04-01

    Around two-thirds of important clinical decisions about the management of patients are based on laboratory test results. Clinical laboratories are required to adopt quality control (QC) measures to ensure provision of accurate and precise results. Six sigma is a statistical tool, which provides opportunity to assess performance at the highest level of excellence. The purpose of this study was to assess performance of our hematological parameters on sigma scale in order to identify gaps and hence areas of improvement in patient care. Twelve analytes included in the study were hemoglobin (Hb), hematocrit (Hct), red blood cell count (RBC), mean corpuscular volume (MCV), red cell distribution width (RDW), total leukocyte count (TLC) with percentages of neutrophils (Neutr%) and lymphocytes (Lymph %), platelet count (Plt), mean platelet volume (MPV), prothrombin time (PT), and fibrinogen (Fbg). Internal quality control data and external quality assurance survey results were utilized for the calculation of sigma metrics for each analyte. Acceptable sigma value of ≥3 was obtained for the majority of the analytes included in the analysis. MCV, Plt, and Fbg achieved value of <3 for level 1 (low abnormal) control. PT performed poorly on both level 1 and 2 controls with sigma value of <3. Despite acceptable conventional QC tools, application of sigma metrics can identify analytical deficits and hence prospects for the improvement in clinical laboratories. © 2016 John Wiley & Sons Ltd.

  12. Toward Higher QA: From Parametric Release of Sterile Parenteral Products to PAT for Other Pharmaceutical Dosage Forms.

    PubMed

    Hock, Sia Chong; Constance, Neo Xue Rui; Wah, Chan Lai

    2012-01-01

    Pharmaceutical products are generally subjected to end-product batch testing as a means of quality control. Due to the inherent limitations of conventional batch testing, this is not the most ideal approach for determining the pharmaceutical quality of the finished dosage form. In the case of terminally sterilized parenteral products, the limitations of conventional batch testing have been successfully addressed with the application of parametric release (the release of a product based on control of process parameters instead of batch sterility testing at the end of the manufacturing process). Consequently, there has been an increasing interest in applying parametric release to other pharmaceutical dosage forms, beyond terminally sterilized parenteral products. For parametric release to be possible, manufacturers must be capable of designing quality into the product, monitoring the manufacturing processes, and controlling the quality of intermediates and finished products in real-time. Process analytical technology (PAT) has been thought to be capable of contributing to these prerequisites. It is believed that the appropriate use of PAT tools can eventually lead to the possibility of real-time release of other pharmaceutical dosage forms, by-passing the need for end-product batch testing. Hence, this literature review attempts to present the basic principles of PAT, introduce the various PAT tools that are currently available, present their recent applications to pharmaceutical processing, and explain the potential benefits that PAT can bring to conventional ways of processing and quality assurance of pharmaceutical products. Last but not least, current regulations governing the use of PAT and the manufacturing challenges associated with PAT implementation are also discussed. Pharmaceutical products are generally subjected to end-product batch testing as a means of quality control. Due to the inherent limitations of conventional batch testing, this is not the most ideal approach. In the case of terminally sterilized parenteral products, these limitations have been successfully addressed with the application of parametric release (the release of a product based on control of process parameters instead of batch sterility testing at the end of the manufacturing process). Consequently, there has been an increasing interest in applying parametric release to other pharmaceutical dosage forms. With the advancement of process analytical technology (PAT), it is possible to monitor the manufacturing processes closely. This will eventually enable quality control of the intermediates and finished products, and thus their release in real-time. Hence, this literature review attempts to present the basic principles of PAT, introduce the various PAT tools that are currently available, present their recent applications to pharmaceutical processing, and explain the potential benefits that PAT can bring to conventional ways of processing and quality assurance of pharmaceutical products. It will also discuss the current regulations governing the use of PAT and the manufacturing challenges associated with the implementation of PAT.

  13. The utilization of six sigma and statistical process control techniques in surgical quality improvement.

    PubMed

    Sedlack, Jeffrey D

    2010-01-01

    Surgeons have been slow to incorporate industrial reliability techniques. Process control methods were applied to surgeon waiting time between cases, and to length of stay (LOS) after colon surgery. Waiting times between surgeries were evaluated by auditing the operating room records of a single hospital over a 1-month period. The medical records of 628 patients undergoing colon surgery over a 5-year period were reviewed. The average surgeon wait time between cases was 53 min, and the busiest surgeon spent 291/2 hr in 1 month waiting between surgeries. Process control charting demonstrated poor overall control of the room turnover process. Average LOS after colon resection also demonstrated very poor control. Mean LOS was 10 days. Weibull's conditional analysis revealed a conditional LOS of 9.83 days. Serious process management problems were identified in both analyses. These process issues are both expensive and adversely affect the quality of service offered by the institution. Process control mechanisms were suggested or implemented to improve these surgical processes. Industrial reliability and quality management tools can easily and effectively identify process control problems that occur on surgical services. © 2010 National Association for Healthcare Quality.

  14. Quality control and primo-diagnosis of transurethral bladder resections with full-field OCT

    NASA Astrophysics Data System (ADS)

    Montagne, P.; Ducesne, I.; Anract, J.; Yang, C.; Sibony, M.; Beuvon, F.; Delongchamps, N. B.; Dalimier, E.

    2017-02-01

    Transurethral resections are commonly used for bladder cancer diagnosis, treatment and follow-up. Cancer staging relies largely on the analysis of muscle in the resections; however, muscle presence is uncertain at the time of the resection. An extemporaneous quality control tool would be of great use to certify the presence of muscle in the resection, and potentially formulate a primo-diagnosis, in order to ensure optimum patient care. Full-field optical coherence tomography (FFOCT) offers a fast and non-destructive method of obtaining images of biological tissues at ultrahigh resolution (1μm in all 3 directions), approaching traditional histological sections. This study aimed to evaluate the potential of FFOCT for the quality control and the primo-diagnosis of transurethral bladder resections. Over 70 transurethral bladder resections were imaged with FFOCT within minutes, shortly after excision, and before histological preparation. Side-by-side comparison with histology allowed to establish reading criteria for the presence of muscle and cancer in particular. Images of 24 specimens were read blindly by three non-pathologists readers: two resident urologists and a junior bio-medical engineer, who were asked to notify the presence of muscle and tumor. Results showed that after appropriate training, 96% accuracy could be obtained on both tumour and muscle detection. FFOCT is a fast and nondestructive imaging technique that provides analysis results concordant with histology. Its implementation as a quality control and primo-diagnosis tool for transurethral bladder resections in the urology suite is feasible and lets envision high value for the patient.

  15. Development of a Comprehensive and Interactive Tool to Inform State Violence and Injury Prevention Plans.

    PubMed

    Wilson, Lauren; Deokar, Angela J; Zaesim, Araya; Thomas, Karen; Kresnow-Sedacca, Marcie-Jo

    The Center of Disease Control and Prevention's Core State Violence and Injury Prevention Program (Core SVIPP) provides an opportunity for states to engage with their partners to implement, evaluate, and disseminate strategies that lead to the reduction and prevention of injury and violence. Core SVIPP requires awardees to develop or update their state injury and violence plans. Currently, literature informing state planning efforts is limited, especially regarding materials related to injury and violence. Presumably, plans that are higher quality result in having a greater impact on preventing injury and violence, and literature to improve quality would benefit prevention programming. (1) To create a comprehensive injury-specific index to aid in the development and revision of state injury and violence prevention plans, and (2) to assess the reliability and utility of this index. Through an iterative development process, a workgroup of subject matter experts created the Violence and Injury Prevention: Comprehensive Index Tool (VIP:CIT). The tool was pilot tested on 3 state injury and violence prevention plans and assessed for initial usability. Following revisions to the tool (ie, a rubric was developed to further delineate consistent criteria for rating; items were added and clarified), the same state plans were reassessed to test interrater reliability and tool utility. For the second assessment, reliability of the VIP:CIT improved, indicating that the rubric was a useful addition. Qualitative feedback from states suggested that the tool significantly helped guide plan development and communicate about planning processes. The final VIP:CIT is a tool that can help increase plan quality, decrease the research-to-practice gap, and increase connectivity to emerging public health paradigms. The tool provides an example of tailoring guidance materials to reflect academic literature, and it can be easily adapted to other topic areas to promote quality of strategic plans for numerous outcomes.

  16. Development of a Comprehensive and Interactive Tool to Inform State Violence and Injury Prevention Plans

    PubMed Central

    Wilson, Lauren; Deokar, Angela J.; Zaesim, Araya; Thomas, Karen; Kresnow-Sedacca, Marcie-jo

    2018-01-01

    Context The Center of Disease Control and Prevention’s Core State Violence and Injury Prevention Program (Core SVIPP) provides an opportunity for states to engage with their partners to implement, evaluate, and disseminate strategies that lead to the reduction and prevention of injury and violence. Core SVIPP requires awardees to develop or update their state injury and violence plans. Currently, literature informing state planning efforts is limited, especially regarding materials related to injury and violence. Presumably, plans that are higher quality result in having a greater impact on preventing injury and violence, and literature to improve quality would benefit prevention programming. Objective (1) To create a comprehensive injury-specific index to aid in the development and revision of state injury and violence prevention plans, and (2) to assess the reliability and utility of this index. Design Through an iterative development process, a workgroup of subject matter experts created the Violence and Injury Prevention: Comprehensive Index Tool (VIP:CIT). The tool was pilot tested on 3 state injury and violence prevention plans and assessed for initial usability. Following revisions to the tool (ie, a rubric was developed to further delineate consistent criteria for rating; items were added and clarified), the same state plans were reassessed to test interrater reliability and tool utility. Results For the second assessment, reliability of the VIP:CIT improved, indicating that the rubric was a useful addition. Qualitative feedback from states suggested that the tool significantly helped guide plan development and communicate about planning processes. Conclusion The final VIP:CIT is a tool that can help increase plan quality, decrease the research-to-practice gap, and increase connectivity to emerging public health paradigms. The tool provides an example of tailoring guidance materials to reflect academic literature, and it can be easily adapted to other topic areas to promote quality of strategic plans for numerous outcomes. PMID:29189505

  17. RNA-seq Data: Challenges in and Recommendations for Experimental Design and Analysis.

    PubMed

    Williams, Alexander G; Thomas, Sean; Wyman, Stacia K; Holloway, Alisha K

    2014-10-01

    RNA-seq is widely used to determine differential expression of genes or transcripts as well as identify novel transcripts, identify allele-specific expression, and precisely measure translation of transcripts. Thoughtful experimental design and choice of analysis tools are critical to ensure high-quality data and interpretable results. Important considerations for experimental design include number of replicates, whether to collect paired-end or single-end reads, sequence length, and sequencing depth. Common analysis steps in all RNA-seq experiments include quality control, read alignment, assigning reads to genes or transcripts, and estimating gene or transcript abundance. Our aims are two-fold: to make recommendations for common components of experimental design and assess tool capabilities for each of these steps. We also test tools designed to detect differential expression, since this is the most widespread application of RNA-seq. We hope that these analyses will help guide those who are new to RNA-seq and will generate discussion about remaining needs for tool improvement and development. Copyright © 2014 John Wiley & Sons, Inc.

  18. Green Roofs for Stormwater Runoff Control - Abstract

    EPA Science Inventory

    This project evaluated green roofs as a stormwater management tool. Specifically, runoff quantity and quality from green and flat asphalt roofs were compared. Evapotranspiration from planted green roofs and evaporation from unplanted media roofs were also compared. The influence...

  19. Green Roofs for Stormwater Runoff Control

    EPA Science Inventory

    This project evaluated green roofs as a stormwater management tool. Specifically, runoff quantity and quality from green and flat asphalt roofs were compared. Evapotranspiration from planted green roofs and evaporation from unplanted media roofs were also compared. The influence...

  20. Rationale for Quality Assurance in Fecal Egg Monitoring of Soil-Transmitted Helminthiasis

    PubMed Central

    Hoekendijk, David J. L.; Hill, Philip C.; Sowerby, Stephen J.

    2016-01-01

    Substantial investment has been made into the once “neglected” tropical disease, soil-transmitted helminthiasis, and into control programs that operate within a framework of mapping baseline disease distribution, measuring the effectiveness of applied interventions, establishing when to cease drug administration, and for posttreatment evaluations. However, critical to each of these stages is the determination of helminth infection. The limitations of traditional microscope-based fecal egg diagnostics have not provided quality assurance in the monitoring of parasite disease and suboptimal treatment regimes provide for the potential development of parasite resistance to anthelmintic drugs. Improved diagnostic and surveillance tools are required to protect therapeutic effectiveness and to maintain funder confidence. Such tools may be on the horizon with emergent technologies that offer potential for enhanced visualization and quality-assured quantitation of helminth eggs. PMID:27352875

  1. Economics of infection control surveillance technology: cost-effective or just cost?

    PubMed

    Furuno, Jon P; Schweizer, Marin L; McGregor, Jessina C; Perencevich, Eli N

    2008-04-01

    Previous studies have suggested that informatics tools, such as automated alert and decision support systems, may increase the efficiency and quality of infection control surveillance. However, little is known about the cost-effectiveness of these tools. We focus on 2 types of economic analyses that have utility in assessing infection control interventions (cost-effectiveness analysis and business-case analysis) and review the available literature on the economics of computerized infection control surveillance systems. Previous studies on the effectiveness of computerized infection control surveillance have been limited to assessments of whether these tools increase the sensitivity and specificity of surveillance over traditional methods. Furthermore, we identified only 2 studies that assessed the costs associated with computerized infection control surveillance. Thus, it remains unknown whether computerized infection control surveillance systems are cost-effective and whether use of these systems improves patient outcomes. The existing data are insufficient to allow for a summary conclusion on the cost-effectiveness of infection control surveillance technology. All future studies of computerized infection control surveillance systems should aim to collect outcomes and economic data to inform decision making and assist hospitals with completing business-cases analyses.

  2. Identifying Items to Assess Methodological Quality in Physical Therapy Trials: A Factor Analysis

    PubMed Central

    Cummings, Greta G.; Fuentes, Jorge; Saltaji, Humam; Ha, Christine; Chisholm, Annabritt; Pasichnyk, Dion; Rogers, Todd

    2014-01-01

    Background Numerous tools and individual items have been proposed to assess the methodological quality of randomized controlled trials (RCTs). The frequency of use of these items varies according to health area, which suggests a lack of agreement regarding their relevance to trial quality or risk of bias. Objective The objectives of this study were: (1) to identify the underlying component structure of items and (2) to determine relevant items to evaluate the quality and risk of bias of trials in physical therapy by using an exploratory factor analysis (EFA). Design A methodological research design was used, and an EFA was performed. Methods Randomized controlled trials used for this study were randomly selected from searches of the Cochrane Database of Systematic Reviews. Two reviewers used 45 items gathered from 7 different quality tools to assess the methodological quality of the RCTs. An exploratory factor analysis was conducted using the principal axis factoring (PAF) method followed by varimax rotation. Results Principal axis factoring identified 34 items loaded on 9 common factors: (1) selection bias; (2) performance and detection bias; (3) eligibility, intervention details, and description of outcome measures; (4) psychometric properties of the main outcome; (5) contamination and adherence to treatment; (6) attrition bias; (7) data analysis; (8) sample size; and (9) control and placebo adequacy. Limitation Because of the exploratory nature of the results, a confirmatory factor analysis is needed to validate this model. Conclusions To the authors' knowledge, this is the first factor analysis to explore the underlying component items used to evaluate the methodological quality or risk of bias of RCTs in physical therapy. The items and factors represent a starting point for evaluating the methodological quality and risk of bias in physical therapy trials. Empirical evidence of the association among these items with treatment effects and a confirmatory factor analysis of these results are needed to validate these items. PMID:24786942

  3. Identifying items to assess methodological quality in physical therapy trials: a factor analysis.

    PubMed

    Armijo-Olivo, Susan; Cummings, Greta G; Fuentes, Jorge; Saltaji, Humam; Ha, Christine; Chisholm, Annabritt; Pasichnyk, Dion; Rogers, Todd

    2014-09-01

    Numerous tools and individual items have been proposed to assess the methodological quality of randomized controlled trials (RCTs). The frequency of use of these items varies according to health area, which suggests a lack of agreement regarding their relevance to trial quality or risk of bias. The objectives of this study were: (1) to identify the underlying component structure of items and (2) to determine relevant items to evaluate the quality and risk of bias of trials in physical therapy by using an exploratory factor analysis (EFA). A methodological research design was used, and an EFA was performed. Randomized controlled trials used for this study were randomly selected from searches of the Cochrane Database of Systematic Reviews. Two reviewers used 45 items gathered from 7 different quality tools to assess the methodological quality of the RCTs. An exploratory factor analysis was conducted using the principal axis factoring (PAF) method followed by varimax rotation. Principal axis factoring identified 34 items loaded on 9 common factors: (1) selection bias; (2) performance and detection bias; (3) eligibility, intervention details, and description of outcome measures; (4) psychometric properties of the main outcome; (5) contamination and adherence to treatment; (6) attrition bias; (7) data analysis; (8) sample size; and (9) control and placebo adequacy. Because of the exploratory nature of the results, a confirmatory factor analysis is needed to validate this model. To the authors' knowledge, this is the first factor analysis to explore the underlying component items used to evaluate the methodological quality or risk of bias of RCTs in physical therapy. The items and factors represent a starting point for evaluating the methodological quality and risk of bias in physical therapy trials. Empirical evidence of the association among these items with treatment effects and a confirmatory factor analysis of these results are needed to validate these items. © 2014 American Physical Therapy Association.

  4. Operational CryoSat Product Quality Assessment

    NASA Astrophysics Data System (ADS)

    Mannan, Rubinder; Webb, Erica; Hall, Amanda; Bouzinac, Catherine

    2013-12-01

    The performance and quality of the CryoSat data products are routinely assessed by the Instrument Data quality Evaluation and Analysis Service (IDEAS). This information is then conveyed to the scientific and user community in order to allow them to utilise CryoSat data with confidence. This paper presents details of the Quality Control (QC) activities performed for CryoSat products under the IDEAS contract. Details of the different QC procedures and tools deployed by IDEAS to assess the quality of operational data are presented. The latest updates to the Instrument Processing Facility (IPF) for the Fast Delivery Marine (FDM) products and the future update to Baseline-C are discussed.

  5. Importance of implementing an analytical quality control system in a core laboratory.

    PubMed

    Marques-Garcia, F; Garcia-Codesal, M F; Caro-Narros, M R; Contreras-SanFeliciano, T

    2015-01-01

    The aim of the clinical laboratory is to provide useful information for screening, diagnosis and monitoring of disease. The laboratory should ensure the quality of extra-analytical and analytical process, based on set criteria. To do this, it develops and implements a system of internal quality control, designed to detect errors, and compare its data with other laboratories, through external quality control. In this way it has a tool to detect the fulfillment of the objectives set, and in case of errors, allowing corrective actions to be made, and ensure the reliability of the results. This article sets out to describe the design and implementation of an internal quality control protocol, as well as its periodical assessment intervals (6 months) to determine compliance with pre-determined specifications (Stockholm Consensus(1)). A total of 40 biochemical and 15 immunochemical methods were evaluated using three different control materials. Next, a standard operation procedure was planned to develop a system of internal quality control that included calculating the error of the analytical process, setting quality specifications, and verifying compliance. The quality control data were then statistically depicted as means, standard deviations, and coefficients of variation, as well as systematic, random, and total errors. The quality specifications were then fixed and the operational rules to apply in the analytical process were calculated. Finally, our data were compared with those of other laboratories through an external quality assurance program. The development of an analytical quality control system is a highly structured process. This should be designed to detect errors that compromise the stability of the analytical process. The laboratory should review its quality indicators, systematic, random and total error at regular intervals, in order to ensure that they are meeting pre-determined specifications, and if not, apply the appropriate corrective actions. Copyright © 2015 SECA. Published by Elsevier Espana. All rights reserved.

  6. High-Protein and High-Dietary Fiber Breakfasts Result in Equal Feelings of Fullness and Better Diet Quality in Low-Income Preschoolers Compared with Their Usual Breakfast.

    PubMed

    Kranz, Sibylle; Brauchla, Mary; Campbell, Wayne W; Mattes, Rickard D; Schwichtenberg, Amy J

    2017-03-01

    Background: In the United States, 17% of children are currently obese. Increasing feelings of fullness may prevent excessive energy intake, lead to better diet quality, and promote long-term maintenance of healthy weight. Objective: The purpose of this study was to develop a fullness-rating tool (aim 1) and to determine whether a high-protein (HP), high-fiber (HF), and combined HP and HF (HPHF) breakfast increases preschoolers' feelings of fullness before (pre) and after (post) breakfast and pre-lunch, as well as their diet quality, as measured by using a composite diet quality assessment tool, the Revised Children's Diet Quality Index (aim 2). Methods: Children aged 4 and 5 y ( n = 41; 22 girls and 19 boys) from local Head Start centers participated in this randomized intervention trial. Sixteen percent of boys and 32% of girls were overweight or obese. After the baseline week, children rotated through four 1-wk periods of consuming ad libitum HP (19-20 g protein), HF (10-11 g fiber), HPHF (19-21 g protein, 10-12 g fiber), or usual (control) breakfasts. Food intake at breakfast was estimated daily, and for breakfast, lunch, and snack on day 3 of each study week Student's t tests and ANOVA were used to determine statistical differences. Results: Children's post-breakfast and pre-lunch fullness ratings were ≥1 point higher than those of pre-breakfast (aim 1). Although children consumed, on average, 65 kcal less energy during the intervention breakfasts ( P < 0.007) than during the control breakfast, fullness ratings did not differ ( P = 0.76). Relative to the control breakfast, improved diet quality (12%) was calculated for the HP and HF breakfasts ( P < 0.027) but not for the HPHF breakfast (aim 2). Conclusions: Post-breakfast fullness ratings were not affected by the intervention breakfasts relative to the control breakfast. HP and HF breakfasts resulted in higher diet quality. Serving HP or HF breakfasts may be valuable in improving diet quality without lowering feelings of satiation or satiety. This trial was registered at clinicaltrials.gov as NCT02122224. © 2017 American Society for Nutrition.

  7. The Development of a Myoelectric Training Tool for Above-Elbow Amputees

    PubMed Central

    Dawson, Michael R; Fahimi, Farbod; Carey, Jason P

    2012-01-01

    The objective of above-elbow myoelectric prostheses is to reestablish the functionality of missing limbs and increase the quality of life of amputees. By using electromyography (EMG) electrodes attached to the surface of the skin, amputees are able to control motors in myoelectric prostheses by voluntarily contracting the muscles of their residual limb. This work describes the development of an inexpensive myoelectric training tool (MTT) designed to help upper limb amputees learn how to use myoelectric technology in advance of receiving their actual myoelectric prosthesis. The training tool consists of a physical and simulated robotic arm, signal acquisition hardware, controller software, and a graphical user interface. The MTT improves over earlier training systems by allowing a targeted muscle reinnervation (TMR) patient to control up to two degrees of freedom simultaneously. The training tool has also been designed to function as a research prototype for novel myoelectric controllers. A preliminary experiment was performed in order to evaluate the effectiveness of the MTT as a learning tool and to identify any issues with the system. Five able-bodied participants performed a motor-learning task using the EMG controlled robotic arm with the goal of moving five balls from one box to another as quickly as possible. The results indicate that the subjects improved their skill in myoelectric control over the course of the trials. A usability survey was administered to the subjects after their trials. Results from the survey showed that the shoulder degree of freedom was the most difficult to control. PMID:22383905

  8. The development of a myoelectric training tool for above-elbow amputees.

    PubMed

    Dawson, Michael R; Fahimi, Farbod; Carey, Jason P

    2012-01-01

    The objective of above-elbow myoelectric prostheses is to reestablish the functionality of missing limbs and increase the quality of life of amputees. By using electromyography (EMG) electrodes attached to the surface of the skin, amputees are able to control motors in myoelectric prostheses by voluntarily contracting the muscles of their residual limb. This work describes the development of an inexpensive myoelectric training tool (MTT) designed to help upper limb amputees learn how to use myoelectric technology in advance of receiving their actual myoelectric prosthesis. The training tool consists of a physical and simulated robotic arm, signal acquisition hardware, controller software, and a graphical user interface. The MTT improves over earlier training systems by allowing a targeted muscle reinnervation (TMR) patient to control up to two degrees of freedom simultaneously. The training tool has also been designed to function as a research prototype for novel myoelectric controllers. A preliminary experiment was performed in order to evaluate the effectiveness of the MTT as a learning tool and to identify any issues with the system. Five able-bodied participants performed a motor-learning task using the EMG controlled robotic arm with the goal of moving five balls from one box to another as quickly as possible. The results indicate that the subjects improved their skill in myoelectric control over the course of the trials. A usability survey was administered to the subjects after their trials. Results from the survey showed that the shoulder degree of freedom was the most difficult to control.

  9. Results of a multicentre randomised controlled trial of statistical process control charts and structured diagnostic tools to reduce ward-acquired meticillin-resistant Staphylococcus aureus: the CHART Project.

    PubMed

    Curran, E; Harper, P; Loveday, H; Gilmour, H; Jones, S; Benneyan, J; Hood, J; Pratt, R

    2008-10-01

    Statistical process control (SPC) charts have previously been advocated for infection control quality improvement. To determine their effectiveness, a multicentre randomised controlled trial was undertaken to explore whether monthly SPC feedback from infection control nurses (ICNs) to healthcare workers of ward-acquired meticillin-resistant Staphylococcus aureus (WA-MRSA) colonisation or infection rates would produce any reductions in incidence. Seventy-five wards in 24 hospitals in the UK were randomised into three arms: (1) wards receiving SPC chart feedback; (2) wards receiving SPC chart feedback in conjunction with structured diagnostic tools; and (3) control wards receiving neither type of feedback. Twenty-five months of pre-intervention WA-MRSA data were compared with 24 months of post-intervention data. Statistically significant and sustained decreases in WA-MRSA rates were identified in all three arms (P<0.001; P=0.015; P<0.001). The mean percentage reduction was 32.3% for wards receiving SPC feedback, 19.6% for wards receiving SPC and diagnostic feedback, and 23.1% for control wards, but with no significant difference between the control and intervention arms (P=0.23). There were significantly more post-intervention 'out-of-control' episodes (P=0.021) in the control arm (averages of 0.60, 0.28, and 0.28 for Control, SPC and SPC+Tools wards, respectively). Participants identified SPC charts as an effective communication tool and valuable for disseminating WA-MRSA data.

  10. Multi-criteria analysis for PM10 planning

    NASA Astrophysics Data System (ADS)

    Pisoni, Enrico; Carnevale, Claudio; Volta, Marialuisa

    To implement sound air quality policies, Regulatory Agencies require tools to evaluate outcomes and costs associated to different emission reduction strategies. These tools are even more useful when considering atmospheric PM10 concentrations due to the complex nonlinear processes that affect production and accumulation of the secondary fraction of this pollutant. The approaches presented in the literature (Integrated Assessment Modeling) are mainly cost-benefit and cost-effective analysis. In this work, the formulation of a multi-objective problem to control particulate matter is proposed. The methodology defines: (a) the control objectives (the air quality indicator and the emission reduction cost functions); (b) the decision variables (precursor emission reductions); (c) the problem constraints (maximum feasible technology reductions). The cause-effect relations between air quality indicators and decision variables are identified tuning nonlinear source-receptor models. The multi-objective problem solution provides to the decision maker a set of not-dominated scenarios representing the efficient trade-off between the air quality benefit and the internal costs (emission reduction technology costs). The methodology has been implemented for Northern Italy, often affected by high long-term exposure to PM10. The source-receptor models used in the multi-objective analysis are identified processing long-term simulations of GAMES multiphase modeling system, performed in the framework of CAFE-Citydelta project.

  11. Automation of testing modules of controller ELSY-ТМК

    NASA Astrophysics Data System (ADS)

    Dolotov, A. E.; Dolotova, R. G.; Petuhov, D. V.; Potapova, A. P.

    2017-01-01

    In modern life, there are means for automation of various processes which allow one to provide high quality standards of released products and to raise labour efficiency. In the given paper, the data on the automation of the test process of the ELSY-TMK controller [1] is presented. The ELSY-TMK programmed logic controller is an effective modular platform for construction of automation systems for small and average branches of industrial production. The modern and functional standard of communication and open environment of the logic controller give a powerful tool of wide spectrum applications for industrial automation. The algorithm allows one to test controller modules by operating the switching system and external devices faster and at a higher level of quality than a human without such means does.

  12. [The participation of patients with dementia in individualised intervention plan meetings: the impact on their well-being and the quality of life].

    PubMed

    Villar, Feliciano; Vila-Miravent, Josep; Celdrán, Montserrat; Fernández, Elena

    2013-01-01

    An individualised intervention plan (IIP) offers a new paradigm in the care of the elderly with dementia, with the aim of increasing their quality of life through personalisation, respect for their freedom, and their participation in the decisions that affect their lives. To evaluate the impact of the residential home patient with dementia and their quality of care when they take part in the interdisciplinary meeting in which their care plan is decided. A total of 52 elderly patients with dementia took part in the study. They were distributed into two groups, one experimental (37 residents) and another control (15 residents). The Dementia Care Mapping (DCM) tool was used to assess the well-being and quality of care of the residents. This tool was used twice, before and after the intervention. The well-being of the resident, evaluated using the DCM, was similar before and after the intervention in the experimental group. No differences were observed either on comparing the control and experimental groups. However, some indicators of carer behaviour were different before and after the intervention, and when the control and experimental group were compared. The inclusion of elderly persons with dementia in their IIP meeting had a positive effect in the interaction of the staff with the residents, but not on the well-being of the resident. Copyright © 2012 SEGG. Published by Elsevier Espana. All rights reserved.

  13. Evaluation of Fly Ash Quality Control Tools

    DOT National Transportation Integrated Search

    2010-06-30

    Many entities currently use fly ash in portland cement concrete (PCC) pavements and structures. Although the body of knowledge is : great concerning the use of fly ash, several projects per year are subject to poor performance where fly ash is named ...

  14. Evaluation of fly ash quality control tools.

    DOT National Transportation Integrated Search

    2010-06-30

    Many entities currently use fly ash in portland cement concrete (PCC) pavements and structures. Although the body of knowledge is : great concerning the use of fly ash, several projects per year are subject to poor performance where fly ash is named ...

  15. Sigma metrics as a tool for evaluating the performance of internal quality control in a clinical chemistry laboratory

    PubMed Central

    Kumar, B. Vinodh; Mohan, Thuthi

    2018-01-01

    OBJECTIVE: Six Sigma is one of the most popular quality management system tools employed for process improvement. The Six Sigma methods are usually applied when the outcome of the process can be measured. This study was done to assess the performance of individual biochemical parameters on a Sigma Scale by calculating the sigma metrics for individual parameters and to follow the Westgard guidelines for appropriate Westgard rules and levels of internal quality control (IQC) that needs to be processed to improve target analyte performance based on the sigma metrics. MATERIALS AND METHODS: This is a retrospective study, and data required for the study were extracted between July 2015 and June 2016 from a Secondary Care Government Hospital, Chennai. The data obtained for the study are IQC - coefficient of variation percentage and External Quality Assurance Scheme (EQAS) - Bias% for 16 biochemical parameters. RESULTS: For the level 1 IQC, four analytes (alkaline phosphatase, magnesium, triglyceride, and high-density lipoprotein-cholesterol) showed an ideal performance of ≥6 sigma level, five analytes (urea, total bilirubin, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level and for level 2 IQCs, same four analytes of level 1 showed a performance of ≥6 sigma level, and four analytes (urea, albumin, cholesterol, and potassium) showed an average performance of <3 sigma level. For all analytes <6 sigma level, the quality goal index (QGI) was <0.8 indicating the area requiring improvement to be imprecision except cholesterol whose QGI >1.2 indicated inaccuracy. CONCLUSION: This study shows that sigma metrics is a good quality tool to assess the analytical performance of a clinical chemistry laboratory. Thus, sigma metric analysis provides a benchmark for the laboratory to design a protocol for IQC, address poor assay performance, and assess the efficiency of existing laboratory processes. PMID:29692587

  16. Framework for establishing records control in hospitals as an ISO 9001 requirement.

    PubMed

    Al-Qatawneh, Lina

    2017-02-13

    Purpose The purpose of this paper is to present the process followed to control records in a Jordanian private community hospital as an ISO 9001:2008 standard requirement. Design/methodology/approach Under the hospital quality council's supervision, the quality management and development office staff were responsible for designing, planning and implementing the quality management system (QMS) using the ISO 9001:2008 standard. A policy for records control was established. An action plan for establishing the records control was developed and implemented. On completion, a coding system for records was specified to be used by hospital staff. Finally, an internal audit was performed to verify conformity to the ISO 9001:2008 standard requirements. Findings Successful certification by a neutral body ascertained that the hospital's QMS conformed to the ISO 9001:2008 requirements. A framework was developed that describes the records controlling process, which can be used by staff in any healthcare organization wanting to achieve ISO 9001:2008 accreditation. Originality/value Given the increased interest among healthcare organizations to achieve the ISO 9001 certification, the proposed framework for establishing records control is developed and is expected to be a valuable management tool to improve and sustain healthcare quality.

  17. A fuzzy logic approach to control anaerobic digestion.

    PubMed

    Domnanovich, A M; Strik, D P; Zani, L; Pfeiffer, B; Karlovits, M; Braun, R; Holubar, P

    2003-01-01

    One of the goals of the EU-Project AMONCO (Advanced Prediction, Monitoring and Controlling of Anaerobic Digestion Process Behaviour towards Biogas Usage in Fuel Cells) is to create a control tool for the anaerobic digestion process, which predicts the volumetric organic loading rate (Bv) for the next day, to obtain a high biogas quality and production. The biogas should contain a high methane concentration (over 50%) and a low concentration of components toxic for fuel cells, e.g. hydrogen sulphide, siloxanes, ammonia and mercaptanes. For producing data to test the control tool, four 20 l anaerobic Continuously Stirred Tank Reactors (CSTR) are operated. For controlling two systems were investigated: a pure fuzzy logic system and a hybrid-system which contains a fuzzy based reactor condition calculation and a hierachial neural net in a cascade of optimisation algorithms.

  18. Association between hospital size and quality improvement for pharmaceutical services.

    PubMed

    Nau, David P; Garber, Mathew C; Lipowski, Earlene E; Stevenson, James G

    2004-01-15

    The relationship between hospital size and quality improvement (QI) for pharmaceutical services was studied. A questionnaire on QI was sent to hospital pharmacy directors in Michigan and Florida in 2002. The questionnaire included items on QI lead-team composition, QI tools, QI training, and QI culture. Usable responses were received from 162 (57%) of 282 pharmacy directors. Pharmacy QI lead teams were present in 57% of institutions, with larger teams in large hospitals (> or = 300 patients). Only two QI tools were used by a majority of hospitals: root-cause analysis (62%) and flow charts (66%). Small hospitals (< 50 patients) were less likely than medium-sized hospitals (50-299 patients) and large hospitals to use several QI tools, including control charts, cause-and-effect diagrams, root-cause analysis, flow charts, and histograms. Large hospitals were more likely than small and medium-sized hospitals to use root-cause analysis and control charts. There was no relationship between hospital size and the frequency with which physician or patient satisfaction with pharmaceutical services was measured. There were no differences in QI training or QI culture across hospital size categories. A survey suggested that a majority of hospital pharmacies in Michigan and Florida have begun to adopt QI techniques but that most are not using rigorous QI tools. Pharmacies in large hospitals had more QI lead-team members and were more likely to use certain QI tools, but there was no relationship between hospital size and satisfaction measurements, QI training, or QI culture.

  19. Clinical peer review program self-evaluation for US hospitals.

    PubMed

    Edwards, Marc T

    2010-01-01

    Prior research has shown wide variation in clinical peer review program structure, process, governance, and perceived effectiveness. This study sought to validate the utility of a Peer Review Program Self-Evaluation Tool as a potential guide to physician and hospital leaders seeking greater program value. Data from 330 hospitals show that the total score from the self-evaluation tool is strongly associated with perceived quality impact. Organizational culture also plays a significant role. When controlling for these factors, there was no evidence of benefit from a multispecialty review process. Physicians do not generally use reliable methods to measure clinical performance. A high rate of change since 2007 has not produced much improvement. The Peer Review Program Self-Evaluation Tool reliably differentiates hospitals along a continuum of perceived program performance. The full potential of peer review as a process to improve the quality and safety of care has yet to be realized.

  20. Guideline harmonization and implementation plan for the BETTER trial: Building on Existing Tools to Improve Chronic Disease Prevention and Screening in Family Practice

    PubMed Central

    Rogers, Jess; Manca, Donna; Lang-Robertson, Kelly; Bell, Stephanie; Salvalaggio, Ginetta; Greiver, Michelle; Korownyk, Christina; Klein, Doug; Carroll, June C.; Kahan, Mel; Meuser, Jamie; Buchman, Sandy; Barrett, Rebekah M.; Grunfeld, Eva

    2014-01-01

    Background The aim of the Building on Existing Tools to Improve Chronic Disease Prevention and Screening in Family Practice (BETTER) randomized controlled trial is to improve the primary prevention of and screening for multiple conditions (diabetes, cardiovascular disease, cancer) and some of the associated lifestyle factors (tobacco use, alcohol overuse, poor nutrition, physical inactivity). In this article, we describe how we harmonized the evidence-based clinical practice guideline recommendations and patient tools to determine the content for the BETTER trial. Methods We identified clinical practice guidelines and tools through a structured literature search; we included both indexed and grey literature. From these guidelines, recommendations were extracted and integrated into knowledge products and outcome measures for use in the BETTER trial. End-users (family physicians, nurse practitioners, nurses and dieticians) were engaged in reviewing the recommendations and tools, as well as tailoring the content to the needs of the BETTER trial and family practice. Results In total, 3–5 high-quality guidelines were identified for each condition; from these, we identified high-grade recommendations for the prevention of and screening for chronic disease. The guideline recommendations were limited by conflicting recommendations, vague wording and different taxonomies for strength of recommendation. There was a lack of quality evidence for manoeuvres to improve the uptake of guidelines among patients with depression. We developed the BETTER clinical algorithms for the implementation plan. Although it was difficult to identify high-quality tools, 180 tools of interest were identified. Interpretation The intervention for the BETTER trial was built by integrating existing guidelines and tools, and working with end-users throughout the process to increase the intervention’s utility for practice. Trial registration: ISRCTN07170460 PMID:25077119

  1. Facilitating quality control for spectra assignments of small organic molecules: nmrshiftdb2--a free in-house NMR database with integrated LIMS for academic service laboratories.

    PubMed

    Kuhn, Stefan; Schlörer, Nils E

    2015-08-01

    nmrshiftdb2 supports with its laboratory information management system the integration of an electronic lab administration and management into academic NMR facilities. Also, it offers the setup of a local database, while full access to nmrshiftdb2's World Wide Web database is granted. This freely available system allows on the one hand the submission of orders for measurement, transfers recorded data automatically or manually, and enables download of spectra via web interface, as well as the integrated access to prediction, search, and assignment tools of the NMR database for lab users. On the other hand, for the staff and lab administration, flow of all orders can be supervised; administrative tools also include user and hardware management, a statistic functionality for accounting purposes, and a 'QuickCheck' function for assignment control, to facilitate quality control of assignments submitted to the (local) database. Laboratory information management system and database are based on a web interface as front end and are therefore independent of the operating system in use. Copyright © 2015 John Wiley & Sons, Ltd.

  2. Effectiveness of the Assessment of Burden of Chronic Obstructive Pulmonary Disease (ABC) tool: study protocol of a cluster randomised trial in primary and secondary care

    PubMed Central

    2014-01-01

    Background Chronic Obstructive Pulmonary Disease (COPD) is a growing worldwide problem that imposes a great burden on the daily life of patients. Since there is no cure, the goal of treating COPD is to maintain or improve quality of life. We have developed a new tool, the Assessment of Burden of COPD (ABC) tool, to assess and visualize the integrated health status of patients with COPD, and to provide patients and healthcare providers with a treatment algorithm. This tool may be used during consultations to monitor the burden of COPD and to adjust treatment if necessary. The aim of the current study is to analyse the effectiveness of the ABC tool compared with usual care on health related quality of life among COPD patients over a period of 18 months. Methods/Design A cluster randomised controlled trial will be conducted in COPD patients in both primary and secondary care throughout the Netherlands. An intervention group, receiving care based on the ABC tool, will be compared with a control group receiving usual care. The primary outcome will be the change in score on a disease-specific-quality-of-life questionnaire, the Saint George Respiratory Questionnaire. Secondary outcomes will be a different questionnaire (the COPD Assessment Test), lung function and number of exacerbations. During the 18 months follow-up, seven measurements will be conducted, including a baseline and final measurement. Patients will receive questionnaires to be completed at home. Additional data, such as number of exacerbations, will be recorded by the patients’ healthcare providers. A total of 360 patients will be recruited by 40 general practitioners and 20 pulmonologists. Additionally, a process evaluation will be performed among patients and healthcare providers. Discussion The new ABC tool complies with the 2014 Global Initiative for Chronic Obstructive Lung Disease guidelines, which describe the necessity to classify patients on both their airway obstruction and a comprehensive symptom assessment. It has been developed to classify patients, but also to provide visual insight into the burden of COPD and to provide treatment advice. Trial registration Netherlands Trial Register, NTR3788. PMID:25098313

  3. Patient information leaflets (PILs) for UK randomised controlled trials: a feasibility study exploring whether they contain information to support decision making about trial participation.

    PubMed

    Gillies, Katie; Huang, Wan; Skea, Zoë; Brehaut, Jamie; Cotton, Seonaidh

    2014-02-18

    Informed consent is regarded as a cornerstone of ethical healthcare research and is a requirement for most clinical research studies. Guidelines suggest that prospective randomised controlled trial (RCT) participants should understand a basic amount of key information about the RCTs they are being asked to enrol in in order to provide valid informed consent. This information is usually provided to potential participants in a patient information leaflet (PIL). There is evidence that some trial participants fail to understand key components of trial processes or rationale. As such, the existing approach to information provision for potential RCT participants may not be optimal. Decision aids have been used for a variety of treatment and screening decisions to improve knowledge, but focus more on overall decision quality, and may be helpful to those making decisions about participating in an RCT. We investigated the feasibility of using a tool to identify which items recommended for good quality decision making are present in UK PILs. PILs were sampled from UK registered Clinical Trials Unit websites across a range of clinical areas. The evaluation tool, which is based on standards for supporting decision making, was applied to 20 PILs. Two researchers independently rated each PIL using the tool. In addition, word count and readability were assessed. PILs scored poorly on the evaluation tool with the majority of leaflets scoring less than 50%. Specifically, presenting probabilities, clarifying and expressing values and structured guidance in deliberation and communication sub-sections scored consistently poorly. Tool score was associated with word count (r=0.802, P <0.01); there was no association between score and readability (r=-0.372, P=0.106). The tool was feasible to use to evaluate PILs for UK RCTs. PILs did not meet current standards of information to support good quality decision making. Writers of information leaflets could use the evaluation tool as a framework during PIL development to help ensure that items are included which promote and support more informed decisions about trial participation. Further research is required to evaluate the inclusion of such information.

  4. Randomized Trial of Reducing Ambulatory Malpractice and Safety Risk: Results of the Massachusetts PROMISES Project.

    PubMed

    Schiff, Gordon D; Reyes Nieva, Harry; Griswold, Paula; Leydon, Nicholas; Ling, Judy; Federico, Frank; Keohane, Carol; Ellis, Bonnie R; Foskett, Cathy; Orav, E John; Yoon, Catherine; Goldmann, Don; Weissman, Joel S; Bates, David W; Biondolillo, Madeleine; Singer, Sara J

    2017-08-01

    Evaluate application of quality improvement approaches to key ambulatory malpractice risk and safety areas. In total, 25 small-to-medium-sized primary care practices (16 intervention; 9 control) in Massachusetts. Controlled trial of a 15-month intervention including exposure to a learning network, webinars, face-to-face meetings, and coaching by improvement advisors targeting "3+1" high-risk domains: test result, referral, and medication management plus culture/communication issues evaluated by survey and chart review tools. Chart reviews conducted at baseline and postintervention for intervention sites. Staff and patient survey data collected at baseline and postintervention for intervention and control sites. Chart reviews demonstrated significant improvements in documentation of abnormal results, patient notification, documentation of an action or treatment plan, and evidence of a completed plan (all P<0.001). Mean days between laboratory test date and evidence of completed action/treatment plan decreased by 19.4 days (P<0.001). Staff surveys showed modest but nonsignificant improvement for intervention practices relative to controls overall and for the 3 high-risk domains that were the focus of PROMISES. A consortium of stakeholders, quality improvement tools, coaches, and learning network decreased selected ambulatory safety risks often seen in malpractice claims.

  5. Advances in photonic MOEMS-MEMS device thinning and polishing

    NASA Astrophysics Data System (ADS)

    McAneny, James J.; Kennedy, Mark; McGroggan, Tom

    2010-02-01

    As devices continue to increase in density and complexity, ever more stringent specifications are placed on the wafer scale equipment manufacturers to produce higher quality and higher output. This results in greater investment and more resource being diverted into producing tools and processes which can meet the latest demanding criteria. Substrate materials employed in the fabrication process range from Silicon through InP and include GaAs, InSb and other optical networking or waveguide materials. With this diversity of substrate materials presented, controlling the geometries and surfaces grows progressively more challenging. This article highlights the key parameters which require close monitoring and control in order to produce highly precise wafers as part of the fabrication process. Several as cut and commercially available standard polished wafer materials were used in empirical trials to test tooling options in generating high levels of geometric control over the dimensions while producing high quality surface finishes. Specific attention was given to the measurement and control of: flatness; parallelism/TTV; surface roughness and final target thickness as common specifications required by the industry. By combining the process variables of: plate speed, download pressure, slurry flow rate and concentration, pad type and wafer travel path across the polish pad, the effect of altering these variables was recorded and analysed to realize the optimum process conditions for the materials under test. The results being then used to design improved methods and tooling for the thinning and polishing of photonic materials applied to MOEMS-MEMS device fabrication.

  6. Quality Risk Management: Putting GMP Controls First.

    PubMed

    O'Donnell, Kevin; Greene, Anne; Zwitkovits, Michael; Calnan, Nuala

    2012-01-01

    This paper presents a practical way in which current approaches to quality risk management (QRM) may be improved, such that they better support qualification, validation programs, and change control proposals at manufacturing sites. The paper is focused on the treatment of good manufacturing practice (GMP) controls during QRM exercises. It specifically addresses why it is important to evaluate and classify such controls in terms of how they affect the severity, probability of occurrence, and detection ratings that may be assigned to potential failure modes or negative events. It also presents a QRM process that is designed to directly link the outputs of risk assessments and risk control activities with qualification and validation protocols in the GMP environment. This paper concerns the need for improvement in the use of risk-based principles and tools when working to ensure that the manufacturing processes used to produce medicines, and their related equipment, are appropriate. Manufacturing processes need to be validated (or proven) to demonstrate that they can produce a medicine of the required quality. The items of equipment used in such processes need to be qualified, in order to prove that they are fit for their intended use. Quality risk management (QRM) tools can be used to support such qualification and validation activities, but their use should be science-based and subject to as little subjectivity and uncertainty as possible. When changes are proposed to manufacturing processes, equipment, or related activities, they also need careful evaluation to ensure that any risks present are managed effectively. This paper presents a practical approach to how QRM may be improved so that it better supports qualification, validation programs, and change control proposals in a more scientific way. This improved approach is based on the treatment of what are called good manufacturing process (GMP) controls during those QRM exercises. A GMP control can be considered to be any control that is put in place to assure product quality and regulatory compliance. This improved approach is also based on how the detectability of risks is assessed. This is important because when producing medicines, it is not always good practice to place a high reliance upon detection-type controls in the absence of an adequate level of assurance in the manufacturing process that leads to the finished medicine.

  7. Recommendations of the DNA Commission of the International Society for Forensic Genetics (ISFG) on quality control of autosomal Short Tandem Repeat allele frequency databasing (STRidER).

    PubMed

    Bodner, Martin; Bastisch, Ingo; Butler, John M; Fimmers, Rolf; Gill, Peter; Gusmão, Leonor; Morling, Niels; Phillips, Christopher; Prinz, Mechthild; Schneider, Peter M; Parson, Walther

    2016-09-01

    The statistical evaluation of autosomal Short Tandem Repeat (STR) genotypes is based on allele frequencies. These are empirically determined from sets of randomly selected human samples, compiled into STR databases that have been established in the course of population genetic studies. There is currently no agreed procedure of performing quality control of STR allele frequency databases, and the reliability and accuracy of the data are largely based on the responsibility of the individual contributing research groups. It has been demonstrated with databases of haploid markers (EMPOP for mitochondrial mtDNA, and YHRD for Y-chromosomal loci) that centralized quality control and data curation is essential to minimize error. The concepts employed for quality control involve software-aided likelihood-of-genotype, phylogenetic, and population genetic checks that allow the researchers to compare novel data to established datasets and, thus, maintain the high quality required in forensic genetics. Here, we present STRidER (http://strider.online), a publicly available, centrally curated online allele frequency database and quality control platform for autosomal STRs. STRidER expands on the previously established ENFSI DNA WG STRbASE and applies standard concepts established for haploid and autosomal markers as well as novel tools to reduce error and increase the quality of autosomal STR data. The platform constitutes a significant improvement and innovation for the scientific community, offering autosomal STR data quality control and reliable STR genotype estimates. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  8. Implementing GermWatcher, an enterprise infection control application.

    PubMed

    Doherty, Joshua; Noirot, Laura A; Mayfield, Jennie; Ramiah, Sridhar; Huang, Christine; Dunagan, Wm Claiborne; Bailey, Thomas C

    2006-01-01

    Automated surveillance tools can provide significant advantages to infection control practitioners. When stored in a relational database, the data collected can also be used to support numerous research and quality improvement opportunities. A previously described electronic infection control surveillance system was remodeled to provide multi-hospital support, an XML based rule set, and interoperability with an enterprise terminology server. This paper describes the new architecture being used at hospitals across BJC HealthCare.

  9. Recent Developments in Hyperspectral Imaging for Assessment of Food Quality and Safety

    PubMed Central

    Huang, Hui; Liu, Li; Ngadi, Michael O.

    2014-01-01

    Hyperspectral imaging which combines imaging and spectroscopic technology is rapidly gaining ground as a non-destructive, real-time detection tool for food quality and safety assessment. Hyperspectral imaging could be used to simultaneously obtain large amounts of spatial and spectral information on the objects being studied. This paper provides a comprehensive review on the recent development of hyperspectral imaging applications in food and food products. The potential and future work of hyperspectral imaging for food quality and safety control is also discussed. PMID:24759119

  10. Data quality can make or break a research infrastructure

    NASA Astrophysics Data System (ADS)

    Pastorello, G.; Gunter, D.; Chu, H.; Christianson, D. S.; Trotta, C.; Canfora, E.; Faybishenko, B.; Cheah, Y. W.; Beekwilder, N.; Chan, S.; Dengel, S.; Keenan, T. F.; O'Brien, F.; Elbashandy, A.; Poindexter, C.; Humphrey, M.; Papale, D.; Agarwal, D.

    2017-12-01

    Research infrastructures (RIs) commonly support observational data provided by multiple, independent sources. Uniformity in the data distributed by such RIs is important in most applications, e.g., in comparative studies using data from two or more sources. Achieving uniformity in terms of data quality is challenging, especially considering that many data issues are unpredictable and cannot be detected until a first occurrence of the issue. With that, many data quality control activities within RIs require a manual, human-in-the-loop element, making it an expensive activity. Our motivating example is the FLUXNET2015 dataset - a collection of ecosystem-level carbon, water, and energy fluxes between land and atmosphere from over 200 sites around the world, some sites with over 20 years of data. About 90% of the human effort to create the dataset was spent in data quality related activities. Based on this experience, we have been working on solutions to increase the automation of data quality control procedures. Since it is nearly impossible to fully automate all quality related checks, we have been drawing from the experience with techniques used in software development, which shares a few common constraints. In both managing scientific data and writing software, human time is a precious resource; code bases, as Science datasets, can be large, complex, and full of errors; both scientific and software endeavors can be pursued by individuals, but collaborative teams can accomplish a lot more. The lucrative and fast-paced nature of the software industry fueled the creation of methods and tools to increase automation and productivity within these constraints. Issue tracking systems, methods for translating problems into automated tests, powerful version control tools are a few examples. Terrestrial and aquatic ecosystems research relies heavily on many types of observational data. As volumes of data collection increases, ensuring data quality is becoming an unwieldy challenge for RIs. Business as usual approaches to data quality do not work with larger data volumes. We believe RIs can benefit greatly from adapting and imitating this body of theory and practice from software quality into data quality, enabling systematic and reproducible safeguards against errors and mistakes in datasets as much as in software.

  11. [Compatibility of different quality control systems].

    PubMed

    Invernizzi, Enrico

    2002-01-01

    Management of the good laboratory practice (GLP) quality system presupposes its linking to a basic recognized and approved quality system, from which it can draw on management procedures common to all quality systems, such as the ISO 9000 set of norms. A quality system organized in this way can also be integrated with other dedicated quality systems, or parts of them, to obtain principles or management procedures for specific topics. The aim of this organization is to set up a reliable, recognized quality system compatible with the principles of GLP and other quality management systems, which provides users with a simplified set of easily accessible management tools and answers. The organization of this quality system is set out in the quality assurance programme, which is actually the document in which the test facility incorporates the GLP principles into its own quality organization.

  12. A new generation of tools for search, recovery and quality evaluation of World Wide Web medical resources.

    PubMed

    Aguillo, I

    2000-01-01

    Although the Internet is already a valuable information resource in medicine, there are important challenges to be faced before physicians and general users will have extensive access to this information. As a result of a research effort to compile a health-related Internet directory, new tools and strategies have been developed to solve key problems derived from the explosive growth of medical information on the Net and the great concern over the quality of such critical information. The current Internet search engines lack some important capabilities. We suggest using second generation tools (client-side based) able to deal with large quantities of data and to increase the usability of the records recovered. We tested the capabilities of these programs to solve health-related information problems, recognising six groups according to the kind of topics addressed: Z39.50 clients, downloaders, multisearchers, tracing agents, indexers and mappers. The evaluation of the quality of health information available on the Internet could require a large amount of human effort. A possible solution may be to use quantitative indicators based on the hypertext visibility of the Web sites. The cybermetric measures are valid for quality evaluation if they are derived from indirect peer review by experts with Web pages citing the site. The hypertext links acting as citations need to be extracted from a controlled sample of quality super-sites.

  13. PACOM: A Versatile Tool for Integrating, Filtering, Visualizing, and Comparing Multiple Large Mass Spectrometry Proteomics Data Sets.

    PubMed

    Martínez-Bartolomé, Salvador; Medina-Aunon, J Alberto; López-García, Miguel Ángel; González-Tejedo, Carmen; Prieto, Gorka; Navajas, Rosana; Salazar-Donate, Emilio; Fernández-Costa, Carolina; Yates, John R; Albar, Juan Pablo

    2018-04-06

    Mass-spectrometry-based proteomics has evolved into a high-throughput technology in which numerous large-scale data sets are generated from diverse analytical platforms. Furthermore, several scientific journals and funding agencies have emphasized the storage of proteomics data in public repositories to facilitate its evaluation, inspection, and reanalysis. (1) As a consequence, public proteomics data repositories are growing rapidly. However, tools are needed to integrate multiple proteomics data sets to compare different experimental features or to perform quality control analysis. Here, we present a new Java stand-alone tool, Proteomics Assay COMparator (PACOM), that is able to import, combine, and simultaneously compare numerous proteomics experiments to check the integrity of the proteomic data as well as verify data quality. With PACOM, the user can detect source of errors that may have been introduced in any step of a proteomics workflow and that influence the final results. Data sets can be easily compared and integrated, and data quality and reproducibility can be visually assessed through a rich set of graphical representations of proteomics data features as well as a wide variety of data filters. Its flexibility and easy-to-use interface make PACOM a unique tool for daily use in a proteomics laboratory. PACOM is available at https://github.com/smdb21/pacom .

  14. The construction of control chart for PM10 functional data

    NASA Astrophysics Data System (ADS)

    Shaadan, Norshahida; Jemain, Abdul Aziz; Deni, Sayang Mohd

    2014-06-01

    In this paper, a statistical procedure to construct a control chart for monitoring air quality (PM10) using functional data is proposed. A set of daily indices that represent the daily PM10 curves were obtained using Functional Principal Component Analysis (FPCA). By means of an iterative charting procedure, a reference data set that represented a stable PM10 process was obtained. The data were then used as a reference for monitoring future data. The application of the procedure was conducted using seven-year (2004-2010) period of recorded data from the Klang air quality monitoring station located in the Klang Valley region of Peninsular Malaysia. The study showed that the control chart provided a useful visualization tool for monitoring air quality and was capable in detecting abnormality in the process system. As in the case of Klang station, the results showed that with reference to 2004-2008, the air quality (PM10) in 2010 was better than that in 2009.

  15. Apprentice Machine Theory Outline.

    ERIC Educational Resources Information Center

    Connecticut State Dept. of Education, Hartford. Div. of Vocational-Technical Schools.

    This volume contains outlines for 16 courses in machine theory that are designed for machine tool apprentices. Addressed in the individual course outlines are the following topics: basic concepts; lathes; milling machines; drills, saws, and shapers; heat treatment and metallurgy; grinders; quality control; hydraulics and pneumatics;…

  16. Comparative mass spectrometry & nuclear magnetic resonance metabolomic approaches for nutraceuticals quality control analysis: a brief review.

    PubMed

    Farag, Mohamed A

    2014-01-01

    The number of botanical dietary supplements in the market has recently increased primarily due to increased health awareness. Standardization and quality control of the constituents of these plant extracts is an important topic, particularly when such ingredients are used long term as dietary supplements, or in cases where higher doses are marketed as drugs. The development of fast, comprehensive, and effective untargeted analytical methods for plant extracts is of high interest. Nuclear magnetic resonance spectroscopy and mass spectrometry are the most informative tools, each of which enables high-throughput and global analysis of hundreds of metabolites in a single step. Although only one of the two techniques is utilized in the majority of plant metabolomics applications, there is a growing interest in combining the data from both platforms to effectively unravel the complexity of plant samples. The application of combined MS and NMR in the quality control of nutraceuticals forms the major part of this review. Finally I will look at the future developments and perspectives of these two technologies for the quality control of herbal materials.

  17. Rationale for Quality Assurance in Fecal Egg Monitoring of Soil-Transmitted Helminthiasis.

    PubMed

    Hoekendijk, David J L; Hill, Philip C; Sowerby, Stephen J

    2016-09-07

    Substantial investment has been made into the once "neglected" tropical disease, soil-transmitted helminthiasis, and into control programs that operate within a framework of mapping baseline disease distribution, measuring the effectiveness of applied interventions, establishing when to cease drug administration, and for posttreatment evaluations. However, critical to each of these stages is the determination of helminth infection. The limitations of traditional microscope-based fecal egg diagnostics have not provided quality assurance in the monitoring of parasite disease and suboptimal treatment regimes provide for the potential development of parasite resistance to anthelmintic drugs. Improved diagnostic and surveillance tools are required to protect therapeutic effectiveness and to maintain funder confidence. Such tools may be on the horizon with emergent technologies that offer potential for enhanced visualization and quality-assured quantitation of helminth eggs. © The American Society of Tropical Medicine and Hygiene.

  18. Development and Testing of Control Laws for the Active Aeroelastic Wing Program

    NASA Technical Reports Server (NTRS)

    Dibley, Ryan P.; Allen, Michael J.; Clarke, Robert; Gera, Joseph; Hodgkinson, John

    2005-01-01

    The Active Aeroelastic Wing research program was a joint program between the U.S. Air Force Research Laboratory and NASA established to investigate the characteristics of an aeroelastic wing and the technique of using wing twist for roll control. The flight test program employed the use of an F/A-18 aircraft modified by reducing the wing torsional stiffness and adding a custom research flight control system. The research flight control system was optimized to maximize roll rate using only wing surfaces to twist the wing while simultaneously maintaining design load limits, stability margins, and handling qualities. NASA Dryden Flight Research Center developed control laws using the software design tool called CONDUIT, which employs a multi-objective function optimization to tune selected control system design parameters. Modifications were made to the Active Aeroelastic Wing implementation in this new software design tool to incorporate the NASA Dryden Flight Research Center nonlinear F/A-18 simulation for time history analysis. This paper describes the design process, including how the control law requirements were incorporated into constraints for the optimization of this specific software design tool. Predicted performance is also compared to results from flight.

  19. Some aspects of precise laser machining - Part 1: Theory

    NASA Astrophysics Data System (ADS)

    Wyszynski, Dominik; Grabowski, Marcin; Lipiec, Piotr

    2018-05-01

    The paper describes the role of laser beam polarization and deflection on quality of laser beam machined parts made of difficult to cut materials (used for cutting tools). Application of efficient and precise cutting tool (laser beam) has significant impact on preparation and finishing operations of cutting tools for aviation part manufacturing. Understanding the phenomena occurring in the polarized light laser cutting gave possibility to design, build and test opto-mechanical instrumentation to control and maintain process parameters and conditions. The research was carried within INNOLOT program funded by Polish National Centre for Research and Development.

  20. Implementing a Data Quality Strategy to Simplify Access to Data

    NASA Astrophysics Data System (ADS)

    Druken, K. A.; Trenham, C. E.; Evans, B. J. K.; Richards, C. J.; Wang, J.; Wyborn, L. A.

    2016-12-01

    To ensure seamless programmatic access for data analysis (including machine learning), standardization of both data and services is vital. At the Australian National Computational Infrastructure (NCI) we have developed a Data Quality Strategy (DQS) that currently provides processes for: (1) the consistency of data structures in the underlying High Performance Data (HPD) platform; (2) quality control through compliance with recognized community standards; and (3) data quality assurance through demonstrated functionality across common platforms, tools and services. NCI hosts one of Australia's largest repositories (10+ PBytes) of research data collections spanning datasets from climate, coasts, oceans and geophysics through to astronomy, bioinformatics and the social sciences. A key challenge is the application of community-agreed data standards to the broad set of Earth systems and environmental data that are being used. Within these disciplines, data span a wide range of gridded, ungridded (i.e., line surveys, point clouds), and raster image types, as well as diverse coordinate reference projections and resolutions. By implementing our DQS we have seen progressive improvement in the quality of the datasets across the different subject domains, and through this, the ease by which the users can programmatically access the data, either in situ or via web services. As part of its quality control procedures, NCI has developed a compliance checker based upon existing domain standards. The DQS also includes extensive Functionality Testing which include readability by commonly used libraries (e.g., netCDF, HDF, GDAL, etc.); accessibility by data servers (e.g., THREDDS, Hyrax, GeoServer), validation against scientific analysis and programming platforms (e.g., Python, Matlab, QGIS); and visualization tools (e.g., ParaView, NASA Web World Wind). These tests ensure smooth interoperability between products and services as well as exposing unforeseen requirements and dependencies. The results provide an important component of quality control within the DQS as well as clarifying the requirement for any extensions to the relevant standards that help support the uptake of data by broader international communities.

  1. The Development of the CONDUIT Advanced Control System Design and Evaluation Interface with a Case Study Application to an Advanced Fly by Wire Helicopter Design

    NASA Technical Reports Server (NTRS)

    Colbourne, Jason

    1999-01-01

    This report details the development and use of CONDUIT (Control Designer's Unified Interface). CONDUIT is a design tool created at Ames Research Center for the purpose of evaluating and optimizing aircraft control systems against handling qualities. Three detailed design problems addressing the RASCAL UH-60A Black Hawk are included in this report to show the application of CONDUIT to helicopter control system design.

  2. Does the use of the revised psychosocial assessment tool (PATrev) result in improved quality of life and reduced psychosocial risk in Canadian families with a child newly diagnosed with cancer?

    PubMed

    Barrera, M; Hancock, K; Rokeach, A; Atenafu, E; Cataudella, D; Punnett, A; Johnston, D; Cassidy, M; Zelcer, S; Silva, M; Jansen, P; Bartels, U; Nathan, P C; Shama, W; Greenberg, C

    2014-02-01

    Early psychosocial screening may guide interventions and ameliorate the adverse psychosocial effects of childhood cancer. The revised psychosocial assessment tool provides risk information - Universal (typical distress), Targeted (additional specific distress), and Clinical (severe distress) - about the child with cancer and his or her family. This pilot study investigated the benefits of providing a summary of family psychosocial risk information to the medical team treating the newly diagnosed child (Experimental Group, EG). We conducted a pilot randomized control trial with a sample of 67 parents, comparing the EG to the control group (CG) on parental perception of family psychosocial difficulties (revised psychosocial assessment tool risk levels), child behavior (behavior assessment scale for children-2), pediatric quality of life (PedsQL), and parental anxiety (state-anxiety scale of the state-trait anxiety inventory ), 2-4 weeks after diagnosis (Time 1) and 6 months later (Time 2). Compared to the CG, participants in the EG had significantly reduced targeted and clinical risk (p < 0.001), and improved pain related PedsQL at Time 2 (p < 0.05). Scores for PedsQL total and nearly all subscales improved over time in both groups (p < 0.05 to p < 0.001). No changes in behavior scores were noted. Preliminary findings suggest that providing a summary of the Psychosocial Assessment Tool to the treating team shortly after diagnosis may help reduce family wide psychosocial risk 6 months later and improve quality of life related to pain for children who are undergoing treatment for cancer. Copyright © 2013 John Wiley & Sons, Ltd.

  3. Use of ecological momentary assessment to determine which structural factors impact perceived teaching quality of attending rounds.

    PubMed

    Willett, Lisa; Houston, Thomas K; Heudebert, Gustavo R; Estrada, Carlos

    2012-09-01

    Providing high-quality teaching to residents during attending rounds is challenging. Reasons include structural factors that affect rounds, which are beyond the attending's teaching style and control. To develop a new evaluation tool to identify the structural components of ward rounds that most affect teaching quality in an internal medicine (IM) residency program. The authors developed a 10-item Ecological Momentary Assessment (EMA) tool and collected daily evaluations for 18 months from IM residents rotating on inpatient services. Residents ranked the quality of teaching on rounds that day, and questions related to their service (general medicine, medical intensive care unit, and subspecialty services), patient census, absenteeism of team members, call status, and number of teaching methods used by the attending. Residents completed 488 evaluation cards over 18 months. This found no association between perceived teaching quality and training level, team absenteeism, and call status. We observed differences by service (P < .001) and patient census (P  =  .009). After adjusting for type of service, census was no longer significant. Use of a larger variety of teaching methods was associated with higher perceived teaching quality, regardless of service or census (P for trend < .001). The EMA tool successfully identified that higher patient census was associated with lower perceived teaching quality, but the results were also influenced by the type of teaching service. We found that, regardless of census or teaching service, attendings can improve their teaching by diversifying the number of methods used in daily rounds.

  4. The Empower project - a new way of assessing and monitoring test comparability and stability.

    PubMed

    De Grande, Linde A C; Goossens, Kenneth; Van Uytfanghe, Katleen; Stöckl, Dietmar; Thienpont, Linda M

    2015-07-01

    Manufacturers and laboratories might benefit from using a modern integrated tool for quality management/assurance. The tool should not be confounded by commutability issues and focus on the intrinsic analytical quality and comparability of assays as performed in routine laboratories. In addition, it should enable monitoring of long-term stability of performance, with the possibility to quasi "real-time" remedial action. Therefore, we developed the "Empower" project. The project comprises four pillars: (i) master comparisons with panels of frozen single-donation samples, (ii) monitoring of patient percentiles and (iii) internal quality control data, and (iv) conceptual and statistical education about analytical quality. In the pillars described here (i and ii), state-of-the-art as well as biologically derived specifications are used. In the 2014 master comparisons survey, 125 laboratories forming 8 peer groups participated. It showed not only good intrinsic analytical quality of assays but also assay biases/non-comparability. Although laboratory performance was mostly satisfactory, sometimes huge between-laboratory differences were observed. In patient percentile monitoring, currently, 100 laboratories participate with 182 devices. Particularly, laboratories with a high daily throughput and low patient population variation show a stable moving median in time with good between-instrument concordance. Shifts/drifts due to lot changes are sometimes revealed. There is evidence that outpatient medians mirror the calibration set-points shown in the master comparisons. The Empower project gives manufacturers and laboratories a realistic view on assay quality/comparability as well as stability of performance and/or the reasons for increased variation. Therefore, it is a modern tool for quality management/assurance toward improved patient care.

  5. Software development environments: Status and trends

    NASA Technical Reports Server (NTRS)

    Duffel, Larry E.

    1988-01-01

    Currently software engineers are the essential integrating factors tying several components together. The components consist of process, methods, computers, tools, support environments, and software engineers. The engineers today empower the tools versus the tools empowering the engineers. Some of the issues in software engineering are quality, managing the software engineering process, and productivity. A strategy to accomplish this is to promote the evolution of software engineering from an ad hoc, labor intensive activity to a managed, technology supported discipline. This strategy may be implemented by putting the process under management control, adopting appropriate methods, inserting the technology that provides automated support for the process and methods, collecting automated tools into an integrated environment and educating the personnel.

  6. Enhanced methodology of focus control and monitoring on scanner tool

    NASA Astrophysics Data System (ADS)

    Chen, Yen-Jen; Kim, Young Ki; Hao, Xueli; Gomez, Juan-Manuel; Tian, Ye; Kamalizadeh, Ferhad; Hanson, Justin K.

    2017-03-01

    As the demand of the technology node shrinks from 14nm to 7nm, the reliability of tool monitoring techniques in advanced semiconductor fabs to achieve high yield and quality becomes more critical. Tool health monitoring methods involve periodic sampling of moderately processed test wafers to detect for particles, defects, and tool stability in order to ensure proper tool health. For lithography TWINSCAN scanner tools, the requirements for overlay stability and focus control are very strict. Current scanner tool health monitoring methods include running BaseLiner to ensure proper tool stability on a periodic basis. The focus measurement on YIELDSTAR by real-time or library-based reconstruction of critical dimensions (CD) and side wall angle (SWA) has been demonstrated as an accurate metrology input to the control loop. The high accuracy and repeatability of the YIELDSTAR focus measurement provides a common reference of scanner setup and user process. In order to further improve the metrology and matching performance, Diffraction Based Focus (DBF) metrology enabling accurate, fast, and non-destructive focus acquisition, has been successfully utilized for focus monitoring/control of TWINSCAN NXT immersion scanners. The optimal DBF target was determined to have minimized dose crosstalk, dynamic precision, set-get residual, and lens aberration sensitivity. By exploiting this new measurement target design, 80% improvement in tool-to-tool matching, >16% improvement in run-to-run mean focus stability, and >32% improvement in focus uniformity have been demonstrated compared to the previous BaseLiner methodology. Matching <2.4 nm across multiple NXT immersion scanners has been achieved with the new methodology of set baseline reference. This baseline technique, with either conventional BaseLiner low numerical aperture (NA=1.20) mode or advanced illumination high NA mode (NA=1.35), has also been evaluated to have consistent performance. This enhanced methodology of focus control and monitoring on multiple illumination conditions, opens an avenue to significantly reduce Focus-Exposure Matrix (FEM) wafer exposure for new product/layer best focus (BF) setup.

  7. System technology for laser-assisted milling with tool integrated optics

    NASA Astrophysics Data System (ADS)

    Hermani, Jan-Patrick; Emonts, Michael; Brecher, Christian

    2013-02-01

    High strength metal alloys and ceramics offer a huge potential for increased efficiency (e. g. in engine components for aerospace or components for gas turbines). However, mass application is still hampered by cost- and time-consuming end-machining due to long processing times and high tool wear. Laser-induced heating shortly before machining can reduce the material strength and improve machinability significantly. The Fraunhofer IPT has developed and successfully realized a new approach for laser-assisted milling with spindle and tool integrated, co-rotating optics. The novel optical system inside the tool consists of one deflection prism to position the laser spot in front of the cutting insert and one focusing lens. Using a fiber laser with high beam quality the laser spot diameter can be precisely adjusted to the chip size. A high dynamic adaption of the laser power signal according to the engagement condition of the cutting tool was realized in order not to irradiate already machined work piece material. During the tool engagement the laser power is controlled in proportion to the current material removal rate, which has to be calculated continuously. The needed geometric values are generated by a CAD/CAM program and converted into a laser power signal by a real-time controller. The developed milling tool with integrated optics and the algorithm for laser power control enable a multi-axis laser-assisted machining of complex parts.

  8. Innovative Quality-Assurance Strategies for Tuberculosis Surveillance in the United States

    PubMed Central

    Manangan, Lilia Ponce; Tryon, Cheryl; Magee, Elvin; Miramontes, Roque

    2012-01-01

    Introduction. The Centers for Disease Control and Prevention (CDC)'s National Tuberculosis Surveillance System (NTSS) is the national repository of tuberculosis (TB) data in the United States. Jurisdictions report to NTSS through the Report of Verified Case of Tuberculosis (RVCT) form that transitioned to a web-based system in 2009. Materials and Methods. To improve RVCT data quality, CDC conducted a quality assurance (QA) needs assessment to develop QA strategies. These include QA components (case detection, data accuracy, completeness, timeliness, data security, and confidentiality); sample tools such as National TB Indicators Project (NTIP) to identify TB case reporting discrepancies; comprehensive training course; resource guide and toolkit. Results and Discussion. During July–September 2011, 73 staff from 34 (57%) of 60 reporting jurisdictions participated in QA training. Participants stated usefulness of sharing jurisdictions' QA methods; 66 (93%) wrote that the QA tools will be effective for their activities. Several jurisdictions reported implementation of QA tools pertinent to their programs. Data showed >8% increase in NTSS and NTIP enrollment through Secure Access Management Services, which monitors system usage, from August 2011–February 2012. Conclusions. Despite challenges imposed by web-based surveillance systems, QA strategies can be developed with innovation and collaboration. These strategies can also be used by other disease programs to ensure high data quality. PMID:22685648

  9. Assessing Subjectivity in Sensor Data Post Processing via a Controlled Experiment

    NASA Astrophysics Data System (ADS)

    Jones, A. S.; Horsburgh, J. S.; Eiriksson, D.

    2017-12-01

    Environmental data collected by in situ sensors must be reviewed to verify validity, and conducting quality control often requires making edits in post processing to generate approved datasets. This process involves decisions by technicians, data managers, or data users on how to handle problematic data. Options include: removing data from a series, retaining data with annotations, and altering data based on algorithms related to adjacent data points or the patterns of data at other locations or of other variables. Ideally, given the same dataset and the same quality control guidelines, multiple data quality control technicians would make the same decisions in data post processing. However, despite the development and implementation of guidelines aimed to ensure consistent quality control procedures, we have faced ambiguity when performing post processing, and we have noticed inconsistencies in the practices of individuals performing quality control post processing. Technicians with the same level of training and using the same input datasets may produce different results, affecting the overall quality and comparability of finished data products. Different results may also be produced by technicians that do not have the same level of training. In order to assess the effect of subjective decision making by the individual technician on the end data product, we designed an experiment where multiple users performed quality control post processing on the same datasets using a consistent set of guidelines, field notes, and tools. We also assessed the effect of technician experience and training by conducting the same procedures with a group of novices unfamiliar with the data and the quality control process and compared their results to those generated by a group of more experienced technicians. In this presentation, we report our observations of the degree of subjectivity in sensor data post processing, assessing and quantifying the impacts of individual technician as well as technician experience on quality controlled data products.

  10. Monitoring scale scores over time via quality control charts, model-based approaches, and time series techniques.

    PubMed

    Lee, Yi-Hsuan; von Davier, Alina A

    2013-07-01

    Maintaining a stable score scale over time is critical for all standardized educational assessments. Traditional quality control tools and approaches for assessing scale drift either require special equating designs, or may be too time-consuming to be considered on a regular basis with an operational test that has a short time window between an administration and its score reporting. Thus, the traditional methods are not sufficient to catch unusual testing outcomes in a timely manner. This paper presents a new approach for score monitoring and assessment of scale drift. It involves quality control charts, model-based approaches, and time series techniques to accommodate the following needs of monitoring scale scores: continuous monitoring, adjustment of customary variations, identification of abrupt shifts, and assessment of autocorrelation. Performance of the methodologies is evaluated using manipulated data based on real responses from 71 administrations of a large-scale high-stakes language assessment.

  11. Improving the Quality of Welding Seam of Automatic Welding of Buckets Based on TCP

    NASA Astrophysics Data System (ADS)

    Hu, Min

    2018-02-01

    Since February 2014, the welding defects of the automatic welding line of buckets have been frequently appeared. The average repair time of each bucket is 26min, which seriously affects the production efficiency and welding quality. We conducted troubleshooting, and found the main reasons for the welding defects of the buckets were the deviations of the center points of the robot tools and the poor quality of the locating welding. We corrected the gripper, welding torch, and accuracy of repeat positioning of robots to control the quality of positioning welding. The welding defect rate of buckets was reduced greatly, ensuring the production efficiency and welding quality.

  12. Technology to improve quality and accountability.

    PubMed

    Kay, Jonathan

    2006-01-01

    A body of evidence has been accumulated to demonstrate that current practice is not sufficiently safe for several stages of central laboratory testing. In particular, while analytical and perianalytical steps that take place within the laboratory are subjected to quality control procedures, this is not the case for several pre- and post-analytical steps. The ubiquitous application of auto-identification technology seems to represent a valuable tool for reducing error rates. A series of projects in Oxford has attempted to improve processes which support several areas of laboratory medicine, including point-of-care testing, blood transfusion, delivery and interpretation of reports, and support of decision-making by clinicians. The key tools are auto-identification, Internet communication technology, process re-engineering, and knowledge management.

  13. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robinson Khosah

    2007-07-31

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analyticalmore » tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project was conducted in two phases. Phase One included the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two involved the development of a platform for on-line data analysis. Phase Two included the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now technically completed.« less

  14. Tools for surveying and improving the quality of life: people with special needs in focus.

    PubMed

    Hoyningen-Süess, Ursula; Oberholzer, David; Stalder, René; Brügger, Urs

    2012-01-01

    This article seeks to describe online tools for surveying and improving quality of life for people with disabilities living in assisted living centers and special education service organizations. Ensuring a decent quality of life for disabled people is an important welfare state goal. Using well-accepted quality of life conceptions, online diagnostic and planning tools were developed during an Institute for Education, University of Zurich, research project. The diagnostic tools measure, evaluate and analyze disabled people's quality of life. The planning tools identify factors that can affect their quality of life and suggest improvements. Instrument validity and reliability are not tested according to the standard statistical procedures. This will be done at a more advanced stage of the project. Instead, the tool is developed, refined and adjusted in cooperation with practitioners who are constantly judging it according to best practice standards. The tools support staff in assisted living centers and special education service organizations. These tools offer comprehensive resources for surveying, quantifying, evaluating, describing and simulating quality of life elements.

  15. Exploring Fusarium head blight disease control by RNA interference

    USDA-ARS?s Scientific Manuscript database

    RNA interference (RNAi) technology provides a novel tool to study gene function and plant protection strategies. Fusarium graminearum is the causal agent of Fusarium head blight (FHB), which reduces crop yield and quality by producing trichothecene mycotoxins including 3-acetyl deoxynivalenol (3-ADO...

  16. Evaluation of Fly Ash Quality Control Tools : Technical Summary

    DOT National Transportation Integrated Search

    2010-06-01

    Many entities currently use fl y ash in portland cement concrete (PCC) pavements and structures. Although : the body of knowledge is great concerning the use of fl y ash, several projects per year are subject to poor : performance where fl y ash is n...

  17. Evaluation of fly ash quality control tools : tech summary.

    DOT National Transportation Integrated Search

    2010-06-01

    Many entities currently use fl y ash in portland cement concrete (PCC) pavements and structures. Although : the body of knowledge is great concerning the use of fl y ash, several projects per year are subject to poor : performance where fl y ash is n...

  18. Texture, ride quality, and the uniformity of hot-mix asphalt pavements.

    DOT National Transportation Integrated Search

    2005-01-01

    Two years ago, the author completed a study with researchers at Virginia Tech that was designed to develop a tool to measure and control segregation of hot-mix asphalt pavements. This earlier work focused on the application of high-speed texture meas...

  19. Taking pictures to take control: Photovoice as a tool to facilitate empowerment among poor and racial/ethnic minority women with HIV

    PubMed Central

    Teti, Michelle; Pichon, Latrice; Kabel, Allison; Farnan, Rose; Binson, Diane

    2013-01-01

    Poor and racial/ethnic minority women comprise the majority of women living with HIV (WLH) in the United States. Race, gender, class, and HIV-based stigmas and inequities limit women’s powers over their health and compromise their quality of life. To help WLH counter this powerlessness, we implemented a photovoice project called Picturing New Possibilities (PNP), and explored how women experienced empowerment through photovoice. PNP participants (N = 30) photographed their life experiences, attended 3 group discussions and a community exhibit of their photos, and completed a follow-up interview. We used strategies of Grounded Theory to identify key empowerment themes. Participants described empowerment through enhanced self-esteem, self-confidence, critical thinking skills, and control. Our findings suggest that photovoice is an important tool for WLH. It offers women a way to access internal strengths and use these resources to improve their quality of life and health. PMID:24064314

  20. Prompt and Precise Prototyping

    NASA Technical Reports Server (NTRS)

    2003-01-01

    For Sanders Design International, Inc., of Wilton, New Hampshire, every passing second between the concept and realization of a product is essential to succeed in the rapid prototyping industry where amongst heavy competition, faster time-to-market means more business. To separate itself from its rivals, Sanders Design aligned with NASA's Marshall Space Flight Center to develop what it considers to be the most accurate rapid prototyping machine for fabrication of extremely precise tooling prototypes. The company's Rapid ToolMaker System has revolutionized production of high quality, small-to-medium sized prototype patterns and tooling molds with an exactness that surpasses that of computer numerically-controlled (CNC) machining devices. Created with funding and support from Marshall under a Small Business Innovation Research (SBIR) contract, the Rapid ToolMaker is a dual-use technology with applications in both commercial and military aerospace fields. The advanced technology provides cost savings in the design and manufacturing of automotive, electronic, and medical parts, as well as in other areas of consumer interest, such as jewelry and toys. For aerospace applications, the Rapid ToolMaker enables fabrication of high-quality turbine and compressor blades for jet engines on unmanned air vehicles, aircraft, and missiles.

  1. Improving the Quality of Hot Stamping Parts with Innovative Press Technology and Inline Process Control

    NASA Astrophysics Data System (ADS)

    Vollmer, R.; Palm, C.

    2017-09-01

    The increasing number of hot stamped parts in the automotive industry is challenging different process areas. This paper presents a method how to improve the production rates over the whole life cycle of a hot forming part. In the core element of a hot forming line, the hydraulic press, mainly two processing steps are performed. Forming and quenching of the sheet metal part. In addition to the forming operation, it is inevitable to optimize the quenching condition in the bottom dead centre in order to reach a fully martensitic structure and tight geometrical tolerances of the part. Deviations in the blank thickness, tool wear, polishing of classical tools impair the quenching condition and therefore the part quality over the time. A new press and tool design has been developed to counter this effect by providing homogenous contact pressure over the whole die. Especially with a multi cavity tool, the new method is advantageous. Test series have shown that the new tool and press concept can produce parts with a blank thickness of 1.0 mm within 8.0 s cycle time. The so called PCH flex principle makes it possible to produce such high output rates under reliable conditions.

  2. Proportional integral derivative, modeling and ways of stabilization for the spark plasma sintering process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Manière, Charles; Lee, Geuntak; Olevsky, Eugene A.

    The stability of the proportional–integral–derivative (PID) control of temperature in the spark plasma sintering (SPS) process is investigated. The PID regulations of this process are tested for different SPS tooling dimensions, physical parameters conditions, and areas of temperature control. It is shown that the PID regulation quality strongly depends on the heating time lag between the area of heat generation and the area of the temperature control. Tooling temperature rate maps are studied to reveal potential areas for highly efficient PID control. The convergence of the model and experiment indicates that even with non-optimal initial PID coefficients, it is possiblemore » to reduce the temperature regulation inaccuracy to less than 4 K by positioning the temperature control location in highly responsive areas revealed by the finite-element calculations of the temperature spatial distribution.« less

  3. Proportional integral derivative, modeling and ways of stabilization for the spark plasma sintering process

    DOE PAGES

    Manière, Charles; Lee, Geuntak; Olevsky, Eugene A.

    2017-04-21

    The stability of the proportional–integral–derivative (PID) control of temperature in the spark plasma sintering (SPS) process is investigated. The PID regulations of this process are tested for different SPS tooling dimensions, physical parameters conditions, and areas of temperature control. It is shown that the PID regulation quality strongly depends on the heating time lag between the area of heat generation and the area of the temperature control. Tooling temperature rate maps are studied to reveal potential areas for highly efficient PID control. The convergence of the model and experiment indicates that even with non-optimal initial PID coefficients, it is possiblemore » to reduce the temperature regulation inaccuracy to less than 4 K by positioning the temperature control location in highly responsive areas revealed by the finite-element calculations of the temperature spatial distribution.« less

  4. Acupuncture and moxibustion for lateral elbow pain: a systematic review of randomized controlled trials

    PubMed Central

    2014-01-01

    Background Acupuncture and moxibustion have widely been used to treat lateral elbow pain (LEP). A comprehensive systematic review of randomized controlled trials (RCTs) including both English and Chinese databases was conducted to assess the efficacy of acupuncture and moxibustion in the treatment of LEP. Methods Revised STRICTA (2010) criteria were used to appraise the acupuncture procedures, the Cochrane risk of bias tool was used to assess the methodological quality of the studies. A total of 19 RCTs that compared acupuncture and/or moxibustion with sham acupuncture, another form of acupuncture, or conventional treatment were included. Results All studies had at least one domain rated as high risk or uncertain risk of bias in the Cochrane risk of bias tool. Results from three RCTs of moderate quality showed that acupuncture was more effective than sham acupuncture. Results from 10 RCTs of mostly low quality showed that acupuncture or moxibustion was superior or equal to conventional treatment, such as local anesthetic injection, local steroid injection, non-steroidal anti- inflammatory drugs, or ultrasound. There were six low quality RCTs that compared acupuncture and moxibustion combined with manual acupuncture alone, and all showed that acupuncture and moxibustion combined was superior to manual acupuncture alone. Conclusion Moderate quality studies suggest that acupuncture is more effective than sham acupuncture. Interpretations of findings regarding acupuncture vs. conventional treatment, and acupuncture and moxibustion combined vs. manual acupuncture alone are limited by the methodological qualities of these studies. Future studies with improved methodological design are warranted to confirm the efficacy of acupuncture and moxibustion for LEP. PMID:24726029

  5. Effect-directed analysis supporting monitoring of aquatic environments--An in-depth overview.

    PubMed

    Brack, Werner; Ait-Aissa, Selim; Burgess, Robert M; Busch, Wibke; Creusot, Nicolas; Di Paolo, Carolina; Escher, Beate I; Mark Hewitt, L; Hilscherova, Klara; Hollender, Juliane; Hollert, Henner; Jonker, Willem; Kool, Jeroen; Lamoree, Marja; Muschket, Matthias; Neumann, Steffen; Rostkowski, Pawel; Ruttkies, Christoph; Schollee, Jennifer; Schymanski, Emma L; Schulze, Tobias; Seiler, Thomas-Benjamin; Tindall, Andrew J; De Aragão Umbuzeiro, Gisela; Vrana, Branislav; Krauss, Martin

    2016-02-15

    Aquatic environments are often contaminated with complex mixtures of chemicals that may pose a risk to ecosystems and human health. This contamination cannot be addressed with target analysis alone but tools are required to reduce this complexity and identify those chemicals that might cause adverse effects. Effect-directed analysis (EDA) is designed to meet this challenge and faces increasing interest in water and sediment quality monitoring. Thus, the present paper summarizes current experience with the EDA approach and the tools required, and provides practical advice on their application. The paper highlights the need for proper problem formulation and gives general advice for study design. As the EDA approach is directed by toxicity, basic principles for the selection of bioassays are given as well as a comprehensive compilation of appropriate assays, including their strengths and weaknesses. A specific focus is given to strategies for sampling, extraction and bioassay dosing since they strongly impact prioritization of toxicants in EDA. Reduction of sample complexity mainly relies on fractionation procedures, which are discussed in this paper, including quality assurance and quality control. Automated combinations of fractionation, biotesting and chemical analysis using so-called hyphenated tools can enhance the throughput and might reduce the risk of artifacts in laboratory work. The key to determining the chemical structures causing effects is analytical toxicant identification. The latest approaches, tools, software and databases for target-, suspect and non-target screening as well as unknown identification are discussed together with analytical and toxicological confirmation approaches. A better understanding of optimal use and combination of EDA tools will help to design efficient and successful toxicant identification studies in the context of quality monitoring in multiply stressed environments. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Artificial intelligence-based computer modeling tools for controlling slag foaming in electric arc furnaces

    NASA Astrophysics Data System (ADS)

    Wilson, Eric Lee

    Due to increased competition in a world economy, steel companies are currently interested in developing techniques that will allow for the improvement of the steelmaking process, either by increasing output efficiency or by improving the quality of their product, or both. Slag foaming is one practice that has been shown to contribute to both these goals. However, slag foaming is highly dynamic and difficult to model or control. This dissertation describes an effort to use artificial intelligence-based tools (genetic algorithms, fuzzy logic, and neural networks) to both model and control the slag foaming process. Specifically, a neural network is trained and tested on slag foaming data provided by a steel plant. This neural network model is then controlled by a fuzzy logic controller, which in turn is optimized by a genetic algorithm. This tuned controller is then installed at a steel plant and given control be a more efficient slag foaming controller than what was previously used by the steel plant.

  7. Statistical Process Control: A Quality Tool for a Venous Thromboembolic Disease Registry.

    PubMed

    Posadas-Martinez, Maria Lourdes; Rojas, Liliana Paloma; Vazquez, Fernando Javier; De Quiros, Fernan Bernaldo; Waisman, Gabriel Dario; Giunta, Diego Hernan

    2016-01-01

    We aim to describe Statistical Control Process as a quality tool for the Institutional Registry of Venous Thromboembolic Disease (IRTD), a registry developed in a community-care tertiary hospital in Buenos Aires, Argentina. The IRTD is a prospective cohort. The process of data acquisition began with the creation of a computerized alert generated whenever physicians requested imaging or laboratory study to diagnose venous thromboembolism, which defined eligible patients. The process then followed a structured methodology for patient's inclusion, evaluation, and posterior data entry. To control this process, process performance indicators were designed to be measured monthly. These included the number of eligible patients, the number of included patients, median time to patient's evaluation, and percentage of patients lost to evaluation. Control charts were graphed for each indicator. The registry was evaluated in 93 months, where 25,757 patients were reported and 6,798 patients met inclusion criteria. The median time to evaluation was 20 hours (SD, 12) and 7.7% of the total was lost to evaluation. Each indicator presented trends over time, caused by structural changes and improvement cycles, and therefore the central limit suffered inflexions. Statistical process control through process performance indicators allowed us to control the performance of the registry over time to detect systematic problems. We postulate that this approach could be reproduced for other clinical registries.

  8. Tribology in secondary wood machining

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ko, P.L.; Hawthorne, H.M.; Andiappan, J.

    Secondary wood manufacturing covers a wide range of products from furniture, cabinets, doors and windows, to musical instruments. Many of these are now mass produced in sophisticated, high speed numerical controlled machines. The performance and the reliability of the tools are key to an efficient and economical manufacturing process as well as to the quality of the finished products. A program concerned with three aspects of tribology of wood machining, namely, tool wear, tool-wood friction characteristics and wood surface quality characterization, was set up in the Integrated Manufacturing Technologies Institute (IMTI) of the National Research Council of Canada. The studiesmore » include friction and wear mechanism identification and modeling, wear performance of surface-engineered tool materials, friction-induced vibration and cutting efficiency, and the influence of wear and friction on finished products. This research program underlines the importance of tribology in secondary wood manufacturing and at the same time adds new challenges to tribology research since wood is a complex, heterogeneous, material and its behavior during machining is highly sensitive to the surrounding environments and to the moisture content in the work piece.« less

  9. Report Central: quality reporting tool in an electronic health record.

    PubMed

    Jung, Eunice; Li, Qi; Mangalampalli, Anil; Greim, Julie; Eskin, Michael S; Housman, Dan; Isikoff, Jeremy; Abend, Aaron H; Middleton, Blackford; Einbinder, Jonathan S

    2006-01-01

    Quality reporting tools, integrated with ambulatory electronic health records, can help clinicians and administrators understand performance, manage populations, and improve quality. Report Central is a secure web report delivery tool built on Crystal Reports XItrade mark and ASP.NET technologies. Pilot evaluation of Report Central indicates that clinicians prefer a quality reporting tool that is integrated with our home-grown EHR to support clinical workflow.

  10. Quality control and quality assurance plan for bridge channel-stability assessments in Massachusetts

    USGS Publications Warehouse

    Parker, Gene W.; Pinson, Harlow

    1993-01-01

    A quality control and quality assurance plan has been implemented as part of the Massachusetts bridge scour and channel-stability assessment program. This program is being conducted by the U.S. Geological Survey, Massachusetts-Rhode Island District, in cooperation with the Massachusetts Highway Department. Project personnel training, data-integrity verification, and new data-management technologies are being utilized in the channel-stability assessment process to improve current data-collection and management techniques. An automated data-collection procedure has been implemented to standardize channel-stability assessments on a regular basis within the State. An object-oriented data structure and new image management tools are used to produce a data base enabling management of multiple data object classes. Data will be reviewed by assessors and data base managers before being merged into a master bridge-scour data base, which includes automated data-verification routines.

  11. A technique for the assessment of fighter aircraft precision controllability

    NASA Technical Reports Server (NTRS)

    Sisk, T. R.

    1978-01-01

    Today's emerging fighter aircraft are maneuvering as well at normal accelerations of 7 to 8 g's as their predecessors did at 4 to 5 g's. This improved maneuvering capability has significantly expanded their operating envelope and made the task of evaluating handling qualities more difficult. This paper describes a technique for assessing the precision controllability of highly maneuverable aircraft, a technique that was developed to evaluate the effects of buffet intensity on gunsight tracking capability and found to be a useful tool for the general assessment of fighter aircraft handling qualities. It has also demonstrated its usefulness for evaluating configuration and advanced flight control system refinements. This technique is believed to have application to future aircraft dynamics and pilot-vehicle interface studies.

  12. A senior manufacturing laboratory for determining injection molding process capability

    NASA Technical Reports Server (NTRS)

    Wickman, Jerry L.; Plocinski, David

    1992-01-01

    The following is a laboratory experiment designed to further understanding of materials science. This subject material is directed at an upper level undergraduate/graduate student in an Engineering or Engineering Technology program. It is assumed that the student has a thorough understanding of the process and quality control. The format of this laboratory does not follow that which is normally recommended because of the nature of process capability and that of the injection molding equipment and tooling. This laboratory is instead developed to be used as a point of departure for determining process capability for any process in either a quality control laboratory or a manufacturing environment where control charts, process capability, and experimental or product design are considered important topics.

  13. Using Lean Six Sigma Methodology to Improve Quality of the Anesthesia Supply Chain in a Pediatric Hospital.

    PubMed

    Roberts, Renée J; Wilson, Ashley E; Quezado, Zenaide

    2017-03-01

    Six Sigma and Lean methodologies are effective quality improvement tools in many health care settings. We applied the DMAIC methodology (define, measure, analyze, improve, control) to address deficiencies in our pediatric anesthesia supply chain. We defined supply chain problems by mapping existing processes and soliciting comments from those involved. We used daily distance walked by anesthesia technicians and number of callouts for missing supplies as measurements that we analyzed before and after implementing improvements (anesthesia cart redesign). We showed improvement in the metrics after those interventions were implemented, and those improvements were sustained and thus controlled 1 year after implementation.

  14. Automated batch characterization of inkjet-printed elastomer lenses using a LEGO platform.

    PubMed

    Sung, Yu-Lung; Garan, Jacob; Nguyen, Hoang; Hu, Zhenyu; Shih, Wei-Chuan

    2017-09-10

    Small, self-adhesive, inkjet-printed elastomer lenses have enabled smartphone cameras to image and resolve microscopic objects. However, the performance of different lenses within a batch is affected by hard-to-control environmental variables. We present a cost-effective platform to perform automated batch characterization of 300 lens units simultaneously for quality inspection. The system was designed and configured with LEGO bricks, 3D printed parts, and a digital camera. The scheme presented here may become the basis of a high-throughput, in-line inspection tool for quality control purposes and can also be employed for optimization of the manufacturing process.

  15. Quality of Life in Atrial Fibrillation: Measurement Tools and Impact of Interventions

    PubMed Central

    REYNOLDS, MATTHEW R.; ELLIS, ETHAN; ZIMETBAUM, PETER

    2008-01-01

    QoL in AF. Quality of life (QoL) is of central importance in atrial fibrillation as both a treatment goal and an endpoint in the evaluation of new therapies. QoL appears to be impaired in the majority of patients with AF. A number of interventions for AF have been shown to improve QoL, including pharmacologic and nonpharmacologic rate control, antiarrhythmic drugs, and nonpharmacologic rhythm control strategies. This paper will review the rationale, design, strengths, and limitations of the questionnaires most commonly used to assess QoL in AF studies, and present QoL outcomes from major studies of AF interventions. PMID:18266667

  16. Enhancing Leadership Quality. TQ Source Tips & Tools: Emerging Strategies to Enhance Educator Quality

    ERIC Educational Resources Information Center

    National Comprehensive Center for Teacher Quality, 2008

    2008-01-01

    Teaching Quality (TQ) Source Tips & Tools: Emerging Strategies to Enhance Educator Quality is an online resource developed by the TQ Center. It is designed to help education practitioners tap into strategies and resources they can use to enhance educator quality. This publication is based on the TQ Source Tips & Tools topic area "Enhancing…

  17. Effect of Tooling Material on the Internal Surface Quality of Ti6Al4V Parts Fabricated by Hot Isostatic Pressing

    NASA Astrophysics Data System (ADS)

    Cai, Chao; Song, Bo; Wei, Qingsong; Yan, Wu; Xue, Pengju; Shi, Yusheng

    2017-01-01

    For the net-shape hot isostatic pressing (HIP) process, control of the internal surface roughness of as-HIPped parts remains a challenge for practical engineering. To reveal the evolution mechanism of the internal surface of the parts during the HIP process, the effect of different tooling materials (H13, T8, Cr12 steel, and graphite) as internal cores on the interfacial diffusion and surface roughness was systematically studied.

  18. Development and application of a microarray meter tool to optimize microarray experiments

    PubMed Central

    Rouse, Richard JD; Field, Katrine; Lapira, Jennifer; Lee, Allen; Wick, Ivan; Eckhardt, Colleen; Bhasker, C Ramana; Soverchia, Laura; Hardiman, Gary

    2008-01-01

    Background Successful microarray experimentation requires a complex interplay between the slide chemistry, the printing pins, the nucleic acid probes and targets, and the hybridization milieu. Optimization of these parameters and a careful evaluation of emerging slide chemistries are a prerequisite to any large scale array fabrication effort. We have developed a 'microarray meter' tool which assesses the inherent variations associated with microarray measurement prior to embarking on large scale projects. Findings The microarray meter consists of nucleic acid targets (reference and dynamic range control) and probe components. Different plate designs containing identical probe material were formulated to accommodate different robotic and pin designs. We examined the variability in probe quality and quantity (as judged by the amount of DNA printed and remaining post-hybridization) using three robots equipped with capillary printing pins. Discussion The generation of microarray data with minimal variation requires consistent quality control of the (DNA microarray) manufacturing and experimental processes. Spot reproducibility is a measure primarily of the variations associated with printing. The microarray meter assesses array quality by measuring the DNA content for every feature. It provides a post-hybridization analysis of array quality by scoring probe performance using three metrics, a) a measure of variability in the signal intensities, b) a measure of the signal dynamic range and c) a measure of variability of the spot morphologies. PMID:18710498

  19. Kaizen and ergonomics: the perfect marriage.

    PubMed

    Rodriguez, Martin Antonio; Lopez, Luis Fernando

    2012-01-01

    This paper is an approach of how Kaizen (Continuous Improvement) and Ergonomics could be implemented in the field of work. The Toyota's Team Members are the owners of this job, applying tools and techniques to improve work conditions using the Kaizen Philosophy in a QCC Activity (Quality Control Circle).

  20. Remote Control and Data Acquisition: A Case Study

    NASA Technical Reports Server (NTRS)

    DeGennaro, Alfred J.; Wilkinson, R. Allen

    2000-01-01

    This paper details software tools developed to remotely command experimental apparatus, and to acquire and visualize the associated data in soft real time. The work was undertaken because commercial products failed to meet the needs. This work has identified six key factors intrinsic to development of quality research laboratory software. Capabilities include access to all new instrument functions without any programming or dependence on others to write drivers or virtual instruments, simple full screen text-based experiment configuration and control user interface, months of continuous experiment run-times, order of 1% CPU load for condensed matter physics experiment described here, very little imposition of software tool choices on remote users, and total remote control from anywhere in the world over the Internet or from home on a 56 Kb modem as if the user is sitting in the laboratory. This work yielded a set of simple robust tools that are highly reliable, resource conserving, extensible, and versatile, with a uniform simple interface.

  1. Automated Quality Control of in Situ Soil Moisture from the North American Soil Moisture Database Using NLDAS-2 Products

    NASA Astrophysics Data System (ADS)

    Ek, M. B.; Xia, Y.; Ford, T.; Wu, Y.; Quiring, S. M.

    2015-12-01

    The North American Soil Moisture Database (NASMD) was initiated in 2011 to provide support for developing climate forecasting tools, calibrating land surface models and validating satellite-derived soil moisture algorithms. The NASMD has collected data from over 30 soil moisture observation networks providing millions of in situ soil moisture observations in all 50 states as well as Canada and Mexico. It is recognized that the quality of measured soil moisture in NASMD is highly variable due to the diversity of climatological conditions, land cover, soil texture, and topographies of the stations and differences in measurement devices (e.g., sensors) and installation. It is also recognized that error, inaccuracy and imprecision in the data set can have significant impacts on practical operations and scientific studies. Therefore, developing an appropriate quality control procedure is essential to ensure the data is of the best quality. In this study, an automated quality control approach is developed using the North American Land Data Assimilation System phase 2 (NLDAS-2) Noah soil porosity, soil temperature, and fraction of liquid and total soil moisture to flag erroneous and/or spurious measurements. Overall results show that this approach is able to flag unreasonable values when the soil is partially frozen. A validation example using NLDAS-2 multiple model soil moisture products at the 20 cm soil layer showed that the quality control procedure had a significant positive impact in Alabama, North Carolina, and West Texas. It had a greater impact in colder regions, particularly during spring and autumn. Over 433 NASMD stations have been quality controlled using the methodology proposed in this study, and the algorithm will be implemented to control data quality from the other ~1,200 NASMD stations in the near future.

  2. Influence of tool geometry and processing parameters on welding defects and mechanical properties for friction stir welding of 6061 Aluminium alloy

    NASA Astrophysics Data System (ADS)

    Daneji, A.; Ali, M.; Pervaiz, S.

    2018-04-01

    Friction stir welding (FSW) is a form of solid state welding process for joining metals, alloys, and selective composites. Over the years, FSW development has provided an improved way of producing welding joints, and consequently got accepted in numerous industries such as aerospace, automotive, rail and marine etc. In FSW, the base metal properties control the material’s plastic flow under the influence of a rotating tool whereas, the process and tool parameters play a vital role in the quality of weld. In the current investigation, an array of square butt joints of 6061 Aluminum alloy was to be welded under varying FSW process and tool geometry related parameters, after which the resulting weld was evaluated for the corresponding mechanical properties and welding defects. The study incorporates FSW process and tool parameters such as welding speed, pin height and pin thread pitch as input parameters. However, the weld quality related defects and mechanical properties were treated as output parameters. The experimentation paves way to investigate the correlation between the inputs and the outputs. The correlation between inputs and outputs were used as tool to predict the optimized FSW process and tool parameters for a desired weld output of the base metals under investigation. The study also provides reflection on the effect of said parameters on a welding defect such as wormhole.

  3. Report Central: Quality Reporting Tool in an Electronic Health Record

    PubMed Central

    Jung, Eunice; Li, Qi; Mangalampalli, Anil; Greim, Julie; Eskin, Michael S.; Housman, Dan; Isikoff, Jeremy; Abend, Aaron H.; Middleton, Blackford; Einbinder, Jonathan S.

    2006-01-01

    Quality reporting tools, integrated with ambulatory electronic health records, can help clinicians and administrators understand performance, manage populations, and improve quality. Report Central is a secure web report delivery tool built on Crystal Reports XI™ and ASP.NET technologies. Pilot evaluation of Report Central indicates that clinicians prefer a quality reporting tool that is integrated with our home-grown EHR to support clinical workflow. PMID:17238590

  4. CONDUIT: A New Multidisciplinary Integration Environment for Flight Control Development

    NASA Technical Reports Server (NTRS)

    Tischler, Mark B.; Colbourne, Jason D.; Morel, Mark R.; Biezad, Daniel J.; Levine, William S.; Moldoveanu, Veronica

    1997-01-01

    A state-of-the-art computational facility for aircraft flight control design, evaluation, and integration called CONDUIT (Control Designer's Unified Interface) has been developed. This paper describes the CONDUIT tool and case study applications to complex rotary- and fixed-wing fly-by-wire flight control problems. Control system analysis and design optimization methods are presented, including definition of design specifications and system models within CONDUIT, and the multi-objective function optimization (CONSOL-OPTCAD) used to tune the selected design parameters. Design examples are based on flight test programs for which extensive data are available for validation. CONDUIT is used to analyze baseline control laws against pertinent military handling qualities and control system specifications. In both case studies, CONDUIT successfully exploits trade-offs between forward loop and feedback dynamics to significantly improve the expected handling, qualities and minimize the required actuator authority. The CONDUIT system provides a new environment for integrated control system analysis and design, and has potential for significantly reducing the time and cost of control system flight test optimization.

  5. TU-AB-BRD-04: Development of Quality Management Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomadsen, B.

    2015-06-15

    Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less

  6. Modelling raw water quality: development of a drinking water management tool.

    PubMed

    Kübeck, Ch; van Berk, W; Bergmann, A

    2009-01-01

    Ensuring future drinking water supply requires a tough management of groundwater resources. However, recent practices of economic resource control often does not involve aspects of the hydrogeochemical and geohydraulical groundwater system. In respect of analysing the available quantity and quality of future raw water, an effective resource management requires a full understanding of the hydrogeochemical and geohydraulical processes within the aquifer. For example, the knowledge of raw water quality development within the time helps to work out strategies of water treatment as well as planning finance resources. On the other hand, the effectiveness of planed measurements reducing the infiltration of harmful substances such as nitrate can be checked and optimized by using hydrogeochemical modelling. Thus, within the framework of the InnoNet program funded by Federal Ministry of Economics and Technology, a network of research institutes and water suppliers work in close cooperation developing a planning and management tool particularly oriented on water management problems. The tool involves an innovative material flux model that calculates the hydrogeochemical processes under consideration of the dynamics in agricultural land use. The program integrated graphical data evaluation is aligned on the needs of water suppliers.

  7. High-Protein and High–Dietary Fiber Breakfasts Result in Equal Feelings of Fullness and Better Diet Quality in Low-Income Preschoolers Compared with Their Usual Breakfast123

    PubMed Central

    Kranz, Sibylle; Brauchla, Mary; Campbell, Wayne W; Mattes, Rickard D

    2017-01-01

    Background: In the United States, 17% of children are currently obese. Increasing feelings of fullness may prevent excessive energy intake, lead to better diet quality, and promote long-term maintenance of healthy weight. Objective: The purpose of this study was to develop a fullness-rating tool (aim 1) and to determine whether a high-protein (HP), high-fiber (HF), and combined HP and HF (HPHF) breakfast increases preschoolers’ feelings of fullness before (pre) and after (post) breakfast and pre-lunch, as well as their diet quality, as measured by using a composite diet quality assessment tool, the Revised Children’s Diet Quality Index (aim 2). Methods: Children aged 4 and 5 y (n = 41; 22 girls and 19 boys) from local Head Start centers participated in this randomized intervention trial. Sixteen percent of boys and 32% of girls were overweight or obese. After the baseline week, children rotated through four 1-wk periods of consuming ad libitum HP (19–20 g protein), HF (10–11 g fiber), HPHF (19–21 g protein, 10–12 g fiber), or usual (control) breakfasts. Food intake at breakfast was estimated daily, and for breakfast, lunch, and snack on day 3 of each study week Student’s t tests and ANOVA were used to determine statistical differences. Results: Children’s post-breakfast and pre-lunch fullness ratings were ≥1 point higher than those of pre-breakfast (aim 1). Although children consumed, on average, 65 kcal less energy during the intervention breakfasts (P < 0.007) than during the control breakfast, fullness ratings did not differ (P = 0.76). Relative to the control breakfast, improved diet quality (12%) was calculated for the HP and HF breakfasts (P < 0.027) but not for the HPHF breakfast (aim 2). Conclusions: Post-breakfast fullness ratings were not affected by the intervention breakfasts relative to the control breakfast. HP and HF breakfasts resulted in higher diet quality. Serving HP or HF breakfasts may be valuable in improving diet quality without lowering feelings of satiation or satiety. This trial was registered at clinicaltrials.gov as NCT02122224. PMID:28077732

  8. Investigation of improving MEMS-type VOA reliability

    NASA Astrophysics Data System (ADS)

    Hong, Seok K.; Lee, Yeong G.; Park, Moo Y.

    2003-12-01

    MEMS technologies have been applied to a lot of areas, such as optical communications, Gyroscopes and Bio-medical components and so on. In terms of the applications in the optical communication field, MEMS technologies are essential, especially, in multi dimensional optical switches and Variable Optical Attenuators(VOAs). This paper describes the process for the development of MEMS type VOAs with good optical performance and improved reliability. Generally, MEMS VOAs have been fabricated by silicon micro-machining process, precise fibre alignment and sophisticated packaging process. Because, it is composed of many structures with various materials, it is difficult to make devices reliable. We have developed MEMS type VOSs with many failure mode considerations (FMEA: Failure Mode Effect Analysis) in the initial design step, predicted critical failure factors and revised the design, and confirmed the reliability by preliminary test. These predicted failure factors were moisture, bonding strength of the wire, which wired between the MEMS chip and TO-CAN and instability of supplied signals. Statistical quality control tools (ANOVA, T-test and so on) were used to control these potential failure factors and produce optimum manufacturing conditions. To sum up, we have successfully developed reliable MEMS type VOAs with good optical performances by controlling potential failure factors and using statistical quality control tools. As a result, developed VOAs passed international reliability standards (Telcodia GR-1221-CORE).

  9. Investigation of improving MEMS-type VOA reliability

    NASA Astrophysics Data System (ADS)

    Hong, Seok K.; Lee, Yeong G.; Park, Moo Y.

    2004-01-01

    MEMS technologies have been applied to a lot of areas, such as optical communications, Gyroscopes and Bio-medical components and so on. In terms of the applications in the optical communication field, MEMS technologies are essential, especially, in multi dimensional optical switches and Variable Optical Attenuators(VOAs). This paper describes the process for the development of MEMS type VOAs with good optical performance and improved reliability. Generally, MEMS VOAs have been fabricated by silicon micro-machining process, precise fibre alignment and sophisticated packaging process. Because, it is composed of many structures with various materials, it is difficult to make devices reliable. We have developed MEMS type VOSs with many failure mode considerations (FMEA: Failure Mode Effect Analysis) in the initial design step, predicted critical failure factors and revised the design, and confirmed the reliability by preliminary test. These predicted failure factors were moisture, bonding strength of the wire, which wired between the MEMS chip and TO-CAN and instability of supplied signals. Statistical quality control tools (ANOVA, T-test and so on) were used to control these potential failure factors and produce optimum manufacturing conditions. To sum up, we have successfully developed reliable MEMS type VOAs with good optical performances by controlling potential failure factors and using statistical quality control tools. As a result, developed VOAs passed international reliability standards (Telcodia GR-1221-CORE).

  10. Arduino Due based tool to facilitate in vivo two-photon excitation microscopy.

    PubMed

    Artoni, Pietro; Landi, Silvia; Sato, Sebastian Sulis; Luin, Stefano; Ratto, Gian Michele

    2016-04-01

    Two-photon excitation spectroscopy is a powerful technique for the characterization of the optical properties of genetically encoded and synthetic fluorescent molecules. Excitation spectroscopy requires tuning the wavelength of the Ti:sapphire laser while carefully monitoring the delivered power. To assist laser tuning and the control of delivered power, we developed an Arduino Due based tool for the automatic acquisition of high quality spectra. This tool is portable, fast, affordable and precise. It allowed studying the impact of scattering and of blood absorption on two-photon excitation light. In this way, we determined the wavelength-dependent deformation of excitation spectra occurring in deep tissues in vivo.

  11. Advanced Wavefront Control Techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olivier, S S; Brase, J M; Avicola, K

    2001-02-21

    Programs at LLNL that involve large laser systems--ranging from the National Ignition Facility to new tactical laser weapons--depend on the maintenance of laser beam quality through precise control of the optical wavefront. This can be accomplished using adaptive optics, which compensate for time-varying aberrations that are often caused by heating in a high-power laser system. Over the past two decades, LLNL has developed a broad capability in adaptive optics technology for both laser beam control and high-resolution imaging. This adaptive optics capability has been based on thin deformable glass mirrors with individual ceramic actuators bonded to the back. In themore » case of high-power lasers, these adaptive optics systems have successfully improved beam quality. However, as we continue to extend our applications requirements, the existing technology base for wavefront control cannot satisfy them. To address this issue, this project studied improved modeling tools to increase our detailed understanding of the performance of these systems, and evaluated novel approaches to low-order wavefront control that offer the possibility of reduced cost and complexity. We also investigated improved beam control technology for high-resolution wavefront control. Many high-power laser systems suffer from high-spatial-frequency aberrations that require control of hundreds or thousands of phase points to provide adequate correction. However, the cost and size of current deformable mirrors can become prohibitive for applications requiring more than a few tens of phase control points. New phase control technologies are becoming available which offer control of many phase points with small low-cost devices. The goal of this project was to expand our wavefront control capabilities with improved modeling tools, new devices that reduce system cost and complexity, and extensions to high spatial and temporal frequencies using new adaptive optics technologies. In FY 99, the second year of this project, work was performed in four areas (1) advanced modeling tools for deformable mirrors (2) low-order wavefront correctors with Alvarez lenses, (3) a direct phase measuring heterdyne wavefront sensor, and (4) high-spatial-frequency wavefront control using spatial light modulators.« less

  12. Safety of ceftriaxone in paediatrics: a systematic review protocol.

    PubMed

    Zeng, Linan; Choonara, Imti; Zhang, Lingli; Xue, Song; Chen, Zhe; He, Miaomiao

    2017-08-21

    Ceftriaxone is widely used in children in the treatment of sepsis. However, concerns have been raised about the safety of ceftriaxone, especially in young children. The aim of this review is to systematically evaluate the safety of ceftriaxone in children of all age groups. MEDLINE, PubMed, Cochrane Central Register of Controlled Trials, EMBASE, CINAHL, International Pharmaceutical Abstracts and adverse drug reaction (ADR) monitoring systems will be systematically searched for randomised controlled trials (RCTs), cohort studies, case-control studies, cross-sectional studies, case series and case reports evaluating the safety of ceftriaxone in children. The Cochrane risk of bias tool, Newcastle-Ottawa and quality assessment tools developed by the National Institutes of Health will be used for quality assessment. Meta-analysis of the incidence of ADRs from RCTs and prospective studies will be done. Subgroup analyses will be performed for age and dosage regimen. Formal ethical approval is not required as no primary data are collected. This systematic review will be disseminated through a peer-reviewed publication and at conference meetings. CRD42017055428. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  13. Using Six-Sigma To Change and Measure Improvement.

    ERIC Educational Resources Information Center

    Feld, Karl G.; Stone, William K.

    2002-01-01

    Explains why Honeywell's Market Research Department replaced its traditional data collection and paper-based tabulation with blended modes, including electronic interviewing and real-time reporting. Describes how the Six-Sigma quality control process demonstrated that a hybrid approach with blended modes and Web-based reporting tools could deliver…

  14. Simulation-based decision-making tool for adaptive traffic signal control on Tarrytown Road in the City of White Plains.

    DOT National Transportation Integrated Search

    2013-01-01

    Transportation corridors are vital for our region and even the nations economy and quality of life. A corridor is usually a complicated system that may span multi-jurisdictions, contains multiple modes, include both freeways and local arterials, a...

  15. Life tables as tools of evaluation and quality control for arthropod mass production

    USDA-ARS?s Scientific Manuscript database

    Life tables, as a basic concept, are descriptions of survival potential at various ages or stages. Understanding critical life stages of arthropod development and their influence on the population structure is of great importance for arthropod rearing systems. Another important advantage of a life t...

  16. A COMPUTER-CONTROLLED SYSTEM FOR GENERATING UNIFORM SURFACE DEPOSITS TO STUDY THE TRANSPORT OF PARTICULATE MATTER

    EPA Science Inventory

    Improved methods for measuring and assessing microenvironmental exposure in individuals are needed. How human activities affect particulate matter in the personal cloud is poorly understood. A quality assurance tool to aid the study of particle transport mechanisms (e.g., re-en...

  17. Phaedra, a protocol-driven system for analysis and validation of high-content imaging and flow cytometry.

    PubMed

    Cornelissen, Frans; Cik, Miroslav; Gustin, Emmanuel

    2012-04-01

    High-content screening has brought new dimensions to cellular assays by generating rich data sets that characterize cell populations in great detail and detect subtle phenotypes. To derive relevant, reliable conclusions from these complex data, it is crucial to have informatics tools supporting quality control, data reduction, and data mining. These tools must reconcile the complexity of advanced analysis methods with the user-friendliness demanded by the user community. After review of existing applications, we realized the possibility of adding innovative new analysis options. Phaedra was developed to support workflows for drug screening and target discovery, interact with several laboratory information management systems, and process data generated by a range of techniques including high-content imaging, multicolor flow cytometry, and traditional high-throughput screening assays. The application is modular and flexible, with an interface that can be tuned to specific user roles. It offers user-friendly data visualization and reduction tools for HCS but also integrates Matlab for custom image analysis and the Konstanz Information Miner (KNIME) framework for data mining. Phaedra features efficient JPEG2000 compression and full drill-down functionality from dose-response curves down to individual cells, with exclusion and annotation options, cell classification, statistical quality controls, and reporting.

  18. Mediator effect of statistical process control between Total Quality Management (TQM) and business performance in Malaysian Automotive Industry

    NASA Astrophysics Data System (ADS)

    Ahmad, M. F.; Rasi, R. Z.; Zakuan, N.; Hisyamudin, M. N. N.

    2015-12-01

    In today's highly competitive market, Total Quality Management (TQM) is vital management tool in ensuring a company can success in their business. In order to survive in the global market with intense competition amongst regions and enterprises, the adoption of tools and techniques are essential in improving business performance. There are consistent results between TQM and business performance. However, only few previous studies have examined the mediator effect namely statistical process control (SPC) between TQM and business performance. A mediator is a third variable that changes the association between an independent variable and an outcome variable. This study present research proposed a TQM performance model with mediator effect of SPC with structural equation modelling, which is a more comprehensive model for developing countries, specifically for Malaysia. A questionnaire was prepared and sent to 1500 companies from automotive industry and the related vendors in Malaysia, giving a 21.8 per cent rate. Attempts were made at findings significant impact of mediator between TQM practices and business performance showed that SPC is important tools and techniques in TQM implementation. The result concludes that SPC is partial correlation between and TQM and BP with indirect effect (IE) is 0.25 which can be categorised as high moderator effect.

  19. Using PICO Methodology to Answer Questions About Smoking in COPD Patients.

    PubMed

    Jiménez Ruiz, Carlos A; Buljubasich, Daniel; Riesco Miranda, Juan Antonio; Acuña Izcaray, Agustín; de Granda Orive, José Ignacio; Chatkin, José Miguel; Zabert, Gustavo; Guerreros Benavides, Alfredo; Paez Espinel, Nelson; Noé, Valeri; Sánchez-Angarita, Efraín; Núñez-Sánchez, Ingrid; Sansores, Raúl H; Casas, Alejandro; Palomar Lever, Andrés; Alfageme Michavila, Inmaculada

    2017-11-01

    The ALAT and SEPAR Treatment and Control of Smoking Groups have collaborated in the preparation of this document which attempts to answer, by way of PICO methodology, different questions on health interventions for helping COPD patients to stop smoking. The main recommendations are: (i)moderate-quality evidence and strong recommendation for performing spirometry in COPD patients and in smokers with a high risk of developing the disease, as a motivational tool (particularly for showing evidence of lung age), a diagnostic tool, and for active case-finding; (ii)high-quality evidence and strong recommendation for using intensive dedicated behavioral counselling and drug treatment for helping COPD patients to stop smoking; (iii)high-quality evidence and strong recommendation for initiating interventions for helping COPD patients to stop smoking during hospitalization with improvement when the intervention is prolonged after discharge, and (iv)high-quality evidence and strong recommendation for funding treatment of smoking in COPD patients, in view of the impact on health and health economics. Copyright © 2017 SEPAR. Publicado por Elsevier España, S.L.U. All rights reserved.

  20. The business process management software for successful quality management and organization: A case study from the University of Split School of Medicine.

    PubMed

    Sapunar, Damir; Grković, Ivica; Lukšić, Davor; Marušić, Matko

    2016-05-01

    Our aim was to describe a comprehensive model of internal quality management (QM) at a medical school founded on the business process analysis (BPA) software tool. BPA software tool was used as the core element for description of all working processes in our medical school, and subsequently the system served as the comprehensive model of internal QM. The quality management system at the University of Split School of Medicine included the documentation and analysis of all business processes within the School. The analysis revealed 80 weak points related to one or several business processes. A precise analysis of medical school business processes allows identification of unfinished, unclear and inadequate points in these processes, and subsequently the respective improvements and increase of the QM level and ultimately a rationalization of the institution's work. Our approach offers a potential reference model for development of common QM framework allowing a continuous quality control, i.e. the adjustments and adaptation to contemporary educational needs of medical students. Copyright © 2016 by Academy of Sciences and Arts of Bosnia and Herzegovina.

  1. [Air quality surveillance in France].

    PubMed

    Téton, S; Robin, D; Genève, C

    2009-10-01

    As air quality has a direct impact on human health, its monitoring is imperative. In France, this task was entrusted by the government (Air Law of 1996) to organisations with territorial responsibility: the Registered Associations for the Surveillance of Air Quality. The type and level of pollution evolve: from industrial and sulphur pollution in the seventies, to urban and photochemical pollution today and to nanoparticles, pesticides and pollutants in buildings tomorrow. The tools, the skills and the roles of the different people involved in air quality control follow these sometimes rapid transitions in connection with an increasingly precise understanding of the relationship between health and the environment and of the considerable research on the subject. This article describes the mechanisms of air quality monitoring in France.

  2. Personal health and consumer informatics. The impact of health oriented social media applications on health outcomes.

    PubMed

    Gibbons, M C

    2013-01-01

    The rapid evolution in the world-wide use of Social Media tools suggests the emergence of a global phenomenon that may have implications in the Personal Health and Consumer Health Informatics domains. However the impact of these tools on health outcomes is not known. The goal of this research was to review the randomized controlled trial (RCT) evidence of the impact of health oriented Social Media informatics tools on health outcomes. Evaluations of Social Media consumer health tools were systematically reviewed. Research was limited to studies published in the English language, published in Medline, published in the calendar year 2012 and limited to studies that utilized a RCT methodological design. Two high quality Randomized Controlled Trials among over 600 articles published in Medline were identified. These studies indicate that Social Media interventions may be able to significantly improve pain control among patients with chronic pain and enhance weight loss maintenance among individuals attempting to lose weight. Significantly more research needs to be done to confirm these early findings, evaluate additional health outcomes and further evaluate emerging health oriented Social Media interventions. Chronic pain and weight control have both socially oriented determinants. These studies suggest that understanding the social component of a disease may ultimately provide novel therapeutic targets and socio-clinical interventional strategies.

  3. Statistical process management: An essential element of quality improvement

    NASA Astrophysics Data System (ADS)

    Buckner, M. R.

    Successful quality improvement requires a balanced program involving the three elements that control quality: organization, people and technology. The focus of the SPC/SPM User's Group is to advance the technology component of Total Quality by networking within the Group and by providing an outreach within Westinghouse to foster the appropriate use of statistic techniques to achieve Total Quality. SPM encompasses the disciplines by which a process is measured against its intrinsic design capability, in the face of measurement noise and other obscuring variability. SPM tools facilitate decisions about the process that generated the data. SPM deals typically with manufacturing processes, but with some flexibility of definition and technique it accommodates many administrative processes as well. The techniques of SPM are those of Statistical Process Control, Statistical Quality Control, Measurement Control, and Experimental Design. In addition, techniques such as job and task analysis, and concurrent engineering are important elements of systematic planning and analysis that are needed early in the design process to ensure success. The SPC/SPM User's Group is endeavoring to achieve its objectives by sharing successes that have occurred within the member's own Westinghouse department as well as within other US and foreign industry. In addition, failures are reviewed to establish lessons learned in order to improve future applications. In broader terms, the Group is interested in making SPM the accepted way of doing business within Westinghouse.

  4. General principles of institutional risks influence on pension systems

    NASA Astrophysics Data System (ADS)

    Nepp, A. N.; Shilkov, A. A.; Sheveleva, A. Y.; Mamedbakov, M. R.

    2016-12-01

    This paper examines the tools used to study the influence of institutional factors on investment returns. The research object are the tools used in the evaluation of institutional risks in the pension system, in particular, the correlation model of factors impacting on the `anti-director' index, econometric estimates combining the different determinants of savings, the model of endogenous institutional change, etc. Research work focusing on issues of institutional factors affecting pension systems (authored by La Porta, Guiso, Gianetti, El-Mekkaouide Freitas, Neyapti B., and others) is reviewed. The model is examined in terms of the impact of institutional risks on pension systems, especially with regard to the funded part. The study identified the following factors that affect financial institutions, including pension institutions: management quality, regulation quality, rule of law, political stability, and corruption control.

  5. A case management tool for occupational health nurses: development, testing, and application.

    PubMed

    Mannon, J A; Conrad, K M; Blue, C L; Muran, S

    1994-08-01

    1. Case management is a process of coordinating an individual client's health care services to achieve optimal, quality care delivered in a cost effective manner. The case manager establishes a provider network, recommends treatment plans that assure quality and efficacy while controlling costs, monitors outcomes, and maintains a strong communication link among all the parties. 2. Through development of audit tools such as the one presented in this article, occupational health nurses can document case management activities and provide employers with measurable outcomes. 3. The Case Management Activity Checklist was tested using data from 61 firefighters' musculoskeletal injury cases. 4. The activities on the checklist are a step by step process: case identification/case disposition; assessment; return to work plan; resource identification; collaborative communication; and evaluation.

  6. Analytical study of beam handling and emittance control

    NASA Astrophysics Data System (ADS)

    Thompson, James R.; Sloan, M. L.

    1993-12-01

    The thrust of our research on beam handling and emittance control was to explore how one might design high current electron accelerators, with the preservation of high beam quality designed as the primary design consideration. We considered high current, induction linacs in the parameter class of the ETA/ATA accelerators at LLNL, but with improvements to the accelerator gap design and other features to permit a significant increase in the deliverable beam brightness. Our approach for beam quality control centered on the use of solenoidal magnetic focusing through such induction accelerators, together with gently-shaped (adiabatic) acceleration gaps. This approach offers several tools for the control of beam quality. The strength and axial variation in the solenoidal magnetic field may be designed, as may the length and shape of the acceleration gaps, the loading of the gaps, and the axial spacing from gap to gap. This research showed that each of these design features may individually be optimized to contribute to improved beam quality control, and by exploiting these features, it appears feasible to produce high current, high energy electron beams possessing breakthrough beam quality and brightness. Applications which have been technologically unachievable may for the first time become possible. One such application is the production of high performance free electron lasers at very short wavelengths, extending down to the optical (less than 1 micron) regime.

  7. [Implementation of quality standard UNE-EN ISO/IEC 17043 in the External Quality Control Program of the Spanish Society of Infectious Diseases and Clinical Microbiology].

    PubMed

    Poveda Gabaldón, Marta; Ovies, María Rosario; Orta Mira, Nieves; Serrano, M del Remedio Guna; Avila, Javier; Giménez, Alicia; Cardona, Concepción Gimeno

    2011-12-01

    The quality standard "UNE-EN-ISO 17043: 2010. Conformity assessment. General requirements for proficiency testing" applies to centers that organize intercomparisons in all areas. In the case of clinical microbiology laboratories, these intercomparisons must meet the management and technical standards required to achieve maximum quality in the performance of microbiological analysis and the preparation of test items (sample, product, data or other information used in the proficiency test) to enable them to be accredited. Once accredited, these laboratories can operate as a tool for quality control laboratories and competency assessment. In Spain, accreditation is granted by the Spanish Accreditation Body [Entidad Nacional de Acreditación (ENAC)]. The objective of this review is to explain how to apply the requirements of the standard to laboratories providing intercomparisons in the field of clinical microbiology (the organization responsible for all the tasks related to the development and operation of a proficiency testing program). This requires defining the scope and specifying the technical requirements (personnel management, control of equipment, facilities and environment, the design of the proficiency testing and data analysis for performance evaluation, communication with participants and confidentiality) and management requirements (document control, purchasing control, monitoring of complaints / claims, non-compliance, internal audits and management reviews). Copyright © 2011 Elsevier España S.L. All rights reserved.

  8. PACS quality control and automatic problem notifier

    NASA Astrophysics Data System (ADS)

    Honeyman-Buck, Janice C.; Jones, Douglas; Frost, Meryll M.; Staab, Edward V.

    1997-05-01

    One side effect of installing a clinical PACS Is that users become dependent upon the technology and in some cases it can be very difficult to revert back to a film based system if components fail. The nature of system failures range from slow deterioration of function as seen in the loss of monitor luminance through sudden catastrophic loss of the entire PACS networks. This paper describes the quality control procedures in place at the University of Florida and the automatic notification system that alerts PACS personnel when a failure has happened or is anticipated. The goal is to recover from a failure with a minimum of downtime and no data loss. Routine quality control is practiced on all aspects of PACS, from acquisition, through network routing, through display, and including archiving. Whenever possible, the system components perform self and between platform checks for active processes, file system status, errors in log files, and system uptime. When an error is detected or a exception occurs, an automatic page is sent to a pager with a diagnostic code. Documentation on each code, trouble shooting procedures, and repairs are kept on an intranet server accessible only to people involved in maintaining the PACS. In addition to the automatic paging system for error conditions, acquisition is assured by an automatic fax report sent on a daily basis to all technologists acquiring PACS images to be used as a cross check that all studies are archived prior to being removed from the acquisition systems. Daily quality control is preformed to assure that studies can be moved from each acquisition and contrast adjustment. The results of selected quality control reports will be presented. The intranet documentation server will be described with the automatic pager system. Monitor quality control reports will be described and the cost of quality control will be quantified. As PACS is accepted as a clinical tool, the same standards of quality control must be established as are expected on other equipment used in the diagnostic process.

  9. An introduction to statistical process control in research proteomics.

    PubMed

    Bramwell, David

    2013-12-16

    Statistical process control is a well-established and respected method which provides a general purpose, and consistent framework for monitoring and improving the quality of a process. It is routinely used in many industries where the quality of final products is critical and is often required in clinical diagnostic laboratories [1,2]. To date, the methodology has been little utilised in research proteomics. It has been shown to be capable of delivering quantitative QC procedures for qualitative clinical assays [3] making it an ideal methodology to apply to this area of biological research. To introduce statistical process control as an objective strategy for quality control and show how it could be used to benefit proteomics researchers and enhance the quality of the results they generate. We demonstrate that rules which provide basic quality control are easy to derive and implement and could have a major impact on data quality for many studies. Statistical process control is a powerful tool for investigating and improving proteomics research work-flows. The process of characterising measurement systems and defining control rules forces the exploration of key questions that can lead to significant improvements in performance. This work asserts that QC is essential to proteomics discovery experiments. Every experimenter must know the current capabilities of their measurement system and have an objective means for tracking and ensuring that performance. Proteomic analysis work-flows are complicated and multi-variate. QC is critical for clinical chemistry measurements and huge strides have been made in ensuring the quality and validity of results in clinical biochemistry labs. This work introduces some of these QC concepts and works to bridge their use from single analyte QC to applications in multi-analyte systems. This article is part of a Special Issue entitled: Standardization and Quality Control in Proteomics. Copyright © 2013 The Author. Published by Elsevier B.V. All rights reserved.

  10. Randomized clinical trial of nutritional counseling for malnourished hospital patients.

    PubMed

    Casals, C; García-Agua-Soler, N; Vázquez-Sánchez, M Á; Requena-Toro, M V; Padilla-Romero, L; Casals-Sánchez, J L

    2015-01-01

    Malnutrition is associated with an increased risk of mortality and morbidity, longer hospital stays and general loss of quality of life. The aim of this study is to assess the impact of dietary counseling for malnourished hospital patients. Prospective, randomized, open-label study of 106 hospital patients with malnutrition (54 in the control group and 52 in the intervention group). The intervention group received dietary counseling, and the control group underwent standard treatment. We determined the patients' nutritional state (body mass index, laboratory parameters, malnutrition universal screening tool), degree of dependence (Barthel index), quality of life (SF-12), degree of satisfaction (CSQ-8), the number and length of readmissions and mortality. The patients who underwent the "intervention" increased their weight at 6 months, while the controls lost weight (difference in body mass index, 2.14kg/m(2); p<.001). The intervention group had better results when compared with the control group in the Malnutrition Universal Screening Tool scores (difference, -1.29; p<.001), Barthel index (difference, 7.49; p=.025), SF-12 (difference, 13.72; p<.001) and CSQ-8 (difference, 4.34, p<.001) and required fewer readmissions (difference, -0.37; p=.04) and shorter stays for readmissions (difference, -6.75; p=.035). Mortality and laboratory parameters were similar for the 2 groups. Nutritional counseling improved the patients' nutritional state, quality of life and degree of dependence and decreased the number of hospital readmissions. Copyright © 2015 Elsevier España, S.L.U. y Sociedad Española de Medicina Interna (SEMI). All rights reserved.

  11. Examining the Effect of the Die Angle on Tool Load and Wear in the Extrusion Process

    NASA Astrophysics Data System (ADS)

    Nowotyńska, Irena; Kut, Stanisław

    2014-04-01

    The tool durability is a crucial factor in each manufacturing process, and this also includes the extrusion process. Striving to achieve the higher product quality should be accompanied by a long-term tool life and production cost reduction. This article presents the comparative research of load and wear of die at various angles of working cone during the concurrent extrusion. The numerical calculations of a tool load during the concurrent extrusion were performed using the MSC MARC software using the finite element method (FEM). Archard model was used to determine and compare die wear. This model was implemented in the software using the FEM. The examined tool deformations and stress distribution were determined based on the performed analyses. The die wear depth at various working cone angles was determined. Properly shaped die has an effect on the extruded material properties, but also controls loads, elastic deformation, and the tool life.

  12. Computer Simulation of Replaceable Many Sider Plates (RMSP) with Enhanced Chip-Breaking Characteristics

    NASA Astrophysics Data System (ADS)

    Korchuganova, M.; Syrbakov, A.; Chernysheva, T.; Ivanov, G.; Gnedasch, E.

    2016-08-01

    Out of all common chip curling methods, a special tool face form has become the most widespread which is developed either by means of grinding or by means of profile pressing in the production process of RMSP. Currently, over 15 large tool manufacturers produce tools using instrument materials of over 500 brands. To this, we must add a large variety of tool face geometries, which purpose includes the control over form and dimensions of the chip. Taking into account all the many processed materials, specific tasks of the process planner, requirements to the quality of manufactured products, all this makes the choice of a proper tool which can perform the processing in the most effective way significantly harder. Over recent years, the nomenclature of RMSP for lathe tools with mechanical mounting has been considerably broadened by means of diversification of their faces

  13. A Real-Time Tool Positioning Sensor for Machine-Tools

    PubMed Central

    Ruiz, Antonio Ramon Jimenez; Rosas, Jorge Guevara; Granja, Fernando Seco; Honorato, Jose Carlos Prieto; Taboada, Jose Juan Esteve; Serrano, Vicente Mico; Jimenez, Teresa Molina

    2009-01-01

    In machining, natural oscillations, and elastic, gravitational or temperature deformations, are still a problem to guarantee the quality of fabricated parts. In this paper we present an optical measurement system designed to track and localize in 3D a reference retro-reflector close to the machine-tool's drill. The complete system and its components are described in detail. Several tests, some static (including impacts and rotations) and others dynamic (by executing linear and circular trajectories), were performed on two different machine tools. It has been integrated, for the first time, a laser tracking system into the position control loop of a machine-tool. Results indicate that oscillations and deformations close to the tool can be estimated with micrometric resolution and a bandwidth from 0 to more than 100 Hz. Therefore this sensor opens the possibility for on-line compensation of oscillations and deformations. PMID:22408472

  14. WE-H-BRC-01: Failure Mode and Effects Analysis of Skin Electronic Brachytherapy Using Esteya Unit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ibanez-Rosello, B; Bautista-Ballesteros, J; Bonaque, J

    Purpose: A failure mode and effect analysis (FMEA) of skin lesions treatment process using Esteya™ device (Elekta Brachyterapy, Veenendaal, The Netherlands) was performed, with the aim of increasing the quality of the treatment and reducing the likelihood of unwanted events. Methods: A multidisciplinary team with experience in the treatment process met to establish the process map, which outlines the flow of various stages for such patients undergoing skin treatment. Potential failure modes (FM) were identified and the value of severity (S), frequency of occurrence (O), and lack of detectability (D) of the proposed FM were scored individually, each on amore » scale of 1 to 10 following TG-100 guidelines of the AAPM. These failure modes were ranked according to our risk priority number (RPN) and S scores. The efficiency of existing quality management tools was analyzed through a reassessment of the O and D made by consensus. Results: 149 FM were identified, 43 of which had RPN ≥ 100 and 30 had S ≥ 7. After introduction of the tools of quality management, only 3 FM had RPN ≥ 100 and 22 FM had RPN ≥ 50. These 22 FM were thoroughly analyzed and new tools for quality management were proposed. The most common cause of highest RPN FM was associated with the heavy patient workload and the continuous and accurate applicator-patient skin contact during the treatment. To overcome this second item, a regular quality control and setup review by a second individual before each treatment session was proposed. Conclusion: FMEA revealed some of the FM potentials that were not predicted during the initial implementation of the quality management tools. This exercise was useful in identifying the need of periodic update of the FMEA process as new potential failures can be identified.« less

  15. Failure mode and effects analysis of skin electronic brachytherapy using Esteya® unit

    PubMed Central

    Bautista-Ballesteros, Juan Antonio; Bonaque, Jorge; Celada, Francisco; Lliso, Françoise; Carmona, Vicente; Gimeno-Olmos, Jose; Ouhib, Zoubir; Rosello, Joan; Perez-Calatayud, Jose

    2016-01-01

    Purpose Esteya® (Nucletron, an Elekta company, Elekta AB, Stockholm, Sweden) is an electronic brachytherapy device used for skin cancer lesion treatment. In order to establish an adequate level of quality of treatment, a risk analysis of the Esteya treatment process has been done, following the methodology proposed by the TG-100 guidelines of the American Association of Physicists in Medicine (AAPM). Material and methods A multidisciplinary team familiar with the treatment process was formed. This team developed a process map (PM) outlining the stages, through which a patient passed when subjected to the Esteya treatment. They identified potential failure modes (FM) and each individual FM was assessed for the severity (S), frequency of occurrence (O), and lack of detection (D). A list of existing quality management tools was developed and the FMs were consensually reevaluated. Finally, the FMs were ranked according to their risk priority number (RPN) and their S. Results 146 FMs were identified, 106 of which had RPN ≥ 50 and 30 had S ≥ 7. After introducing the quality management tools, only 21 FMs had RPN ≥ 50. The importance of ensuring contact between the applicator and the surface of the patient’s skin was emphasized, so the setup was reviewed by a second individual before each treatment session with periodic quality control to ensure stability of the applicator pressure. Some of the essential quality management tools are already being implemented in the installation are the simple templates for reproducible positioning of skin applicators, that help marking the treatment area and positioning of X-ray tube. Conclusions New quality management tools have been established as a result of the application of the failure modes and effects analysis (FMEA) treatment. However, periodic update of the FMEA process is necessary, since clinical experience has suggested occurring of further new possible potential failure modes. PMID:28115958

  16. The Healthy Lifestyle and Personal Control Questionnaire (HLPCQ): a novel tool for assessing self-empowerment through a constellation of daily activities.

    PubMed

    Darviri, Christina; Alexopoulos, Evangelos C; Artemiadis, Artemios K; Tigani, Xanthi; Kraniotou, Christina; Darvyri, Panagiota; Chrousos, George P

    2014-09-24

    The main goal of stress management and health promotion programs is to improve health by empowering people to take control over their lives. Daily health-related lifestyle choices are integral targets of these interventions and critical to evaluating their efficacy. To date, concepts such as self-efficacy, self-control and empowerment are assessed by tools that only partially address daily lifestyle choices. The aim of this study is to validate a novel measurement tool, the Healthy Lifestyle and Personal Control Questionnaire (HLPCQ), which aims to assess the concept of empowerment through a constellation of daily activities. Therefore, we performed principal component analysis (PCA) of 26 items that were derived from the qualitative data of several stress management programs conducted by our research team. The PCA resulted in the following five-factor solution: 1) Dietary Healthy Choices, 2) Dietary Harm Avoidance, 3) Daily Routine, 4) Organized Physical Exercise and 5) Social and Mental Balance. All subscales showed satisfactory internal consistency and variance, relative to theoretical score ranges. Subscale scores and the total score were significantly correlated with perceived stress and health locus of control, implying good criterion validity. Associations with sociodemographic data and other variables, such as sleep quality and health assessments, were also found. The HLPCQ is a good tool for assessing the efficacy of future health-promoting interventions to improve individuals' lifestyle and wellbeing.

  17. Quality of life, mental health and self-esteem in hirsute adolescent females.

    PubMed

    Drosdzol, Agnieszka; Skrzypulec, Violetta; Plinta, Ryszard

    2010-09-01

    The aim of the study was to evaluate the influence of hirsutism on general quality of life, self-esteem and the prevalence of anxiety and depressive symptoms among adolescent girls. Fifty adolescent females with hirsutism, aged 13-18 years, were enrolled in the research group. The control group comprised 50 non-hirsute adolescents. A specific questionnaire was used as the research tool. It included self-evaluation inventories: Short Form-36 Health Survey Version 2, Hospital Anxiety and Depression Scale and Rosenberg Self-Esteem Scale. Quality of life indices for hirsute girls scored lower than for the controls and statistically significantly so with regard to physical functioning (p = 0.04), general health (p = 0.002) and social functioning (p = 0.007). Anxiety was diagnosed in 26% in the group of hirsute girls as compared with 10% of the controls (p = 0.03). The study analysis revealed more clinically significant problems of low self-esteem in hirsute adolescents compared with non-hirsute girls (14% vs. 2%). Hirsutism is associated with a decreased quality of life, a higher prevalence of anxiety disorder and lower self-esteem in adolescent females. The mother's level of education is associated with the quality of life in adolescent girls.

  18. Dimensional and material characteristics of direct deposited tool steel by CO II laser

    NASA Astrophysics Data System (ADS)

    Choi, J.

    2006-01-01

    Laser aided direct metalimaterial deposition (DMD) process builds metallic parts layer-by-layer directly from the CAD representation. In general, the process uses powdered metaUmaterials fed into a melt pool, creating fully dense parts. Success of this technology in the die and tool industry depends on the parts quality to be achieved. To obtain designed geometric dimensions and material properties, delicate control of the parameters such as laser power, spot diameter, traverse speed and powder mass flow rate is critical. In this paper, the dimensional and material characteristics of directed deposited H13 tool steel by CO II laser are investigated for the DMD process with a feedback height control system. The relationships between DMD process variables and the product characteristics are analyzed using statistical techniques. The performance of the DMD process is examined with the material characteristics of hardness, porosity, microstructure, and composition.

  19. Quality of clinical brain tumor MR spectra judged by humans and machine learning tools.

    PubMed

    Kyathanahally, Sreenath P; Mocioiu, Victor; Pedrosa de Barros, Nuno; Slotboom, Johannes; Wright, Alan J; Julià-Sapé, Margarida; Arús, Carles; Kreis, Roland

    2018-05-01

    To investigate and compare human judgment and machine learning tools for quality assessment of clinical MR spectra of brain tumors. A very large set of 2574 single voxel spectra with short and long echo time from the eTUMOUR and INTERPRET databases were used for this analysis. Original human quality ratings from these studies as well as new human guidelines were used to train different machine learning algorithms for automatic quality control (AQC) based on various feature extraction methods and classification tools. The performance was compared with variance in human judgment. AQC built using the RUSBoost classifier that combats imbalanced training data performed best. When furnished with a large range of spectral and derived features where the most crucial ones had been selected by the TreeBagger algorithm it showed better specificity (98%) in judging spectra from an independent test-set than previously published methods. Optimal performance was reached with a virtual three-class ranking system. Our results suggest that feature space should be relatively large for the case of MR tumor spectra and that three-class labels may be beneficial for AQC. The best AQC algorithm showed a performance in rejecting spectra that was comparable to that of a panel of human expert spectroscopists. Magn Reson Med 79:2500-2510, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  20. [The Italian instrument evaluating the nursing students clinical learning quality].

    PubMed

    Palese, Alvisa; Grassetti, Luca; Mansutti, Irene; Destrebecq, Anne; Terzoni, Stefano; Altini, Pietro; Bevilacqua, Anita; Brugnolli, Anna; Benaglio, Carla; Dal Ponte, Adriana; De Biasio, Laura; Dimonte, Valerio; Gambacorti, Benedetta; Fasci, Adriana; Grosso, Silvia; Mantovan, Franco; Marognolli, Oliva; Montalti, Sandra; Nicotera, Raffaela; Randon, Giulia; Stampfl, Brigitte; Tollini, Morena; Canzan, Federica; Saiani, Luisa; Zannini, Lucia

    2017-01-01

    . The Clinical Learning Quality Evaluation Index for nursing students. The Italian nursing programs, the need to introduce tools evaluating the quality of the clinical learning as perceived by nursing students. Several tools already exist, however, several limitations suggesting the need to develop a new tool. A national project aimed at developing and validating a new instrument capable of measuring the clinical learning quality as experience by nursing students. A validation study design was undertaken from 2015 to 2016. All nursing national programs (n=43) were invited to participate by including all nursing students attending regularly their clinical learning. The tool developed based upon a) literature, b) validated tools already established among other healthcare professionals, and c) consensus expressed by experts and nursing students, was administered to the eligible students. 9606 nursing in 27 universities (62.8%) participated. The psychometric properties of the new instrument ranged from good to excellent. According to the findings, the tool consists in 22 items and five factors: a) quality of the tutorial strategies, b) learning opportunities; c) safety and nursing care quality; d) self-direct learning; e) quality of the learning environment. The tool is already used. Its systematic adoption may support comparison among settings and across different programs; moreover, the tool may also support in accrediting new settings as well as in measuring the effects of strategies aimed at improving the quality of the clinical learning.

  1. Evaluation of the reliability, usability, and applicability of AMSTAR, AMSTAR 2, and ROBIS: protocol for a descriptive analytic study.

    PubMed

    Gates, Allison; Gates, Michelle; Duarte, Gonçalo; Cary, Maria; Becker, Monika; Prediger, Barbara; Vandermeer, Ben; Fernandes, Ricardo M; Pieper, Dawid; Hartling, Lisa

    2018-06-13

    Systematic reviews (SRs) of randomised controlled trials (RCTs) can provide the best evidence to inform decision-making, but their methodological and reporting quality varies. Tools exist to guide the critical appraisal of quality and risk of bias in SRs, but evaluations of their measurement properties are limited. We will investigate the interrater reliability (IRR), usability, and applicability of A MeaSurement Tool to Assess systematic Reviews (AMSTAR), AMSTAR 2, and Risk Of Bias In Systematic reviews (ROBIS) for SRs in the fields of biomedicine and public health. An international team of researchers at three collaborating centres will undertake the study. We will use a random sample of 30 SRs of RCTs investigating therapeutic interventions indexed in MEDLINE in February 2014. Two reviewers at each centre will appraise the quality and risk of bias in each SR using AMSTAR, AMSTAR 2, and ROBIS. We will record the time to complete each assessment and for the two reviewers to reach consensus for each SR. We will extract the descriptive characteristics of each SR, the included studies, participants, interventions, and comparators. We will also extract the direction and strength of the results and conclusions for the primary outcome. We will summarise the descriptive characteristics of the SRs using means and standard deviations, or frequencies and proportions. To test for interrater reliability between reviewers and between the consensus agreements of reviewer pairs, we will use Gwet's AC 1 statistic. For comparability to previous evaluations, we will also calculate weighted Cohen's kappa and Fleiss' kappa statistics. To estimate usability, we will calculate the mean time to complete the appraisal and to reach consensus for each tool. To inform applications of the tools, we will test for statistical associations between quality scores and risk of bias judgments, and the results and conclusions of the SRs. Appraising the methodological and reporting quality of SRs is necessary to determine the trustworthiness of their conclusions. Which tool may be most reliably applied and how the appraisals should be used is uncertain; the usability of newly developed tools is unknown. This investigation of common (AMSTAR) and newly developed (AMSTAR 2, ROBIS) tools will provide empiric data to inform their application, interpretation, and refinement.

  2. Contamination-Free Manufacturing: Tool Component Qualification, Verification and Correlation with Wafers

    NASA Astrophysics Data System (ADS)

    Tan, Samantha H.; Chen, Ning; Liu, Shi; Wang, Kefei

    2003-09-01

    As part of the semiconductor industry "contamination-free manufacturing" effort, significant emphasis has been placed on reducing potential sources of contamination from process equipment and process equipment components. Process tools contain process chambers and components that are exposed to the process environment or process chemistry and in some cases are in direct contact with production wafers. Any contamination from these sources must be controlled or eliminated in order to maintain high process yields, device performance, and device reliability. This paper discusses new nondestructive analytical methods for quantitative measurement of the cleanliness of metal, quartz, polysilicon and ceramic components that are used in process equipment tools. The goal of these new procedures is to measure the effectiveness of cleaning procedures and to verify whether a tool component part is sufficiently clean for installation and subsequent routine use in the manufacturing line. These procedures provide a reliable "qualification method" for tool component certification and also provide a routine quality control method for reliable operation of cleaning facilities. Cost advantages to wafer manufacturing include higher yields due to improved process cleanliness and elimination of yield loss and downtime resulting from the installation of "bad" components in process tools. We also discuss a representative example of wafer contamination having been linked to a specific process tool component.

  3. CRCDA—Comprehensive resources for cancer NGS data analysis

    PubMed Central

    Thangam, Manonanthini; Gopal, Ramesh Kumar

    2015-01-01

    Next generation sequencing (NGS) innovations put a compelling landmark in life science and changed the direction of research in clinical oncology with its productivity to diagnose and treat cancer. The aim of our portal comprehensive resources for cancer NGS data analysis (CRCDA) is to provide a collection of different NGS tools and pipelines under diverse classes with cancer pathways and databases and furthermore, literature information from PubMed. The literature data was constrained to 18 most common cancer types such as breast cancer, colon cancer and other cancers that exhibit in worldwide population. NGS-cancer tools for the convenience have been categorized into cancer genomics, cancer transcriptomics, cancer epigenomics, quality control and visualization. Pipelines for variant detection, quality control and data analysis were listed to provide out-of-the box solution for NGS data analysis, which may help researchers to overcome challenges in selecting and configuring individual tools for analysing exome, whole genome and transcriptome data. An extensive search page was developed that can be queried by using (i) type of data [literature, gene data and sequence read archive (SRA) data] and (ii) type of cancer (selected based on global incidence and accessibility of data). For each category of analysis, variety of tools are available and the biggest challenge is in searching and using the right tool for the right application. The objective of the work is collecting tools in each category available at various places and arranging the tools and other data in a simple and user-friendly manner for biologists and oncologists to find information easier. To the best of our knowledge, we have collected and presented a comprehensive package of most of the resources available in cancer for NGS data analysis. Given these factors, we believe that this website will be an useful resource to the NGS research community working on cancer. Database URL: http://bioinfo.au-kbc.org.in/ngs/ngshome.html. PMID:26450948

  4. TH-C-18A-08: A Management Tool for CT Dose Monitoring, Analysis, and Protocol Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, J; Chan, F; Newman, B

    2014-06-15

    Purpose: To develop a customizable tool for enterprise-wide managing of CT protocols and analyzing radiation dose information of CT exams for a variety of quality control applications Methods: All clinical CT protocols implemented on the 11 CT scanners at our institution were extracted in digital format. The original protocols had been preset by our CT management team. A commercial CT dose tracking software (DoseWatch,GE healthcare,WI) was used to collect exam information (exam date, patient age etc.), scanning parameters, and radiation doses for all CT exams. We developed a Matlab-based program (MathWorks,MA) with graphic user interface which allows to analyze themore » scanning protocols with the actual dose estimates, and compare the data to national (ACR,AAPM) and internal reference values for CT quality control. Results: The CT protocol review portion of our tool allows the user to look up the scanning and image reconstruction parameters of any protocol on any of the installed CT systems among about 120 protocols per scanner. In the dose analysis tool, dose information of all CT exams (from 05/2013 to 02/2014) was stratified on a protocol level, and within a protocol down to series level, i.e. each individual exposure event. This allows numerical and graphical review of dose information of any combination of scanner models, protocols and series. The key functions of the tool include: statistics of CTDI, DLP and SSDE, dose monitoring using user-set CTDI/DLP/SSDE thresholds, look-up of any CT exam dose data, and CT protocol review. Conclusion: our inhouse CT management tool provides radiologists, technologists and administration a first-hand near real-time enterprise-wide knowledge on CT dose levels of different exam types. Medical physicists use this tool to manage CT protocols, compare and optimize dose levels across different scanner models. It provides technologists feedback on CT scanning operation, and knowledge on important dose baselines and thresholds.« less

  5. A Monte Carlo analysis of breast screening randomized trials.

    PubMed

    Zamora, Luis I; Forastero, Cristina; Guirado, Damián; Lallena, Antonio M

    2016-12-01

    To analyze breast screening randomized trials with a Monte Carlo simulation tool. A simulation tool previously developed to simulate breast screening programmes was adapted for that purpose. The history of women participating in the trials was simulated, including a model for survival after local treatment of invasive cancers. Distributions of time gained due to screening detection against symptomatic detection and the overall screening sensitivity were used as inputs. Several randomized controlled trials were simulated. Except for the age range of women involved, all simulations used the same population characteristics and this permitted to analyze their external validity. The relative risks obtained were compared to those quoted for the trials, whose internal validity was addressed by further investigating the reasons of the disagreements observed. The Monte Carlo simulations produce results that are in good agreement with most of the randomized trials analyzed, thus indicating their methodological quality and external validity. A reduction of the breast cancer mortality around 20% appears to be a reasonable value according to the results of the trials that are methodologically correct. Discrepancies observed with Canada I and II trials may be attributed to a low mammography quality and some methodological problems. Kopparberg trial appears to show a low methodological quality. Monte Carlo simulations are a powerful tool to investigate breast screening controlled randomized trials, helping to establish those whose results are reliable enough to be extrapolated to other populations and to design the trial strategies and, eventually, adapting them during their development. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  6. AIMBAT: A Python/Matplotlib Tool for Measuring Teleseismic Arrival Times

    NASA Astrophysics Data System (ADS)

    Lou, X.; van der Lee, S.; Lloyd, S.

    2013-12-01

    Python is an open-source, platform-independent, and object-oriented scripting language. It became more popular in the seismologist community since the appearance of ObsPy (Beyreuther et al. 2010, Megies et al. 2011), which provides a powerful framework for seismic data access and processing. This study introduces a new Python-based tool named AIMBAT (Automated and Interactive Measurement of Body-wave Arrival Times) for measuring teleseismic body-wave arrival times on large-scale seismic event data (Lou et al. 2013). Compared to ObsPy, AIMBAT is a lighter tool that is more focused on a particular aspect of seismic data processing. It originates from the widely used MCCC (Multi-Channel Cross-Correlation) method developed by VanDecar and Crosson (1990). On top of the original MCCC procedure, AIMBAT is automated in initial phase picking and is interactive in quality control. The core cross-correlation function is implemented in Fortran to boost up performance in addition to Python. The GUI (graphical user interface) of AIMBAT depends on Matplotlib's GUI-neutral widgets and event-handling API. A number of sorting and (de)selecting options are designed to facilitate the quality control of seismograms. By using AIMBAT, both relative and absolute teleseismic body-wave arrival times are measured. AIMBAT significantly improves efficiency and quality of the measurements. User interaction is needed only to pick the target phase arrival and to set a time window on the array stack. The package is easy to install and use, open-source, and is publicly available. Graphical user interface of AIMBAT.

  7. 3D models mapping optimization through an integrated parameterization approach: cases studies from Ravenna

    NASA Astrophysics Data System (ADS)

    Cipriani, L.; Fantini, F.; Bertacchi, S.

    2014-06-01

    Image-based modelling tools based on SfM algorithms gained great popularity since several software houses provided applications able to achieve 3D textured models easily and automatically. The aim of this paper is to point out the importance of controlling models parameterization process, considering that automatic solutions included in these modelling tools can produce poor results in terms of texture utilization. In order to achieve a better quality of textured models from image-based modelling applications, this research presents a series of practical strategies aimed at providing a better balance between geometric resolution of models from passive sensors and their corresponding (u,v) map reference systems. This aspect is essential for the achievement of a high-quality 3D representation, since "apparent colour" is a fundamental aspect in the field of Cultural Heritage documentation. Complex meshes without native parameterization have to be "flatten" or "unwrapped" in the (u,v) parameter space, with the main objective to be mapped with a single image. This result can be obtained by using two different strategies: the former automatic and faster, while the latter manual and time-consuming. Reverse modelling applications provide automatic solutions based on splitting the models by means of different algorithms, that produce a sort of "atlas" of the original model in the parameter space, in many instances not adequate and negatively affecting the overall quality of representation. Using in synergy different solutions, ranging from semantic aware modelling techniques to quad-dominant meshes achieved using retopology tools, it is possible to obtain a complete control of the parameterization process.

  8. A Systematic Approach to Capacity Strengthening of Laboratory Systems for Control of Neglected Tropical Diseases in Ghana, Kenya, Malawi and Sri Lanka

    PubMed Central

    Njelesani, Janet; Dacombe, Russell; Palmer, Tanith; Smith, Helen; Koudou, Benjamin; Bockarie, Moses; Bates, Imelda

    2014-01-01

    Background The lack of capacity in laboratory systems is a major barrier to achieving the aims of the London Declaration (2012) on neglected tropical diseases (NTDs). To counter this, capacity strengthening initiatives have been carried out in NTD laboratories worldwide. Many of these initiatives focus on individuals' skills or institutional processes and structures ignoring the crucial interactions between the laboratory and the wider national and international context. Furthermore, rigorous methods to assess these initiatives once they have been implemented are scarce. To address these gaps we developed a set of assessment and monitoring tools that can be used to determine the capacities required and achieved by laboratory systems at the individual, organizational, and national/international levels to support the control of NTDs. Methodology and principal findings We developed a set of qualitative and quantitative assessment and monitoring tools based on published evidence on optimal laboratory capacity. We implemented the tools with laboratory managers in Ghana, Malawi, Kenya, and Sri Lanka. Using the tools enabled us to identify strengths and gaps in the laboratory systems from the following perspectives: laboratory quality benchmarked against ISO 15189 standards, the potential for the laboratories to provide support to national and regional NTD control programmes, and the laboratory's position within relevant national and international networks and collaborations. Conclusion We have developed a set of mixed methods assessment and monitoring tools based on evidence derived from the components needed to strengthen the capacity of laboratory systems to control NTDs. Our tools help to systematically assess and monitor individual, organizational, and wider system level capacity of laboratory systems for NTD control and can be applied in different country contexts. PMID:24603407

  9. Supporting Goal-Oriented Primary Health Care for Seniors with Complex Care Needs Using Mobile Technology: Evaluation and Implementation of the Health System Performance Research Network, Bridgepoint Electronic Patient Reported Outcome Tool.

    PubMed

    Steele Gray, Carolyn; Wodchis, Walter P; Upshur, Ross; Cott, Cheryl; McKinstry, Brian; Mercer, Stewart; Palen, Ted E; Ramsay, Tim; Thavorn, Kednapa

    2016-06-24

    Older adults experiencing multiple chronic illnesses are at high risk of hospitalization and health decline if they are unable to manage the significant challenges posed by their health conditions. Goal-oriented care approaches can provide better care for these complex patients, but clinicians find the process of ascertaining goals "too complex and too-time consuming," and goals are often not agreed upon between complex patients and their providers. The electronic patient reported outcomes (ePRO) mobile app and portal offers an innovative approach to creating and monitoring goal-oriented patient-care plans to improve patient self-management and shared decision-making between patients and health care providers. The ePRO tool also supports proactive patient monitoring by the patient, caregiver(s), and health care provider. It was developed with and for older adults with complex care needs as a means to improve their quality of life. Our proposed project will evaluate the use, effectiveness, and value for money of the ePRO tool in a 12-month multicenter, randomized controlled trial in Ontario; targeting individuals 65 or over with two or more chronic conditions that require frequent health care visits to manage their health conditions. Intervention groups using the ePRO tool will be compared with control groups on measures of quality of life, patient experience, and cost-effectiveness. We will also evaluate the implementation of the tool. The proposed project presented in this paper will be funded through the Canadian Institute for Health Research (CIHR) eHealth Innovation Partnerships Program (eHIPP) program (CIHR-348362). The expected completion date of the study is November, 2019. We anticipate our program of work will support improved quality of life and patient self-management, improved patient-centered primary care delivery, and will encourage the adoption of goal-oriented care approaches across primary health care systems. We have partnered with family health teams and quality improvement organizations in Ontario to ensure that our research is practical and that findings are shared widely. We will work with our established international network to develop an implementation framework to support continued adaptation and adoption across Canada and internationally.

  10. Supporting Goal-Oriented Primary Health Care for Seniors with Complex Care Needs Using Mobile Technology: Evaluation and Implementation of the Health System Performance Research Network, Bridgepoint Electronic Patient Reported Outcome Tool

    PubMed Central

    Wodchis, Walter P; Upshur, Ross; Cott, Cheryl; McKinstry, Brian; Mercer, Stewart; Palen, Ted E; Ramsay, Tim; Thavorn, Kednapa

    2016-01-01

    Background Older adults experiencing multiple chronic illnesses are at high risk of hospitalization and health decline if they are unable to manage the significant challenges posed by their health conditions. Goal-oriented care approaches can provide better care for these complex patients, but clinicians find the process of ascertaining goals “too complex and too-time consuming,” and goals are often not agreed upon between complex patients and their providers. The electronic patient reported outcomes (ePRO) mobile app and portal offers an innovative approach to creating and monitoring goal-oriented patient-care plans to improve patient self-management and shared decision-making between patients and health care providers. The ePRO tool also supports proactive patient monitoring by the patient, caregiver(s), and health care provider. It was developed with and for older adults with complex care needs as a means to improve their quality of life. Objective Our proposed project will evaluate the use, effectiveness, and value for money of the ePRO tool in a 12-month multicenter, randomized controlled trial in Ontario; targeting individuals 65 or over with two or more chronic conditions that require frequent health care visits to manage their health conditions. Methods Intervention groups using the ePRO tool will be compared with control groups on measures of quality of life, patient experience, and cost-effectiveness. We will also evaluate the implementation of the tool. Results The proposed project presented in this paper will be funded through the Canadian Institute for Health Research (CIHR) eHealth Innovation Partnerships Program (eHIPP) program (CIHR–143559). The expected completion date of the study is November, 2019. Conclusions We anticipate our program of work will support improved quality of life and patient self-management, improved patient-centered primary care delivery, and will encourage the adoption of goal-oriented care approaches across primary health care systems. We have partnered with family health teams and quality improvement organizations in Ontario to ensure that our research is practical and that findings are shared widely. We will work with our established international network to develop an implementation framework to support continued adaptation and adoption across Canada and internationally. PMID:27341765

  11. Quality Controlling CMIP datasets at GFDL

    NASA Astrophysics Data System (ADS)

    Horowitz, L. W.; Radhakrishnan, A.; Balaji, V.; Adcroft, A.; Krasting, J. P.; Nikonov, S.; Mason, E. E.; Schweitzer, R.; Nadeau, D.

    2017-12-01

    As GFDL makes the switch from model development to production in light of the Climate Model Intercomparison Project (CMIP), GFDL's efforts are shifted to testing and more importantly establishing guidelines and protocols for Quality Controlling and semi-automated data publishing. Every CMIP cycle introduces key challenges and the upcoming CMIP6 is no exception. The new CMIP experimental design comprises of multiple MIPs facilitating research in different focus areas. This paradigm has implications not only for the groups that develop the models and conduct the runs, but also for the groups that monitor, analyze and quality control the datasets before data publishing, before their knowledge makes its way into reports like the IPCC (Intergovernmental Panel on Climate Change) Assessment Reports. In this talk, we discuss some of the paths taken at GFDL to quality control the CMIP-ready datasets including: Jupyter notebooks, PrePARE, LAMP (Linux, Apache, MySQL, PHP/Python/Perl): technology-driven tracker system to monitor the status of experiments qualitatively and quantitatively, provide additional metadata and analysis services along with some in-built controlled-vocabulary validations in the workflow. In addition to this, we also discuss the integration of community-based model evaluation software (ESMValTool, PCMDI Metrics Package, and ILAMB) as part of our CMIP6 workflow.

  12. Likelihood of achieving air quality targets under model uncertainties.

    PubMed

    Digar, Antara; Cohan, Daniel S; Cox, Dennis D; Kim, Byeong-Uk; Boylan, James W

    2011-01-01

    Regulatory attainment demonstrations in the United States typically apply a bright-line test to predict whether a control strategy is sufficient to attain an air quality standard. Photochemical models are the best tools available to project future pollutant levels and are a critical part of regulatory attainment demonstrations. However, because photochemical models are uncertain and future meteorology is unknowable, future pollutant levels cannot be predicted perfectly and attainment cannot be guaranteed. This paper introduces a computationally efficient methodology for estimating the likelihood that an emission control strategy will achieve an air quality objective in light of uncertainties in photochemical model input parameters (e.g., uncertain emission and reaction rates, deposition velocities, and boundary conditions). The method incorporates Monte Carlo simulations of a reduced form model representing pollutant-precursor response under parametric uncertainty to probabilistically predict the improvement in air quality due to emission control. The method is applied to recent 8-h ozone attainment modeling for Atlanta, Georgia, to assess the likelihood that additional controls would achieve fixed (well-defined) or flexible (due to meteorological variability and uncertain emission trends) targets of air pollution reduction. The results show that in certain instances ranking of the predicted effectiveness of control strategies may differ between probabilistic and deterministic analyses.

  13. Easy parallel screening of reagent stability, quality control, and metrology in solid phase peptide synthesis (SPPS) and peptide couplings for microarrays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Achyuthan, Komandoor E.; Wheeler, David R.

    Evaluating the stability of coupling reagents, quality control (QC), and surface functionalization metrology are all critical to the production of high quality peptide microarrays. We describe a broadly applicable screening technique for evaluating the fidelity of solid phase peptide synthesis (SPPS), the stability of activation/coupling reagents, and a microarray surface metrology tool. This technique was used to assess the stability of the activation reagent 1-{[1-(Cyano-2-ethoxy-2-oxo-ethylidenaminooxy)dimethylamino-morpholinomethylene]}methaneaminiumHexafluorophosphate (COMU) (Sigma-Aldrich, St. Louis, MO, USA) by SPPS of Leu-Enkephalin (YGGFL) or the coupling of commercially synthesized YGGFL peptides to (3-aminopropyl)triethyoxysilane-modified glass surfaces. Coupling efficiency was quantitated by fluorescence signaling based on immunoreactivity of themore » YGGFL motif. It was concluded that COMU solutions should be prepared fresh and used within 5 h when stored at ~23 °C and not beyond 24 h if stored refrigerated, both in closed containers. Caveats to gauging COMU stability by absorption spectroscopy are discussed. Commercial YGGFL peptides needed independent QC, due to immunoreactivity variations for the same sequence synthesized by different vendors. This technique is useful in evaluating the stability of other activation/coupling reagents besides COMU and as a metrology tool for SPPS and peptide microarrays.« less

  14. Easy parallel screening of reagent stability, quality control, and metrology in solid phase peptide synthesis (SPPS) and peptide couplings for microarrays

    DOE PAGES

    Achyuthan, Komandoor E.; Wheeler, David R.

    2015-08-27

    Evaluating the stability of coupling reagents, quality control (QC), and surface functionalization metrology are all critical to the production of high quality peptide microarrays. We describe a broadly applicable screening technique for evaluating the fidelity of solid phase peptide synthesis (SPPS), the stability of activation/coupling reagents, and a microarray surface metrology tool. This technique was used to assess the stability of the activation reagent 1-{[1-(Cyano-2-ethoxy-2-oxo-ethylidenaminooxy)dimethylamino-morpholinomethylene]}methaneaminiumHexafluorophosphate (COMU) (Sigma-Aldrich, St. Louis, MO, USA) by SPPS of Leu-Enkephalin (YGGFL) or the coupling of commercially synthesized YGGFL peptides to (3-aminopropyl)triethyoxysilane-modified glass surfaces. Coupling efficiency was quantitated by fluorescence signaling based on immunoreactivity of themore » YGGFL motif. It was concluded that COMU solutions should be prepared fresh and used within 5 h when stored at ~23 °C and not beyond 24 h if stored refrigerated, both in closed containers. Caveats to gauging COMU stability by absorption spectroscopy are discussed. Commercial YGGFL peptides needed independent QC, due to immunoreactivity variations for the same sequence synthesized by different vendors. This technique is useful in evaluating the stability of other activation/coupling reagents besides COMU and as a metrology tool for SPPS and peptide microarrays.« less

  15. Evaluation of medical record quality and communication skills among pediatric interns after standardized parent training history-taking in China.

    PubMed

    Yu, Mu Xue; Jiang, Xiao Yun; Li, Yi Juan; Shen, Zhen Yu; Zhuang, Si Qi; Gu, Yu Fen

    2018-02-01

    The effect of using standardized parent training history-taking on the quality of medical records and communication skills among pediatric interns was determined. Fifth-year interns who were undertaking a pediatric clinical practice rotation were randomized to intervention and control groups. All of the pediatric interns received history-taking training by lecture and bedside teaching. The pediatric interns in the intervention group also received standardized parent history-taking training. The following two outcome measures were used: the scores of medical records, which were written by the pediatric interns after history-taking from real parents of pediatric patients; and the communication assessment tool (CAT) assessed by real parents. The general information, history of present illness (HPI), past medical history, personal history, family history, diagnosis, diagnostic analysis, and differential diagnosis scores in the intervention group were significantly higher than the control group (p < 0.05). Assessment of the CAT indicated that the real parents were more satisfied with the pediatric interns in the intervention group. Standardized parent training history-taking is effective in improving the quality of medical records by pediatric interns. Standardized parent training history-taking is a superior teaching tool for clinical reasoning ability, as well as communication skills in clinical pediatric practice.

  16. [Preliminary studies on critical control point of traceability system in wolfberry].

    PubMed

    Liu, Sai; Xu, Chang-Qing; Li, Jian-Ling; Lin, Chen; Xu, Rong; Qiao, Hai-Li; Guo, Kun; Chen, Jun

    2016-07-01

    As a traditional Chinese medicine, wolfberry (Lycium barbarum) has a long cultivation history and a good industrial development foundation. With the development of wolfberry production, the expansion of cultivation area and the increased attention of governments and consumers on food safety, the quality and safety requirement of wolfberry is higher demanded. The quality tracing and traceability system of production entire processes is the important technology tools to protect the wolfberry safety, and to maintain sustained and healthy development of the wolfberry industry. Thus, this article analyzed the wolfberry quality management from the actual situation, the safety hazard sources were discussed according to the HACCP (hazard analysis and critical control point) and GAP (good agricultural practice for Chinese crude drugs), and to provide a reference for the traceability system of wolfberry. Copyright© by the Chinese Pharmaceutical Association.

  17. Iterative evaluation in a mobile counseling and testing program to reach people of color at risk for HIV--new strategies improve program acceptability, effectiveness, and evaluation capabilities.

    PubMed

    Spielberg, Freya; Kurth, Ann; Reidy, William; McKnight, Teka; Dikobe, Wame; Wilson, Charles

    2011-06-01

    This article highlights findings from an evaluation that explored the impact of mobile versus clinic-based testing, rapid versus central-lab based testing, incentives for testing, and the use of a computer counseling program to guide counseling and automate evaluation in a mobile program reaching people of color at risk for HIV. The program's results show that an increased focus on mobile outreach using rapid testing, incentives and health information technology tools may improve program acceptability, quality, productivity and timeliness of reports. This article describes program design decisions based on continuous quality assessment efforts. It also examines the impact of the Computer Assessment and Risk Reduction Education computer tool on HIV testing rates, staff perception of counseling quality, program productivity, and on the timeliness of evaluation reports. The article concludes with a discussion of implications for programmatic responses to the Centers for Disease Control and Prevention's HIV testing recommendations.

  18. ITERATIVE EVALUATION IN A MOBILE COUNSELING AND TESTING PROGRAM TO REACH PEOPLE OF COLOR AT RISK FOR HIV—NEW STRATEGIES IMPROVE PROGRAM ACCEPTABILITY, EFFECTIVENESS, AND EVALUATION CAPABILITIES

    PubMed Central

    Spielberg, Freya; Kurth, Ann; Reidy, William; McKnight, Teka; Dikobe, Wame; Wilson, Charles

    2016-01-01

    This article highlights findings from an evaluation that explored the impact of mobile versus clinic-based testing, rapid versus central-lab based testing, incentives for testing, and the use of a computer counseling program to guide counseling and automate evaluation in a mobile program reaching people of color at risk for HIV. The program’s results show that an increased focus on mobile outreach using rapid testing, incentives and health information technology tools may improve program acceptability, quality, productivity and timeliness of reports. This article describes program design decisions based on continuous quality assessment efforts. It also examines the impact of the Computer Assessment and Risk Reduction Education computer tool on HIV testing rates, staff perception of counseling quality, program productivity, and on the timeliness of evaluation reports. The article concludes with a discussion of implications for programmatic responses to the Centers for Disease Control and Prevention’s HIV testing recommendations. PMID:21689041

  19. Using a statistical process control chart during the quality assessment of cancer registry data.

    PubMed

    Myles, Zachary M; German, Robert R; Wilson, Reda J; Wu, Manxia

    2011-01-01

    Statistical process control (SPC) charts may be used to detect acute variations in the data while simultaneously evaluating unforeseen aberrations that may warrant further investigation by the data user. Using cancer stage data captured by the Summary Stage 2000 (SS2000) variable, we sought to present a brief report highlighting the utility of the SPC chart during the quality assessment of cancer registry data. Using a county-level caseload for the diagnosis period of 2001-2004 (n=25,648), we found the overall variation of the SS2000 variable to be in control during diagnosis years of 2001 and 2002, exceeded the lower control limit (LCL) in 2003, and exceeded the upper control limit (UCL) in 2004; in situ/localized stages were in control throughout the diagnosis period, regional stage exceeded UCL in 2004, and distant stage exceeded the LCL in 2001 and the UCL in 2004. Our application of the SPC chart with cancer registry data illustrates that the SPC chart may serve as a readily available and timely tool for identifying areas of concern during the data collection and quality assessment of central cancer registry data.

  20. Accurate estimation of short read mapping quality for next-generation genome sequencing

    PubMed Central

    Ruffalo, Matthew; Koyutürk, Mehmet; Ray, Soumya; LaFramboise, Thomas

    2012-01-01

    Motivation: Several software tools specialize in the alignment of short next-generation sequencing reads to a reference sequence. Some of these tools report a mapping quality score for each alignment—in principle, this quality score tells researchers the likelihood that the alignment is correct. However, the reported mapping quality often correlates weakly with actual accuracy and the qualities of many mappings are underestimated, encouraging the researchers to discard correct mappings. Further, these low-quality mappings tend to correlate with variations in the genome (both single nucleotide and structural), and such mappings are important in accurately identifying genomic variants. Approach: We develop a machine learning tool, LoQuM (LOgistic regression tool for calibrating the Quality of short read mappings, to assign reliable mapping quality scores to mappings of Illumina reads returned by any alignment tool. LoQuM uses statistics on the read (base quality scores reported by the sequencer) and the alignment (number of matches, mismatches and deletions, mapping quality score returned by the alignment tool, if available, and number of mappings) as features for classification and uses simulated reads to learn a logistic regression model that relates these features to actual mapping quality. Results: We test the predictions of LoQuM on an independent dataset generated by the ART short read simulation software and observe that LoQuM can ‘resurrect’ many mappings that are assigned zero quality scores by the alignment tools and are therefore likely to be discarded by researchers. We also observe that the recalibration of mapping quality scores greatly enhances the precision of called single nucleotide polymorphisms. Availability: LoQuM is available as open source at http://compbio.case.edu/loqum/. Contact: matthew.ruffalo@case.edu. PMID:22962451

  1. Emerging developments in the standardized chemical characterization of indoor air quality.

    PubMed

    Nehr, Sascha; Hösen, Elisabeth; Tanabe, Shin-Ichi

    2017-01-01

    Despite the fact that the special characteristics of indoor air pollution make closed environments quite different from outdoor environments, the conceptual ideas for assessing air quality indoors and outdoors are similar. Therefore, the elaboration of International Standards for air quality characterization in view of controlling indoor air quality should resort to this common basis. In this short review we describe the possibilities of standardization of tools dedicated to indoor air quality characterization with a focus on the tools permitting to study the indoor air chemistry. The link between indoor exposure and health as well as the critical processes driving the indoor air quality are introduced. Available International Standards for the assessment of indoor air quality are depicted. The standards comprise requirements for the sampling on site, the analytical procedures, and the determination of material emissions. To date, these standardized procedures assure that indoor air, settled dust and material samples are analyzed in a comparable manner. However, existing International Standards exclusively specify conventional, event-driven target-screening using discontinuous measurement methods for long-lived pollutants. Therefore, this review draws a parallel between physico-chemical processes in indoor and outdoor environments. The achievements in atmospheric sciences also improve our understanding of indoor environments. The community of atmospheric scientists can be both ideal and supporter for researchers in the area of indoor air quality characterization. This short review concludes with propositions for future standardization activities for the chemical characterization of indoor air quality. Future standardization efforts should focus on: (i) the elaboration of standardized measurement methods and measurement strategies for online monitoring of long-lived and short-lived pollutants, (ii) the assessment of the potential and the limitations of non-target screening, (iii) the paradigm shift from event-driven investigations to systematic approaches to characterize indoor environments, and (iv) the development of tools for policy implementation. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Failure analysis in the identification of synergies between cleaning monitoring methods.

    PubMed

    Whiteley, Greg S; Derry, Chris; Glasbey, Trevor

    2015-02-01

    The 4 monitoring methods used to manage the quality assurance of cleaning outcomes within health care settings are visual inspection, microbial recovery, fluorescent marker assessment, and rapid ATP bioluminometry. These methods each generate different types of information, presenting a challenge to the successful integration of monitoring results. A systematic approach to safety and quality control can be used to interrogate the known qualities of cleaning monitoring methods and provide a prospective management tool for infection control professionals. We investigated the use of failure mode and effects analysis (FMEA) for measuring failure risk arising through each cleaning monitoring method. FMEA uses existing data in a structured risk assessment tool that identifies weaknesses in products or processes. Our FMEA approach used the literature and a small experienced team to construct a series of analyses to investigate the cleaning monitoring methods in a way that minimized identified failure risks. FMEA applied to each of the cleaning monitoring methods revealed failure modes for each. The combined use of cleaning monitoring methods in sequence is preferable to their use in isolation. When these 4 cleaning monitoring methods are used in combination in a logical sequence, the failure modes noted for any 1 can be complemented by the strengths of the alternatives, thereby circumventing the risk of failure of any individual cleaning monitoring method. Copyright © 2015 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  3. Process evaluation of software using the international classification of external causes of injuries for collecting burn injury data at burn centers in the United States.

    PubMed

    Villaveces, Andrés; Peck, Michael; Faraklas, Iris; Hsu-Chang, Naiwei; Joe, Victor; Wibbenmeyer, Lucy

    2014-01-01

    Detailed information on the cause of burns is necessary to construct effective prevention programs. The International Classification of External Causes of Injury (ICECI) is a data collection tool that allows comprehensive categorization of multiple facets of injury events. The objective of this study was to conduct a process evaluation of software designed to improve the ease of use of the ICECI so as to identify key additional variables useful for understanding the occurrence of burn injuries, and compare this software with existing data-collection practices conducted for burn injuries. The authors completed a process evaluation of the implementation and ease of use of the software in six U.S. burn centers. They also collected preliminary burn injury data and compared them with existing variables reported to the American Burn Association's National Burn Repository (NBR). The authors accomplished their goals of 1) creating a data-collection tool for the ICECI, which can be linked to existing operational programs of the NBR, 2) training registrars in the use of this tool, 3) establishing quality-control mechanisms for ensuring accuracy and reliability, 4) incorporating ICECI data entry into the weekly routine of the burn registrar, and 5) demonstrating the quality differences between data collected using this tool and the NBR. Using this or similar tools with the ICECI structure or key selected variables can improve the quantity and quality of data on burn injuries in the United States and elsewhere and thus can be more useful in informing prevention strategies.

  4. A Cluster Randomized-Controlled Trial of the Impact of the Tools of the Mind Curriculum on Self-Regulation in Canadian Preschoolers.

    PubMed

    Solomon, Tracy; Plamondon, Andre; O'Hara, Arland; Finch, Heather; Goco, Geraldine; Chaban, Peter; Huggins, Lorrie; Ferguson, Bruce; Tannock, Rosemary

    2017-01-01

    Early self-regulation predicts school readiness, academic success, and quality of life in adulthood. Its development in the preschool years is rapid and also malleable. Thus, preschool curricula that promote the development of self-regulation may help set children on a more positive developmental trajectory. We conducted a cluster-randomized controlled trial of the Tools of the Mind preschool curriculum, a program that targets self-regulation through imaginative play and self-regulatory language (Tools; clinical trials identifier NCT02462733). Previous research with Tools is limited, with mixed evidence of its effectiveness. Moreover, it is unclear whether it would benefit all preschoolers or primarily those with poorly developed cognitive capacities (e.g., language, executive function, attention). The study goals were to ascertain whether the Tools program leads to greater gains in self-regulation compared to Playing to Learn (YMCA PTL), another play based program that does not target self-regulation specifically, and whether the effects were moderated by children's initial language and hyperactivity/inattention. Two hundred and sixty 3- to 4-year-olds attending 20 largely urban daycares were randomly assigned, at the site level, to receive either Tools or YMCA PTL (the business-as-usual curriculum) for 15 months. We assessed self-regulation at pre-, mid and post intervention, using two executive function tasks, and two questionnaires regarding behavior at home and at school, to capture development in cognitive as well as socio-emotional aspects of self-regulation. Fidelity data showed that only the teachers at the Tools sites implemented Tools, and did so with reasonable success. We found that children who received Tools made greater gains on a behavioral measure of executive function than their YMCA PTL peers, but the difference was significant only for those children whose parents rated them high in hyperactivity/inattention initially. The effect of Tools did not vary with children's initial language skills. We suggest that, as both programs promote quality play and that the two groups fared similarly well overall, Tools and YMCA PTL may be effective curricula choices for a diverse preschool classroom. However, Tools may be advantageous in classrooms with children experiencing greater challenges with self-regulation, at no apparent cost to those less challenged in this regard.

  5. A Cluster Randomized-Controlled Trial of the Impact of the Tools of the Mind Curriculum on Self-Regulation in Canadian Preschoolers

    PubMed Central

    Solomon, Tracy; Plamondon, Andre; O’Hara, Arland; Finch, Heather; Goco, Geraldine; Chaban, Peter; Huggins, Lorrie; Ferguson, Bruce; Tannock, Rosemary

    2018-01-01

    Early self-regulation predicts school readiness, academic success, and quality of life in adulthood. Its development in the preschool years is rapid and also malleable. Thus, preschool curricula that promote the development of self-regulation may help set children on a more positive developmental trajectory. We conducted a cluster-randomized controlled trial of the Tools of the Mind preschool curriculum, a program that targets self-regulation through imaginative play and self-regulatory language (Tools; clinical trials identifier NCT02462733). Previous research with Tools is limited, with mixed evidence of its effectiveness. Moreover, it is unclear whether it would benefit all preschoolers or primarily those with poorly developed cognitive capacities (e.g., language, executive function, attention). The study goals were to ascertain whether the Tools program leads to greater gains in self-regulation compared to Playing to Learn (YMCA PTL), another play based program that does not target self-regulation specifically, and whether the effects were moderated by children’s initial language and hyperactivity/inattention. Two hundred and sixty 3- to 4-year-olds attending 20 largely urban daycares were randomly assigned, at the site level, to receive either Tools or YMCA PTL (the business-as-usual curriculum) for 15 months. We assessed self-regulation at pre-, mid and post intervention, using two executive function tasks, and two questionnaires regarding behavior at home and at school, to capture development in cognitive as well as socio-emotional aspects of self-regulation. Fidelity data showed that only the teachers at the Tools sites implemented Tools, and did so with reasonable success. We found that children who received Tools made greater gains on a behavioral measure of executive function than their YMCA PTL peers, but the difference was significant only for those children whose parents rated them high in hyperactivity/inattention initially. The effect of Tools did not vary with children’s initial language skills. We suggest that, as both programs promote quality play and that the two groups fared similarly well overall, Tools and YMCA PTL may be effective curricula choices for a diverse preschool classroom. However, Tools may be advantageous in classrooms with children experiencing greater challenges with self-regulation, at no apparent cost to those less challenged in this regard. PMID:29403411

  6. Framing quality improvement tools and techniques in healthcare the case of improvement leaders' guides.

    PubMed

    Millar, Ross

    2013-01-01

    The purpose of this paper is to present a study of how quality improvement tools and techniques are framed within healthcare settings. The paper employs an interpretive approach to understand how quality improvement tools and techniques are mobilised and legitimated. It does so using a case study of the NHS Modernisation Agency Improvement Leaders' Guides in England. Improvement Leaders' Guides were framed within a service improvement approach encouraging the use of quality improvement tools and techniques within healthcare settings. Their use formed part of enacting tools and techniques across different contexts. Whilst this enactment was believed to support the mobilisation of tools and techniques, the experience also illustrated the challenges in distributing such approaches. The paper provides an important contribution in furthering our understanding of framing the "social act" of quality improvement. Given the ongoing emphasis on quality improvement in health systems and the persistent challenges involved, it also provides important information for healthcare leaders globally in seeking to develop, implement or modify similar tools and distribute leadership within health and social care settings.

  7. The Digital electronic Guideline Library (DeGeL): a hybrid framework for representation and use of clinical guidelines.

    PubMed

    Shahar, Yuval; Young, Ohad; Shalom, Erez; Mayaffit, Alon; Moskovitch, Robert; Hessing, Alon; Galperin, Maya

    2004-01-01

    We propose to present a poster (and potentially also a demonstration of the implemented system) summarizing the current state of our work on a hybrid, multiple-format representation of clinical guidelines that facilitates conversion of guidelines from free text to a formal representation. We describe a distributed Web-based architecture (DeGeL) and a set of tools using the hybrid representation. The tools enable performing tasks such as guideline specification, semantic markup, search, retrieval, visualization, eligibility determination, runtime application and retrospective quality assessment. The representation includes four parallel formats: Free text (one or more original sources); semistructured text (labeled by the target guideline-ontology semantic labels); semiformal text (which includes some control specification); and a formal, machine-executable representation. The specification, indexing, search, retrieval, and browsing tools are essentially independent of the ontology chosen for guideline representation, but editing the semi-formal and formal formats requires ontology-specific tools, which we have developed in the case of the Asbru guideline-specification language. The four formats support increasingly sophisticated computational tasks. The hybrid guidelines are stored in a Web-based library. All tools, such as for runtime guideline application or retrospective quality assessment, are designed to operate on all representations. We demonstrate the hybrid framework by providing examples from the semantic markup and search tools.

  8. TU-AB-BRD-00: Task Group 100

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    2015-06-15

    Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less

  9. TU-AB-BRD-03: Fault Tree Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dunscombe, P.

    2015-06-15

    Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less

  10. TU-AB-BRD-01: Process Mapping

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palta, J.

    2015-06-15

    Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less

  11. TU-AB-BRD-02: Failure Modes and Effects Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huq, M.

    2015-06-15

    Current quality assurance and quality management guidelines provided by various professional organizations are prescriptive in nature, focusing principally on performance characteristics of planning and delivery devices. However, published analyses of events in radiation therapy show that most events are often caused by flaws in clinical processes rather than by device failures. This suggests the need for the development of a quality management program that is based on integrated approaches to process and equipment quality assurance. Industrial engineers have developed various risk assessment tools that are used to identify and eliminate potential failures from a system or a process before amore » failure impacts a customer. These tools include, but are not limited to, process mapping, failure modes and effects analysis, fault tree analysis. Task Group 100 of the American Association of Physicists in Medicine has developed these tools and used them to formulate an example risk-based quality management program for intensity-modulated radiotherapy. This is a prospective risk assessment approach that analyzes potential error pathways inherent in a clinical process and then ranks them according to relative risk, typically before implementation, followed by the design of a new process or modification of the existing process. Appropriate controls are then put in place to ensure that failures are less likely to occur and, if they do, they will more likely be detected before they propagate through the process, compromising treatment outcome and causing harm to the patient. Such a prospective approach forms the basis of the work of Task Group 100 that has recently been approved by the AAPM. This session will be devoted to a discussion of these tools and practical examples of how these tools can be used in a given radiotherapy clinic to develop a risk based quality management program. Learning Objectives: Learn how to design a process map for a radiotherapy process Learn how to perform failure modes and effects analysis analysis for a given process Learn what fault trees are all about Learn how to design a quality management program based upon the information obtained from process mapping, failure modes and effects analysis and fault tree analysis. Dunscombe: Director, TreatSafely, LLC and Center for the Assessment of Radiological Sciences; Consultant to IAEA and Varian Thomadsen: President, Center for the Assessment of Radiological Sciences Palta: Vice President of the Center for the Assessment of Radiological Sciences.« less

  12. Enabling Mobile Air Quality App Development with an AirNow API

    NASA Astrophysics Data System (ADS)

    Dye, T.; White, J. E.; Ludewig, S. A.; Dickerson, P.; Healy, A. N.; West, J. W.; Prince, L. A.

    2013-12-01

    The U.S. Environmental Protection Agency's (EPA) AirNow program works with over 130 participating state, local, and federal air quality agencies to obtain, quality control, and store real-time air quality observations and forecasts. From these data, the AirNow system generates thousands of maps and products each hour. Each day, information from AirNow is published online and in other media to assist the public in making health-based decisions related to air quality. However, an increasing number of people use mobile devices as their primary tool for obtaining information, and AirNow has responded to this trend by publishing an easy-to-use Web API that is useful for mobile app developers. This presentation will describe the various features of the AirNow application programming interface (API), including Representational State Transfer (REST)-type web services, file outputs, and RSS feeds. In addition, a web portal for the AirNow API will be shown, including documentation on use of the system, a query tool for configuring and running web services, and general information about the air quality data and forecasts available. Data published via the AirNow API includes corresponding Air Quality Index (AQI) levels for each pollutant. We will highlight examples of mobile apps that are using the AirNow API to provide location-based, real-time air quality information. Examples will include mobile apps developed for Minnesota ('Minnesota Air') and Washington, D.C. ('Clean Air Partners Air Quality'), and an app developed by EPA ('EPA AirNow').

  13. Standard Free Droplet Digital Polymerase Chain Reaction as a New Tool for the Quality Control of High-Capacity Adenoviral Vectors in Small-Scale Preparations

    PubMed Central

    Boehme, Philip; Stellberger, Thorsten; Solanki, Manish; Zhang, Wenli; Schulz, Eric; Bergmann, Thorsten; Liu, Jing; Doerner, Johannes; Baiker, Armin E.

    2015-01-01

    Abstract High-capacity adenoviral vectors (HCAdVs) are promising tools for gene therapy as well as for genetic engineering. However, one limitation of the HCAdV vector system is the complex, time-consuming, and labor-intensive production process and the following quality control procedure. Since HCAdVs are deleted for all viral coding sequences, a helper virus (HV) is needed in the production process to provide the sequences for all viral proteins in trans. For the purification procedure of HCAdV, cesium chloride density gradient centrifugation is usually performed followed by buffer exchange using dialysis or comparable methods. However, performing these steps is technically difficult, potentially error-prone, and not scalable. Here, we establish a new protocol for small-scale production of HCAdV based on commercially available adenovirus purification systems and a standard method for the quality control of final HCAdV preparations. For titration of final vector preparations, we established a droplet digital polymerase chain reaction (ddPCR) that uses a standard free-end-point PCR in small droplets of defined volume. By using different probes, this method is capable of detecting and quantifying HCAdV and HV in one reaction independent of reference material, rendering this method attractive for accurately comparing viral titers between different laboratories. In summary, we demonstrate that it is possible to produce HCAdV in a small scale of sufficient quality and quantity to perform experiments in cell culture, and we established a reliable protocol for vector titration based on ddPCR. Our method significantly reduces time and required equipment to perform HCAdV production. In the future the ddPCR technology could be advantageous for titration of other viral vectors commonly used in gene therapy. PMID:25640117

  14. Comparison of Three Quality of Life Instruments in Lymphatic Filariasis: DLQI, WHODAS 2.0, and LFSQQ

    PubMed Central

    Thomas, Cristina; Narahari, Saravu R.; Bose, Kuthaje S.; Vivekananda, Kuthaje; Nwe, Steven; West, Dennis P.; Kwasny, Mary; Kundu, Roopal V.

    2014-01-01

    Background The Global Program to Eliminate Lymphatic Filariasis aims to interrupt transmission of lymphatic filariasis and manage morbidity in people currently living with the disease. A component of morbidity management is improving health-related quality of life (HRQoL) in patients. Measurement of HRQoL in current management programs is varied because of the lack of a standard HRQoL tool for use in the lymphatic filariasis population. Methodology/Principal Findings In this study, the psychometric properties of three health status measures were compared when used in a group of lymphatic filariasis patients and healthy controls. The World Health Organization Disability Assessment Schedule 2.0 (WHODAS 2.0), the Dermatology Life Quality Index (DLQI), and the Lymphatic Filariasis Quality of Life Questionnaire (LFSQQ) were administered to 36 stage II and stage III lymphatic filariasis subjects and 36 age and sex matched controls in Kerala, India. All three tools yielded missing value rates lower than 10%, suggesting high feasibility. Highest internal consistency was seen in the LFSQQ (α = 0.97). Discriminant validity analysis demonstrated that HRQoL was significantly lower in the LF group than in controls for the WHODAS 2.0, DLQI, and LFSQQ, but total HRQoL scores did not differ between stage II and stage III lymphedema subjects. The LFSQQ total score correlated most strongly with the WHODAS 2.0 (r = 0.91, p<0.001) and DLQI (r = 0.81, p<0.001). Conclusions/Significance The WHODAS 2.0, DLQI, and LFSQQ demonstrate acceptable feasibility, internal consistency, discriminate validity, and construct validity. Based on our psychometric analyses, the LFSQQ performs the best and is recommended for use in the lymphatic filariasis population. PMID:24587467

  15. Epidemiology and quality assurance: applications at farm level.

    PubMed

    Noordhuizen, J P; Frankena, K

    1999-03-29

    Animal production is relevant with respect to farm income and the position of the sector in the market, but also with respect to the quality and safety of products of animal origin, related to public health. Animal production is part of a chain of food production. Therefore, producers have to take consumer expectations and demands in the domains of animal health, welfare and environment into account. A different attitude for production has to be adopted; this attitude can be visualized in good farming practice, GFP, codes. Farmers who focused on quality in its broadest sense need a system supporting them in their management and control of quality risks. Generally speaking, there are three systems for that purpose: GFP, ISO and HACCP. When the hypothesis followed relates to animal health being a feature of quality, or else welfare and environmental issues, then animal health care can be executed following quality control principles. The HACCP concept is well suited for quality control at farm level, involving risk identification and risk management. The on-farm monitoring and surveillance system of critical control points in the animal production process is the most important tool in this procedure. Principles for HACCP application as well as certification fitness of HACCP are elaborated upon. They are illustrated by using salmonellosis in meat-pig farms as objective for an HACCP approach. It is further discussed that, in addition to animal health and quality, animal welfare and environmental issues could also be covered by an HACCP-like system in an integrated manner. Ultimately, the HACCP modules could end up in an overall ISO certification.

  16. Objective Evaluation Tool for Texture-Modified Food (OET-TMF): Development of the Tool and Validation.

    PubMed

    Calleja-Fernández, Alicia; Pintor-de-la-Maza, Begoña; Vidal-Casariego, Alfonso; Cano-Rodríguez, Isidoro; Ballesteros-Pomar, María D

    2016-06-01

    Texture-modified diets (TMDs) should fulfil nutritional goals, guarantee homogenous texture, and meet food safety regulations. The food industry has created texture-modified food (TMF) that meets the TMD requirements of quality and safety for inpatients. To design and develop a tool that allows the objective selection of foodstuffs for TMDs that ensures nutritional requirements and swallowing safety of inpatients in order to improve their quality of life, especially regarding their food satisfaction. An evaluation tool was designed to objectively determine the adequacy of food included in the TMD menus of a hospital. The "Objective Evaluation Tool for Texture-Modified Food" (OET-TMF) consists of seven items that evaluate the food's nutritional quality (energy and protein input), presence of allergens, texture and viscosity, cooking, storage type, useful life, and patient acceptance. The total score ranged from 0 to 64 and was divided into four categories: high quality, good quality, medium quality, and low quality. Studying four different commercial TMFs contributed to the validation of the tool. All the evaluated products scored between high and good regarding quality. There was a tendency (p = 0.077) towards higher consumption and a higher overall quality of the product obtained with the OET-TMF. The product that scored highest with the tool was the best accepted; the product with the lowest score had the highest rate of refusal. The OET-TMF allows for the objective discrimination of the quality of TMF. In addition, it shows a certain relationship between the observed and assessed quality intake.

  17. The Three-item ALERT-B Questionnaire Provides a Validated Screening Tool to Detect Chronic Gastrointestinal Symptoms after Pelvic Radiotherapy in Cancer Survivors.

    PubMed

    Taylor, S; Byrne, A; Adams, R; Turner, J; Hanna, L; Staffurth, J; Farnell, D; Sivell, S; Nelson, A; Green, J

    2016-10-01

    Although pelvic radiotherapy is an effective treatment for various malignancies, around half of patients develop significant gastrointestinal problems. These symptoms often remain undetected, despite the existence of effective treatments. This study developed and refined a simple screening tool to detect common gastrointestinal symptoms in outpatient clinics. These symptoms have a significant effect on quality of life. This tool will increase detection rates and so enable access to specialist gastroenterologists, which will in turn lead to improved symptom control and quality of life after treatment. A literature review and expert consensus meeting identified four items for the ALERT-B (Assessment of Late Effects of RadioTherapy - Bowel) screening tool. ALERT-B was face tested for its usability and acceptability using cognitive interviews with 12 patients experiencing late gastrointestinal symptoms after pelvic radiotherapy. Thematic analysis and probe category were used to analyse interview transcripts. Interview data were presented to a group of experts to agree on the final content and format of the tool. ALERT-B was assessed for reliability and tested for validity against the Gastrointestinal Symptom Rating Scale in a clinical study (EAGLE). Overall, the tool was found to be acceptable in terms of wording, response format and completion time. Participant-reported experiences, including lifestyle modifications and the psychological effect of the symptoms, led to further modifications of the tool. The refined tool includes three questions covering rectal bleeding, incontinence, nocturnal bowel movements and impact on quality of life, including mood, relationships and socialising. ALERT-B was successfully validated against the Gastrointestinal Symptom Rating Scale in the EAGLE study with the tool shown broadly to be internally consistent (Cronbach's α = 0.61 and all item-subscale correlation [Spearman] coefficients are > 0.6). The ALERT-B screening tool can be used in clinical practice to improve post-treatment supportive care by triggering the clinical assessment of patients suitable for referral to a gastroenterologist. Copyright © 2016 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.

  18. THE ATMOSPHERIC MODEL EVALUATION TOOL (AMET); AIR QUALITY MODULE

    EPA Science Inventory

    This presentation reviews the development of the Atmospheric Model Evaluation Tool (AMET) air quality module. The AMET tool is being developed to aid in the model evaluation. This presentation focuses on the air quality evaluation portion of AMET. Presented are examples of the...

  19. Quality Assessment of Comparative Diagnostic Accuracy Studies: Our Experience Using a Modified Version of the QUADAS-2 Tool

    ERIC Educational Resources Information Center

    Wade, Ros; Corbett, Mark; Eastwood, Alison

    2013-01-01

    Assessing the quality of included studies is a vital step in undertaking a systematic review. The recently revised Quality Assessment of Diagnostic Accuracy Studies (QUADAS) tool (QUADAS-2), which is the only validated quality assessment tool for diagnostic accuracy studies, does not include specific criteria for assessing comparative studies. As…

  20. The PEPR GeneChip data warehouse, and implementation of a dynamic time series query tool (SGQT) with graphical interface.

    PubMed

    Chen, Josephine; Zhao, Po; Massaro, Donald; Clerch, Linda B; Almon, Richard R; DuBois, Debra C; Jusko, William J; Hoffman, Eric P

    2004-01-01

    Publicly accessible DNA databases (genome browsers) are rapidly accelerating post-genomic research (see http://www.genome.ucsc.edu/), with integrated genomic DNA, gene structure, EST/ splicing and cross-species ortholog data. DNA databases have relatively low dimensionality; the genome is a linear code that anchors all associated data. In contrast, RNA expression and protein databases need to be able to handle very high dimensional data, with time, tissue, cell type and genes, as interrelated variables. The high dimensionality of microarray expression profile data, and the lack of a standard experimental platform have complicated the development of web-accessible databases and analytical tools. We have designed and implemented a public resource of expression profile data containing 1024 human, mouse and rat Affymetrix GeneChip expression profiles, generated in the same laboratory, and subject to the same quality and procedural controls (Public Expression Profiling Resource; PEPR). Our Oracle-based PEPR data warehouse includes a novel time series query analysis tool (SGQT), enabling dynamic generation of graphs and spreadsheets showing the action of any transcript of interest over time. In this report, we demonstrate the utility of this tool using a 27 time point, in vivo muscle regeneration series. This data warehouse and associated analysis tools provides access to multidimensional microarray data through web-based interfaces, both for download of all types of raw data for independent analysis, and also for straightforward gene-based queries. Planned implementations of PEPR will include web-based remote entry of projects adhering to quality control and standard operating procedure (QC/SOP) criteria, and automated output of alternative probe set algorithms for each project (see http://microarray.cnmcresearch.org/pgadatatable.asp).

  1. The PEPR GeneChip data warehouse, and implementation of a dynamic time series query tool (SGQT) with graphical interface

    PubMed Central

    Chen, Josephine; Zhao, Po; Massaro, Donald; Clerch, Linda B.; Almon, Richard R.; DuBois, Debra C.; Jusko, William J.; Hoffman, Eric P.

    2004-01-01

    Publicly accessible DNA databases (genome browsers) are rapidly accelerating post-genomic research (see http://www.genome.ucsc.edu/), with integrated genomic DNA, gene structure, EST/ splicing and cross-species ortholog data. DNA databases have relatively low dimensionality; the genome is a linear code that anchors all associated data. In contrast, RNA expression and protein databases need to be able to handle very high dimensional data, with time, tissue, cell type and genes, as interrelated variables. The high dimensionality of microarray expression profile data, and the lack of a standard experimental platform have complicated the development of web-accessible databases and analytical tools. We have designed and implemented a public resource of expression profile data containing 1024 human, mouse and rat Affymetrix GeneChip expression profiles, generated in the same laboratory, and subject to the same quality and procedural controls (Public Expression Profiling Resource; PEPR). Our Oracle-based PEPR data warehouse includes a novel time series query analysis tool (SGQT), enabling dynamic generation of graphs and spreadsheets showing the action of any transcript of interest over time. In this report, we demonstrate the utility of this tool using a 27 time point, in vivo muscle regeneration series. This data warehouse and associated analysis tools provides access to multidimensional microarray data through web-based interfaces, both for download of all types of raw data for independent analysis, and also for straightforward gene-based queries. Planned implementations of PEPR will include web-based remote entry of projects adhering to quality control and standard operating procedure (QC/SOP) criteria, and automated output of alternative probe set algorithms for each project (see http://microarray.cnmcresearch.org/pgadatatable.asp). PMID:14681485

  2. The development of a quality appraisal tool for studies of diagnostic reliability (QAREL).

    PubMed

    Lucas, Nicholas P; Macaskill, Petra; Irwig, Les; Bogduk, Nikolai

    2010-08-01

    In systematic reviews of the reliability of diagnostic tests, no quality assessment tool has been used consistently. The aim of this study was to develop a specific quality appraisal tool for studies of diagnostic reliability. Key principles for the quality of studies of diagnostic reliability were identified with reference to epidemiologic principles, existing quality appraisal checklists, and the Standards for Reporting of Diagnostic Accuracy (STARD) and Quality Assessment of Diagnostic Accuracy Studies (QUADAS) resources. Specific items that encompassed each of the principles were developed. Experts in diagnostic research provided feedback on the items that were to form the appraisal tool. This process was iterative and continued until consensus among experts was reached. The Quality Appraisal of Reliability Studies (QAREL) checklist includes 11 items that explore seven principles. Items cover the spectrum of subjects, spectrum of examiners, examiner blinding, order effects of examination, suitability of the time interval among repeated measurements, appropriate test application and interpretation, and appropriate statistical analysis. QAREL has been developed as a specific quality appraisal tool for studies of diagnostic reliability. The reliability of this tool in different contexts needs to be evaluated. Copyright (c) 2010 Elsevier Inc. All rights reserved.

  3. Evaluating radiographers' diagnostic accuracy in screen-reading mammograms: what constitutes a quality study?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Debono, Josephine C, E-mail: josephine.debono@bci.org.au; Poulos, Ann E; Westmead Breast Cancer Institute, Westmead, New South Wales

    The aim of this study was to first evaluate the quality of studies investigating the diagnostic accuracy of radiographers as mammogram screen-readers and then to develop an adapted tool for determining the quality of screen-reading studies. A literature search was used to identify relevant studies and a quality evaluation tool constructed by combining the criteria for quality of Whiting, Rutjes, Dinnes et al. and Brealey and Westwood. This constructed tool was then applied to the studies and subsequently adapted specifically for use in evaluating quality in studies investigating diagnostic accuracy of screen-readers. Eleven studies were identified and the constructed toolmore » applied to evaluate quality. This evaluation resulted in the identification of quality issues with the studies such as potential for bias, applicability of results, study conduct, reporting of the study and observer characteristics. An assessment of the applicability and relevance of the tool for this area of research resulted in adaptations to the criteria and the development of a tool specifically for evaluating diagnostic accuracy in screen-reading. This tool, with further refinement and rigorous validation can make a significant contribution to promoting well-designed studies in this important area of research and practice.« less

  4. Advanced process control framework initiative

    NASA Astrophysics Data System (ADS)

    Hill, Tom; Nettles, Steve

    1997-01-01

    The semiconductor industry, one the world's most fiercely competitive industries, is driven by increasingly complex process technologies and global competition to improve cycle time, quality, and process flexibility. Due to the complexity of these problems, current process control techniques are generally nonautomated, time-consuming, reactive, nonadaptive, and focused on individual fabrication tools and processes. As the semiconductor industry moves into higher density processes, radical new approaches are required. To address the need for advanced factory-level process control in this environment, Honeywell, Advanced Micro Devices (AMD), and SEMATECH formed the Advanced Process Control Framework Initiative (APCFI) joint research project. The project defines and demonstrates an Advanced Process Control (APC) approach based on SEMATECH's Computer Integrated Manufacturing (CIM) Framework. Its scope includes the coordination of Manufacturing Execution Systems, process control tools, and wafer fabrication equipment to provide necessary process control capabilities. Moreover, it takes advantage of the CIM Framework to integrate and coordinate applications from other suppliers that provide services necessary for the overall system to function. This presentation discusses the key concept of model-based process control that differentiates the APC Framework. This major improvement over current methods enables new systematic process control by linking the knowledge of key process settings to desired product characteristics that reside in models created with commercial model development tools The unique framework-based approach facilitates integration of commercial tools and reuse of their data by tying them together in an object-based structure. The presentation also explores the perspective of each organization's involvement in the APCFI project. Each has complementary goals and expertise to contribute; Honeywell represents the supplier viewpoint, AMD represents the user with 'real customer requirements', and SEMATECH provides a consensus-building organization that widely disseminates technology to suppliers and users in the semiconductor industry that face similar equipment and factory control systems challenges.

  5. Feasibility of UV-VIS-Fluorescence spectroscopy combined with pattern recognition techniques to authenticate a new category of plant food supplements.

    PubMed

    Boggia, Raffaella; Turrini, Federica; Anselmo, Marco; Zunin, Paola; Donno, Dario; Beccaro, Gabriele L

    2017-07-01

    Bud extracts, named also "gemmoderivatives", are a new category of natural products, obtained macerating meristematic fresh tissues of trees and plants. In the European Community these botanical remedies are classified as plant food supplements. Nowadays these products are still poorly studied, even if they are widely used and commercialized. Several analytical tools for the quality control of these very expensive supplements are urgently needed in order to avoid mislabelling and frauds. In fact, besides the usual quality controls common to the other botanical dietary supplements, these extracts should be checked in order to quickly detect if the cheaper adult parts of the plants are deceptively used in place of the corresponding buds whose harvest-period and production are extremely limited. This study aims to provide a screening analytical method based on UV-VIS-Fluorescence spectroscopy coupled to multivariate analysis for a rapid, inexpensive and non-destructive quality control of these products.

  6. How to Sustain Change and Support Continuous Quality Improvement

    PubMed Central

    McQuillan, Rory; Harel, Ziv; Weizman, Adam V.; Thomas, Alison; Nesrallah, Gihad; Bell, Chaim M.; Chan, Christopher T.; Chertow, Glenn M.

    2016-01-01

    To achieve sustainable change, quality improvement initiatives must become the new way of working rather than something added on to routine clinical care. However, most organizational change is not maintained. In this next article in this Moving Points in Nephrology feature on quality improvement, we provide health care professionals with strategies to sustain and support quality improvement. Threats to sustainability may be identified both at the beginning of a project and when it is ready for implementation. The National Health Service Sustainability Model is reviewed as one example to help identify issues that affect long-term success of quality improvement projects. Tools to help sustain improvement include process control boards, performance boards, standard work, and improvement huddles. Process control and performance boards are methods to communicate improvement results to staff and leadership. Standard work is a written or visual outline of current best practices for a task and provides a framework to ensure that changes that have improved patient care are consistently and reliably applied to every patient encounter. Improvement huddles are short, regular meetings among staff to anticipate problems, review performance, and support a culture of improvement. Many of these tools rely on principles of visual management, which are systems transparent and simple so that every staff member can rapidly distinguish normal from abnormal working conditions. Even when quality improvement methods are properly applied, the success of a project still depends on contextual factors. Context refers to aspects of the local setting in which the project operates. Context affects resources, leadership support, data infrastructure, team motivation, and team performance. For these reasons, the same project may thrive in a supportive context and fail in a different context. To demonstrate the practical applications of these quality improvement principles, these principles are applied to a hypothetical quality improvement initiative that aims to promote home dialysis (home hemodialysis and peritoneal dialysis). PMID:27016498

  7. Altered States of Consciousness Profile: An Afro-Centric Intrapsychic Evaluation Tool

    PubMed Central

    Bell, Carl C.; Thompson, Belinda; Shorter-Gooden, Kumea; Mays, Raymond; Shakoor, Bambade

    1985-01-01

    In an effort to develop an Afro-centric intrapsychic evaluation tool, the Community Mental Health Council, Inc., Altered States of Consciousness Research Team, developed a structured interview used to quantify and qualify the 17 states of consciousness1 that occurred in black control, precare, and aftercare subjects. Differences were noted in the three groups as to the incidence, prevalence, and quality of the various states of consciousness. It was also noted that the profile obtained from the interviews yielded a sharp clinical picture of the subjects' total intrapsychic propensities. PMID:4057274

  8. Practical applications of surface analytic tools in tribology

    NASA Technical Reports Server (NTRS)

    Ferrante, J.

    1980-01-01

    Many of the currently, widely used tools available for surface analysis are described. Those which have the highest applicability for giving elemental and/or compound analysis for problems of interest in tribology and are truly surface sensitive (that is, less than 10 atomic layers) are presented. The latter group is evaluated in detail in terms of strengths and weaknesses. Emphasis is placed on post facto analysis of experiments performed under 'real' conditions (e.g., in air with lubricants). It is further indicated that such equipment could be used for screening and quality control.

  9. 3D measurement by digital photogrammetry

    NASA Astrophysics Data System (ADS)

    Schneider, Carl T.

    1993-12-01

    Photogrammetry is well known in geodetic surveys as aerial photogrammetry or close range applications as architectural photogrammetry. The photogrammetric methods and algorithms combined with digital cameras and digital image processing methods are now introduced for industrial applications as automation and quality control. The presented paper will describe the photogrammetric and digital image processing algorithms and the calibration methods. These algorithms and methods were demonstrated with application examples. These applications are a digital photogrammetric workstation as a mobil multi purpose 3D measuring tool and a tube measuring system as an example for a single purpose tool.

  10. Design and Testing of a Tool for Evaluating the Quality of Diabetes Consumer-Information Web Sites

    PubMed Central

    Steinwachs, Donald; Rubin, Haya R

    2003-01-01

    Background Most existing tools for measuring the quality of Internet health information focus almost exclusively on structural criteria or other proxies for quality information rather than evaluating actual accuracy and comprehensiveness. Objective This research sought to develop a new performance-measurement tool for evaluating the quality of Internet health information, test the validity and reliability of the tool, and assess the variability in diabetes Web site quality. Methods An objective, systematic tool was developed to evaluate Internet diabetes information based on a quality-of-care measurement framework. The principal investigator developed an abstraction tool and trained an external reviewer on its use. The tool included 7 structural measures and 34 performance measures created by using evidence-based practice guidelines and experts' judgments of accuracy and comprehensiveness. Results Substantial variation existed in all categories, with overall scores following a normal distribution and ranging from 15% to 95% (mean was 50% and median was 51%). Lin's concordance correlation coefficient to assess agreement between raters produced a rho of 0.761 (Pearson's r of 0.769), suggesting moderate to high agreement. The average agreement between raters for the performance measures was 0.80. Conclusions Diabetes Web site quality varies widely. Alpha testing of this new tool suggests that it could become a reliable and valid method for evaluating the quality of Internet health sites. Such an instrument could help lay people distinguish between beneficial and misleading information. PMID:14713658

  11. Quality of surgical randomized controlled trials for acute cholecystitis: assessment based on CONSORT and additional check items.

    PubMed

    Shikata, Satoru; Nakayama, Takeo; Yamagishi, Hisakazu

    2008-01-01

    In this study, we conducted a limited survey of reports of surgical randomized controlled trials, using the consolidated standards of reporting trials (CONSORT) statement and additional check items to clarify problems in the evaluation of surgical reports. A total of 13 randomized trials were selected from two latest review articles on biliary surgery. Each randomized trial was evaluated according to 28 quality measures that comprised items from the CONSORT statement plus additional items. Analysis focused on relationships between the quality of each study and the estimated effect gap ("pooled estimate in meta-analysis" -- "estimated effect of each study"). No definite relationships were found between individual study quality and the estimated effect gap. The following items could have been described but were not provided in almost all the surgical RCT reports: "clearly defined outcomes"; "details of randomization"; "participant flow charts"; "intention-to-treat analysis"; "ancillary analyses"; and "financial conflicts of interest". The item, "participation of a trial methodologist in the study" was not found in any of the reports. Although the quality of reporting trials is not always related to a biased estimation of treatment effect, the items used for quality measures must be described to enable readers to evaluate the quality and applicability of the reporting. Further development of an assessment tool is needed for items specific to surgical randomized controlled trials.

  12. RZWQM predicted effects of soil N testing with incorporated automatic parameter optimization software (PEST) and weather input quality control

    USDA-ARS?s Scientific Manuscript database

    Among the most promising tools available for determining precise N requirements are soil mineral N tests. Field tests that evaluated this practice, however, have been conducted under only limited weather and soil conditions. Previous research has shown that using agricultural systems models such as ...

  13. Runoff delay exerts a strong control on the field-scale removal of manure-borne fecal bacteria with runoff

    USDA-ARS?s Scientific Manuscript database

    The microbial safety of surface waters is an ongoing issue which is threatened by the transport of manure-borne bacteria to water sources used for irrigation or recreation. Predictive modeling has become an effective tool to forecast the microbial quality of water duringprecipitation events, however...

  14. Use of Longitudinal Regression in Quality Control. Research Report. ETS RR-14-31

    ERIC Educational Resources Information Center

    Lu, Ying; Yen, Wendy M.

    2014-01-01

    This article explores the use of longitudinal regression as a tool for identifying scoring inaccuracies. Student progression patterns, as evaluated through longitudinal regressions, typically are more stable from year to year than are scale score distributions and statistics, which require representative samples to conduct credibility checks.…

  15. Acoustic measurements on trees and logs: a review and analysis

    Treesearch

    Xiping Wang

    2013-01-01

    Acoustic technologies have been well established as material evaluation tools in the past several decades, and their use has become widely accepted in the forest products industry for online quality control and products grading. Recent research developments on acoustic sensing technology offer further opportunities to evaluate standing trees and logs for general wood...

  16. The Allocation of Visual Attention in Multimedia Search Interfaces

    ERIC Educational Resources Information Center

    Hughes, Edith Allen

    2017-01-01

    Multimedia analysts are challenged by the massive numbers of unconstrained video clips generated daily. Such clips can include any possible scene and events, and generally have limited quality control. Analysts who must work with such data are overwhelmed by its volume and lack of computational tools to probe it effectively. Even with advances…

  17. Inverse simulation system for manual-controlled rendezvous and docking based on artificial neural network

    NASA Astrophysics Data System (ADS)

    Zhou, Wanmeng; Wang, Hua; Tang, Guojin; Guo, Shuai

    2016-09-01

    The time-consuming experimental method for handling qualities assessment cannot meet the increasing fast design requirements for the manned space flight. As a tool for the aircraft handling qualities research, the model-predictive-control structured inverse simulation (MPC-IS) has potential applications in the aerospace field to guide the astronauts' operations and evaluate the handling qualities more effectively. Therefore, this paper establishes MPC-IS for the manual-controlled rendezvous and docking (RVD) and proposes a novel artificial neural network inverse simulation system (ANN-IS) to further decrease the computational cost. The novel system was obtained by replacing the inverse model of MPC-IS with the artificial neural network. The optimal neural network was trained by the genetic Levenberg-Marquardt algorithm, and finally determined by the Levenberg-Marquardt algorithm. In order to validate MPC-IS and ANN-IS, the manual-controlled RVD experiments on the simulator were carried out. The comparisons between simulation results and experimental data demonstrated the validity of two systems and the high computational efficiency of ANN-IS.

  18. Effects of yoga on chronic neck pain: a systematic review of randomized controlled trials

    PubMed Central

    Kim, Sang-Dol

    2016-01-01

    [Purpose] The aim of this study was to investigate the effectiveness of yoga in the management of chronic neck pain. [Subjects and Methods] Five electronic databases were searched to identify randomized controlled trials (RCTs) of yoga intervention on chronic neck pain. The trials were published in the English language between January 1966 and December 2015. The Cochrane Risk of Bias Tool was used to assess the quality of the trials. [Results] Three trials were identified and included in this review. A critical appraisal was performed on the trials, and the result indicated a high risk of bias. A narrative description was processed because of the small number of RCTs. Neck pain intensity and functional disability were significantly lower in the yoga groups than in the control groups. [Conclusion] Evidence from the 3 randomly controlled trials shows that yoga may be beneficial for chronic neck pain. The low-quality result of the critical appraisal and the small number of trials suggest that high-quality RCTs are required to examine further the effects of yoga intervention on chronic neck pain relief. PMID:27512290

  19. Donated chemical probes for open science.

    PubMed

    Müller, Susanne; Ackloo, Suzanne; Arrowsmith, Cheryl H; Bauser, Marcus; Baryza, Jeremy L; Blagg, Julian; Böttcher, Jark; Bountra, Chas; Brown, Peter J; Bunnage, Mark E; Carter, Adrian J; Damerell, David; Dötsch, Volker; Drewry, David H; Edwards, Aled M; Edwards, James; Elkins, Jon M; Fischer, Christian; Frye, Stephen V; Gollner, Andreas; Grimshaw, Charles E; IJzerman, Adriaan; Hanke, Thomas; Hartung, Ingo V; Hitchcock, Steve; Howe, Trevor; Hughes, Terry V; Laufer, Stefan; Li, Volkhart Mj; Liras, Spiros; Marsden, Brian D; Matsui, Hisanori; Mathias, John; O'Hagan, Ronan C; Owen, Dafydd R; Pande, Vineet; Rauh, Daniel; Rosenberg, Saul H; Roth, Bryan L; Schneider, Natalie S; Scholten, Cora; Singh Saikatendu, Kumar; Simeonov, Anton; Takizawa, Masayuki; Tse, Chris; Thompson, Paul R; Treiber, Daniel K; Viana, Amélia Yi; Wells, Carrow I; Willson, Timothy M; Zuercher, William J; Knapp, Stefan; Mueller-Fahrnow, Anke

    2018-04-20

    Potent, selective and broadly characterized small molecule modulators of protein function (chemical probes) are powerful research reagents. The pharmaceutical industry has generated many high-quality chemical probes and several of these have been made available to academia. However, probe-associated data and control compounds, such as inactive structurally related molecules and their associated data, are generally not accessible. The lack of data and guidance makes it difficult for researchers to decide which chemical tools to choose. Several pharmaceutical companies (AbbVie, Bayer, Boehringer Ingelheim, Janssen, MSD, Pfizer, and Takeda) have therefore entered into a pre-competitive collaboration to make available a large number of innovative high-quality probes, including all probe-associated data, control compounds and recommendations on use (https://openscienceprobes.sgc-frankfurt.de/). Here we describe the chemical tools and target-related knowledge that have been made available, and encourage others to join the project. © 2018, Müller et al.

  20. Improving and integrating data on invasive species collected by citizen scientists

    USGS Publications Warehouse

    2010-01-01

    Limited resources make it difficult to effectively document, monitor, and control invasive species across large areas, resulting in large gaps in our knowledge of current and future invasion patterns. We surveyed 128 citizen science program coordinators and interviewed 15 of them to evaluate their potential role in filling these gaps. Many programs collect data on invasive species and are willing to contribute these data to public databases. Although resources for education and monitoring are readily available, groups generally lack tools to manage and analyze data. Potential users of these data also retain concerns over data quality. We discuss how to address these concerns about citizen scientist data and programs while preserving the advantages they afford. A unified yet flexible national citizen science program aimed at tracking invasive species location, abundance, and control efforts could be designed using centralized data sharing and management tools. Such a system could meet the needs of multiple stakeholders while allowing efficiencies of scale, greater standardization of methods, and improved data quality testing and sharing. Finally, we present a prototype for such a system (see www.citsci.org).

  1. Analysis of ChIP-seq Data in R/Bioconductor.

    PubMed

    de Santiago, Ines; Carroll, Thomas

    2018-01-01

    The development of novel high-throughput sequencing methods for ChIP (chromatin immunoprecipitation) has provided a very powerful tool to study gene regulation in multiple conditions at unprecedented resolution and scale. Proactive quality-control and appropriate data analysis techniques are of critical importance to extract the most meaningful results from the data. Over the last years, an array of R/Bioconductor tools has been developed allowing researchers to process and analyze ChIP-seq data. This chapter provides an overview of the methods available to analyze ChIP-seq data based primarily on software packages from the open-source Bioconductor project. Protocols described in this chapter cover basic steps including data alignment, peak calling, quality control and data visualization, as well as more complex methods such as the identification of differentially bound regions and functional analyses to annotate regulatory regions. The steps in the data analysis process were demonstrated on publicly available data sets and will serve as a demonstration of the computational procedures routinely used for the analysis of ChIP-seq data in R/Bioconductor, from which readers can construct their own analysis pipelines.

  2. Laser-assisted micro sheet forming

    NASA Astrophysics Data System (ADS)

    Holtkamp, Jens; Gillner, Arnold

    2008-01-01

    The fast growing market for micro technical products requires parts with increasing complexity. While sheet metal forming enables low cost mass production with short cycle times, it is limited by the maximum degree of deformation and the quality of the cut edge. The technology of warm forming partially eliminates these deficiencies. This operation takes place at elevated temperatures before structural transformation is initiated. It combines characteristic advantages of traditional cold and hot forming processes. Lasers as heat sources provide a high, selective and controllable energy input. The general difficulty of a uniform temperature distribution during the heating process can be reached by using an Axicon which generates an annulus on the sheet metal surface. The temperature of the workpiece, measured by a pyrometer, is tuned by a PI-Controller. A tool incorporating a multistage operation die is used for the manufacturing of up to three parts at the same time. The tool is integrated into a hydraulical press. A gearwheel made of the magnesium alloy AZ31 is chosen as metal demonstrator. The quality of these punched parts could be significantly improved at elevated temperatures

  3. Monitoring the sensory quality of canned white asparagus through cluster analysis.

    PubMed

    Arana, Inés; Ibañez, Francisco C; Torre, Paloma

    2016-05-01

    White asparagus is one of the 30 vegetables most consumed in the world. This paper unifies the stages of their sensory quality control. The aims of this work were to describe the sensory properties of canned white asparagus and their quality control and to evaluate the applicability of agglomerative hierarchical clustering (AHC) for classifying and monitoring the sensory quality of manufacturers. Sixteen sensory descriptors and their evaluation technique were defined. The sensory profile of canned white asparagus was high flavor characteristic, little acidity and bitterness, medium firmness and very light fibrosity, among other characteristics. The dendrogram established groups of manufacturers that had similar scores in the same set of descriptors, and each cluster grouped the manufacturers that had a similar quality profile. The sensory profile of canned white asparagus was clearly defined through the intensity evaluation of 16 descriptors, and the sensory quality report provided to the manufacturers is in detail and of easy interpretation. AHC grouped the manufacturers according to the highest quality scores in certain descriptors and is a useful tool because it is very visual. © 2015 Society of Chemical Industry. © 2015 Society of Chemical Industry.

  4. Acuity systems dialogue and patient classification system essentials.

    PubMed

    Harper, Kelle; McCully, Crystal

    2007-01-01

    Obtaining resources for quality patient care is a major responsibility of nurse leaders and requires accurate information in the political world of budgeting. Patient classification systems (PCS) assist nurse managers in controlling cost and improving patient care while appropriately using financial resources. This paper communicates acuity systems development, background, flaws, and components while discussing a few tools currently available. It also disseminates the development of a new acuity tool, the Patient Classification System. The PCS tool, developed in a small rural hospital, uses 5 broad concepts: (1) medications, (2) complicated procedures, (3) education, (4) psychosocial issues, and (5) complicated intravenous medications. These concepts embrace a 4-tiered scale that differentiates significant patient characteristics and assists in staffing measures for equality in patient staffing and improving quality of care and performance. Data obtained through use of the PCS can be used by nurse leaders to effectively and objectively lobby for appropriate patient care resources. Two questionnaires distributed to registered nurses on a medical-surgical unit evaluated the nurses' opinion of the 5 concepts and the importance for establishing patient acuity for in-patient care. Interrater reliability among nurses was 87% with the author's acuity tool.

  5. Production of rotational parts in small-series and computer-aided planning of its production engineering

    NASA Astrophysics Data System (ADS)

    Dudas, Illes; Berta, Miklos; Cser, Istvan

    1998-12-01

    Up-to-date manufacturing equipments of production of rotational parts in small series are lathe-centers and CNC grinding machines with high concentration of manufacturing operations. By the use of these machine tools it can be produced parts with requirements of increased accuracy and surface quality. In the lathe centers, which contain the manufacturing procedures of lathes using stationary tools and of drilling-milling machine tools using rotational tools, non-rotational surfaces of rotational parts can also be produced. The high concentration of manufacturing operations makes necessary the planning and programing of the measuring, monitoring and quality control into the technological process during manufacturing operation. In this way, taking into consideration the technological possibilities of lathe canters, the scope of computer aided technological planning duties significantly increases. It is trivial requirement to give only once the descriptions of the prefabricated parts and ready made parts. Starting taking into account these careful considerations we have been developing the planning system of technology of body of revolution on the base of GTIPROG/EC system which useful for programming of lathe centers. Out paper deals with the results of development and the occurring problems.

  6. Design and simulation of a sensor for heliostat field closed loop control

    NASA Astrophysics Data System (ADS)

    Collins, Mike; Potter, Daniel; Burton, Alex

    2017-06-01

    Significant research has been completed in pursuit of capital cost reductions for heliostats [1],[2]. The camera array closed loop control concept has potential to radically alter the way heliostats are controlled and installed by replacing high quality open loop targeting systems with low quality targeting devices that rely on measurement of image position to remove tracking errors during operation. Although the system could be used for any heliostat size, the system significantly benefits small heliostats by reducing actuation costs, enabling large numbers of heliostats to be calibrated simultaneously, and enabling calibration of heliostats that produce low irradiance (similar or less than ambient light images) on Lambertian calibration targets, such as small heliostats that are far from the tower. A simulation method for the camera array has been designed and verified experimentally. The simulation tool demonstrates that closed loop calibration or control is possible using this device.

  7. Generating community-built tools for data sharing and analysis in environmental networks

    USGS Publications Warehouse

    Read, Jordan S.; Gries, Corinna; Read, Emily K.; Klug, Jennifer; Hanson, Paul C.; Hipsey, Matthew R.; Jennings, Eleanor; O'Reilley, Catherine; Winslow, Luke A.; Pierson, Don; McBride, Christopher G.; Hamilton, David

    2016-01-01

    Rapid data growth in many environmental sectors has necessitated tools to manage and analyze these data. The development of tools often lags behind the proliferation of data, however, which may slow exploratory opportunities and scientific progress. The Global Lake Ecological Observatory Network (GLEON) collaborative model supports an efficient and comprehensive data–analysis–insight life cycle, including implementations of data quality control checks, statistical calculations/derivations, models, and data visualizations. These tools are community-built and openly shared. We discuss the network structure that enables tool development and a culture of sharing, leading to optimized output from limited resources. Specifically, data sharing and a flat collaborative structure encourage the development of tools that enable scientific insights from these data. Here we provide a cross-section of scientific advances derived from global-scale analyses in GLEON. We document enhancements to science capabilities made possible by the development of analytical tools and highlight opportunities to expand this framework to benefit other environmental networks.

  8. Arduino Due based tool to facilitate in vivo two-photon excitation microscopy

    PubMed Central

    Artoni, Pietro; Landi, Silvia; Sato, Sebastian Sulis; Luin, Stefano; Ratto, Gian Michele

    2016-01-01

    Two-photon excitation spectroscopy is a powerful technique for the characterization of the optical properties of genetically encoded and synthetic fluorescent molecules. Excitation spectroscopy requires tuning the wavelength of the Ti:sapphire laser while carefully monitoring the delivered power. To assist laser tuning and the control of delivered power, we developed an Arduino Due based tool for the automatic acquisition of high quality spectra. This tool is portable, fast, affordable and precise. It allowed studying the impact of scattering and of blood absorption on two-photon excitation light. In this way, we determined the wavelength-dependent deformation of excitation spectra occurring in deep tissues in vivo. PMID:27446677

  9. Approaching the Practice Quality Improvement Project in Interventional Radiology.

    PubMed

    Reis, Stephen P; White, Benjamin; Sutphin, Patrick D; Pillai, Anil K; Kalva, Sanjeeva P; Toomay, Seth M

    2015-12-01

    An important component of maintenance of certification and quality improvement in radiology is the practice quality improvement (PQI) project. In this article, the authors describe several methodologies for initiating and completing PQI projects. Furthermore, the authors illustrate several tools that are vital in compiling, analyzing, and presenting data in an easily understandable and reproducible manner. Last, they describe two PQI projects performed in an interventional radiology division that have successfully improved the quality of care for patients. Using the DMAIC (define, measure, analyze, improve, control) quality improvement framework, interventional radiology throughput has been increased, to lessen mediport wait times from 43 to 8 days, and mediport infection rates have decreased from more than 2% to less than 0.4%. Copyright © 2015 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  10. [Evaluation of the "initiative pain-free clinic" for quality improvement in postoperative pain management. A prospective controlled study].

    PubMed

    Lehmkuhl, D; Meissner, W; Neugebauer, E A M

    2011-09-01

    Demonstration of improved postoperative pain management by implementation of the S3 guidelines on treatment of acute perioperative and posttraumatic pain, by the integrated quality management concept "quality management acute pain" of the TÜV Rheinland or by participation in the benchmark project "Quality improvement in postoperative pain management" (QUIPS). A prospective controlled study (pre-post design) was carried out in hospitals with various levels of care comparing three hospital groups (n = 17/7/3, respectively). Group 1: participation in the QUIPS project (intraclinic and interclinic comparison of outcome data of postoperative pain treatment), group 2: participation in the quality management acute pain program (certified by TÜV Rheinland), group 3: control group with no involvement in either of the two concepts. In all three groups, an anonymous data collection was performed consisting of patient-reported pain intensity, side effects, pain disability and patient satisfaction. Pain therapy intervention was carried out only in group 2 by an integrated quality management concept (certification project: Quality management acute pain) with a package of measures to improve structure, process and outcome quality. The TÜV Rheinland certified clinics (group 2) showed a significant improvement in the pre-post comparison (before versus after certification) in the areas maximum pain (from visual analogue scale VAS 4.6 to 3.7), stress pain (5.3 to 3.9), pain-related impairment (proportion of patients with pain-linked decreased mobility and movement 26% to 16.1%, coughing and breathing 23.1% to 14.3%) and patient satisfaction (from 13.2 to 13.7; scale 0 completely unsatisfied, 15 very satisfied). The clinics with participation in QUIPS for 2 years also showed a significant improvement in stress pain (numeric rating scale NRS for pain 4.5 to 4.2), pain-linked-limitation of coughing and breathing (28% to 23.6%), and patient satisfaction (from 11.9 to 12.4). There were no differences in postoperative nausea and vomiting between any of the groups. The main objective of the certification concept quality management acute pain as a tool for the successful implementation of the S3 guidelines on treatment of acute perioperative and posttraumatic pain, led to a significant improvement in patient outcome. Participation in QUIPS is an ideal supplement to TÜV Rheinland certification and can be recommended as a benchmarking tool to evaluate outcome.

  11. Appraisal Tools for Clinical Practice Guidelines: A Systematic Review

    PubMed Central

    Siering, Ulrich; Eikermann, Michaela; Hausner, Elke; Hoffmann-Eßer, Wiebke; Neugebauer, Edmund A.

    2013-01-01

    Introduction Clinical practice guidelines can improve healthcare processes and patient outcomes, but are often of low quality. Guideline appraisal tools aim to help potential guideline users in assessing guideline quality. We conducted a systematic review of publications describing guideline appraisal tools in order to identify and compare existing tools. Methods Among others we searched MEDLINE, EMBASE and the Cochrane Database of Systematic Reviews from 1995 to May 2011 for relevant primary and secondary publications. We also handsearched the reference lists of relevant publications. On the basis of the available literature we firstly generated 34 items to be used in the comparison of appraisal tools and grouped them into thirteen quality dimensions. We then extracted formal characteristics as well as questions and statements of the appraisal tools and assigned them to the items. Results We identified 40 different appraisal tools. They covered between three and thirteen of the thirteen possible quality dimensions and between three and 29 of the possible 34 items. The main focus of the appraisal tools were the quality dimensions “evaluation of evidence” (mentioned in 35 tools; 88%), “presentation of guideline content” (34 tools; 85%), “transferability” (33 tools; 83%), “independence” (32 tools; 80%), “scope” (30 tools; 75%), and “information retrieval” (29 tools; 73%). The quality dimensions “consideration of different perspectives” and “dissemination, implementation and evaluation of the guideline” were covered by only twenty (50%) and eighteen tools (45%) respectively. Conclusions Most guideline appraisal tools assess whether the literature search and the evaluation, synthesis and presentation of the evidence in guidelines follow the principles of evidence-based medicine. Although conflicts of interest and norms and values of guideline developers, as well as patient involvement, affect the trustworthiness of guidelines, they are currently insufficiently considered. Greater focus should be placed on these issues in the further development of guideline appraisal tools. PMID:24349397

  12. Quantifying Solar Cell Cracks in Photovoltaic Modules by Electroluminescence Imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spataru, Sergiu; Hacke, Peter; Sera, Dezso

    2015-06-14

    This article proposes a method for quantifying the percentage of partially and totally disconnected solar cell cracks by analyzing electroluminescence images of the photovoltaic module taken under high- and low-current forward bias. The method is based on the analysis of the module's electroluminescence intensity distribution, applied at module and cell level. These concepts are demonstrated on a crystalline silicon photovoltaic module that was subjected to several rounds of mechanical loading and humidity-freeze cycling, causing increasing levels of solar cell cracks. The proposed method can be used as a diagnostic tool to rate cell damage or quality of modules after transportation.more » Moreover, the method can be automated and used in quality control for module manufacturers, installers, or as a diagnostic tool by plant operators and diagnostic service providers.« less

  13. Principles of continuous quality improvement applied to intravenous therapy.

    PubMed

    Dunavin, M K; Lane, C; Parker, P E

    1994-01-01

    Documentation of the application of the principles of continuous quality improvement (CQI) to the health care setting is crucial for understanding the transition from traditional management models to CQI models. A CQI project was designed and implemented by the IV Therapy Department at Lawrence Memorial Hospital to test the application of these principles to intravenous therapy and as a learning tool for the entire organization. Through a prototype inventory project, significant savings in cost and time were demonstrated using check sheets, flow diagrams, control charts, and other statistical tools, as well as using the Plan-Do-Check-Act cycle. As a result, a primary goal, increased time for direct patient care, was achieved. Eight hours per week in nursing time was saved, relationships between two work areas were improved, and $6,000 in personnel costs, storage space, and inventory were saved.

  14. Systematic review of the behavioural assessment of pain in cats.

    PubMed

    Merola, Isabella; Mills, Daniel S

    2016-02-01

    The objectives were to review systematically the range of assessment tools used in cats to detect the behavioural expression of pain and the evidence of their quality; and to examine behavioural metrics (considering both the sensory and affective domains) used to assess pain. A search of PubMed and ScienceDirect, alongside articles known to the authors, from 2000 onwards, for papers in English was performed. This was followed by a manual search of the references within the primary data sources. Only peer-reviewed publications that provided information on the assessment tool used to evaluate the behavioural expression of pain in cats, in conscious animals (not anaesthetised cats), were included. No previous systematic reviews were identified. One hundred papers were included in the final assessment. Studies were primarily related to the assessment of pain in relation to surgical procedures, and no clear distinction was made concerning the onset of acute and chronic pain. Ten broad types of instrument to assess pain were identified, and generally the quality of evidence to support the use of the various instruments was poor. Only one specific instrument (UNESP-Botucatu scale) had published evidence of validity, reliability and sensitivity at the level of a randomised control trial, but with a positive rather than placebo control, and limited to its use in the ovariohysterectomy situation. The metrics used within the tools appeared to focus primarily on the sensory aspect of pain, with no study clearly discriminating between the sensory and affective components of pain. Further studies are required to provide a higher quality of evidence for methods used to assess pain in cats. Furthermore, a consistent definition for acute and chronic pain is needed. Tools need to be validated that can detect pain in a range of conditions and by different evaluators (veterinary surgeons and owners), which consider both the sensory and emotional aspects of pain. © ISFM and AAFP 2015.

  15. TU-FG-201-05: Varian MPC as a Statistical Process Control Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carver, A; Rowbottom, C

    Purpose: Quality assurance in radiotherapy requires the measurement of various machine parameters to ensure they remain within permitted values over time. In Truebeam release 2.0 the Machine Performance Check (MPC) was released allowing beam output and machine axis movements to be assessed in a single test. We aim to evaluate the Varian Machine Performance Check (MPC) as a tool for Statistical Process Control (SPC). Methods: Varian’s MPC tool was used on three Truebeam and one EDGE linac for a period of approximately one year. MPC was commissioned against independent systems. After this period the data were reviewed to determine whethermore » or not the MPC was useful as a process control tool. Analyses on individual tests were analysed using Shewhart control plots, using Matlab for analysis. Principal component analysis was used to determine if a multivariate model was of any benefit in analysing the data. Results: Control charts were found to be useful to detect beam output changes, worn T-nuts and jaw calibration issues. Upper and lower control limits were defined at the 95% level. Multivariate SPC was performed using Principal Component Analysis. We found little evidence of clustering beyond that which might be naively expected such as beam uniformity and beam output. Whilst this makes multivariate analysis of little use it suggests that each test is giving independent information. Conclusion: The variety of independent parameters tested in MPC makes it a sensitive tool for routine machine QA. We have determined that using control charts in our QA programme would rapidly detect changes in machine performance. The use of control charts allows large quantities of tests to be performed on all linacs without visual inspection of all results. The use of control limits alerts users when data are inconsistent with previous measurements before they become out of specification. A. Carver has received a speaker’s honorarium from Varian.« less

  16. Comparison of methodological quality rating of systematic reviews on neuropathic pain using AMSTAR and R-AMSTAR.

    PubMed

    Dosenovic, Svjetlana; Jelicic Kadic, Antonia; Vucic, Katarina; Markovina, Nikolina; Pieper, Dawid; Puljak, Livia

    2018-05-08

    Systematic reviews (SRs) in the field of neuropathic pain (NeuP) are increasingly important for decision-making. However, methodological flaws in SRs can reduce the validity of conclusions. Hence, it is important to assess the methodological quality of NeuP SRs critically. Additionally, it remains unclear which assessment tool should be used. We studied the methodological quality of SRs published in the field of NeuP and compared two assessment tools. We systematically searched 5 electronic databases to identify SRs of randomized controlled trials of interventions for NeuP available up to March 2015. Two independent reviewers assessed the methodological quality of the studies using the Assessment of Multiple Systematic Reviews (AMSTAR) and the revised AMSTAR (R-AMSTAR) tools. The scores were converted to percentiles and ranked into 4 grades to allow comparison between the two checklists. Gwet's AC1 coefficient was used for interrater reliability assessment. The 97 included SRs had a wide range of methodological quality scores (AMSTAR median (IQR): 6 (5-8) vs. R-AMSTAR median (IQR): 30 (26-35)). The overall agreement score between the 2 raters was 0.62 (95% CI 0.39-0.86) for AMSTAR and 0.62 (95% CI 0.53-0.70) for R-AMSTAR. The 31 Cochrane systematic reviews (CSRs) were consistently ranked higher than the 66 non-Cochrane systematic reviews (NCSRs). The analysis of individual domains showed the best compliance in a comprehensive literature search (item 3) on both checklists. The results for the domain that was the least compliant differed: conflict of interest (item 11) was the item most poorly reported on AMSTAR vs. publication bias assessment (item 10) on R-AMSTAR. A high positive correlation between the total AMSTAR and R-AMSTAR scores for all SRs, as well as for CSRs and NCSRs, was observed. The methodological quality of analyzed SRs in the field of NeuP was not optimal, and CSRs had a higher quality than NCSRs. Both AMSTAR and R-AMSTAR tools produced comparable quality ratings. Our results point out to weaknesses in the methodology of existing SRs on interventions for the management NeuP and call for future improvement by better adherence to analyzed quality checklists, either AMSTAR or R-AMSTAR.

  17. Design and thermal analysis of a mold used in the injection of elastomers

    NASA Astrophysics Data System (ADS)

    Fekiri, Nasser; Canto, Cécile; Madec, Yannick; Mousseau, Pierre; Plot, Christophe; Sarda, Alain

    2017-10-01

    In the process of injection molding of elastomers, improving the energy efficiency of the tools is a current challenge for industry in terms of energy consumption, productivity and product quality. In the rubber industry, 20% of the energy consumed by capital goods comes from heating processes; more than 50% of heat losses are linked to insufficient control and thermal insulation of Molds. The design of the tooling evolves in particular towards the reduction of the heated mass and the thermal insulation of the molds. In this paper, we present a complex tool composed, on one hand, of a multi-cavity mold designed by reducing the heated mass and equipped with independent control zones placed closest to each molding cavity and, on the other hand, of a regulated channel block (RCB) which makes it possible to limit the waste of rubber during the injection. The originality of this tool lies in thermally isolating the regulated channel block from the mold and the cavities between them in order to better control the temperature field in the material which is transformed. We present the design and the instrumentation of the experimental set-up. Experimental measurements allow us to understand the thermal of the tool and to show the thermal heterogeneities on the surface of the mold and in the various cavities. Tests of injection molding of the rubber and a thermal balance on the energy consumption of the tool are carried out.

  18. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center ({+-}4% of deviation between the calculated and measured doses) by calculatingmore » a control process capability (C{sub pc}) index. The C{sub pc} index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should improve the security of treatments. They also showed that the dose delivery processes in the cancer center were in control for prostate and head-and-neck treatments. In parallel, long term process performance indices (P{sub p}, P{sub pk}, and P{sub pm}) have been analyzed. Their analysis helped defining which actions should be undertaken in order to improve the performance of the process. The prostate dose delivery process has been shown statistically capable (0.08% of the results is expected to be outside the clinical tolerances) contrary to the head-and-neck dose delivery process (5.76% of the results are expected to be outside the clinical tolerances).« less

  19. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC).

    PubMed

    Gérard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent; Kafrouni, Hanna; Husson, François; Aletti, Pierre

    2009-04-01

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short-term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center (+/- 4% of deviation between the calculated and measured doses) by calculating a control process capability (C(pc)) index. The C(pc) index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short-term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should improve the security of treatments. They also showed that the dose delivery processes in the cancer center were in control for prostate and head-and-neck treatments. In parallel, long-term process performance indices (P(p), P(pk), and P(pm)) have been analyzed. Their analysis helped defining which actions should be undertaken in order to improve the performance of the process. The prostate dose delivery process has been shown statistically capable (0.08% of the results is expected to be outside the clinical tolerances) contrary to the head-and-neck dose delivery process (5.76% of the results are expected to be outside the clinical tolerances).

  20. The complications of controlling agency time discretion: FDA review deadlines and postmarket drug safety.

    PubMed

    Carpenter, Daniel; Chattopadhyay, Jacqueline; Moffitt, Susan; Nall, Clayton

    2012-01-01

    Public agencies have discretion on the time domain, and politicians deploy numerous policy instruments to constrain it. Yet little is known about how administrative procedures that affect timing also affect the quality of agency decisions. We examine whether administrative deadlines shape decision timing and the observed quality of decisions. Using a unique and rich dataset of FDA drug approvals that allows us to examine decision timing and quality, we find that this administrative tool induces a piling of decisions before deadlines, and that these “just-before-deadline” approvals are linked with higher rates of postmarket safety problems (market withdrawals, severe safety warnings, safety alerts). Examination of data from FDA advisory committees suggests that the deadlines may impede quality by impairing late-stage deliberation and agency risk communication. Our results both support and challenge reigning theories about administrative procedures, suggesting they embody expected control-expertise trade-offs, but may also create unanticipated constituency losses.

  1. Designing a Clinical Data Warehouse Architecture to Support Quality Improvement Initiatives.

    PubMed

    Chelico, John D; Wilcox, Adam B; Vawdrey, David K; Kuperman, Gilad J

    2016-01-01

    Clinical data warehouses, initially directed towards clinical research or financial analyses, are evolving to support quality improvement efforts, and must now address the quality improvement life cycle. In addition, data that are needed for quality improvement often do not reside in a single database, requiring easier methods to query data across multiple disparate sources. We created a virtual data warehouse at NewYork Presbyterian Hospital that allowed us to bring together data from several source systems throughout the organization. We also created a framework to match the maturity of a data request in the quality improvement life cycle to proper tools needed for each request. As projects progress in the Define, Measure, Analyze, Improve, Control stages of quality improvement, there is a proper matching of resources the data needs at each step. We describe the analysis and design creating a robust model for applying clinical data warehousing to quality improvement.

  2. Designing a Clinical Data Warehouse Architecture to Support Quality Improvement Initiatives

    PubMed Central

    Chelico, John D.; Wilcox, Adam B.; Vawdrey, David K.; Kuperman, Gilad J.

    2016-01-01

    Clinical data warehouses, initially directed towards clinical research or financial analyses, are evolving to support quality improvement efforts, and must now address the quality improvement life cycle. In addition, data that are needed for quality improvement often do not reside in a single database, requiring easier methods to query data across multiple disparate sources. We created a virtual data warehouse at NewYork Presbyterian Hospital that allowed us to bring together data from several source systems throughout the organization. We also created a framework to match the maturity of a data request in the quality improvement life cycle to proper tools needed for each request. As projects progress in the Define, Measure, Analyze, Improve, Control stages of quality improvement, there is a proper matching of resources the data needs at each step. We describe the analysis and design creating a robust model for applying clinical data warehousing to quality improvement. PMID:28269833

  3. Evidence-based decision-making in infectious diseases epidemiology, prevention and control: matching research questions to study designs and quality appraisal tools

    PubMed Central

    2014-01-01

    Background The Project on a Framework for Rating Evidence in Public Health (PRECEPT) was initiated and is being funded by the European Centre for Disease Prevention and Control (ECDC) to define a methodology for evaluating and grading evidence and strength of recommendations in the field of public health, with emphasis on infectious disease epidemiology, prevention and control. One of the first steps was to review existing quality appraisal tools (QATs) for individual research studies of various designs relevant to this area, using a question-based approach. Methods Through team discussions and expert consultations, we identified 20 relevant types of public health questions, which were grouped into six domains, i.e. characteristics of the pathogen, burden of disease, diagnosis, risk factors, intervention, and implementation of intervention. Previously published systematic reviews were used and supplemented by expert consultation to identify suitable QATs. Finally, a matrix was constructed for matching questions to study designs suitable to address them and respective QATs. Key features of each of the included QATs were then analyzed, in particular in respect to its intended use, types of questions and answers, presence/absence of a quality score, and if a validation was performed. Results In total we identified 21 QATs and 26 study designs, and matched them. Four QATs were suitable for experimental quantitative study designs, eleven for observational quantitative studies, two for qualitative studies, three for economic studies, one for diagnostic test accuracy studies, and one for animal studies. Included QATs consisted of six to 28 items. Six of the QATs had a summary quality score. Fourteen QATs had undergone at least one validation procedure. Conclusions The results of this methodological study can be used as an inventory of potentially relevant questions, appropriate study designs and QATs for researchers and authorities engaged with evidence-based decision-making in infectious disease epidemiology, prevention and control. PMID:24886571

  4. Evidence-based decision-making in infectious diseases epidemiology, prevention and control: matching research questions to study designs and quality appraisal tools.

    PubMed

    Harder, Thomas; Takla, Anja; Rehfuess, Eva; Sánchez-Vivar, Alex; Matysiak-Klose, Dorothea; Eckmanns, Tim; Krause, Gérard; de Carvalho Gomes, Helena; Jansen, Andreas; Ellis, Simon; Forland, Frode; James, Roberta; Meerpohl, Joerg J; Morgan, Antony; Schünemann, Holger; Zuiderent-Jerak, Teun; Wichmann, Ole

    2014-05-21

    The Project on a Framework for Rating Evidence in Public Health (PRECEPT) was initiated and is being funded by the European Centre for Disease Prevention and Control (ECDC) to define a methodology for evaluating and grading evidence and strength of recommendations in the field of public health, with emphasis on infectious disease epidemiology, prevention and control. One of the first steps was to review existing quality appraisal tools (QATs) for individual research studies of various designs relevant to this area, using a question-based approach. Through team discussions and expert consultations, we identified 20 relevant types of public health questions, which were grouped into six domains, i.e. characteristics of the pathogen, burden of disease, diagnosis, risk factors, intervention, and implementation of intervention. Previously published systematic reviews were used and supplemented by expert consultation to identify suitable QATs. Finally, a matrix was constructed for matching questions to study designs suitable to address them and respective QATs. Key features of each of the included QATs were then analyzed, in particular in respect to its intended use, types of questions and answers, presence/absence of a quality score, and if a validation was performed. In total we identified 21 QATs and 26 study designs, and matched them. Four QATs were suitable for experimental quantitative study designs, eleven for observational quantitative studies, two for qualitative studies, three for economic studies, one for diagnostic test accuracy studies, and one for animal studies. Included QATs consisted of six to 28 items. Six of the QATs had a summary quality score. Fourteen QATs had undergone at least one validation procedure. The results of this methodological study can be used as an inventory of potentially relevant questions, appropriate study designs and QATs for researchers and authorities engaged with evidence-based decision-making in infectious disease epidemiology, prevention and control.

  5. Development of an analysis tool for cloud base height and visibility

    NASA Astrophysics Data System (ADS)

    Umdasch, Sarah; Reinhold, Steinacker; Manfred, Dorninger; Markus, Kerschbaum; Wolfgang, Pöttschacher

    2014-05-01

    The meteorological variables cloud base height (CBH) and horizontal atmospheric visibility (VIS) at surface level are of vital importance for safety and effectiveness in aviation. Around 20% of all civil aviation accidents in the USA from 2003 to 2007 were due to weather related causes, around 18% of which were owing to decreased visibility or ceiling (main CBH). The aim of this study is to develop a system generating quality-controlled gridded analyses of the two parameters based on the integration of various kinds of observational data. Upon completion, the tool is planned to provide guidance for nowcasting during take-off and landing as well as for flights operated under visual flight rules. Primary input data consists of manual as well as instrumental observation of CBH and VIS. In Austria, restructuring of part of the standard meteorological stations from human observation to automatic measurement of VIS and CBH is currently in progress. As ancillary data, satellite derived products can add 2-dimensional information, e.g. Cloud Type by NWC SAF (Nowcasting Satellite Application Facilities) MSG (Meteosat Second Generation). Other useful available data are meteorological surface measurements (in particular of temperature, humidity, wind and precipitation), radiosonde, radar and high resolution topography data. A one-year data set is used to study the spatial and weather-dependent representativeness of the CBH and VIS measurements. The VERA (Vienna Enhanced Resolution Analysis) system of the Institute of Meteorology and Geophysics of the University of Vienna provides the framework for the analysis development. Its integrated "Fingerprint" technique allows the insertion of empirical prior knowledge and ancillary information in the form of spatial patterns. Prior to the analysis, a quality control of input data is performed. For CBH and VIS, quality control can consist of internal consistency checks between different data sources. The possibility of two-dimensional consistency checks has to be explored. First results in the development of quality control features and fingerprints will be shown.

  6. Development and testing of an active boring bar for increased chatter immunity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Redmond, J.; Barney, P.

    Recent advances in smart materials have renewed interest in the development of improved manufacturing processes featuring sensing, processing, and active control. In particular, vibration suppression in metal cutting has received much attention because of its potential for enhancing part quality while reducing the time and cost of production. Although active tool clamps have been recently demonstrated, they are often accompanied by interfacing issues that limit their applicability to specific machines. Under the auspices of the Laboratory Directed Research and Development program, the project titled {open_quotes}Smart Cutting Tools for Precision Manufacturing{close_quotes} developed an alternative approach to active vibration control in machining.more » Using the boring process as a vehicle for exploration, a commercially available tool was modified to incorporate PZT stack actuators for active suppression of its bending modes. Since the modified tool requires no specialized mounting hardware, it can be readily mounted on many machines. Cutting tests conducted on a horizontal lathe fitted with a hardened steel workpiece verify that the actively damped boring bar yields significant vibration reduction and improved surface finishes as compared to an unmodified tool.« less

  7. International Society of Human and Animal Mycology (ISHAM)-ITS reference DNA barcoding database--the quality controlled standard tool for routine identification of human and animal pathogenic fungi.

    PubMed

    Irinyi, Laszlo; Serena, Carolina; Garcia-Hermoso, Dea; Arabatzis, Michael; Desnos-Ollivier, Marie; Vu, Duong; Cardinali, Gianluigi; Arthur, Ian; Normand, Anne-Cécile; Giraldo, Alejandra; da Cunha, Keith Cassia; Sandoval-Denis, Marcelo; Hendrickx, Marijke; Nishikaku, Angela Satie; de Azevedo Melo, Analy Salles; Merseguel, Karina Bellinghausen; Khan, Aziza; Parente Rocha, Juliana Alves; Sampaio, Paula; da Silva Briones, Marcelo Ribeiro; e Ferreira, Renata Carmona; de Medeiros Muniz, Mauro; Castañón-Olivares, Laura Rosio; Estrada-Barcenas, Daniel; Cassagne, Carole; Mary, Charles; Duan, Shu Yao; Kong, Fanrong; Sun, Annie Ying; Zeng, Xianyu; Zhao, Zuotao; Gantois, Nausicaa; Botterel, Françoise; Robbertse, Barbara; Schoch, Conrad; Gams, Walter; Ellis, David; Halliday, Catriona; Chen, Sharon; Sorrell, Tania C; Piarroux, Renaud; Colombo, Arnaldo L; Pais, Célia; de Hoog, Sybren; Zancopé-Oliveira, Rosely Maria; Taylor, Maria Lucia; Toriello, Conchita; de Almeida Soares, Célia Maria; Delhaes, Laurence; Stubbe, Dirk; Dromer, Françoise; Ranque, Stéphane; Guarro, Josep; Cano-Lira, Jose F; Robert, Vincent; Velegraki, Aristea; Meyer, Wieland

    2015-05-01

    Human and animal fungal pathogens are a growing threat worldwide leading to emerging infections and creating new risks for established ones. There is a growing need for a rapid and accurate identification of pathogens to enable early diagnosis and targeted antifungal therapy. Morphological and biochemical identification methods are time-consuming and require trained experts. Alternatively, molecular methods, such as DNA barcoding, a powerful and easy tool for rapid monophasic identification, offer a practical approach for species identification and less demanding in terms of taxonomical expertise. However, its wide-spread use is still limited by a lack of quality-controlled reference databases and the evolving recognition and definition of new fungal species/complexes. An international consortium of medical mycology laboratories was formed aiming to establish a quality controlled ITS database under the umbrella of the ISHAM working group on "DNA barcoding of human and animal pathogenic fungi." A new database, containing 2800 ITS sequences representing 421 fungal species, providing the medical community with a freely accessible tool at http://www.isham.org/ and http://its.mycologylab.org/ to rapidly and reliably identify most agents of mycoses, was established. The generated sequences included in the new database were used to evaluate the variation and overall utility of the ITS region for the identification of pathogenic fungi at intra-and interspecies level. The average intraspecies variation ranged from 0 to 2.25%. This highlighted selected pathogenic fungal species, such as the dermatophytes and emerging yeast, for which additional molecular methods/genetic markers are required for their reliable identification from clinical and veterinary specimens. © The Author 2015. Published by Oxford University Press on behalf of The International Society for Human and Animal Mycology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  8. Microcontroller based automatic temperature control for oyster mushroom plants

    NASA Astrophysics Data System (ADS)

    Sihombing, P.; Astuti, T. P.; Herriyance; Sitompul, D.

    2018-03-01

    In the cultivation of Oyster Mushrooms need special treatment because oyster mushrooms are susceptible to disease. Mushroom growth will be inhibited if the temperature and humidity are not well controlled because temperature and inertia can affect mold growth. Oyster mushroom growth usually will be optimal at temperatures around 22-28°C and humidity around 70-90%. This problem is often encountered in the cultivation of oyster mushrooms. Therefore it is very important to control the temperature and humidity of the room of oyster mushroom cultivation. In this paper, we developed an automatic temperature monitoring tool in the cultivation of oyster mushroom-based Arduino Uno microcontroller. We have designed a tool that will control the temperature and humidity automatically by Android Smartphone. If the temperature increased more than 28°C in the room of mushroom plants, then this tool will turn on the pump automatically to run water in order to lower the room temperature. And if the room temperature of mushroom plants below of 22°C, then the light will be turned on in order to heat the room. Thus the temperature in the room oyster mushrooms will remain stable so that the growth of oyster mushrooms can grow with good quality.

  9. The effect of massage therapy on the quality of sleep in breast cancer patients.

    PubMed

    Kashani, Fahimeh; Kashani, Parisa

    2014-03-01

    Annually, about 6000 new cases are diagnosed with breast cancer in Iran. In Iran, more women are affected with breast cancer than a decade earlier in comparison with other countries, and 70% of them are diagnosed at an advanced phase. Insomnia is the most common disorder following breast cancer, and interference in sleep quality and rest causes changes in physiological functions and reduces the body's daily performance. The objective of this study was to determine the effect of massage therapy on the quality of sleep in patients with breast cancer. This clinical trial was conducted for about 1 month in a referral chemotherapy clinic of a teaching hospital in Isfahan, Iran. The participants consisted of 57 women with breast cancer who were selected by simple random sampling. They were randomly assigned to two groups of control and experimental. The control group was treated only by usual medical therapy, whereas the case group was treated by combined medical-massage therapy. Data collection tools were the validated Pittsburgh Sleep Quality Index and a demographic questionnaire. Data were analyzed by SPSS using descriptive statistics, Chi-square test, paired t-test, and Student's t-test. The results showed significant differences in the mean scores of quality of sleep before and after the intervention in the case group, while no significant differences were observed in the mean scores of quality of sleep before and after the intervention in the control group. In addition, no significant differences were observed in the mean scores of quality of sleep before the intervention between case and control groups. However, significant differences were observed in the mean scores of quality of sleep after the intervention between case and control groups. According to the results of this study, learning and applying massage techniques by medical staff causes health promotion and improves the quality of sleep in cancer patients. Furthermore, massage therapy is suggested as a non-pharmacologic method to improve sleep quality in these patients.

  10. Marky: a tool supporting annotation consistency in multi-user and iterative document annotation projects.

    PubMed

    Pérez-Pérez, Martín; Glez-Peña, Daniel; Fdez-Riverola, Florentino; Lourenço, Anália

    2015-02-01

    Document annotation is a key task in the development of Text Mining methods and applications. High quality annotated corpora are invaluable, but their preparation requires a considerable amount of resources and time. Although the existing annotation tools offer good user interaction interfaces to domain experts, project management and quality control abilities are still limited. Therefore, the current work introduces Marky, a new Web-based document annotation tool equipped to manage multi-user and iterative projects, and to evaluate annotation quality throughout the project life cycle. At the core, Marky is a Web application based on the open source CakePHP framework. User interface relies on HTML5 and CSS3 technologies. Rangy library assists in browser-independent implementation of common DOM range and selection tasks, and Ajax and JQuery technologies are used to enhance user-system interaction. Marky grants solid management of inter- and intra-annotator work. Most notably, its annotation tracking system supports systematic and on-demand agreement analysis and annotation amendment. Each annotator may work over documents as usual, but all the annotations made are saved by the tracking system and may be further compared. So, the project administrator is able to evaluate annotation consistency among annotators and across rounds of annotation, while annotators are able to reject or amend subsets of annotations made in previous rounds. As a side effect, the tracking system minimises resource and time consumption. Marky is a novel environment for managing multi-user and iterative document annotation projects. Compared to other tools, Marky offers a similar visually intuitive annotation experience while providing unique means to minimise annotation effort and enforce annotation quality, and therefore corpus consistency. Marky is freely available for non-commercial use at http://sing.ei.uvigo.es/marky. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  11. Quality of Reporting Randomized Controlled Trials in Five Leading Neurology Journals in 2008 and 2013 Using the Modified "Risk of Bias" Tool.

    PubMed

    Zhai, Xiao; Cui, Jin; Wang, Yiran; Qu, Zhiquan; Mu, Qingchun; Li, Peiwen; Zhang, Chaochao; Yang, Mingyuan; Chen, Xiao; Chen, Ziqiang; Li, Ming

    2017-03-01

    To examine the risk of bias of methodological quality of reporting randomized clinical trials (RCTs) in major neurology journals before and after the update (2011) of Cochrane risk of bias tool. RCTs in 5 leading neurology journals in 2008 and 2013 were searched systematically. Characteristics were extracted based on the list of the modified Cochrane Collaboration's tool. Country, number of patients, type of intervention, and funding source also were examined for further analysis. A total of 138 RCTs were enrolled in this study. The rates of following a trial plan were 61.6% for the allocation generation, 52.9% for the allocation concealment, 84.8% for the blinding of the participants or the personnel, 34.8% for the blinding of outcome assessment, 78.3% for the incomplete outcome data, and 67.4% for the selective reporting. A significant setback was found in "the selective reporting" in 2013 than that in 2008. Trials performed by multi-centers and on a large scale had significantly more "low risk of bias" trials. Not only the number of surgical trials (5.8%) was much less than that of trials using drugs (73.9%), but also the reporting quality of surgical trials were worse (P = 0.008). Finally, only 17.4% trials met the criterion of "low risk of bias." The modified "risk of bias" tool is an improved version for assessment. Methodological quality of reporting RCTs in the 5neurology journals is unsatisfactory, especially that for surgical RCTs, and it could be further improved. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Effect-directed analysis supporting monitoring of aquatic ...

    EPA Pesticide Factsheets

    Aquatic environments are often contaminated with complex mixtures of chemicals that may pose a risk to ecosystems and human health. This contamination cannot be addressed with target analysis alone but tools are required to reduce this complexity and identify those chemicals that might cause adverse effects. Effect-directed analysis (EDA) is designed to meet this challenge and faces increasing interest in water and sediment quality monitoring. Thus, the present paper summarizes current experience with the EDA approach and the tools required,and provides practical advice on their application. The paper highlights the need for proper problem formulation and gives general advice for study design. As the EDA approach is directed by toxicity, basic principles for the selection of bioassays are given as well as a comprehensive compilation of appropriate assays, includingtheir strengths andweaknesses. A specific focus is given to strategies for sampling, extraction and bioassay dosing since they strongly impact prioritization of toxicants in EDA. Reduction of sample complexity mainly relies onfractionation procedures, which are discussed in this paper, including quality assurance and quality control. Automated combinations of fractionation, biotesting and chemical analysis using so-called hyphenated tools can enhance the throughput and might reduce the risk of artifacts in laboratory work. The key to determiningthe chemical structures causing effects is analytical toxi

  13. Impact of an Information Technology-Enabled Initiative on the Quality of Prostate Multiparametric MRI Reports

    PubMed Central

    Silveira, Patricia C.; Dunne, Ruth; Sainani, Nisha I.; Lacson, Ronilda; Silverman, Stuart G.; Tempany, Clare M.; Khorasani, Ramin

    2015-01-01

    Rationale and Objectives Assess the impact of implementing a structured report template and a computer-aided diagnosis (CAD) tool on the quality of prostate multiparametric MRI (mp-MRI) reports. Materials and Methods Institutional Review Board approval was obtained for this HIPAA-compliant study performed at an academic medical center. The study cohort included all prostate mp-MRI reports (n=385) finalized 6 months before and after implementation of a structured report template and a CAD tool (collectively the IT tools) integrated into the PACS workstation. Primary outcome measure was quality of prostate mp-MRI reports. An expert panel of our institution’s subspecialty trained abdominal radiologists defined prostate mp-MRI report quality as optimal, satisfactory or unsatisfactory based on documentation of 9 variables. Reports were reviewed to extract the predefined quality variables and determine whether the IT tools were used to create each report. Chi-square and Student’s t-tests were used to compare report quality before and after implementation of IT tools. Results The overall proportion of optimal or satisfactory reports increased from 29.8% (47/158) to 53.3% (121/227) (p<0.001) after implementing the IT tools. While the proportion of optimal or satisfactory reports increased among reports generated using at least one of the IT tools (47/158=[29.8%] vs. 105/161=[65.2%]; p<0.001), there was no change in quality among reports generated without use of the IT tools (47/158=[29.8%] vs. 16/66=[24.2%]; p=0.404). Conclusion The use of a structured template and CAD tool improved the quality of prostate mp-MRI reports compared to free-text report format and subjective measurement of contrast enhancement kinetic curve. PMID:25863794

  14. Advances in food crystallization.

    PubMed

    Hartel, Richard W

    2013-01-01

    Crystals often play an important role in food product quality and shelf life. Controlling crystallization to obtain the desired crystal content, size distribution, shape, and polymorph is key to manufacturing products with desired functionality and shelf life. Technical developments in the field have improved the tools with which we study and characterize crystals in foods. These developments also help our understanding of the physico-chemical phenomena that govern crystallization and improve our ability to control it during processing and storage. In this review, some of the more important recent developments in measuring and controlling crystallization are discussed.

  15. Process analytical technology in the pharmaceutical industry: a toolkit for continuous improvement.

    PubMed

    Scott, Bradley; Wilcock, Anne

    2006-01-01

    Process analytical technology (PAT) refers to a series of tools used to ensure that quality is built into products while at the same time improving the understanding of processes, increasing efficiency, and decreasing costs. It has not been widely adopted by the pharmaceutical industry. As the setting for this paper, the current pharmaceutical manufacturing paradigm and PAT guidance to date are discussed prior to the review of PAT principles and tools, benefits, and challenges. The PAT toolkit contains process analyzers, multivariate analysis tools, process control tools, and continuous improvement/knowledge management/information technology systems. The integration and implementation of these tools is complex, and has resulted in uncertainty with respect to both regulation and validation. The paucity of staff knowledgeable in this area may complicate adoption. Studies to quantitate the benefits resulting from the adoption of PAT within the pharmaceutical industry would be a valuable addition to the qualitative studies that are currently available.

  16. Management of fish populations in large rivers: a review of tools and approaches

    USGS Publications Warehouse

    Petts, Geoffrey E.; Imhoff, Jack G.; Manny, Bruce A.; Maher, John F. B.; Weisberg, Stephen B.

    1989-01-01

    In common with most branches of science, the management of riverine fish populations is characterised by reductionist and isolationist philosophies. Traditional fish management focuses on stocking and controls on fishing. This paper presents a concensus of scientists involved in the LARS workshop on the management of fish populations in large rivers. A move towards a more holistic philosophy is advocated, with fish management forming an integral part of sustainable river development. Based upon a questionnaire survey of LARS members, with wide-ranging expertise and experience from all parts of the world, lists of management tools currently in use are presented. Four categories of tools are described: flow, water-quality, habitat, and biological. The potential applications of tools for fish management in large rivers is discussed and research needs are identified. The lack of scientific evaluations of the different tools remains the major constraint to their wider application.

  17. Control of Space-Based Electron Beam Free Form Fabrication

    NASA Technical Reports Server (NTRS)

    Seifzer. W. J.; Taminger, K. M.

    2007-01-01

    Engineering a closed-loop control system for an electron beam welder for space-based additive manufacturing is challenging. For earth and space based applications, components must work in a vacuum and optical components become occluded with metal vapor deposition. For extraterrestrial applications added components increase launch weight, increase complexity, and increase space flight certification efforts. Here we present a software tool that closely couples path planning and E-beam parameter controls into the build process to increase flexibility. In an environment where data collection hinders real-time control, another approach is considered that will still yield a high quality build.

  18. Real-time feedback control of twin-screw wet granulation based on image analysis.

    PubMed

    Madarász, Lajos; Nagy, Zsombor Kristóf; Hoffer, István; Szabó, Barnabás; Csontos, István; Pataki, Hajnalka; Démuth, Balázs; Szabó, Bence; Csorba, Kristóf; Marosi, György

    2018-06-04

    The present paper reports the first dynamic image analysis-based feedback control of continuous twin-screw wet granulation process. Granulation of the blend of lactose and starch was selected as a model process. The size and size distribution of the obtained particles were successfully monitored by a process camera coupled with an image analysis software developed by the authors. The validation of the developed system showed that the particle size analysis tool can determine the size of the granules with an error of less than 5 µm. The next step was to implement real-time feedback control of the process by controlling the liquid feeding rate of the pump through a PC, based on the real-time determined particle size results. After the establishment of the feedback control, the system could correct different real-life disturbances, creating a Process Analytically Controlled Technology (PACT), which guarantees the real-time monitoring and controlling of the quality of the granules. In the event of changes or bad tendencies in the particle size, the system can automatically compensate the effect of disturbances, ensuring proper product quality. This kind of quality assurance approach is especially important in the case of continuous pharmaceutical technologies. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. Electronic monitoring of patient adherence to oral antihypertensive medical treatment: a systematic review.

    PubMed

    Christensen, Arne; Osterberg, Lars G; Hansen, Ebba Holme

    2009-08-01

    Poor patient adherence is often the reason for suboptimal blood pressure control. Electronic monitoring is one method of assessing adherence. The aim was to systematically review the literature on electronic monitoring of patient adherence to self-administered oral antihypertensive medications. We searched the Pubmed, Embase, Cinahl and Psychinfo databases and websites of suppliers of electronic monitoring devices. The quality of the studies was assessed according to the quality criteria proposed by Haynes et al. Sixty-two articles were included; three met the criteria proposed by Haynes et al. and nine reported the use of electronic adherence monitoring for feedback interventions. Adherence rates were generally high, whereas average study quality was low with a recent tendency towards improved quality. One study detected investigator fraud based on electronic monitoring data. Use of electronic monitoring of patient adherence according to the quality criteria proposed by Haynes et al. has been rather limited during the past two decades. Electronic monitoring has mainly been used as a measurement tool, but it seems to have the potential to significantly improve blood pressure control as well and should be used more widely.

  20. Application of spatial technology in malaria research & control: some new insights.

    PubMed

    Saxena, Rekha; Nagpal, B N; Srivastava, Aruna; Gupta, S K; Dash, A P

    2009-08-01

    Geographical information System (GIS) has emerged as the core of the spatial technology which integrates wide range of dataset available from different sources including Remote Sensing (RS) and Global Positioning System (GPS). Literature published during the decade (1998-2007) has been compiled and grouped into six categories according to the usage of the technology in malaria epidemiology. Different GIS modules like spatial data sources, mapping and geo-processing tools, distance calculation, digital elevation model (DEM), buffer zone and geo-statistical analysis have been investigated in detail, illustrated with examples as per the derived results. These GIS tools have contributed immensely in understanding the epidemiological processes of malaria and examples drawn have shown that GIS is now widely used for research and decision making in malaria control. Statistical data analysis currently is the most consistent and established set of tools to analyze spatial datasets. The desired future development of GIS is in line with the utilization of geo-statistical tools which combined with high quality data has capability to provide new insight into malaria epidemiology and the complexity of its transmission potential in endemic areas.

  1. Cluster-Randomized Trial of Personalized Site Performance Feedback in Get With The Guidelines-Heart Failure.

    PubMed

    DeVore, Adam D; Cox, Margueritte; Heidenreich, Paul A; Fonarow, Gregg C; Yancy, Clyde W; Eapen, Zubin J; Peterson, Eric D; Hernandez, Adrian F

    2015-07-01

    There is significant variation in the delivery of evidence-based care for patients with heart failure (HF), but there is limited evidence defining the best methods to improve the quality of care. We performed a cluster-randomized trial of personalized site performance feedback at 147 hospitals participating in the Get With The Guidelines-Heart Failure quality improvement program from October 2009 to March 2011. The intervention provided sites with specific data on their heart failure achievement and quality measures in addition to the usual Get With The Guidelines-Heart Failure tools. The primary outcome for our trial was improvement in site composite quality of care score. Overall, 73 hospitals (n=33 886 patients) received the intervention, whereas 74 hospitals (n=37 943 patients) did not. One year after the intervention, both the intervention and control arms had a similar mean change in percentage points in their composite quality score (absolute change, +0.31 [SE, 1.51] versus +3.18 [SE, 1.68] in control; P=0.21). Similarly, none of the individual achievement measures or quality measures improved more at intervention versus control hospitals. Our site-based intervention, which included personalized site feedback on adherence to quality metrics, was not able to elicit more quality improvement beyond that already associated with participation in the Get With The Guidelines-Heart Failure program. URL: http://www.clinicaltrials.gov. Unique identifier: NCT00979264. © 2015 American Heart Association, Inc.

  2. Automatic Control of the Concrete Mixture Homogeneity in Cycling Mixers

    NASA Astrophysics Data System (ADS)

    Anatoly Fedorovich, Tikhonov; Drozdov, Anatoly

    2018-03-01

    The article describes the factors affecting the concrete mixture quality related to the moisture content of aggregates, since the effectiveness of the concrete mixture production is largely determined by the availability of quality management tools at all stages of the technological process. It is established that the unaccounted moisture of aggregates adversely affects the concrete mixture homogeneity and, accordingly, the strength of building structures. A new control method and the automatic control system of the concrete mixture homogeneity in the technological process of mixing components have been proposed, since the tasks of providing a concrete mixture are performed by the automatic control system of processing kneading-and-mixing machinery with operational automatic control of homogeneity. Theoretical underpinnings of the control of the mixture homogeneity are presented, which are related to a change in the frequency of vibrodynamic vibrations of the mixer body. The structure of the technical means of the automatic control system for regulating the supply of water is determined depending on the change in the concrete mixture homogeneity during the continuous mixing of components. The following technical means for establishing automatic control have been chosen: vibro-acoustic sensors, remote terminal units, electropneumatic control actuators, etc. To identify the quality indicator of automatic control, the system offers a structure flowchart with transfer functions that determine the ACS operation in transient dynamic mode.

  3. Interpolity exchange of basalt tools facilitated via elite control in Hawaiian archaic states

    PubMed Central

    Kirch, Patrick V.; Mills, Peter R.; Lundblad, Steven P.; Sinton, John; Kahn, Jennifer G.

    2012-01-01

    Ethnohistoric accounts of late precontact Hawaiian archaic states emphasize the independence of chiefly controlled territories (ahupua‘a) based on an agricultural, staple economy. However, elite control of unevenly distributed resources, such as high-quality volcanic rock for adze production, may have provided an alternative source of economic power. To test this hypothesis we used nondestructive energy-dispersive X-ray fluorescence (ED-XRF) analysis of 328 lithic artifacts from 36 archaeological features in the Kahikinui district, Maui Island, to geochemically characterize the source groups. This process was followed by a limited sampling using destructive wavelength-dispersive X-ray fluorescence (WD-XRF) analysis to more precisely characterize certain nonlocal source groups. Seventeen geochemical groups were defined, eight of which represent extra-Maui Island sources. Although the majority of stone tools were derived from Maui Island sources (71%), a significant quantity (27%) of tools derived from extraisland sources, including the large Mauna Kea quarry on Hawai‘i Island as well as quarries on O‘ahu, Moloka‘i, and Lāna‘i islands. Importantly, tools quarried from extralocal sources are found in the highest frequency in elite residential features and in ritual contexts. These results suggest a significant role for a wealth economy based on the control and distribution of nonagricultural goods and resources during the rise of the Hawaiian archaic states. PMID:22203984

  4. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robinson P. Khosah; Frank T. Alex

    2007-02-11

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analyticalmore » tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its forty-eighth month of development activities.« less

  5. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM 2.5) RESEARCH

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robinson P. Khosah; Charles G. Crawford

    2006-02-11

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analyticalmore » tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its forty-second month of development activities.« less

  6. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robinson P. Khosah; Charles G. Crawford

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analyticalmore » tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase 1, which is currently in progress and will take twelve months to complete, will include the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. In Phase 2, which will be completed in the second year of the project, a platform for on-line data analysis will be developed. Phase 2 will include the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its eleventh month of Phase 1 development activities.« less

  7. Investigating output and energy variations and their relationship to delivery QA results using Statistical Process Control for helical tomotherapy.

    PubMed

    Binny, Diana; Mezzenga, Emilio; Lancaster, Craig M; Trapp, Jamie V; Kairn, Tanya; Crowe, Scott B

    2017-06-01

    The aims of this study were to investigate machine beam parameters using the TomoTherapy quality assurance (TQA) tool, establish a correlation to patient delivery quality assurance results and to evaluate the relationship between energy variations detected using different TQA modules. TQA daily measurement results from two treatment machines for periods of up to 4years were acquired. Analyses of beam quality, helical and static output variations were made. Variations from planned dose were also analysed using Statistical Process Control (SPC) technique and their relationship to output trends were studied. Energy variations appeared to be one of the contributing factors to delivery output dose seen in the analysis. Ion chamber measurements were reliable indicators of energy and output variations and were linear with patient dose verifications. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.

  8. The development of a tool for assessing the quality of closed circuit camera footage for use in forensic gait analysis.

    PubMed

    Birch, Ivan; Vernon, Wesley; Walker, Jeremy; Saxelby, Jai

    2013-10-01

    Gait analysis from closed circuit camera footage is now commonly used as evidence in criminal trials. The biomechanical analysis of human gait is a well established science in both clinical and laboratory settings. However, closed circuit camera footage is rarely of the quality of that taken in the more controlled clinical and laboratory environments. The less than ideal quality of much of this footage for use in gait analysis is associated with a range of issues, the combination of which can often render the footage unsuitable for use in gait analysis. The aim of this piece of work was to develop a tool for assessing the suitability of closed circuit camera footage for the purpose of forensic gait analysis. A Delphi technique was employed with a small sample of expert forensic gait analysis practitioners, to identify key quality elements of CCTV footage used in legal proceedings. Five elements of the footage were identified and then subdivided into 15 contributing sub-elements, each of which was scored using a 5-point Likert scale. A Microsoft Excel worksheet was developed to calculate automatically an overall score from the fifteen sub-element scores. Five expert witnesses experienced in using CCTV footage for gait analysis then trialled the prototype tool on current case footage. A repeatability study was also undertaken using standardized CCTV footage. The results showed the tool to be a simple and repeatable means of assessing the suitability of closed circuit camera footage for use in forensic gait analysis. The inappropriate use of poor quality footage could lead to challenges to the practice of forensic gait analysis. All parties involved in criminal proceedings must therefore understand the fitness for purpose of any footage used. The development of this tool could offer a method of achieving this goal, and help to assure the continued role of forensic gait analysis as an aid to the identification process. Copyright © 2013 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  9. Modelling and control for laser based welding processes: modern methods of process control to improve quality of laser-based joining methods

    NASA Astrophysics Data System (ADS)

    Zäh, Ralf-Kilian; Mosbach, Benedikt; Hollwich, Jan; Faupel, Benedikt

    2017-02-01

    To ensure the competitiveness of manufacturing companies it is indispensable to optimize their manufacturing processes. Slight variations of process parameters and machine settings have only marginally effects on the product quality. Therefore, the largest possible editing window is required. Such parameters are, for example, the movement of the laser beam across the component for the laser keyhole welding. That`s why it is necessary to keep the formation of welding seams within specified limits. Therefore, the quality of laser welding processes is ensured, by using post-process methods, like ultrasonic inspection, or special in-process methods. These in-process systems only achieve a simple evaluation which shows whether the weld seam is acceptable or not. Furthermore, in-process systems use no feedback for changing the control variables such as speed of the laser or adjustment of laser power. In this paper the research group presents current results of the research field of Online Monitoring, Online Controlling and Model predictive controlling in laser welding processes to increase the product quality. To record the characteristics of the welding process, tested online methods are used during the process. Based on the measurement data, a state space model is ascertained, which includes all the control variables of the system. Depending on simulation tools the model predictive controller (MPC) is designed for the model and integrated into an NI-Real-Time-System.

  10. Clinical decision support tools for osteoporosis disease management: a systematic review of randomized controlled trials.

    PubMed

    Kastner, Monika; Straus, Sharon E

    2008-12-01

    Studies indicate a gap between evidence and clinical practice in osteoporosis management. Tools that facilitate clinical decision making at the point of care are promising strategies for closing these practice gaps. To systematically review the literature to identify and describe the effectiveness of tools that support clinical decision making in osteoporosis disease management. Medline, EMBASE, CINAHL, and EBM Reviews (CDSR, DARE, CCTR, and ACP J Club), and contact with experts in the field. Randomized controlled trials (RCTs) in any language from 1966 to July 2006 investigating disease management interventions in patients at risk for osteoporosis. Outcomes included fractures and bone mineral density (BMD) testing. Two investigators independently assessed articles for relevance and study quality, and extracted data using standardized forms. Of 1,246 citations that were screened for relevance, 13 RCTs met the inclusion criteria. Reported study quality was generally poor. Meta-analysis was not done because of methodological and clinical heterogeneity; 77% of studies included a reminder or education as a component of their intervention. Three studies of reminders plus education targeted to physicians and patients showed increased BMD testing (RR range 1.43 to 8.67) and osteoporosis medication use (RR range 1.60 to 8.67). A physician reminder plus a patient risk assessment strategy found reduced fractures [RR 0.58, 95% confidence interval (CI) 0.37 to 0.90] and increased osteoporosis therapy (RR 2.44, CI 1.43 to 4.17). Multi-component tools that are targeted to physicians and patients may be effective for supporting clinical decision making in osteoporosis disease management.

  11. Computer-Controlled Cylindrical Polishing Process for Large X-Ray Mirror Mandrels

    NASA Technical Reports Server (NTRS)

    Khan, Gufran S.; Gubarev, Mikhail; Speegle, Chet; Ramsey, Brian

    2010-01-01

    We are developing high-energy grazing incidence shell optics for hard-x-ray telescopes. The resolution of a mirror shells depends on the quality of cylindrical mandrel from which they are being replicated. Mid-spatial-frequency axial figure error is a dominant contributor in the error budget of the mandrel. This paper presents our efforts to develop a deterministic cylindrical polishing process in order to keep the mid-spatial-frequency axial figure errors to a minimum. Simulation software is developed to model the residual surface figure errors of a mandrel due to the polishing process parameters and the tools used, as well as to compute the optical performance of the optics. The study carried out using the developed software was focused on establishing a relationship between the polishing process parameters and the mid-spatial-frequency error generation. The process parameters modeled are the speeds of the lap and the mandrel, the tool s influence function, the contour path (dwell) of the tools, their shape and the distribution of the tools on the polishing lap. Using the inputs from the mathematical model, a mandrel having conical approximated Wolter-1 geometry, has been polished on a newly developed computer-controlled cylindrical polishing machine. The preliminary results of a series of polishing experiments demonstrate a qualitative agreement with the developed model. We report our first experimental results and discuss plans for further improvements in the polishing process. The ability to simulate the polishing process is critical to optimize the polishing process, improve the mandrel quality and significantly reduce the cost of mandrel production

  12. Tools used for evaluation of Brazilian children's quality of life

    PubMed Central

    Souza, João Gabriel S.; Pamponet, Marcela Antunes; Souza, Tamirys Caroline S.; Pereira, Alessandra Ribeiro; Souza, Andrey George S.; Martins, Andréa Maria E. de B. L.

    2014-01-01

    OBJECTIVE: To review the available tools to evaluate children's quality of life validated for Brazilian language and culture. DATA SOURCES: Search of scientific articles in Medline, Lilacs and SciELO databases using the combination of descriptors "quality of life", "child" and "questionnaires" in Portuguese and English. DATA SYNTHESIS: Among the tools designed to assess children's quality of life validated for the Brazilian language and culture, the Auto questionnaire Qualité de Vie Enfant Imagé (AUQEI), the Child Health Questionnaire - Parent Form 50 (CHQ-PF50), the Pediatric Quality of Life Inventory (PedsQL(tm)) version 4.0 and the Kidscreen-52 are highlighted. Some tools do not include all range of ages and some lack domains that are currently considered relevant in the context of childhood, such as bullying. Moreover, due to the cultural diversity of Brazil, it may be necessary to adapt some instruments or to validate other tools. CONCLUSIONS: There are validated instruments to evaluate children's quality of life in Brazil. However, the validation or the adaptation of other international tools have to be considered in order to overcome current deficiencies. PMID:25119761

  13. Investigating the Effect of Approach Angle and Nose Radius on Surface Quality of Inconel 718

    NASA Astrophysics Data System (ADS)

    Kumar, Sunil; Singh, Dilbag; Kalsi, Nirmal S.

    2017-11-01

    This experimental work presents a surface quality evaluation of a Nickel-Cr-Fe based Inconel 718 superalloy, which has many applications in the aero engine and turbine components. However, during machining, the early wear of tool leads to decrease in surface quality. The coating on cutting tool plays a significant role in increasing the wear resistance and life of the tool. In this work, the aim is to study the surface quality of Inconel 718 with TiAlN-coated carbide tools. Influence of various geometrical parameters (tool nose radius, approach angle) and machining variables (cutting velocity, feed rate) on the quality of machined surface (surface roughness) was determined by using central composite design (CCD) matrix. The mathematical model of the same was developed. Analysis of variance was used to find the significance of the parameters. Results showed that the tool nose radius and feed were the main active factors. The present experiment accomplished that TiAlN-coated carbide inserts result in better surface quality as compared with uncoated carbide inserts.

  14. So you want to conduct a randomised trial? Learnings from a 'failed' feasibility study of a Crisis Resource Management prompt during simulated paediatric resuscitation.

    PubMed

    Teis, Rachel; Allen, Jyai; Lee, Nigel; Kildea, Sue

    2017-02-01

    No study has tested a Crisis Resource Management prompt on resuscitation performance. We conducted a feasibility, unblinded, parallel-group, randomised controlled trial at one Australian paediatric hospital (June-September 2014). Eligible participants were any doctor, nurse, or nurse manager who would normally be involved in a Medical Emergency Team simulation. The unit of block randomisation was one of six scenarios (3 control:3 intervention) with or without a verbal prompt. The primary outcomes tested the feasibility and utility of the intervention and data collection tools. The secondary outcomes measured resuscitation quality and team performance. Data were analysed from six resuscitation scenarios (n=49 participants); three control groups (n=25) and three intervention groups (n=24). The ability to measure all data items on the data collection tools was hindered by problems with the recording devices both in the mannequins and the video camera. For a pilot study, greater training for the prompt role and pre-briefing participants about assessment of their cardio-pulmonary resuscitation quality should be undertaken. Data could be analysed in real time with independent video analysis to validate findings. Two cameras would strengthen reliability of the methods. Copyright © 2016 College of Emergency Nursing Australasia. Published by Elsevier Ltd. All rights reserved.

  15. Review of quality assessment tools for the evaluation of pharmacoepidemiological safety studies

    PubMed Central

    Neyarapally, George A; Hammad, Tarek A; Pinheiro, Simone P; Iyasu, Solomon

    2012-01-01

    Objectives Pharmacoepidemiological studies are an important hypothesis-testing tool in the evaluation of postmarketing drug safety. Despite the potential to produce robust value-added data, interpretation of findings can be hindered due to well-recognised methodological limitations of these studies. Therefore, assessment of their quality is essential to evaluating their credibility. The objective of this review was to evaluate the suitability and relevance of available tools for the assessment of pharmacoepidemiological safety studies. Design We created an a priori assessment framework consisting of reporting elements (REs) and quality assessment attributes (QAAs). A comprehensive literature search identified distinct assessment tools and the prespecified elements and attributes were evaluated. Primary and secondary outcome measures The primary outcome measure was the percentage representation of each domain, RE and QAA for the quality assessment tools. Results A total of 61 tools were reviewed. Most tools were not designed to evaluate pharmacoepidemiological safety studies. More than 50% of the reviewed tools considered REs under the research aims, analytical approach, outcome definition and ascertainment, study population and exposure definition and ascertainment domains. REs under the discussion and interpretation, results and study team domains were considered in less than 40% of the tools. Except for the data source domain, quality attributes were considered in less than 50% of the tools. Conclusions Many tools failed to include critical assessment elements relevant to observational pharmacoepidemiological safety studies and did not distinguish between REs and QAAs. Further, there is a lack of considerations on the relative weights of different domains and elements. The development of a quality assessment tool would facilitate consistent, objective and evidence-based assessments of pharmacoepidemiological safety studies. PMID:23015600

  16. Simulation of textile manufacturing processes for planning, scheduling, and quality control purposes

    NASA Astrophysics Data System (ADS)

    Cropper, A. E.; Wang, Z.

    1995-08-01

    Simulation, as a management information tool, has been applied to engineering manufacture and assembly operations. The application of the principles to textile manufacturing (fiber to fabric) is discussed. The particular problems and solutions in applying the simulation software package to the yarn production processes are discussed with an indication of how the software achieves the production schedule. The system appears to have application in planning, scheduling, and quality assurance. The latter being a result of the traceability possibilities through a process involving mixing and splitting of material.

  17. Formative evaluation of a patient-specific clinical knowledge summarization tool

    PubMed Central

    Del Fiol, Guilherme; Mostafa, Javed; Pu, Dongqiuye; Medlin, Richard; Slager, Stacey; Jonnalagadda, Siddhartha R.; Weir, Charlene R.

    2015-01-01

    Objective To iteratively design a prototype of a computerized clinical knowledge summarization (CKS) tool aimed at helping clinicians finding answers to their clinical questions; and to conduct a formative assessment of the usability, usefulness, efficiency, and impact of the CKS prototype on physicians’ perceived decision quality compared with standard search of UpToDate and PubMed. Materials and methods Mixed-methods observations of the interactions of 10 physicians with the CKS prototype vs. standard search in an effort to solve clinical problems posed as case vignettes. Results The CKS tool automatically summarizes patient-specific and actionable clinical recommendations from PubMed (high quality randomized controlled trials and systematic reviews) and UpToDate. Two thirds of the study participants completed 15 out of 17 usability tasks. The median time to task completion was less than 10 s for 12 of the 17 tasks. The difference in search time between the CKS and standard search was not significant (median = 4.9 vs. 4.5 min). Physician’s perceived decision quality was significantly higher with the CKS than with manual search (mean = 16.6 vs. 14.4; p = 0.036). Conclusions The CKS prototype was well-accepted by physicians both in terms of usability and usefulness. Physicians perceived better decision quality with the CKS prototype compared to standard search of PubMed and UpToDate within a similar search time. Due to the formative nature of this study and a small sample size, conclusions regarding efficiency and efficacy are exploratory. PMID:26612774

  18. Formative evaluation of a patient-specific clinical knowledge summarization tool.

    PubMed

    Del Fiol, Guilherme; Mostafa, Javed; Pu, Dongqiuye; Medlin, Richard; Slager, Stacey; Jonnalagadda, Siddhartha R; Weir, Charlene R

    2016-02-01

    To iteratively design a prototype of a computerized clinical knowledge summarization (CKS) tool aimed at helping clinicians finding answers to their clinical questions; and to conduct a formative assessment of the usability, usefulness, efficiency, and impact of the CKS prototype on physicians' perceived decision quality compared with standard search of UpToDate and PubMed. Mixed-methods observations of the interactions of 10 physicians with the CKS prototype vs. standard search in an effort to solve clinical problems posed as case vignettes. The CKS tool automatically summarizes patient-specific and actionable clinical recommendations from PubMed (high quality randomized controlled trials and systematic reviews) and UpToDate. Two thirds of the study participants completed 15 out of 17 usability tasks. The median time to task completion was less than 10s for 12 of the 17 tasks. The difference in search time between the CKS and standard search was not significant (median=4.9 vs. 4.5m in). Physician's perceived decision quality was significantly higher with the CKS than with manual search (mean=16.6 vs. 14.4; p=0.036). The CKS prototype was well-accepted by physicians both in terms of usability and usefulness. Physicians perceived better decision quality with the CKS prototype compared to standard search of PubMed and UpToDate within a similar search time. Due to the formative nature of this study and a small sample size, conclusions regarding efficiency and efficacy are exploratory. Published by Elsevier Ireland Ltd.

  19. The usefulness of Quality of Life Childhood Epilepsy (QOLCE) questionnaire in evaluating the quality of life of children with epilepsy.

    PubMed

    Talarska, D

    2007-01-01

    Evaluation of quality of life has become a frequently used method in treatment effects supervision. Quality of Life Childhood Epilepsy (QOLCE) questionnaire, which is completed by patients' parents, has been prepared for children with epilepsy. It enables to determine the quality of life in children aged 4-18 years. The aim of the study was to show the usefulness of QOLCE questionnaire in evaluating the quality of life of children with epilepsy. 160 epileptic children, aged 8-18 years and their parents were examined in the Chair and Department of Developmental Neurology, K. Marcinkowski University of Medical Sciences in Poznań. QOLCE questionnaire was completed by parents and "Young people and epilepsy" questionnaire was designed for children. Reliability index of the complete questionnaire in own research and in the original amounted to 0.93 Cronbach alpha coefficient. Epileptic, drug-resistant children constituted 28% of the examined group. Parents of children with controlled seizures evaluated children's functioning in analyzed areas of quality of life higher. 1. QOLCE questionnaire is a suitable tool to evaluate the quality of children's and adolescents' life. 2. The most significant differences in functioning of epileptic, drug-resistant patients and those with controlled seizures were observed in areas of cognitive processes and social activity.

  20. Alternative indicators for monitoring the quality of a continuous intervention program on antibiotic prescribing during changing healthcare conditions.

    PubMed

    Bantar, C; Franco, D; Heft, C; Vesco, E; Arango, C; Izaguirre, M; Alcázar, G; Boleas, M; Oliva, M E

    2005-06-01

    We recently published on the impact of a four-phase hospital-wide intervention program designed to optimize the quality of antibiotic use, where a multidisciplinary team (MDT) could modify prescription at the last phase. Because health care quality was changing during the last 5 years (late 1999 to early 2004), we developed certain indicators to monitor the quality of our intervention over time. Different periods were defined as baseline (pre-intervention), initial intervention-active control, pre-crisis control, crisis control, post-crisis control and end of crisis control. Major indicators were rates of prescription modification by the MDT; prescription for an uncertain infection and a novel index formula (RIcarb) to estimate the rationale for carbapenem use. We assessed 2115 antimicrobial prescriptions. Modification of prescription rate was 30% at the beginning and decreased thereafter up to stable levels. Rate of prescriptions ordered for cases of both uncertain infection and unknown source of infection decreased significantly after intervention (i.e. from baseline to active control). In contrast, a doubling of culture-directed prescriptions was observed between these periods. RIcarb values lower and higher than 60% (modal, cut-off) were assumed as carbapenem overuse and underuse, respectively. Overuse was observed at the pre-intervention, while pronounced underuse was shown during the crisis (RIcarb, 45% and 87%, respectively). The present study demonstrates that certain indicators, other than the widely adopted impact outcomes, are a suitable tool for monitoring the quality of a continuous, long-term, active intervention on antimicrobial prescribing practice, especially when applied in a changing healthcare setting.

  1. Using Statistical Process Control to Drive Improvement in Neonatal Care: A Practical Introduction to Control Charts.

    PubMed

    Gupta, Munish; Kaplan, Heather C

    2017-09-01

    Quality improvement (QI) is based on measuring performance over time, and variation in data measured over time must be understood to guide change and make optimal improvements. Common cause variation is natural variation owing to factors inherent to any process; special cause variation is unnatural variation owing to external factors. Statistical process control methods, and particularly control charts, are robust tools for understanding data over time and identifying common and special cause variation. This review provides a practical introduction to the use of control charts in health care QI, with a focus on neonatology. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. Diagnosis of the Computer-Controlled Milling Machine, Definition of the Working Errors and Input Corrections on the Basis of Mathematical Model

    NASA Astrophysics Data System (ADS)

    Starikov, A. I.; Nekrasov, R. Yu; Teploukhov, O. J.; Soloviev, I. V.; Narikov, K. A.

    2016-10-01

    Manufactures, machinery and equipment improve of constructively as science advances and technology, and requirements are improving of quality and longevity. That is, the requirements for surface quality and precision manufacturing, oil and gas equipment parts are constantly increasing. Production of oil and gas engineering products on modern machine tools with computer numerical control - is a complex synthesis of technical and electrical equipment parts, as well as the processing procedure. Technical machine part wears during operation and in the electrical part are accumulated mathematical errors. Thus, the above-mentioned disadvantages of any of the following parts of metalworking equipment affect the manufacturing process of products in general, and as a result lead to the flaw.

  3. Controlling Quality in CME/CPD by Measuring and Illuminating Bias

    ERIC Educational Resources Information Center

    Dixon, David; Takhar, Jatinder; Macnab, Jennifer; Eadie, Jason; Lockyer, Jocelyn; Stenerson, Heather; Francois, Jose; Bell, Mary; Monette, Celine; Campbell, Craig; Marlow, Bernie

    2011-01-01

    Introduction: There has been a surge of interest in the area of bias in industry-supported continuing medical education/continuing professional development (CME/CPD) activities. In 2007, we published our first study on measuring bias in CME, demonstrating that our assessment tool was valid and reliable. In light of the increasing interest in this…

  4. Runoff delay exerts a strong control on the field-scale removal of manure-borne fecal bacteria with runoff

    USDA-ARS?s Scientific Manuscript database

    The microbial safety of surface waters is an ongoing issue which is threatened by the transport of manure-borne bacteria to water sources used for irrigation or recreation. Predictive modeling has become an effective tool to forecast the microbial quality of water during precipitation events, howeve...

  5. Administering Examinations for Quality Control in Distance Education: The National Open University of Nigeria Perspective

    ERIC Educational Resources Information Center

    Ibara, E. C.

    2008-01-01

    Examination is an important tool for evaluating students learning outcome and require proper planning to meet high standards. This paper therefore examines the processes leading to administration of face-to-face examination in distance education with focus on the National Open University of Nigeria. It highlighted some procedures such as test…

  6. MAKER-P: a tool-kit for the creation, management, and quality control of plant genome annotations

    USDA-ARS?s Scientific Manuscript database

    We have optimized and extended the widely used annotation-engine MAKER for use on plant genomes. We have benchmarked the resulting software, MAKER-P, using the A. thaliana genome and the TAIR10 gene models. Here we demonstrate the ability of the MAKER-P toolkit to generate de novo repeat databases, ...

  7. Integration of visual and motion cues for simulator requirements and ride quality investigation

    NASA Technical Reports Server (NTRS)

    Young, L. R.

    1976-01-01

    Practical tools which can extend the state of the art of moving base flight simulation for research and training are developed. Main approaches to this research effort include: (1) application of the vestibular model for perception of orientation based on motion cues: optimum simulator motion controls; and (2) visual cues in landing.

  8. Efficacy of a Self-Monitoring Tool for Improving the Quality of the Language Environment in the Preschool Classroom

    ERIC Educational Resources Information Center

    Strasser, Katherine; Mendive, Susana; Vergara, Daniela; Darricades, Michelle

    2018-01-01

    Research Findings: This study evaluated the impact of a self-monitoring intervention on preschool teachers' use of language and on children's language growth. Nineteen classrooms from Santiago de Chile participated (10 intervention, 9 control). Twice a week, intervention teachers filled out a checklist to monitor the language stimulation they…

  9. Implementation of Quality Management in Core Service Laboratories

    PubMed Central

    Creavalle, T.; Haque, K.; Raley, C.; Subleski, M.; Smith, M.W.; Hicks, B.

    2010-01-01

    CF-28 The Genetics and Genomics group of the Advanced Technology Program of SAIC-Frederick exists to bring innovative genomic expertise, tools and analysis to NCI and the scientific community. The Sequencing Facility (SF) provides next generation short read (Illumina) sequencing capacity to investigators using a streamlined production approach. The Laboratory of Molecular Technology (LMT) offers a wide range of genomics core services including microarray expression analysis, miRNA analysis, array comparative genome hybridization, long read (Roche) next generation sequencing, quantitative real time PCR, transgenic genotyping, Sanger sequencing, and clinical mutation detection services to investigators from across the NIH. As the technology supporting this genomic research becomes more complex, the need for basic quality processes within all aspects of the core service groups becomes critical. The Quality Management group works alongside members of these labs to establish or improve processes supporting operations control (equipment, reagent and materials management), process improvement (reengineering/optimization, automation, acceptance criteria for new technologies and tech transfer), and quality assurance and customer support (controlled documentation/SOPs, training, service deficiencies and continual improvement efforts). Implementation and expansion of quality programs within unregulated environments demonstrates SAIC-Frederick's dedication to providing the highest quality products and services to the NIH community.

  10. The use of a non-nuclear density gauge for monitoring the compaction process of asphalt pavement

    NASA Astrophysics Data System (ADS)

    Van den bergh, Wim; Vuye, Cedric; Kara, Patricia; Couscheir, Karolien; Blom, Johan; Van Bouwel, Philippe

    2017-09-01

    The mechanical performance of an asphalt pavement affects its durability - thus carbon footprint. Many parameters contribute to the success of a durable asphalt mix, e.g. material selection, an accurate mix and even the road design in which the asphalt mix quality is quantified. The quality of the asphalt mix, by its mechanical properties, is also related to the compaction degree. However, and specifically for high volume rates, the laying process at the construction site needs an effective method to monitor and adjust immediately the compaction quality before cooling and without damaging the layer, which is now absent. In this paper the use of a non-nuclear density gauge (PQI - Pavement Quality Indicator) is evaluated, based on a site at Brussels Airport. Considering the outcome of the present research, this PQI is advised as a unique tool for continuous density measurements and allow immediate adjustments during compaction, and decreases the number of core drilling for quality control, and as a posteriori asphalt pavement density test where cores are prohibited. The use of PQI could be recommended to be a part of the standard quality control process in the Flemish region.

  11. Understanding Interrater Reliability and Validity of Risk Assessment Tools Used to Predict Adverse Clinical Events.

    PubMed

    Siedlecki, Sandra L; Albert, Nancy M

    This article will describe how to assess interrater reliability and validity of risk assessment tools, using easy-to-follow formulas, and to provide calculations that demonstrate principles discussed. Clinical nurse specialists should be able to identify risk assessment tools that provide high-quality interrater reliability and the highest validity for predicting true events of importance to clinical settings. Making best practice recommendations for assessment tool use is critical to high-quality patient care and safe practices that impact patient outcomes and nursing resources. Optimal risk assessment tool selection requires knowledge about interrater reliability and tool validity. The clinical nurse specialist will understand the reliability and validity issues associated with risk assessment tools, and be able to evaluate tools using basic calculations. Risk assessment tools are developed to objectively predict quality and safety events and ultimately reduce the risk of event occurrence through preventive interventions. To ensure high-quality tool use, clinical nurse specialists must critically assess tool properties. The better the tool's ability to predict adverse events, the more likely that event risk is mediated. Interrater reliability and validity assessment is relatively an easy skill to master and will result in better decisions when selecting or making recommendations for risk assessment tool use.

  12. Atmospheric Model Evaluation Tool for meteorological and air quality simulations

    EPA Pesticide Factsheets

    The Atmospheric Model Evaluation Tool compares model predictions to observed data from various meteorological and air quality observation networks to help evaluate meteorological and air quality simulations.

  13. [The OPTIMISE study (Optimal Type 2 Diabetes Management Including Benchmarking and Standard Treatment]. Results for Luxembourg].

    PubMed

    Michel, G

    2012-01-01

    The OPTIMISE study (NCT00681850) has been run in six European countries, including Luxembourg, to prospectively assess the effect of benchmarking on the quality of primary care in patients with type 2 diabetes, using major modifiable vascular risk factors as critical quality indicators. Primary care centers treating type 2 diabetic patients were randomized to give standard care (control group) or standard care with feedback benchmarked against other centers in each country (benchmarking group). Primary endpoint was percentage of patients in the benchmarking group achieving pre-set targets of the critical quality indicators: glycated hemoglobin (HbAlc), systolic blood pressure (SBP) and low-density lipoprotein (LDL) cholesterol after 12 months follow-up. In Luxembourg, in the benchmarking group, more patients achieved target for SBP (40.2% vs. 20%) and for LDL-cholesterol (50.4% vs. 44.2%). 12.9% of patients in the benchmarking group met all three targets compared with patients in the control group (8.3%). In this randomized, controlled study, benchmarking was shown to be an effective tool for improving critical quality indicator targets, which are the principal modifiable vascular risk factors in diabetes type 2.

  14. deepTools2: a next generation web server for deep-sequencing data analysis.

    PubMed

    Ramírez, Fidel; Ryan, Devon P; Grüning, Björn; Bhardwaj, Vivek; Kilpert, Fabian; Richter, Andreas S; Heyne, Steffen; Dündar, Friederike; Manke, Thomas

    2016-07-08

    We present an update to our Galaxy-based web server for processing and visualizing deeply sequenced data. Its core tool set, deepTools, allows users to perform complete bioinformatic workflows ranging from quality controls and normalizations of aligned reads to integrative analyses, including clustering and visualization approaches. Since we first described our deepTools Galaxy server in 2014, we have implemented new solutions for many requests from the community and our users. Here, we introduce significant enhancements and new tools to further improve data visualization and interpretation. deepTools continue to be open to all users and freely available as a web service at deeptools.ie-freiburg.mpg.de The new deepTools2 suite can be easily deployed within any Galaxy framework via the toolshed repository, and we also provide source code for command line usage under Linux and Mac OS X. A public and documented API for access to deepTools functionality is also available. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  15. Experimental evaluation of tool run-out in micro milling

    NASA Astrophysics Data System (ADS)

    Attanasio, Aldo; Ceretti, Elisabetta

    2018-05-01

    This paper deals with micro milling cutting process focusing the attention on tool run-out measurement. In fact, among the effects of the scale reduction from macro to micro (i.e., size effects) tool run-out plays an important role. This research is aimed at developing an easy and reliable method to measure tool run-out in micro milling based on experimental tests and an analytical model. From an Industry 4.0 perspective this measuring strategy can be integrated into an adaptive system for controlling cutting forces, with the objective of improving the production quality, the process stability, reducing at the same time the tool wear and the machining costs. The proposed procedure estimates the tool run-out parameters from the tool diameter, the channel width, and the phase angle between the cutting edges. The cutting edge phase measurement is based on the force signal analysis. The developed procedure has been tested on data coming from micro milling experimental tests performed on a Ti6Al4V sample. The results showed that the developed procedure can be successfully used for tool run-out estimation.

  16. Application of ESE Data and Tools to Air Quality Management: Services for Helping the Air Quality Community use ESE Data (SHAirED)

    NASA Technical Reports Server (NTRS)

    Falke, Stefan; Husar, Rudolf

    2011-01-01

    The goal of this REASoN applications and technology project is to deliver and use Earth Science Enterprise (ESE) data and tools in support of air quality management. Its scope falls within the domain of air quality management and aims to develop a federated air quality information sharing network that includes data from NASA, EPA, US States and others. Project goals were achieved through a access of satellite and ground observation data, web services information technology, interoperability standards, and air quality community collaboration. In contributing to a network of NASA ESE data in support of particulate air quality management, the project will develop access to distributed data, build Web infrastructure, and create tools for data processing and analysis. The key technologies used in the project include emerging web services for developing self describing and modular data access and processing tools, and service oriented architecture for chaining web services together to assemble customized air quality management applications. The technology and tools required for this project were developed within DataFed.net, a shared infrastructure that supports collaborative atmospheric data sharing and processing web services. Much of the collaboration was facilitated through community interactions through the Federation of Earth Science Information Partners (ESIP) Air Quality Workgroup. The main activities during the project that successfully advanced DataFed, enabled air quality applications and established community-oriented infrastructures were: develop access to distributed data (surface and satellite), build Web infrastructure to support data access, processing and analysis create tools for data processing and analysis foster air quality community collaboration and interoperability.

  17. Contamination and Surface Preparation Effects on Composite Bonding

    NASA Technical Reports Server (NTRS)

    Kutscha, Eileen O.; Vahey, Paul G.; Belcher, Marcus A.; VanVoast, Peter J.; Grace, William B.; Blohowiak, Kay Y.; Palmieri, Frank L.; Connell, John W.

    2017-01-01

    Results presented here demonstrate the effect of several prebond surface contaminants (hydrocarbon, machining fluid, latex, silicone, peel ply residue, release film) on bond quality, as measured by fracture toughness and failure modes of carbon fiber reinforced epoxy substrates bonded in secondary and co-bond configurations with paste and film adhesives. Additionally, the capability of various prebond surface property measurement tools to detect contaminants and potentially predict subsequent bond performance of three different adhesives is also shown. Surface measurement methods included water contact angle, Dyne solution wettability, optically stimulated electron emission spectroscopy, surface free energy, inverse gas chromatography, and Fourier transform infrared spectroscopy with chemometrics analysis. Information will also be provided on the effectiveness of mechanical and energetic surface treatments to recover a bondable surface after contamination. The benefits and drawbacks of the various surface analysis tools to detect contaminants and evaluate prebond surfaces after surface treatment were assessed as well as their ability to correlate to bond performance. Surface analysis tools were also evaluated for their potential use as in-line quality control of adhesive bonding parameters in the manufacturing environment.

  18. The impact of a novel resident leadership training curriculum.

    PubMed

    Awad, Samir S; Hayley, Barbara; Fagan, Shawn P; Berger, David H; Brunicardi, F Charles

    2004-11-01

    Today's complex health care environment coupled with the 80-hour workweek mandate has required that surgical resident team interactions evolve from a military command-and-control style to a collaborative leadership style. A novel educational curriculum was implemented with objectives of training the residents to have the capacity/ability to create and manage powerful teams through alignment, communication, and integrity integral tools to practicing a collaborative leadership style while working 80 hours per week. Specific strategies were as follows: (1) to focus on quality of patient care and service while receiving a high education-to-service ratio, and (2) to maximize efficiency through time management. This article shows that leadership training as part of a resident curriculum can significantly increase a resident's view of leadership in the areas of alignment, communication, and integrity; tools previously shown in business models to be vital for effective and efficient teams. This curriculum, over the course of the surgical residency, can provide residents with the necessary tools to deliver efficient quality of care while working within the 80-hour workweek mandate in a more collaborative style environment.

  19. CÆLIS: software for assimilation, management and processing data of an atmospheric measurement network

    NASA Astrophysics Data System (ADS)

    Fuertes, David; Toledano, Carlos; González, Ramiro; Berjón, Alberto; Torres, Benjamín; Cachorro, Victoria E.; de Frutos, Ángel M.

    2018-02-01

    Given the importance of the atmospheric aerosol, the number of instruments and measurement networks which focus on its characterization are growing. Many challenges are derived from standardization of protocols, monitoring of the instrument status to evaluate the network data quality and manipulation and distribution of large volume of data (raw and processed). CÆLIS is a software system which aims at simplifying the management of a network, providing tools by monitoring the instruments, processing the data in real time and offering the scientific community a new tool to work with the data. Since 2008 CÆLIS has been successfully applied to the photometer calibration facility managed by the University of Valladolid, Spain, in the framework of Aerosol Robotic Network (AERONET). Thanks to the use of advanced tools, this facility has been able to analyze a growing number of stations and data in real time, which greatly benefits the network management and data quality control. The present work describes the system architecture of CÆLIS and some examples of applications and data processing.

  20. Duly noted: Lessons from a two-site intervention to assess and improve the quality of clinical documentation in the electronic health record.

    PubMed

    Fanucchi, Laura; Yan, Donglin; Conigliaro, Rosemarie L

    2016-07-06

    Communication errors are identified as a root cause contributing to a majority of sentinel events. The clinical note is a cornerstone of physician communication, yet there are few published interventions on teaching note writing in the electronic health record (EHR). This is a prospective, two-site, quality improvement project to assess and improve the quality of clinical documentation in the EHR using a validated assessment tool. Internal Medicine (IM) residents at the University of Kentucky College of Medicine (UK) and Montefiore Medical Center/Albert Einstein College of Medicine (MMC) received one of two interventions during an inpatient ward month: either a lecture, or a lecture and individual feedback on progress notes. A third group of residents in each program served as control. Notes were evaluated with the Physician Documentation Quality Instrument 9 (PDQI-9). Due to a significant difference in baseline PDQI-9 scores at MMC, the sites were not combined. Of 75 residents at the UK site, 22 were eligible, 20 (91%) enrolled, 76 notes in total were scored. Of 156 residents at MMC, 22 were eligible, 18 (82%) enrolled, 40 notes in total were scored. Note quality did not improve as measured by the PDQI-9. This educational quality improvement project did not improve the quality of clinical documentation as measured by the PDQI-9. This project underscores the difficulty in improving note quality. Further efforts should explore more effective educational tools to improve the quality of clinical documentation in the EHR.

Top