Sample records for processing quality control

  1. [Development of whole process quality control and management system of traditional Chinese medicine decoction pieces based on traditional Chinese medicine quality tree].

    PubMed

    Yu, Wen-Kang; Dong, Ling; Pei, Wen-Xuan; Sun, Zhi-Rong; Dai, Jun-Dong; Wang, Yun

    2017-12-01

    The whole process quality control and management of traditional Chinese medicine (TCM) decoction pieces is a system engineering, involving the base environment, seeds and seedlings, harvesting, processing and other multiple steps, so the accurate identification of factors in TCM production process that may induce the quality risk, as well as reasonable quality control measures are very important. At present, the concept of quality risk is mainly concentrated in the aspects of management and regulations, etc. There is no comprehensive analysis on possible risks in the quality control process of TCM decoction pieces, or analysis summary of effective quality control schemes. A whole process quality control and management system for TCM decoction pieces based on TCM quality tree was proposed in this study. This system effectively combined the process analysis method of TCM quality tree with the quality risk management, and can help managers to make real-time decisions while realizing the whole process quality control of TCM. By providing personalized web interface, this system can realize user-oriented information feedback, and was convenient for users to predict, evaluate and control the quality of TCM. In the application process, the whole process quality control and management system of the TCM decoction pieces can identify the related quality factors such as base environment, cultivation and pieces processing, extend and modify the existing scientific workflow according to their own production conditions, and provide different enterprises with their own quality systems, to achieve the personalized service. As a new quality management model, this paper can provide reference for improving the quality of Chinese medicine production and quality standardization. Copyright© by the Chinese Pharmaceutical Association.

  2. A case study: application of statistical process control tool for determining process capability and sigma level.

    PubMed

    Chopra, Vikram; Bairagi, Mukesh; Trivedi, P; Nagar, Mona

    2012-01-01

    Statistical process control is the application of statistical methods to the measurement and analysis of variation process. Various regulatory authorities such as Validation Guidance for Industry (2011), International Conference on Harmonisation ICH Q10 (2009), the Health Canada guidelines (2009), Health Science Authority, Singapore: Guidance for Product Quality Review (2008), and International Organization for Standardization ISO-9000:2005 provide regulatory support for the application of statistical process control for better process control and understanding. In this study risk assessments, normal probability distributions, control charts, and capability charts are employed for selection of critical quality attributes, determination of normal probability distribution, statistical stability, and capability of production processes, respectively. The objective of this study is to determine tablet production process quality in the form of sigma process capability. By interpreting data and graph trends, forecasting of critical quality attributes, sigma process capability, and stability of process were studied. The overall study contributes to an assessment of process at the sigma level with respect to out-of-specification attributes produced. Finally, the study will point to an area where the application of quality improvement and quality risk assessment principles for achievement of six sigma-capable processes is possible. Statistical process control is the most advantageous tool for determination of the quality of any production process. This tool is new for the pharmaceutical tablet production process. In the case of pharmaceutical tablet production processes, the quality control parameters act as quality assessment parameters. Application of risk assessment provides selection of critical quality attributes among quality control parameters. Sequential application of normality distributions, control charts, and capability analyses provides a valid statistical process control study on process. Interpretation of such a study provides information about stability, process variability, changing of trends, and quantification of process ability against defective production. Comparative evaluation of critical quality attributes by Pareto charts provides the least capable and most variable process that is liable for improvement. Statistical process control thus proves to be an important tool for six sigma-capable process development and continuous quality improvement.

  3. Quality transitivity and traceability system of herbal medicine products based on quality markers.

    PubMed

    Liu, Changxiao; Guo, De-An; Liu, Liang

    2018-05-15

    Due to a variety of factors to affect the herb quality, the existing quality management model is unable to evaluate the process control. The development of the concept of "quality marker" (Q-marker) lays basis for establishing an independent process quality control system for herbal products. To ensure the highest degree of safety, effectiveness and quality process control of herbal products, it is aimed to establish a quality transitivity and traceability system of quality and process control from raw materials to finished herbal products. Based on the key issues and challenges of quality assessment, the current status of quality and process controls from raw materials to herbal medicinal products listed in Pharmacopoeia were analyzed and the research models including discovery and identification of Q-markers, analysis and quality management of risk evaluation were designed. Authors introduced a few new technologies and methodologies, such as DNA barcoding, chromatographic technologies, fingerprint analysis, chemical markers, bio-responses, risk management and solution for quality process control. The quality and process control models for herbal medicinal products were proposed and the transitivity and traceability system from raw materials to the finished products was constructed to improve the herbal quality from the entire supply and production chain. The transitivity and traceability system has been established based on quality markers, especially on how to control the production process under Good Engineering Practices, as well as to implement the risk management for quality and process control in herbal medicine production. Copyright © 2018 Elsevier GmbH. All rights reserved.

  4. 21 CFR 111.110 - What quality control operations are required for laboratory operations associated with the...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... laboratory operations associated with the production and process control system? 111.110 Section 111.110 Food... OPERATIONS FOR DIETARY SUPPLEMENTS Production and Process Control System: Requirements for Quality Control... production and process control system? Quality control operations for laboratory operations associated with...

  5. Quality by control: Towards model predictive control of mammalian cell culture bioprocesses.

    PubMed

    Sommeregger, Wolfgang; Sissolak, Bernhard; Kandra, Kulwant; von Stosch, Moritz; Mayer, Martin; Striedner, Gerald

    2017-07-01

    The industrial production of complex biopharmaceuticals using recombinant mammalian cell lines is still mainly built on a quality by testing approach, which is represented by fixed process conditions and extensive testing of the end-product. In 2004 the FDA launched the process analytical technology initiative, aiming to guide the industry towards advanced process monitoring and better understanding of how critical process parameters affect the critical quality attributes. Implementation of process analytical technology into the bio-production process enables moving from the quality by testing to a more flexible quality by design approach. The application of advanced sensor systems in combination with mathematical modelling techniques offers enhanced process understanding, allows on-line prediction of critical quality attributes and subsequently real-time product quality control. In this review opportunities and unsolved issues on the road to a successful quality by design and dynamic control implementation are discussed. A major focus is directed on the preconditions for the application of model predictive control for mammalian cell culture bioprocesses. Design of experiments providing information about the process dynamics upon parameter change, dynamic process models, on-line process state predictions and powerful software environments seem to be a prerequisite for quality by control realization. © 2017 The Authors. Biotechnology Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. A real time quality control application for animal production by image processing.

    PubMed

    Sungur, Cemil; Özkan, Halil

    2015-11-01

    Standards of hygiene and health are of major importance in food production, and quality control has become obligatory in this field. Thanks to rapidly developing technologies, it is now possible for automatic and safe quality control of food production. For this purpose, image-processing-based quality control systems used in industrial applications are being employed to analyze the quality of food products. In this study, quality control of chicken (Gallus domesticus) eggs was achieved using a real time image-processing technique. In order to execute the quality control processes, a conveying mechanism was used. Eggs passing on a conveyor belt were continuously photographed in real time by cameras located above the belt. The images obtained were processed by various methods and techniques. Using digital instrumentation, the volume of the eggs was measured, broken/cracked eggs were separated and dirty eggs were determined. In accordance with international standards for classifying the quality of eggs, the class of separated eggs was determined through a fuzzy implication model. According to tests carried out on thousands of eggs, a quality control process with an accuracy of 98% was possible. © 2014 Society of Chemical Industry.

  7. [Analysis and countermeasure for quality risk in process of traditional Chinese medicine preparations].

    PubMed

    Yang, Ming; Yang, Yuan-Zhen; Wang, Ya-Qi; Wu, Zhen-Feng; Wang, Xue-Cheng; Luo, Jing

    2017-03-01

    Product quality relies on not only testing methods,but also the design and development, production control and product manufacturing all aspects of logistics management. Quality comes from the process control level.Therefore, it is very important to accurately identify the factors that may induce quality risk in the production process and quality control measures correspondingly.This article systematically analyzes the source of the quality risk of all aspects of the production process in traditional Chinese medicine preparation. Discussing ways and methods of quality risk identification of traditional Chinese medicine preparation and providing references for perfecting the whole process quality management of traditional Chinese medicine preparation. Copyright© by the Chinese Pharmaceutical Association.

  8. [A strategy of constructing the technological system for quality control of Chinese medicine based on process control and management].

    PubMed

    Cheng, Yi-Yu; Qian, Zhong-Zhi; Zhang, Bo-Li

    2017-01-01

    The current situation, bottleneck problems and severe challenges in quality control technology of Chinese Medicine (CM) are briefly described. It is presented to change the phenomenon related to the post-test as the main means and contempt for process control in drug regulation, reverse the situation of neglecting the development of process control and management technology for pharmaceutical manufacture and reconstruct the technological system for quality control of CM products. The regulation and technology system based on process control and management for controlling CM quality should be established to solve weighty realistic problems of CM industry from the root causes, including backwardness of quality control technology, weakness of quality risk control measures, poor reputation of product quality and so on. By this way, the obstacles from poor controllability of CM product quality could be broken. Concentrating on those difficult problems and weak links in the technical field of CM quality control, it is proposed to build CMC (Chemistry, Manufacturing and Controls) regulation for CM products with Chinese characteristics and promote the regulation international recognition as soon as possible. The CMC technical framework, which is clinical efficacy-oriented, manufacturing manner-centered and process control-focused, was designed. To address the clinical characteristics of traditional Chinese medicine (TCM) and the production feature of CM manufacture, it is suggested to establish quality control engineering for CM manufacturing by integrating pharmaceutical analysis, TCM chemistry, TCM pharmacology, pharmaceutical engineering, control engineering, management engineering and other disciplines. Further, a theoretical model of quality control engineering for CM manufacturing and the methodology of digital pharmaceutical engineering are proposed. A technology pathway for promoting CM standard and realizing the strategic goal of CM internationalization is elaborated. Copyright© by the Chinese Pharmaceutical Association.

  9. Pharmaceutical quality by design: product and process development, understanding, and control.

    PubMed

    Yu, Lawrence X

    2008-04-01

    The purpose of this paper is to discuss the pharmaceutical Quality by Design (QbD) and describe how it can be used to ensure pharmaceutical quality. The QbD was described and some of its elements identified. Process parameters and quality attributes were identified for each unit operation during manufacture of solid oral dosage forms. The use of QbD was contrasted with the evaluation of product quality by testing alone. The QbD is a systemic approach to pharmaceutical development. It means designing and developing formulations and manufacturing processes to ensure predefined product quality. Some of the QbD elements include: Defining target product quality profile; Designing product and manufacturing processes; Identifying critical quality attributes, process parameters, and sources of variability; Controlling manufacturing processes to produce consistent quality over time. Using QbD, pharmaceutical quality is assured by understanding and controlling formulation and manufacturing variables. Product testing confirms the product quality. Implementation of QbD will enable transformation of the chemistry, manufacturing, and controls (CMC) review of abbreviated new drug applications (ANDAs) into a science-based pharmaceutical quality assessment.

  10. A Quality Assurance Initiative for Commercial-Scale Production in High-Throughput Cryopreservation of Blue Catfish Sperm

    PubMed Central

    Hu, E; Liao, T. W.; Tiersch, T. R.

    2013-01-01

    Cryopreservation of fish sperm has been studied for decades at a laboratory (research) scale. However, high-throughput cryopreservation of fish sperm has recently been developed to enable industrial-scale production. This study treated blue catfish (Ictalurus furcatus) sperm high-throughput cryopreservation as a manufacturing production line and initiated quality assurance plan development. The main objectives were to identify: 1) the main production quality characteristics; 2) the process features for quality assurance; 3) the internal quality characteristics and their specification designs; 4) the quality control and process capability evaluation methods, and 5) the directions for further improvements and applications. The essential product quality characteristics were identified as fertility-related characteristics. Specification design which established the tolerance levels according to demand and process constraints was performed based on these quality characteristics. Meanwhile, to ensure integrity throughout the process, internal quality characteristics (characteristics at each quality control point within process) that could affect fertility-related quality characteristics were defined with specifications. Due to the process feature of 100% inspection (quality inspection of every fish), a specific calculation method, use of cumulative sum (CUSUM) control charts, was applied to monitor each quality characteristic. An index of overall process evaluation, process capacity, was analyzed based on in-control process and the designed specifications, which further integrates the quality assurance plan. With the established quality assurance plan, the process could operate stably and quality of products would be reliable. PMID:23872356

  11. TQM (Total Quality Management) SPARC (Special Process Action Review Committees) Handbook

    DTIC Science & Technology

    1989-08-01

    This document describes the techniques used to support and guide the Special Process Action Review Committees for accomplishing their goals for Total Quality Management (TQM). It includes concepts and definitions, checklists, sample formats, and assessment criteria. Keywords: Continuous process improvement; Logistics information; Process analysis; Quality control; Quality assurance; Total Quality Management ; Statistical processes; Management Planning and control; Management training; Management information systems.

  12. [Study on "multi-dimensional structure and process dynamics quality control system" of Danshen infusion solution based on component structure theory].

    PubMed

    Feng, Liang; Zhang, Ming-Hua; Gu, Jun-Fei; Wang, Gui-You; Zhao, Zi-Yu; Jia, Xiao-Bin

    2013-11-01

    As traditional Chinese medicine (TCM) preparation products feature complex compounds and multiple preparation processes, the implementation of quality control in line with the characteristics of TCM preparation products provides a firm guarantee for the clinical efficacy and safety of TCM preparation products. Danshen infusion solution is a preparation commonly used in clinic, but its quality control is restricted to indexes of finished products, which can not guarantee its inherent quality. Our study group has proposed "multi-dimensional structure and process dynamics quality control system" on the basis of "component structure theory", for the purpose of controlling the quality of Danshen infusion solution at multiple levels and in multiple links from the efficacy-related material basis, the safety-related material basis, the characteristics of dosage form to the preparation process. This article, we bring forth new ideas and models to the quality control of TCM preparation products.

  13. Quality control process improvement of flexible printed circuit board by FMEA

    NASA Astrophysics Data System (ADS)

    Krasaephol, Siwaporn; Chutima, Parames

    2018-02-01

    This research focuses on the quality control process improvement of Flexible Printed Circuit Board (FPCB), centred around model 7-Flex, by using Failure Mode and Effect Analysis (FMEA) method to decrease proportion of defective finished goods that are found at the final inspection process. Due to a number of defective units that were found at the final inspection process, high scraps may be escaped to customers. The problem comes from poor quality control process which is not efficient enough to filter defective products from in-process because there is no In-Process Quality Control (IPQC) or sampling inspection in the process. Therefore, the quality control process has to be improved by setting inspection gates and IPCQs at critical processes in order to filter the defective products. The critical processes are analysed by the FMEA method. IPQC is used for detecting defective products and reducing chances of defective finished goods escaped to the customers. Reducing proportion of defective finished goods also decreases scrap cost because finished goods incur higher scrap cost than work in-process. Moreover, defective products that are found during process can reflect the abnormal processes; therefore, engineers and operators should timely solve the problems. Improved quality control was implemented for 7-Flex production lines from July 2017 to September 2017. The result shows decreasing of the average proportion of defective finished goods and the average of Customer Manufacturers Lot Reject Rate (%LRR of CMs) equal to 4.5% and 4.1% respectively. Furthermore, cost saving of this quality control process equals to 100K Baht.

  14. [Feedforward control strategy and its application in quality improvement of ethanol precipitation process of danhong injection].

    PubMed

    Yan, Bin-Jun; Guo, Zheng-Tai; Qu, Hai-Bin; Zhao, Bu-Chang; Zhao, Tao

    2013-06-01

    In this work, a feedforward control strategy basing on the concept of quality by design was established for the manufacturing process of traditional Chinese medicine to reduce the impact of the quality variation of raw materials on drug. In the research, the ethanol precipitation process of Danhong injection was taken as an application case of the method established. Box-Behnken design of experiments was conducted. Mathematical models relating the attributes of the concentrate, the process parameters and the quality of the supernatants produced were established. Then an optimization model for calculating the best process parameters basing on the attributes of the concentrate was built. The quality of the supernatants produced by ethanol precipitation with optimized and non-optimized process parameters were compared. The results showed that using the feedforward control strategy for process parameters optimization can control the quality of the supernatants effectively. The feedforward control strategy proposed can enhance the batch-to-batch consistency of the supernatants produced by ethanol precipitation.

  15. 42 CFR 431.830 - Basic elements of the Medicaid quality control (MQC) claims processing assessment system.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Basic elements of the Medicaid quality control (MQC... & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL ASSISTANCE PROGRAMS STATE ORGANIZATION AND GENERAL ADMINISTRATION Quality Control Medicaid Quality Control (mqc) Claims Processing...

  16. Process-based quality for thermal spray via feedback control

    NASA Astrophysics Data System (ADS)

    Dykhuizen, R. C.; Neiser, R. A.

    2006-09-01

    Quality control of a thermal spray system manufacturing process is difficult due to the many input variables that need to be controlled. Great care must be taken to ensure that the process remains constant to obtain a consistent quality of the parts. Control is greatly complicated by the fact that measurement of particle velocities and temperatures is a noisy stochastic process. This article illustrates the application of quality control concepts to a wire flame spray process. A central feature of the real-time control system is an automatic feedback control scheme that provides fine adjustments to ensure that uncontrolled variations are accommodated. It is shown how the control vectors can be constructed from simple process maps to independently control particle velocity and temperature. This control scheme is shown to perform well in a real production environment. We also demonstrate that slight variations in the feed wire curvature can greatly influence the process. Finally, the geometry of the spray system and sensor must remain constant for the best reproducibility.

  17. Importance of good manufacturing practices in microbiological monitoring in processing human tissues for transplant.

    PubMed

    Pianigiani, Elisa; Ierardi, Francesca; Fimiani, Michele

    2013-12-01

    Skin allografts represent an important therapeutic resource in the treatment of severe skin loss. The risk associated with application of processed tissues in humans is very low, however, human material always carries the risk of disease transmission. To minimise the risk of contamination of grafts, processing is carried out in clean rooms where air quality is monitored. Procedures and quality control tests are performed to standardise the production process and to guarantee the final product for human use. Since we only validate and distribute aseptic tissues, we conducted a study to determine what type of quality controls for skin processing are the most suitable for detecting processing errors and intercurrent contamination, and for faithfully mapping the process without unduly increasing production costs. Two different methods for quality control were statistically compared using the Fisher exact test. On the basis of the current study we selected our quality control procedure based on pre- and post-processing tissue controls, operator and environmental controls. Evaluation of the predictability of our control methods showed that tissue control was the most reliable method of revealing microbial contamination of grafts. We obtained 100 % sensitivity by doubling tissue controls, while maintaining high specificity (77 %).

  18. [Quality process control system of Chinese medicine preparation based on "holistic view"].

    PubMed

    Wang, Ya-Qi; Jiao, Jiao-Jiao; Wu, Zhen-Feng; Zheng, Qin; Yang, Ming

    2018-01-01

    "High quality, safety and effectiveness" are the primary principles for the pharmaceutical research and development process in China. The quality of products relies not only on the inspection method, but also on the design and development, process control and standardized management. The quality depends on the process control level. In this paper, the history and current development of quality control of traditional Chinese medicine (TCM) preparations are reviewed systematically. Based on the development model of international drug quality control and the misunderstanding of quality control of TCM preparations, the reasons for impacting the homogeneity of TCM preparations are analyzed and summarized. According to TCM characteristics, efforts were made to control the diversity of TCM, make "unstable" TCM into "stable" Chinese patent medicines, put forward the concepts of "holistic view" and "QbD (quality by design)", so as to create the "holistic, modular, data, standardized" model as the core of TCM preparation quality process control model. Scientific studies shall conform to the actual production of TCM preparations, and be conducive to supporting advanced equipment and technology upgrade, thoroughly applying the scientific research achievements in Chinese patent medicines, and promoting the cluster application and transformation application of TCM pharmaceutical technology, so as to improve the quality and effectiveness of the TCM industry and realize the green development. Copyright© by the Chinese Pharmaceutical Association.

  19. [Establishment and application of "multi-dimensional structure and process dynamic quality control technology system" in preparation products of traditional Chinese medicine (I)].

    PubMed

    Gu, Jun-Fei; Feng, Liang; Zhang, Ming-Hua; Wu, Chan; Jia, Xiao-Bin

    2013-11-01

    Safety is an important component of the quality control of traditional Chinese medicine (TCM) preparation products, as well as an important guarantee for clinical application. Currently, the quality control of TCMs in Chinese Pharmacopoeia mostly focuses on indicative compounds for TCM efficacy. TCM preparations are associated with multiple links, from raw materials to products, and each procedure may have impacts on the safety of preparation. We make a summary and analysis on the factors impacting safety during the preparation of TCM products, and then expound the important role of the "multi-dimensional structure and process dynamic quality control technology system" in the quality safety of TCM preparations. Because the product quality of TCM preparation is closely related to the safety, the control over safety-related material basis is an important component of the product quality control of TCM preparations. The implementation of the quality control over the dynamic process of TCM preparations from raw materials to products, and the improvement of the TCM quality safety control at the microcosmic level help lay a firm foundation for the development of the modernization process of TCM preparations.

  20. 5 CFR 1315.3 - Responsibilities.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... vendors under this part are subject to periodic quality control validation to be conducted no less frequently than once annually. Quality control processes will be used to confirm that controls are effective and that processes are efficient. Each agency head is responsible for establishing a quality control...

  1. 5 CFR 1315.3 - Responsibilities.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... vendors under this part are subject to periodic quality control validation to be conducted no less frequently than once annually. Quality control processes will be used to confirm that controls are effective and that processes are efficient. Each agency head is responsible for establishing a quality control...

  2. 5 CFR 1315.3 - Responsibilities.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... vendors under this part are subject to periodic quality control validation to be conducted no less frequently than once annually. Quality control processes will be used to confirm that controls are effective and that processes are efficient. Each agency head is responsible for establishing a quality control...

  3. Assessing Subjectivity in Sensor Data Post Processing via a Controlled Experiment

    NASA Astrophysics Data System (ADS)

    Jones, A. S.; Horsburgh, J. S.; Eiriksson, D.

    2017-12-01

    Environmental data collected by in situ sensors must be reviewed to verify validity, and conducting quality control often requires making edits in post processing to generate approved datasets. This process involves decisions by technicians, data managers, or data users on how to handle problematic data. Options include: removing data from a series, retaining data with annotations, and altering data based on algorithms related to adjacent data points or the patterns of data at other locations or of other variables. Ideally, given the same dataset and the same quality control guidelines, multiple data quality control technicians would make the same decisions in data post processing. However, despite the development and implementation of guidelines aimed to ensure consistent quality control procedures, we have faced ambiguity when performing post processing, and we have noticed inconsistencies in the practices of individuals performing quality control post processing. Technicians with the same level of training and using the same input datasets may produce different results, affecting the overall quality and comparability of finished data products. Different results may also be produced by technicians that do not have the same level of training. In order to assess the effect of subjective decision making by the individual technician on the end data product, we designed an experiment where multiple users performed quality control post processing on the same datasets using a consistent set of guidelines, field notes, and tools. We also assessed the effect of technician experience and training by conducting the same procedures with a group of novices unfamiliar with the data and the quality control process and compared their results to those generated by a group of more experienced technicians. In this presentation, we report our observations of the degree of subjectivity in sensor data post processing, assessing and quantifying the impacts of individual technician as well as technician experience on quality controlled data products.

  4. Total Quality Management Implementation Strategy: Directorate of Quality Assurance

    DTIC Science & Technology

    1989-05-01

    Total Quality Control Harrington, H. James The Improvement Process Imai, Masaaki Kaizen Ishikawa , Kaoru What is Total Quality Control Ishikawa ... Kaoru Statistical Quality Control Juran, J. M. Managerial Breakthrough Juran, J. M. Quality Control Handbook Mizuno, Ed Managing for Quality Improvements

  5. [Quality by design approaches for pharmaceutical development and manufacturing of Chinese medicine].

    PubMed

    Xu, Bing; Shi, Xin-Yuan; Wu, Zhi-Sheng; Zhang, Yan-Ling; Wang, Yun; Qiao, Yan-Jiang

    2017-03-01

    The pharmaceutical quality was built by design, formed in the manufacturing process and improved during the product's lifecycle. Based on the comprehensive literature review of pharmaceutical quality by design (QbD), the essential ideas and implementation strategies of pharmaceutical QbD were interpreted. Considering the complex nature of Chinese medicine, the "4H" model was innovated and proposed for implementing QbD in pharmaceutical development and industrial manufacture of Chinese medicine product. "4H" corresponds to the acronym of holistic design, holistic information analysis, holistic quality control, and holistic process optimization, which is consistent with the holistic concept of Chinese medicine theory. The holistic design aims at constructing both the quality problem space from the patient requirement and the quality solution space from multidisciplinary knowledge. Holistic information analysis emphasizes understanding the quality pattern of Chinese medicine by integrating and mining multisource data and information at a relatively high level. The batch-to-batch quality consistence and manufacturing system reliability can be realized by comprehensive application of inspective quality control, statistical quality control, predictive quality control and intelligent quality control strategies. Holistic process optimization is to improve the product quality and process capability during the product lifecycle management. The implementation of QbD is useful to eliminate the ecosystem contradictions lying in the pharmaceutical development and manufacturing process of Chinese medicine product, and helps guarantee the cost effectiveness. Copyright© by the Chinese Pharmaceutical Association.

  6. Statistical process control methods allow the analysis and improvement of anesthesia care.

    PubMed

    Fasting, Sigurd; Gisvold, Sven E

    2003-10-01

    Quality aspects of the anesthetic process are reflected in the rate of intraoperative adverse events. The purpose of this report is to illustrate how the quality of the anesthesia process can be analyzed using statistical process control methods, and exemplify how this analysis can be used for quality improvement. We prospectively recorded anesthesia-related data from all anesthetics for five years. The data included intraoperative adverse events, which were graded into four levels, according to severity. We selected four adverse events, representing important quality and safety aspects, for statistical process control analysis. These were: inadequate regional anesthesia, difficult emergence from general anesthesia, intubation difficulties and drug errors. We analyzed the underlying process using 'p-charts' for statistical process control. In 65,170 anesthetics we recorded adverse events in 18.3%; mostly of lesser severity. Control charts were used to define statistically the predictable normal variation in problem rate, and then used as a basis for analysis of the selected problems with the following results: Inadequate plexus anesthesia: stable process, but unacceptably high failure rate; Difficult emergence: unstable process, because of quality improvement efforts; Intubation difficulties: stable process, rate acceptable; Medication errors: methodology not suited because of low rate of errors. By applying statistical process control methods to the analysis of adverse events, we have exemplified how this allows us to determine if a process is stable, whether an intervention is required, and if quality improvement efforts have the desired effect.

  7. [Investigation on production process quality control of traditional Chinese medicine--Banlangen granule as an example].

    PubMed

    Tan, Manrong; Yan, Dan; Qiu, Lingling; Chen, Longhu; Yan, Yan; Jin, Cheng; Li, Hanbing; Xiao, Xiaohe

    2012-04-01

    For the quality management system of herbal medicines, intermediate and finished products it exists the " short board" effect of methodologies. Based on the concept of process control, new strategies and new methods of the production process quality control had been established with the consideration of the actual production of traditional Chinese medicine an the characteristics of Chinese medicine. Taking Banlangen granule as a practice example, which was effective and widespread application, character identification, determination of index components, chemical fingerprint and biometrics technology were sequentially used respectively to assess the quality of Banlangen herbal medicines, intermediate (water extraction and alcohol precipitation) and finished product. With the transfer rate of chemical information and biological potency as indicators, the effectiveness and transmission of the above different assessments and control methods had been researched. And ultimately, the process quality control methods of Banlangen granule, which were based on chemical composition analysis-biometric analysis, had been set up. It can not only validly solute the current status that there were many manufacturers varying quality of Banlangen granule, but also ensure and enhance its clinical efficacy. Furthermore it provided a foundation for the construction of the quality control of traditional Chinese medicine production process.

  8. 21 CFR 111.105 - What must quality control personnel do?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false What must quality control personnel do? 111.105..., LABELING, OR HOLDING OPERATIONS FOR DIETARY SUPPLEMENTS Production and Process Control System: Requirements for Quality Control § 111.105 What must quality control personnel do? Quality control personnel must...

  9. Root Cause Analysis of Quality Defects Using HPLC-MS Fingerprint Knowledgebase for Batch-to-batch Quality Control of Herbal Drugs.

    PubMed

    Yan, Binjun; Fang, Zhonghua; Shen, Lijuan; Qu, Haibin

    2015-01-01

    The batch-to-batch quality consistency of herbal drugs has always been an important issue. To propose a methodology for batch-to-batch quality control based on HPLC-MS fingerprints and process knowledgebase. The extraction process of Compound E-jiao Oral Liquid was taken as a case study. After establishing the HPLC-MS fingerprint analysis method, the fingerprints of the extract solutions produced under normal and abnormal operation conditions were obtained. Multivariate statistical models were built for fault detection and a discriminant analysis model was built using the probabilistic discriminant partial-least-squares method for fault diagnosis. Based on multivariate statistical analysis, process knowledge was acquired and the cause-effect relationship between process deviations and quality defects was revealed. The quality defects were detected successfully by multivariate statistical control charts and the type of process deviations were diagnosed correctly by discriminant analysis. This work has demonstrated the benefits of combining HPLC-MS fingerprints, process knowledge and multivariate analysis for the quality control of herbal drugs. Copyright © 2015 John Wiley & Sons, Ltd.

  10. A system framework of inter-enterprise machining quality control based on fractal theory

    NASA Astrophysics Data System (ADS)

    Zhao, Liping; Qin, Yongtao; Yao, Yiyong; Yan, Peng

    2014-03-01

    In order to meet the quality control requirement of dynamic and complicated product machining processes among enterprises, a system framework of inter-enterprise machining quality control based on fractal was proposed. In this system framework, the fractal-specific characteristic of inter-enterprise machining quality control function was analysed, and the model of inter-enterprise machining quality control was constructed by the nature of fractal structures. Furthermore, the goal-driven strategy of inter-enterprise quality control and the dynamic organisation strategy of inter-enterprise quality improvement were constructed by the characteristic analysis on this model. In addition, the architecture of inter-enterprise machining quality control based on fractal was established by means of Web service. Finally, a case study for application was presented. The result showed that the proposed method was available, and could provide guidance for quality control and support for product reliability in inter-enterprise machining processes.

  11. Off-Line Quality Control In Integrated Circuit Fabrication Using Experimental Design

    NASA Astrophysics Data System (ADS)

    Phadke, M. S.; Kackar, R. N.; Speeney, D. V.; Grieco, M. J.

    1987-04-01

    Off-line quality control is a systematic method of optimizing production processes and product designs. It is widely used in Japan to produce high quality products at low cost. The method was introduced to us by Professor Genichi Taguchi who is a Deming-award winner and a former Director of the Japanese Academy of Quality. In this paper we will i) describe the off-line quality control method, and ii) document our efforts to optimize the process for forming contact windows in 3.5 Aim CMOS circuits fabricated in the Murray Hill Integrated Circuit Design Capability Laboratory. In the fabrication of integrated circuits it is critically important to produce contact windows of size very near the target dimension. Windows which are too small or too large lead to loss of yield. The off-line quality control method has improved both the process quality and productivity. The variance of the window size has been reduced by a factor of four. Also, processing time for window photolithography has been substantially reduced. The key steps of off-line quality control are: i) Identify important manipulatable process factors and their potential working levels. ii) Perform fractional factorial experiments on the process using orthogonal array designs. iii) Analyze the resulting data to determine the optimum operating levels of the factors. Both the process mean and the process variance are considered in this analysis. iv) Conduct an additional experiment to verify that the new factor levels indeed give an improvement.

  12. The process of managerial control in quality improvement initiatives.

    PubMed

    Slovensky, D J; Fottler, M D

    1994-11-01

    The fundamental intent of strategic management is to position an organization with in its market to exploit organizational competencies and strengths to gain competitive advantage. Competitive advantage may be achieved through such strategies as low cost, high quality, or unique services or products. For health care organizations accredited by the Joint Commission on Accreditation of Healthcare Organizations, continually improving both processes and outcomes of organizational performance--quality improvement--in all operational areas of the organization is a mandated strategy. Defining and measuring quality and controlling the quality improvement strategy remain problematic. The article discusses the nature and processes of managerial control, some potential measures of quality, and related information needs.

  13. [Discussion on research thinking of traditional Chinese medicine standardization system based on whole process quality control].

    PubMed

    Dong, Ling; Sun, Yu; Pei, Wen-Xuan; Dai, Jun-Dong; Wang, Zi-Yu; Pan, Meng; Chen, Jiang-Peng; Wang, Yun

    2017-12-01

    The concept of "Quality by design" indicates that good design for the whole life cycle of pharmaceutical production enables the drug to meet the expected quality requirements. Aiming at the existing problems of the traditional Chinese medicine (TCM) industry, the TCM standardization system was put forward in this paper from the national strategic level, under the guidance by the idea of quality control in international manufacturing industry and with considerations of TCM industry's own characteristics and development status. The connotation of this strategy was to establish five interrelated systems: multi-indicators system based on tri-indicators system, quality standard and specification system of TCM herbal materials and decoction pieces, quality traceability system, data monitoring system based on whole-process quality control, and whole-process quality management system of TCM, and achieve the whole process systematic and scientific study in TCM industry through "top-level design-implement in steps-system integration" workflow. This article analyzed the correlation between the quality standards of all links, established standard operating procedures of each link and whole process, and constructed a high standard overall quality management system for TCM industry chains, in order to provide a demonstration for the establishment of TCM whole-process quality control system and provide systematic reference and basis for standardization strategy in TCM industry. Copyright© by the Chinese Pharmaceutical Association.

  14. The utilization of six sigma and statistical process control techniques in surgical quality improvement.

    PubMed

    Sedlack, Jeffrey D

    2010-01-01

    Surgeons have been slow to incorporate industrial reliability techniques. Process control methods were applied to surgeon waiting time between cases, and to length of stay (LOS) after colon surgery. Waiting times between surgeries were evaluated by auditing the operating room records of a single hospital over a 1-month period. The medical records of 628 patients undergoing colon surgery over a 5-year period were reviewed. The average surgeon wait time between cases was 53 min, and the busiest surgeon spent 291/2 hr in 1 month waiting between surgeries. Process control charting demonstrated poor overall control of the room turnover process. Average LOS after colon resection also demonstrated very poor control. Mean LOS was 10 days. Weibull's conditional analysis revealed a conditional LOS of 9.83 days. Serious process management problems were identified in both analyses. These process issues are both expensive and adversely affect the quality of service offered by the institution. Process control mechanisms were suggested or implemented to improve these surgical processes. Industrial reliability and quality management tools can easily and effectively identify process control problems that occur on surgical services. © 2010 National Association for Healthcare Quality.

  15. Image processing system performance prediction and product quality evaluation

    NASA Technical Reports Server (NTRS)

    Stein, E. K.; Hammill, H. B. (Principal Investigator)

    1976-01-01

    The author has identified the following significant results. A new technique for image processing system performance prediction and product quality evaluation was developed. It was entirely objective, quantitative, and general, and should prove useful in system design and quality control. The technique and its application to determination of quality control procedures for the Earth Resources Technology Satellite NASA Data Processing Facility are described.

  16. Interim results of quality-control sampling of surface water for the Upper Colorado River National Water-Quality Assessment Study Unit, water years 1995-96

    USGS Publications Warehouse

    Spahr, N.E.; Boulger, R.W.

    1997-01-01

    Quality-control samples provide part of the information needed to estimate the bias and variability that result from sample collection, processing, and analysis. Quality-control samples of surface water collected for the Upper Colorado River National Water-Quality Assessment study unit for water years 1995?96 are presented and analyzed in this report. The types of quality-control samples collected include pre-processing split replicates, concurrent replicates, sequential replicates, post-processing split replicates, and field blanks. Analysis of the pre-processing split replicates, concurrent replicates, sequential replicates, and post-processing split replicates is based on differences between analytical results of the environmental samples and analytical results of the quality-control samples. Results of these comparisons indicate that variability introduced by sample collection, processing, and handling is low and will not affect interpretation of the environmental data. The differences for most water-quality constituents is on the order of plus or minus 1 or 2 lowest rounding units. A lowest rounding unit is equivalent to the magnitude of the least significant figure reported for analytical results. The use of lowest rounding units avoids some of the difficulty in comparing differences between pairs of samples when concentrations span orders of magnitude and provides a measure of the practical significance of the effect of variability. Analysis of field-blank quality-control samples indicates that with the exception of chloride and silica, no systematic contamination of samples is apparent. Chloride contamination probably was the result of incomplete rinsing of the dilute cleaning solution from the outlet ports of the decaport sample splitter. Silica contamination seems to have been introduced by the blank water. Sampling and processing procedures for water year 1997 have been modified as a result of these analyses.

  17. Project T.E.A.M. (Technical Education Advancement Modules). Advanced Statistical Process Control.

    ERIC Educational Resources Information Center

    Dunlap, Dale

    This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 20-hour advanced statistical process control (SPC) and quality improvement course designed to develop the following competencies: (1) understanding quality systems; (2) knowing the process; (3) solving quality problems; and (4)…

  18. Artificial Intelligence Approach to Support Statistical Quality Control Teaching

    ERIC Educational Resources Information Center

    Reis, Marcelo Menezes; Paladini, Edson Pacheco; Khator, Suresh; Sommer, Willy Arno

    2006-01-01

    Statistical quality control--SQC (consisting of Statistical Process Control, Process Capability Studies, Acceptance Sampling and Design of Experiments) is a very important tool to obtain, maintain and improve the Quality level of goods and services produced by an organization. Despite its importance, and the fact that it is taught in technical and…

  19. Measuring Software Product Quality: The ISO 25000 Series and CMMI

    DTIC Science & Technology

    2004-06-14

    performance objectives” covers objectives and requirements for product quality, service quality , and process performance. Process performance objectives...such that product quality, service quality , and process performance attributes are measurable and controlled throughout the project (internal and

  20. Development of a Premium Quality Plasma-derived IVIg (IQYMUNE®) Utilizing the Principles of Quality by Design-A Worked-through Case Study.

    PubMed

    Paolantonacci, Philippe; Appourchaux, Philippe; Claudel, Béatrice; Ollivier, Monique; Dennett, Richard; Siret, Laurent

    2018-01-01

    Polyvalent human normal immunoglobulins for intravenous use (IVIg), indicated for rare and often severe diseases, are complex plasma-derived protein preparations. A quality by design approach has been used to develop the Laboratoire Français du Fractionnement et des Biotechnologies new-generation IVIg, targeting a high level of purity to generate an enhanced safety profile while maintaining a high level of efficacy. A modular approach of quality by design was implemented consisting of five consecutive steps to cover all the stages from the product design to the final product control strategy.A well-defined target product profile was translated into 27 product quality attributes that formed the basis of the process design. In parallel, a product risk analysis was conducted and identified 19 critical quality attributes among the product quality attributes. Process risk analysis was carried out to establish the links between process parameters and critical quality attributes. Twelve critical steps were identified, and for each of these steps a risk mitigation plan was established.Among the different process risk mitigation exercises, five process robustness studies were conducted at qualified small scale with a design of experiment approach. For each process step, critical process parameters were identified and, for each critical process parameter, proven acceptable ranges were established. The quality risk management and risk mitigation outputs, including verification of proven acceptable ranges, were used to design the process verification exercise at industrial scale.Finally, the control strategy was established using a mix, or hybrid, of the traditional approach plus elements of the quality by design enhanced approach, as illustrated, to more robustly assign material and process controls and in order to securely meet product specifications.The advantages of this quality by design approach were improved process knowledge for industrial design and process validation and a clear justification of the process and product specifications as a basis for control strategy and future comparability exercises. © PDA, Inc. 2018.

  1. Importance of implementing an analytical quality control system in a core laboratory.

    PubMed

    Marques-Garcia, F; Garcia-Codesal, M F; Caro-Narros, M R; Contreras-SanFeliciano, T

    2015-01-01

    The aim of the clinical laboratory is to provide useful information for screening, diagnosis and monitoring of disease. The laboratory should ensure the quality of extra-analytical and analytical process, based on set criteria. To do this, it develops and implements a system of internal quality control, designed to detect errors, and compare its data with other laboratories, through external quality control. In this way it has a tool to detect the fulfillment of the objectives set, and in case of errors, allowing corrective actions to be made, and ensure the reliability of the results. This article sets out to describe the design and implementation of an internal quality control protocol, as well as its periodical assessment intervals (6 months) to determine compliance with pre-determined specifications (Stockholm Consensus(1)). A total of 40 biochemical and 15 immunochemical methods were evaluated using three different control materials. Next, a standard operation procedure was planned to develop a system of internal quality control that included calculating the error of the analytical process, setting quality specifications, and verifying compliance. The quality control data were then statistically depicted as means, standard deviations, and coefficients of variation, as well as systematic, random, and total errors. The quality specifications were then fixed and the operational rules to apply in the analytical process were calculated. Finally, our data were compared with those of other laboratories through an external quality assurance program. The development of an analytical quality control system is a highly structured process. This should be designed to detect errors that compromise the stability of the analytical process. The laboratory should review its quality indicators, systematic, random and total error at regular intervals, in order to ensure that they are meeting pre-determined specifications, and if not, apply the appropriate corrective actions. Copyright © 2015 SECA. Published by Elsevier Espana. All rights reserved.

  2. [Establishment of industry promotion technology system in Chinese medicine secondary exploitation based on "component structure theory"].

    PubMed

    Cheng, Xu-Dong; Feng, Liang; Zhang, Ming-Hua; Gu, Jun-Fei; Jia, Xiao-Bin

    2014-10-01

    The purpose of the secondary exploitation of Chinese medicine is to improve the quality of Chinese medicine products, enhance core competitiveness, for better use in clinical practice, and more effectively solve the patient suffering. Herbs, extraction, separation, refreshing, preparation and quality control are all involved in the industry promotion of Chinese medicine secondary exploitation of industrial production. The Chinese medicine quality improvement and industry promotion could be realized with the whole process of process optimization, quality control, overall processes improvement. Based on the "component structure theory", "multi-dimensional structure & process dynamic quality control system" and systematic and holistic character of Chinese medicine, impacts of whole process were discussed. Technology systems of Chinese medicine industry promotion was built to provide theoretical basis for improving the quality and efficacy of the secondary development of traditional Chinese medicine products.

  3. Control by quality: proposition of a typology.

    PubMed

    Pujo, P; Pillet, M

    The application of Quality tools and methods in industrial management has always had a fundamental impact on the control of production. It influences the behavior of the actors concerned, while introducing the necessary notions and formalizations, especially for production systems with little or no automation, which constitute a large part of the industrial activity. Several quality approaches are applied in the workshop and are implemented at the level of the control. In this paper, the authors present a typology of the various approaches that have successively influenced control, such as statistical process control, quality assurance, and continuous improvement. First the authors present a parallel between production control and quality organizational structure. They note the duality between control, which is aimed at increasing productivity, and quality, which aims to satisfy the needs of the customer. They also note the hierarchical organizational structure of these two systems of management with, at each level, the notion of a feedback loop. This notion is fundamental to any kind of decision making. The paper is organized around the operational, tactical, and strategic levels, by describing for each level the main methods and tools for control by quality. The overview of these tools and methods starts at the operational level, with the Statistical Process Control, the Taguchi technique, and the "six sigma" approach. On the tactical level, we find a quality system approach, with a documented description of the procedures introduced in the firm. The management system can refer here to Quality Assurance, Total Productive Maintenance, or Management by Total Quality. The formalization through procedures of the rules of decision governing the process control enhances the validity of these rules. This leads to the enhancement of their reliability and to their consolidation. All this counterbalances the human, intrinsically fluctuating, behavior of the control operators. Strategic control by quality is then detailed, and the two main approaches, the continuous improvement approach and the proactive improvement approach, are introduced. Finally, the authors observe that at each of the three levels, the continuous process improvement, which is a component of Total Quality, becomes an essential preoccupation for the control. Ultimately, the recursive utilization of the Deming cycle remains the best practice for the control by quality.

  4. 7 CFR 58.928 - Quality control tests.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Quality control tests. 58.928 Section 58.928... Procedures § 58.928 Quality control tests. All dairy products and other ingredients shall be subject to inspection for quality and condition throughout each processing operation. Quality control tests shall be...

  5. 7 CFR 58.335 - Quality control tests.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Quality control tests. 58.335 Section 58.335... Procedures § 58.335 Quality control tests. All milk, cream and related products are subject to inspection for quality and condition throughout each processing operation. Quality control tests shall be made on flow...

  6. Quality control education in the community college

    NASA Technical Reports Server (NTRS)

    Greene, J. Griffen; Wilson, Steve

    1966-01-01

    This paper describes the Quality Control Program at Daytona Beach Junior College, including course descriptions. The program in quality control required communication between the college and the American Society for Quality Control (ASQC). The college has machinery established for certification of the learning process, and the society has the source of teachers who are competent in the technical field and who are the employers of the educational products. The associate degree for quality control does not have a fixed program, which can serve all needs, any more than all engineering degrees have identical programs. The main ideas which would be common to all quality control programs are the concept of economic control of a repetitive process and the concept of developing individual potentialities into individuals who are needed and productive.

  7. 21 CFR 111.117 - What quality control operations are required for equipment, instruments, and controls?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false What quality control operations are required for equipment, instruments, and controls? 111.117 Section 111.117 Food and Drugs FOOD AND DRUG ADMINISTRATION... and Process Control System: Requirements for Quality Control § 111.117 What quality control operations...

  8. 42 CFR 431.834 - Access to records: Claims processing assessment systems.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... ADMINISTRATION Quality Control Medicaid Quality Control (mqc) Claims Processing Assessment System § 431.834 Access to records: Claims processing assessment systems. The agency, upon written request, must provide HHS staff with access to all records pertaining to its MQC claims processing assessment system reviews...

  9. Defining the best quality-control systems by design and inspection.

    PubMed

    Hinckley, C M

    1997-05-01

    Not all of the many approaches to quality control are equally effective. Nonconformities in laboratory testing are caused basically by excessive process variation and mistakes. Statistical quality control can effectively control process variation, but it cannot detect or prevent most mistakes. Because mistakes or blunders are frequently the dominant source of nonconformities, we conclude that statistical quality control by itself is not effective. I explore the 100% inspection methods essential for controlling mistakes. Unlike the inspection techniques that Deming described as ineffective, the new "source" inspection methods can detect mistakes and enable corrections before nonconformities are generated, achieving the highest degree of quality at a fraction of the cost of traditional methods. Key relationships between task complexity and nonconformity rates are also described, along with cultural changes that are essential for implementing the best quality-control practices.

  10. 75 FR 16022 - Airworthiness Directives; Turbomeca S.A. MAKILA 1A and 1A1 Turboshaft Engines

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-31

    ... investigation by Turbom[eacute]ca are that these malfunctions are due to a lapse of quality control in the... Turbomeca are that these malfunctions are due to a lapse of quality control in the varnishing process... these malfunctions are due to a lapse of quality control in the varnishing process applied to the boards...

  11. 78 FR 69103 - 30-Day Notice of Proposed Information Collection: Quality Control for Rental Assistance Subsidy...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-18

    ... Information Collection: Quality Control for Rental Assistance Subsidy Determinations AGENCY: Office of the... Collection Title of Information Collection: Quality Control for Rental Assistance Subsidy Determinations. OMB... Quality Control process involves selecting a nationally representative sample of assisted households to...

  12. 21 CFR 111.135 - What quality control operations are required for product complaints?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false What quality control operations are required for... MANUFACTURING, PACKAGING, LABELING, OR HOLDING OPERATIONS FOR DIETARY SUPPLEMENTS Production and Process Control System: Requirements for Quality Control § 111.135 What quality control operations are required for...

  13. 21 CFR 111.65 - What are the requirements for quality control operations?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false What are the requirements for quality control... Process Control System § 111.65 What are the requirements for quality control operations? You must implement quality control operations in your manufacturing, packaging, labeling, and holding operations for...

  14. Interrupted Time Series Versus Statistical Process Control in Quality Improvement Projects.

    PubMed

    Andersson Hagiwara, Magnus; Andersson Gäre, Boel; Elg, Mattias

    2016-01-01

    To measure the effect of quality improvement interventions, it is appropriate to use analysis methods that measure data over time. Examples of such methods include statistical process control analysis and interrupted time series with segmented regression analysis. This article compares the use of statistical process control analysis and interrupted time series with segmented regression analysis for evaluating the longitudinal effects of quality improvement interventions, using an example study on an evaluation of a computerized decision support system.

  15. 33 CFR 385.21 - Quality control.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 33 Navigation and Navigable Waters 3 2011-07-01 2011-07-01 false Quality control. 385.21 Section... Processes § 385.21 Quality control. (a) The Corps of Engineers and the non-Federal sponsor shall prepare a quality control plan, in accordance with applicable Corps of Engineers regulations, for each product that...

  16. 33 CFR 385.21 - Quality control.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 33 Navigation and Navigable Waters 3 2013-07-01 2013-07-01 false Quality control. 385.21 Section... Processes § 385.21 Quality control. (a) The Corps of Engineers and the non-Federal sponsor shall prepare a quality control plan, in accordance with applicable Corps of Engineers regulations, for each product that...

  17. 33 CFR 385.21 - Quality control.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 3 2010-07-01 2010-07-01 false Quality control. 385.21 Section... Processes § 385.21 Quality control. (a) The Corps of Engineers and the non-Federal sponsor shall prepare a quality control plan, in accordance with applicable Corps of Engineers regulations, for each product that...

  18. [Quality control in herbal supplements].

    PubMed

    Oelker, Luisa

    2005-01-01

    Quality and safety of food and herbal supplements are the result of a whole of different elements as good manufacturing practice and process control. The process control must be active and able to individuate and correct all possible hazards. The main and most utilized instrument is the hazard analysis critical control point (HACCP) system the correct application of which can guarantee the safety of the product. Herbal supplements need, in addition to standard quality control, a set of checks to assure the harmlessness and safety of the plants used.

  19. Modelling and control for laser based welding processes: modern methods of process control to improve quality of laser-based joining methods

    NASA Astrophysics Data System (ADS)

    Zäh, Ralf-Kilian; Mosbach, Benedikt; Hollwich, Jan; Faupel, Benedikt

    2017-02-01

    To ensure the competitiveness of manufacturing companies it is indispensable to optimize their manufacturing processes. Slight variations of process parameters and machine settings have only marginally effects on the product quality. Therefore, the largest possible editing window is required. Such parameters are, for example, the movement of the laser beam across the component for the laser keyhole welding. That`s why it is necessary to keep the formation of welding seams within specified limits. Therefore, the quality of laser welding processes is ensured, by using post-process methods, like ultrasonic inspection, or special in-process methods. These in-process systems only achieve a simple evaluation which shows whether the weld seam is acceptable or not. Furthermore, in-process systems use no feedback for changing the control variables such as speed of the laser or adjustment of laser power. In this paper the research group presents current results of the research field of Online Monitoring, Online Controlling and Model predictive controlling in laser welding processes to increase the product quality. To record the characteristics of the welding process, tested online methods are used during the process. Based on the measurement data, a state space model is ascertained, which includes all the control variables of the system. Depending on simulation tools the model predictive controller (MPC) is designed for the model and integrated into an NI-Real-Time-System.

  20. An introduction to statistical process control in research proteomics.

    PubMed

    Bramwell, David

    2013-12-16

    Statistical process control is a well-established and respected method which provides a general purpose, and consistent framework for monitoring and improving the quality of a process. It is routinely used in many industries where the quality of final products is critical and is often required in clinical diagnostic laboratories [1,2]. To date, the methodology has been little utilised in research proteomics. It has been shown to be capable of delivering quantitative QC procedures for qualitative clinical assays [3] making it an ideal methodology to apply to this area of biological research. To introduce statistical process control as an objective strategy for quality control and show how it could be used to benefit proteomics researchers and enhance the quality of the results they generate. We demonstrate that rules which provide basic quality control are easy to derive and implement and could have a major impact on data quality for many studies. Statistical process control is a powerful tool for investigating and improving proteomics research work-flows. The process of characterising measurement systems and defining control rules forces the exploration of key questions that can lead to significant improvements in performance. This work asserts that QC is essential to proteomics discovery experiments. Every experimenter must know the current capabilities of their measurement system and have an objective means for tracking and ensuring that performance. Proteomic analysis work-flows are complicated and multi-variate. QC is critical for clinical chemistry measurements and huge strides have been made in ensuring the quality and validity of results in clinical biochemistry labs. This work introduces some of these QC concepts and works to bridge their use from single analyte QC to applications in multi-analyte systems. This article is part of a Special Issue entitled: Standardization and Quality Control in Proteomics. Copyright © 2013 The Author. Published by Elsevier B.V. All rights reserved.

  1. Measuring and improving the quality of postoperative epidural analgesia for major abdominal surgery using statistical process control charts.

    PubMed

    Duncan, Fiona; Haigh, Carol

    2013-10-01

    To explore and improve the quality of continuous epidural analgesia for pain relief using Statistical Process Control tools. Measuring the quality of pain management interventions is complex. Intermittent audits do not accurately capture the results of quality improvement initiatives. The failure rate for one intervention, epidural analgesia, is approximately 30% in everyday practice, so it is an important area for improvement. Continuous measurement and analysis are required to understand the multiple factors involved in providing effective pain relief. Process control and quality improvement Routine prospectively acquired data collection started in 2006. Patients were asked about their pain and side effects of treatment. Statistical Process Control methods were applied for continuous data analysis. A multidisciplinary group worked together to identify reasons for variation in the data and instigated ideas for improvement. The key measure for improvement was a reduction in the percentage of patients with an epidural in severe pain. The baseline control charts illustrated the recorded variation in the rate of several processes and outcomes for 293 surgical patients. The mean visual analogue pain score (VNRS) was four. There was no special cause variation when data were stratified by surgeons, clinical area or patients who had experienced pain before surgery. Fifty-seven per cent of patients were hypotensive on the first day after surgery. We were able to demonstrate a significant improvement in the failure rate of epidurals as the project continued with quality improvement interventions. Statistical Process Control is a useful tool for measuring and improving the quality of pain management. The applications of Statistical Process Control methods offer the potential to learn more about the process of change and outcomes in an Acute Pain Service both locally and nationally. We have been able to develop measures for improvement and benchmarking in routine care that has led to the establishment of a national pain registry. © 2013 Blackwell Publishing Ltd.

  2. Near infrared spectroscopy based monitoring of extraction processes of raw material with the help of dynamic predictive modeling

    NASA Astrophysics Data System (ADS)

    Wang, Haixia; Suo, Tongchuan; Wu, Xiaolin; Zhang, Yue; Wang, Chunhua; Yu, Heshui; Li, Zheng

    2018-03-01

    The control of batch-to-batch quality variations remains a challenging task for pharmaceutical industries, e.g., traditional Chinese medicine (TCM) manufacturing. One difficult problem is to produce pharmaceutical products with consistent quality from raw material of large quality variations. In this paper, an integrated methodology combining the near infrared spectroscopy (NIRS) and dynamic predictive modeling is developed for the monitoring and control of the batch extraction process of licorice. With the spectra data in hand, the initial state of the process is firstly estimated with a state-space model to construct a process monitoring strategy for the early detection of variations induced by the initial process inputs such as raw materials. Secondly, the quality property of the end product is predicted at the mid-course during the extraction process with a partial least squares (PLS) model. The batch-end-time (BET) is then adjusted accordingly to minimize the quality variations. In conclusion, our study shows that with the help of the dynamic predictive modeling, NIRS can offer the past and future information of the process, which enables more accurate monitoring and control of process performance and product quality.

  3. 42 CFR 431.834 - Access to records: Claims processing assessment systems.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... ADMINISTRATION Quality Control Medicaid Quality Control (mqc) Claims Processing Assessment System § 431.834... to which the State has access, including information available under part 435, subpart J, of this...

  4. 42 CFR 431.834 - Access to records: Claims processing assessment systems.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... ADMINISTRATION Quality Control Medicaid Quality Control (mqc) Claims Processing Assessment System § 431.834... to which the State has access, including information available under part 435, subpart J, of this...

  5. 42 CFR 431.832 - Reporting requirements for claims processing assessment systems.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... assessment systems. 431.832 Section 431.832 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES... GENERAL ADMINISTRATION Quality Control Medicaid Quality Control (mqc) Claims Processing Assessment System § 431.832 Reporting requirements for claims processing assessment systems. (a) The agency must submit...

  6. Adopting Quality Assurance Technology in Customer-Vendor Relationships: A Case Study of How Interorganizational Relationships Influence the Process

    NASA Astrophysics Data System (ADS)

    Heeager, Lise Tordrup; Tjørnehøj, Gitte

    Quality assurance technology is a formal control mechanism aiming at increasing the quality of the product exchanged between vendors and customers. Studies of the adoption of this technology in the field of system development rarely focus on the role of the relationship between the customer and vendor in the process. We have studied how the process of adopting quality assurance technology by a small Danish IT vendor developing pharmacy software for a customer in the public sector was influenced by the relationship with the customer. The case study showed that the adoption process was shaped to a high degree by the relationship and vice versa. The prior high level of trust and mutual knowledge helped the parties negotiate mutually feasible solutions throughout the adoption process. We thus advise enhancing trust-building processes to strengthen the relationships and to balance formal control and social control to increase the likelihood of a successful outcome of the adoption of quality assurance technology in a customer-vendor relationship.

  7. 75 FR 17942 - Notice of Proposed Information Collection for Public Comment on the Quality Control for Rental...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-08

    ... Information Collection for Public Comment on the Quality Control for Rental Assistance Subsidy Determinations... respondent burden (e.g., permitting electronic submission of responses). Title of Proposal: Quality Control... covered by the Public Housing and Section 8 housing subsidies. The Quality Control process involves...

  8. Parametric Optimization Of Gas Metal Arc Welding Process By Using Grey Based Taguchi Method On Aisi 409 Ferritic Stainless Steel

    NASA Astrophysics Data System (ADS)

    Ghosh, Nabendu; Kumar, Pradip; Nandi, Goutam

    2016-10-01

    Welding input process parameters play a very significant role in determining the quality of the welded joint. Only by properly controlling every element of the process can product quality be controlled. For better quality of MIG welding of Ferritic stainless steel AISI 409, precise control of process parameters, parametric optimization of the process parameters, prediction and control of the desired responses (quality indices) etc., continued and elaborate experiments, analysis and modeling are needed. A data of knowledge - base may thus be generated which may be utilized by the practicing engineers and technicians to produce good quality weld more precisely, reliably and predictively. In the present work, X-ray radiographic test has been conducted in order to detect surface and sub-surface defects of weld specimens made of Ferritic stainless steel. The quality of the weld has been evaluated in terms of yield strength, ultimate tensile strength and percentage of elongation of the welded specimens. The observed data have been interpreted, discussed and analyzed by considering ultimate tensile strength ,yield strength and percentage elongation combined with use of Grey-Taguchi methodology.

  9. 21 CFR 211.100 - Written procedures; deviations.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Process Controls § 211.100 Written procedures; deviations. (a) There shall be written procedures for production and process control designed to assure that the drug products have the identity, strength, quality... approved by the appropriate organizational units and reviewed and approved by the quality control unit. (b...

  10. 21 CFR 211.100 - Written procedures; deviations.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Process Controls § 211.100 Written procedures; deviations. (a) There shall be written procedures for production and process control designed to assure that the drug products have the identity, strength, quality... approved by the appropriate organizational units and reviewed and approved by the quality control unit. (b...

  11. 21 CFR 211.100 - Written procedures; deviations.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Process Controls § 211.100 Written procedures; deviations. (a) There shall be written procedures for production and process control designed to assure that the drug products have the identity, strength, quality... approved by the appropriate organizational units and reviewed and approved by the quality control unit. (b...

  12. 21 CFR 211.100 - Written procedures; deviations.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... Process Controls § 211.100 Written procedures; deviations. (a) There shall be written procedures for production and process control designed to assure that the drug products have the identity, strength, quality... approved by the appropriate organizational units and reviewed and approved by the quality control unit. (b...

  13. 21 CFR 211.100 - Written procedures; deviations.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Process Controls § 211.100 Written procedures; deviations. (a) There shall be written procedures for production and process control designed to assure that the drug products have the identity, strength, quality... approved by the appropriate organizational units and reviewed and approved by the quality control unit. (b...

  14. An efficient transmission power control scheme for temperature variation in wireless sensor networks.

    PubMed

    Lee, Jungwook; Chung, Kwangsue

    2011-01-01

    Wireless sensor networks collect data from several nodes dispersed at remote sites. Sensor nodes can be installed in harsh environments such as deserts, cities, and indoors, where the link quality changes considerably over time. Particularly, changes in transmission power may be caused by temperature, humidity, and other factors. In order to compensate for link quality changes, existing schemes detect the link quality changes between nodes and control transmission power through a series of feedback processes, but these approaches can cause heavy overhead with the additional control packets needed. In this paper, the change of the link quality according to temperature is examined through empirical experimentation. A new power control scheme combining both temperature-aware link quality compensation and a closed-loop feedback process to adapt to link quality changes is proposed. We prove that the proposed scheme effectively adapts the transmission power to the changing link quality with less control overhead and energy consumption.

  15. 40 CFR 85.2233 - Steady state test equipment calibrations, adjustments, and quality control-EPA 91.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... compensated for automatically and statistical process control demonstrates equal or better quality control... calibrations, adjustments, and quality control-EPA 91. 85.2233 Section 85.2233 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) CONTROL OF AIR POLLUTION FROM MOBILE...

  16. HDP for the Neutralized pH Value Control in the Clarifying Process of Sugar Cane Juice

    NASA Astrophysics Data System (ADS)

    Lin, Xiaofeng; Yang, Jiaran

    2009-05-01

    Neutralizing pH value of sugar cane juice is the important craft in the control process in the clarifying process of sugar cane juice, which is the important factor to influence output and the quality of white sugar. On the one hand, it is an important content to control the neutralized pH value within a required range, which has the vital significance for acquiring high quality purified juice, reducing energy consumption and raising sucrose recovery. On the other hand, it is a complicated physical-chemistry process, which has the characteristics of strong non-linearity, time-varying, large time-delay, and multi-input. Therefore, there has not been a very good solution to control the neutralized pH value. Firstly, in this chapter, a neural network model for the clarifying process of sugar juice is established based on gathering 1200 groups of real-time sample data in a sugar factory. Then, the HDP (Heuristic Dynamic Programming) method is used to optimize and control the neutralized pH value in the clarifying process of sugar juice. Simulation results indicate that this method has good control effect. This will build a good foundation for stabilizing the clarifying process and enhancing the quality of the purified juice and lastly enhancing the quality of white sugar.

  17. 42 CFR 431.834 - Access to records: Claims processing assessment systems.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 4 2013-10-01 2013-10-01 false Access to records: Claims processing assessment systems. 431.834 Section 431.834 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF... ADMINISTRATION Quality Control Medicaid Quality Control (mqc) Claims Processing Assessment System § 431.834...

  18. 42 CFR 431.834 - Access to records: Claims processing assessment systems.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 4 2011-10-01 2011-10-01 false Access to records: Claims processing assessment systems. 431.834 Section 431.834 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF... ADMINISTRATION Quality Control Medicaid Quality Control (mqc) Claims Processing Assessment System § 431.834...

  19. 42 CFR 431.836 - Corrective action under the MQC claims processing assessment system.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... assessment system. 431.836 Section 431.836 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT... ADMINISTRATION Quality Control Medicaid Quality Control (mqc) Claims Processing Assessment System § 431.836 Corrective action under the MQC claims processing assessment system. The agency must— (a) Take action to...

  20. Statistical Process Control: Going to the Limit for Quality.

    ERIC Educational Resources Information Center

    Training, 1987

    1987-01-01

    Defines the concept of statistical process control, a quality control method used especially in manufacturing. Generally, concept users set specific standard levels that must be met. Makes the point that although employees work directly with the method, management is responsible for its success within the plant. (CH)

  1. Assessing the influence of component processing and donor characteristics on quality of red cell concentrates using quality control data.

    PubMed

    Jordan, A; Chen, D; Yi, Q-L; Kanias, T; Gladwin, M T; Acker, J P

    2016-07-01

    Quality control (QC) data collected by blood services are used to monitor production and to ensure compliance with regulatory standards. We demonstrate how analysis of quality control data can be used to highlight the sources of variability within red cell concentrates (RCCs). We merged Canadian Blood Services QC data with manufacturing and donor records for 28 227 RCC between June 2011 and October 2014. Units were categorized based on processing method, bag manufacturer, donor age and donor sex, then assessed based on product characteristics: haemolysis and haemoglobin levels, unit volume, leucocyte count and haematocrit. Buffy-coat method (top/bottom)-processed units exhibited lower haemolysis than units processed using the whole-blood filtration method (top/top). Units from female donors exhibited lower haemolysis than male donations. Processing method influenced unit volume and the ratio of additive solution to residual plasma. Stored red blood cell characteristics are influenced by prestorage processing and donor factors. Understanding the relationship between processing, donors and RCC quality will help blood services to ensure the safety of transfused products. © 2016 International Society of Blood Transfusion.

  2. Statistical process management: An essential element of quality improvement

    NASA Astrophysics Data System (ADS)

    Buckner, M. R.

    Successful quality improvement requires a balanced program involving the three elements that control quality: organization, people and technology. The focus of the SPC/SPM User's Group is to advance the technology component of Total Quality by networking within the Group and by providing an outreach within Westinghouse to foster the appropriate use of statistic techniques to achieve Total Quality. SPM encompasses the disciplines by which a process is measured against its intrinsic design capability, in the face of measurement noise and other obscuring variability. SPM tools facilitate decisions about the process that generated the data. SPM deals typically with manufacturing processes, but with some flexibility of definition and technique it accommodates many administrative processes as well. The techniques of SPM are those of Statistical Process Control, Statistical Quality Control, Measurement Control, and Experimental Design. In addition, techniques such as job and task analysis, and concurrent engineering are important elements of systematic planning and analysis that are needed early in the design process to ensure success. The SPC/SPM User's Group is endeavoring to achieve its objectives by sharing successes that have occurred within the member's own Westinghouse department as well as within other US and foreign industry. In addition, failures are reviewed to establish lessons learned in order to improve future applications. In broader terms, the Group is interested in making SPM the accepted way of doing business within Westinghouse.

  3. Real-time product attribute control to manufacture antibodies with defined N-linked glycan levels.

    PubMed

    Zupke, Craig; Brady, Lowell J; Slade, Peter G; Clark, Philip; Caspary, R Guy; Livingston, Brittney; Taylor, Lisa; Bigham, Kyle; Morris, Arvia E; Bailey, Robert W

    2015-01-01

    Pressures for cost-effective new therapies and an increased emphasis on emerging markets require technological advancements and a flexible future manufacturing network for the production of biologic medicines. The safety and efficacy of a product is crucial, and consistent product quality is an essential feature of any therapeutic manufacturing process. The active control of product quality in a typical biologic process is challenging because of measurement lags and nonlinearities present in the system. The current study uses nonlinear model predictive control to maintain a critical product quality attribute at a predetermined value during pilot scale manufacturing operations. This approach to product quality control ensures a more consistent product for patients, enables greater manufacturing efficiency, and eliminates the need for extensive process characterization by providing direct measures of critical product quality attributes for real time release of drug product. © 2015 American Institute of Chemical Engineers.

  4. [Discussion on research and development of new traditional Chinese medicine preparation process based on idea of QbD].

    PubMed

    Feng, Yi; Hong, Yan-Long; Xian, Jie-Chen; Du, Ruo-Fei; Zhao, Li-Jie; Shen, Lan

    2014-09-01

    Traditional processes are mostly adopted in traditional Chinese medicine (TCM) preparation production and the quality of products is mostly controlled by terminal. Potential problems of the production in the process are unpredictable and is relied on experience in most cases. Therefore, it is hard to find the key points affecting the preparation process and quality control. A pattern of research and development of traditional Chinese medicine preparation process based on the idea of Quality by Design (QbD) was proposed after introducing the latest research achievement. Basic theories of micromeritics and rheology were used to characterize the physical property of TCM raw material. TCM preparation process was designed in a more scientific and rational way by studying the correlation among enhancing physical property of raw material, preparation process and product quality of preparation. So factors affecting the quality of TCM production would be found out and problems that might occur in the pilot process could be predicted. It would be a foundation for the R&D and production of TCM preparation as well as support for the "process control" of TCMIs gradually realized in the future.

  5. Quality management of manufacturing process based on manufacturing execution system

    NASA Astrophysics Data System (ADS)

    Zhang, Jian; Jiang, Yang; Jiang, Weizhuo

    2017-04-01

    Quality control elements in manufacturing process are elaborated. And the approach of quality management of manufacturing process based on manufacturing execution system (MES) is discussed. The functions of MES for a microcircuit production line are introduced conclusively.

  6. A multiple objective optimization approach to quality control

    NASA Technical Reports Server (NTRS)

    Seaman, Christopher Michael

    1991-01-01

    The use of product quality as the performance criteria for manufacturing system control is explored. The goal in manufacturing, for economic reasons, is to optimize product quality. The problem is that since quality is a rather nebulous product characteristic, there is seldom an analytic function that can be used as a measure. Therefore standard control approaches, such as optimal control, cannot readily be applied. A second problem with optimizing product quality is that it is typically measured along many dimensions: there are many apsects of quality which must be optimized simultaneously. Very often these different aspects are incommensurate and competing. The concept of optimality must now include accepting tradeoffs among the different quality characteristics. These problems are addressed using multiple objective optimization. It is shown that the quality control problem can be defined as a multiple objective optimization problem. A controller structure is defined using this as the basis. Then, an algorithm is presented which can be used by an operator to interactively find the best operating point. Essentially, the algorithm uses process data to provide the operator with two pieces of information: (1) if it is possible to simultaneously improve all quality criteria, then determine what changes to the process input or controller parameters should be made to do this; and (2) if it is not possible to improve all criteria, and the current operating point is not a desirable one, select a criteria in which a tradeoff should be made, and make input changes to improve all other criteria. The process is not operating at an optimal point in any sense if no tradeoff has to be made to move to a new operating point. This algorithm ensures that operating points are optimal in some sense and provides the operator with information about tradeoffs when seeking the best operating point. The multiobjective algorithm was implemented in two different injection molding scenarios: tuning of process controllers to meet specified performance objectives and tuning of process inputs to meet specified quality objectives. Five case studies are presented.

  7. Multicenter Cell Processing for Cardiovascular Regenerative Medicine Applications - The Cardiovascular Cell Therapy Research Network (CCTRN) Experience

    PubMed Central

    Gee, Adrian P.; Richman, Sara; Durett, April; McKenna, David; Traverse, Jay; Henry, Timothy; Fisk, Diann; Pepine, Carl; Bloom, Jeannette; Willerson, James; Prater, Karen; Zhao, David; Koç, Jane Reese; Ellis, Steven; Taylor, Doris; Cogle, Christopher; Moyé, Lemuel; Simari, Robert; Skarlatos, Sonia

    2013-01-01

    Background Aims Multi-center cellular therapy clinical trials require the establishment and implementation of standardized cell processing protocols and associated quality control mechanisms. The aims here were to develop such an infrastructure in support of the Cardiovascular Cell Therapy Research Network (CCTRN) and to report on the results of processing for the first 60 patients. Methods Standardized cell preparations, consisting of autologous bone marrow mononuclear cells, prepared using the Sepax device were manufactured at each of the five processing facilities that supported the clinical treatment centers. Processing staff underwent centralized training that included proficiency evaluation. Quality was subsequently monitored by a central quality control program that included product evaluation by the CCTRN biorepositories. Results Data from the first 60 procedures demonstrate that uniform products, that met all release criteria, could be manufactured at all five sites within 7 hours of receipt of the bone marrow. Uniformity was facilitated by use of the automated systems (the Sepax for processing and the Endosafe device for endotoxin testing), standardized procedures and centralized quality control. Conclusions Complex multicenter cell therapy and regenerative medicine protocols can, where necessary, successfully utilize local processing facilities once an effective infrastructure is in place to provide training, and quality control. PMID:20524773

  8. Quality data collection and management technology of aerospace complex product assembly process

    NASA Astrophysics Data System (ADS)

    Weng, Gang; Liu, Jianhua; He, Yongxi; Zhuang, Cunbo

    2017-04-01

    Aiming at solving problems of difficult management and poor traceability for discrete assembly process quality data, a data collection and management method is proposed which take the assembly process and BOM as the core. Data collection method base on workflow technology, data model base on BOM and quality traceability of assembly process is included in the method. Finally, assembly process quality data management system is developed and effective control and management of quality information for complex product assembly process is realized.

  9. Measuring the quality of therapeutic apheresis care in the pediatric intensive care unit.

    PubMed

    Sussmane, Jeffrey B; Torbati, Dan; Gitlow, Howard S

    2012-01-01

    Our goal was to measure the quality of care provided in the Pediatric Intensive Care Unit (PICU) during Therapeutic Apheresis (TA). We described the care as a step by step process. We designed a flow chart to carefully document each step of the process. We then defined each step with a unique clinical indictor (CI) that represented the exact task we felt provided quality care. These CIs were studied and modified for 1 year. We measured our performance in this process by the number of times we accomplished the CI vs. the total number of CIs that were to be performed. The degree of compliance, with these clinical indicators, was analyzed and used as a metric for quality by calculating how close the process is running exactly as planned or "in control." The Apheresis Process was in control (compliance) for 47% of the indicators, as measured in the aggregate for the first observational year. We then applied the theory of Total Quality Management (TQM) through our Design, Measure, Analyze, Improve, and Control (DMAIC) model. We were able to improve the process and bring it into control by increasing the compliance to > 99.74%, in the aggregate, for the third and fourth quarter of the second year. We have implemented TQM to increase compliance, thus control, of a highly complex and multidisciplinary Pediatric Intensive Care therapy. We have shown a reproducible and scalable measure of quality for a complex clinical process in the PICU, without additional capital expenditure. Copyright © 2011 Wiley-Liss, Inc.

  10. Application of Sigma Metrics Analysis for the Assessment and Modification of Quality Control Program in the Clinical Chemistry Laboratory of a Tertiary Care Hospital.

    PubMed

    Iqbal, Sahar; Mustansar, Tazeen

    2017-03-01

    Sigma is a metric that quantifies the performance of a process as a rate of Defects-Per-Million opportunities. In clinical laboratories, sigma metric analysis is used to assess the performance of laboratory process system. Sigma metric is also used as a quality management strategy for a laboratory process to improve the quality by addressing the errors after identification. The aim of this study is to evaluate the errors in quality control of analytical phase of laboratory system by sigma metric. For this purpose sigma metric analysis was done for analytes using the internal and external quality control as quality indicators. Results of sigma metric analysis were used to identify the gaps and need for modification in the strategy of laboratory quality control procedure. Sigma metric was calculated for quality control program of ten clinical chemistry analytes including glucose, chloride, cholesterol, triglyceride, HDL, albumin, direct bilirubin, total bilirubin, protein and creatinine, at two control levels. To calculate the sigma metric imprecision and bias was calculated with internal and external quality control data, respectively. The minimum acceptable performance was considered as 3 sigma. Westgard sigma rules were applied to customize the quality control procedure. Sigma level was found acceptable (≥3) for glucose (L2), cholesterol, triglyceride, HDL, direct bilirubin and creatinine at both levels of control. For rest of the analytes sigma metric was found <3. The lowest value for sigma was found for chloride (1.1) at L2. The highest value of sigma was found for creatinine (10.1) at L3. HDL was found with the highest sigma values at both control levels (8.8 and 8.0 at L2 and L3, respectively). We conclude that analytes with the sigma value <3 are required strict monitoring and modification in quality control procedure. In this study application of sigma rules provided us the practical solution for improved and focused design of QC procedure.

  11. Quality Control Barriers in Adapting "Metro-Centric" Education to Regional Needs

    ERIC Educational Resources Information Center

    Nagy, Judy; Robinson, Susan R.

    2013-01-01

    The massification and globalization of higher education, combined with the widespread adoption of processes underpinning accreditation and quality control of university programs, have tended to result in learning contexts that are increasingly narrowly conceived and tightly controlled. Underlying many quality control measures is a "one size…

  12. The remote supervisory and controlling experiment system of traditional Chinese medicine production based on Fieldbus

    NASA Astrophysics Data System (ADS)

    Zhan, Jinliang; Lu, Pei

    2006-11-01

    Since the quality of traditional Chinese medicine products are affected by raw material, machining and many other factors, it is difficult for traditional Chinese medicine production process especially the extracting process to ensure the steady and homogeneous quality. At the same time, there exist some quality control blind spots due to lacking on-line quality detection means. But if infrared spectrum analysis technology was used in traditional Chinese medicine production process on the basis of off-line analysis to real-time detect the quality of semi-manufactured goods and to be assisted by advanced automatic control technique, the steady and homogeneous quality can be obtained. It can be seen that the on-line detection of extracting process plays an important role in the development of Chinese patent medicines industry. In this paper, the design and implement of a traditional Chinese medicine extracting process monitoring experiment system which is based on PROFIBUS-DP field bus, OPC, and Internet technology is introduced. The system integrates intelligence node which gathering data, superior sub-system which achieving figure configuration and remote supervisory, during the process of traditional Chinese medicine production, monitors the temperature parameter, pressure parameter, quality parameter etc. And it can be controlled by the remote nodes in the VPN (Visual Private Network). Experiment and application do have proved that the system can reach the anticipation effect fully, and with the merits of operational stability, real-time, reliable, convenient and simple manipulation and so on.

  13. Integrated Application of Quality-by-Design Principles to Drug Product Development: A Case Study of Brivanib Alaninate Film-Coated Tablets.

    PubMed

    Badawy, Sherif I F; Narang, Ajit S; LaMarche, Keirnan R; Subramanian, Ganeshkumar A; Varia, Sailesh A; Lin, Judy; Stevens, Tim; Shah, Pankaj A

    2016-01-01

    Modern drug product development is expected to follow quality-by-design (QbD) paradigm. At the same time, although there are several issue-specific examples in the literature that demonstrate the application of QbD principles, a holistic demonstration of the application of QbD principles to drug product development and control strategy, is lacking. This article provides an integrated case study on the systematic application of QbD to product development and demonstrates the implementation of QbD concepts in the different aspects of product and process design for brivanib alaninate film-coated tablets. Using a risk-based approach, the strategy for development entailed identification of product critical quality attributes (CQAs), assessment of risks to the CQAs, and performing experiments to understand and mitigate identified risks. Quality risk assessments and design of experiments were performed to understand the quality of the input raw materials required for a robust formulation and the impact of manufacturing process parameters on CQAs. In addition to the material property and process parameter controls, the proposed control strategy includes use of process analytical technology and conventional analytical tests to control in-process material attributes and ensure quality of the final product. Copyright © 2016. Published by Elsevier Inc.

  14. Severe postpartum haemorrhage after vaginal delivery: a statistical process control chart to report seven years of continuous quality improvement.

    PubMed

    Dupont, Corinne; Occelli, Pauline; Deneux-Tharaux, Catherine; Touzet, Sandrine; Duclos, Antoine; Bouvier-Colle, Marie-Hélène; Rudigoz, René-Charles; Huissoud, Cyril

    2014-07-01

    Severe postpartum haemorrhage after vaginal delivery: a statistical process control chart to report seven years of continuous quality improvement To use statistical process control charts to describe trends in the prevalence of severe postpartum haemorrhage after vaginal delivery. This assessment was performed 7 years after we initiated a continuous quality improvement programme that began with regular criteria-based audits Observational descriptive study, in a French maternity unit in the Rhône-Alpes region. Quarterly clinical audit meetings to analyse all cases of severe postpartum haemorrhage after vaginal delivery and provide feedback on quality of care with statistical process control tools. The primary outcomes were the prevalence of severe PPH after vaginal delivery and its quarterly monitoring with a control chart. The secondary outcomes included the global quality of care for women with severe postpartum haemorrhage, including the performance rate of each recommended procedure. Differences in these variables between 2005 and 2012 were tested. From 2005 to 2012, the prevalence of severe postpartum haemorrhage declined significantly, from 1.2% to 0.6% of vaginal deliveries (p<0.001). Since 2010, the quarterly rate of severe PPH has not exceeded the upper control limits, that is, been out of statistical control. The proportion of cases that were managed consistently with the guidelines increased for all of their main components. Implementation of continuous quality improvement efforts began seven years ago and used, among other tools, statistical process control charts. During this period, the prevalence of severe postpartum haemorrhage after vaginal delivery has been reduced by 50%. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  15. Statistical process control: separating signal from noise in emergency department operations.

    PubMed

    Pimentel, Laura; Barrueto, Fermin

    2015-05-01

    Statistical process control (SPC) is a visually appealing and statistically rigorous methodology very suitable to the analysis of emergency department (ED) operations. We demonstrate that the control chart is the primary tool of SPC; it is constructed by plotting data measuring the key quality indicators of operational processes in rationally ordered subgroups such as units of time. Control limits are calculated using formulas reflecting the variation in the data points from one another and from the mean. SPC allows managers to determine whether operational processes are controlled and predictable. We review why the moving range chart is most appropriate for use in the complex ED milieu, how to apply SPC to ED operations, and how to determine when performance improvement is needed. SPC is an excellent tool for operational analysis and quality improvement for these reasons: 1) control charts make large data sets intuitively coherent by integrating statistical and visual descriptions; 2) SPC provides analysis of process stability and capability rather than simple comparison with a benchmark; 3) SPC allows distinction between special cause variation (signal), indicating an unstable process requiring action, and common cause variation (noise), reflecting a stable process; and 4) SPC keeps the focus of quality improvement on process rather than individual performance. Because data have no meaning apart from their context, and every process generates information that can be used to improve it, we contend that SPC should be seriously considered for driving quality improvement in emergency medicine. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. [Statistical process control applied to intensity modulated radiotherapy pretreatment controls with portal dosimetry].

    PubMed

    Villani, N; Gérard, K; Marchesi, V; Huger, S; François, P; Noël, A

    2010-06-01

    The first purpose of this study was to illustrate the contribution of statistical process control for a better security in intensity modulated radiotherapy (IMRT) treatments. This improvement is possible by controlling the dose delivery process, characterized by pretreatment quality control results. So, it is necessary to put under control portal dosimetry measurements (currently, the ionisation chamber measurements were already monitored by statistical process control thanks to statistical process control tools). The second objective was to state whether it is possible to substitute ionisation chamber with portal dosimetry in order to optimize time devoted to pretreatment quality control. At Alexis-Vautrin center, pretreatment quality controls in IMRT for prostate and head and neck treatments were performed for each beam of each patient. These controls were made with an ionisation chamber, which is the reference detector for the absolute dose measurement, and with portal dosimetry for the verification of dose distribution. Statistical process control is a statistical analysis method, coming from industry, used to control and improve the studied process quality. It uses graphic tools as control maps to follow-up process, warning the operator in case of failure, and quantitative tools to evaluate the process toward its ability to respect guidelines: this is the capability study. The study was performed on 450 head and neck beams and on 100 prostate beams. Control charts, showing drifts, both slow and weak, and also both strong and fast, of mean and standard deviation have been established and have shown special cause introduced (manual shift of the leaf gap of the multileaf collimator). Correlation between dose measured at one point, given with the EPID and the ionisation chamber has been evaluated at more than 97% and disagreement cases between the two measurements were identified. The study allowed to demonstrate the feasibility to reduce the time devoted to pretreatment controls, by substituting the ionisation chamber's measurements with those performed with EPID, and also that a statistical process control monitoring of data brought security guarantee. 2010 Société française de radiothérapie oncologique (SFRO). Published by Elsevier SAS. All rights reserved.

  17. Statistical quality control through overall vibration analysis

    NASA Astrophysics Data System (ADS)

    Carnero, M. a. Carmen; González-Palma, Rafael; Almorza, David; Mayorga, Pedro; López-Escobar, Carlos

    2010-05-01

    The present study introduces the concept of statistical quality control in automotive wheel bearings manufacturing processes. Defects on products under analysis can have a direct influence on passengers' safety and comfort. At present, the use of vibration analysis on machine tools for quality control purposes is not very extensive in manufacturing facilities. Noise and vibration are common quality problems in bearings. These failure modes likely occur under certain operating conditions and do not require high vibration amplitudes but relate to certain vibration frequencies. The vibration frequencies are affected by the type of surface problems (chattering) of ball races that are generated through grinding processes. The purpose of this paper is to identify grinding process variables that affect the quality of bearings by using statistical principles in the field of machine tools. In addition, an evaluation of the quality results of the finished parts under different combinations of process variables is assessed. This paper intends to establish the foundations to predict the quality of the products through the analysis of self-induced vibrations during the contact between the grinding wheel and the parts. To achieve this goal, the overall self-induced vibration readings under different combinations of process variables are analysed using statistical tools. The analysis of data and design of experiments follows a classical approach, considering all potential interactions between variables. The analysis of data is conducted through analysis of variance (ANOVA) for data sets that meet normality and homoscedasticity criteria. This paper utilizes different statistical tools to support the conclusions such as chi squared, Shapiro-Wilks, symmetry, Kurtosis, Cochran, Hartlett, and Hartley and Krushal-Wallis. The analysis presented is the starting point to extend the use of predictive techniques (vibration analysis) for quality control. This paper demonstrates the existence of predictive variables (high-frequency vibration displacements) that are sensible to the processes setup and the quality of the products obtained. Based on the result of this overall vibration analysis, a second paper will analyse self-induced vibration spectrums in order to define limit vibration bands, controllable every cycle or connected to permanent vibration-monitoring systems able to adjust sensible process variables identified by ANOVA, once the vibration readings exceed established quality limits.

  18. Drop-on-Demand System for Manufacturing of Melt-based Solid Oral Dosage: Effect of Critical Process Parameters on Product Quality.

    PubMed

    Içten, Elçin; Giridhar, Arun; Nagy, Zoltan K; Reklaitis, Gintaras V

    2016-04-01

    The features of a drop-on-demand-based system developed for the manufacture of melt-based pharmaceuticals have been previously reported. In this paper, a supervisory control system, which is designed to ensure reproducible production of high quality of melt-based solid oral dosages, is presented. This control system enables the production of individual dosage forms with the desired critical quality attributes: amount of active ingredient and drug morphology by monitoring and controlling critical process parameters, such as drop size and product and process temperatures. The effects of these process parameters on the final product quality are investigated, and the properties of the produced dosage forms characterized using various techniques, such as Raman spectroscopy, optical microscopy, and dissolution testing. A crystallization temperature control strategy, including controlled temperature cycles, is presented to tailor the crystallization behavior of drug deposits and to achieve consistent drug morphology. This control strategy can be used to achieve the desired bioavailability of the drug by mitigating variations in the dissolution profiles. The supervisor control strategy enables the application of the drop-on-demand system to the production of individualized dosage required for personalized drug regimens.

  19. Terms of Productivity, Including the Relationship Between Productivity, Effectiveness and Efficiency.

    DTIC Science & Technology

    1989-04-01

    for Awareness Juran on Planning for Quality, 1988, J.M. Juran What is Total Quality Control? The Japanese Way, 1985, Kaoru Ishikawa Guide to Quality...Control, 1982, Kaoru Ishikawa Andrews, M. (1985). Statistical Process Control: Mandatory Management Tool. Production April 1985. Bushe, G. (1988

  20. Can a combination of average of normals and "real time" External Quality Assurance replace Internal Quality Control?

    PubMed

    Badrick, Tony; Graham, Peter

    2018-03-28

    Internal Quality Control and External Quality Assurance are separate but related processes that have developed independently in laboratory medicine over many years. They have different sample frequencies, statistical interpretations and immediacy. Both processes have evolved absorbing new understandings of the concept of laboratory error, sample material matrix and assay capability. However, we do not believe at the coalface that either process has led to much improvement in patient outcomes recently. It is the increasing reliability and automation of analytical platforms along with improved stability of reagents that has reduced systematic and random error, which in turn has minimised the risk of running less frequent IQC. We suggest that it is time to rethink the role of both these processes and unite them into a single approach using an Average of Normals model supported by more frequent External Quality Assurance samples. This new paradigm may lead to less confusion for laboratory staff and quicker responses to and identification of out of control situations.

  1. Analysis And Control System For Automated Welding

    NASA Technical Reports Server (NTRS)

    Powell, Bradley W.; Burroughs, Ivan A.; Kennedy, Larry Z.; Rodgers, Michael H.; Goode, K. Wayne

    1994-01-01

    Automated variable-polarity plasma arc (VPPA) welding apparatus operates under electronic supervision by welding analysis and control system. System performs all major monitoring and controlling functions. It acquires, analyzes, and displays weld-quality data in real time and adjusts process parameters accordingly. Also records pertinent data for use in post-weld analysis and documentation of quality. System includes optoelectronic sensors and data processors that provide feedback control of welding process.

  2. The impact of pay-for-performance on quality of care for minority patients.

    PubMed

    Epstein, Arnold M; Jha, Ashish K; Orav, E John

    2014-10-01

    To determine whether racial disparities in process quality and outcomes of care change under hospital pay-for-performance. Retrospective cohort study comparing the change in racial disparities in process quality and outcomes of care between 2004 and 2008 in hospitals participating in the Premier Hospital Quality Incentive Demonstration versus control hospitals. Using patient-level Hospital Quality Alliance (HQA) data, we identified 226,096 patients in Premier hospitals, which were subject to pay-for-performance (P4P) contracts and 1,607,575 patients at control hospitals who had process of care measured during hospitalization for acute myocardial infarction (AMI), congestive heart failure (CHF), or pneumonia. We additionally identified 123,241 Medicare patients in Premier hospitals and 995,107 in controls who were hospitalized for AMI, CHF, pneumonia, or coronary artery bypass graft (CABG) surgery. We then compared HQA process quality indicators for AMI, CHF, and pneumonia between P4P and control hospitals, as well as risk-adjusted mortality rates for AMI, CHF, pneumonia, and CABG. Black patients initially had lower performance on process quality indicators in both Premier and non-Premier hospitals. The racial gap decreased over time in both groups; the reduction in the gap in Premier hospitals was greater than the gap reduction in non-Premier hospitals for AMI patients. During the study period, mortality generally decreased for blacks relative to whites for AMI, CHF, and pneumonia in both Premier and non-Premier hospitals, with the relative reduction for blacks greatest in Premier hospitals for CHF. Our results show no evidence of a deleterious impact of P4P in the Premier HQID on racial disparities in process quality or outcomes.

  3. Manufacture and quality control of interconnecting wire hardnesses, Volume 1

    NASA Technical Reports Server (NTRS)

    1972-01-01

    A standard is presented for manufacture, installation, and quality control of eight types of interconnecting wire harnesses. The processes, process controls, and inspection and test requirements reflected are based on acknowledgment of harness design requirements, acknowledgment of harness installation requirements, identification of the various parts, materials, etc., utilized in harness manufacture, and formulation of a typical manufacturing flow diagram for identification of each manufacturing and quality control process, operation, inspection, and test. The document covers interconnecting wire harnesses defined in the design standard, including type 1, enclosed in fluorocarbon elastomer convolute, tubing; type 2, enclosed in TFE convolute tubing lines with fiberglass braid; type 3, enclosed in TFE convolute tubing; and type 5, combination of types 3 and 4. Knowledge gained through experience on the Saturn 5 program coupled with recent advances in techniques, materials, and processes was incorporated.

  4. CRN5EXP: Expert system for statistical quality control

    NASA Technical Reports Server (NTRS)

    Hentea, Mariana

    1991-01-01

    The purpose of the Expert System CRN5EXP is to assist in checking the quality of the coils at two very important mills: Hot Rolling and Cold Rolling in a steel plant. The system interprets the statistical quality control charts, diagnoses and predicts the quality of the steel. Measurements of process control variables are recorded in a database and sample statistics such as the mean and the range are computed and plotted on a control chart. The chart is analyzed through patterns using the C Language Integrated Production System (CLIPS) and a forward chaining technique to reach a conclusion about the causes of defects and to take management measures for the improvement of the quality control techniques. The Expert System combines the certainty factors associated with the process control variables to predict the quality of the steel. The paper presents the approach to extract data from the database, the reason to combine certainty factors, the architecture and the use of the Expert System. However, the interpretation of control charts patterns requires the human expert's knowledge and lends to Expert Systems rules.

  5. Multicapillary SDS-gel electrophoresis for the analysis of fluorescently labeled mAb preparations: a high throughput quality control process for the production of QuantiPlasma and PlasmaScan mAb libraries.

    PubMed

    Székely, Andrea; Szekrényes, Akos; Kerékgyártó, Márta; Balogh, Attila; Kádas, János; Lázár, József; Guttman, András; Kurucz, István; Takács, László

    2014-08-01

    Molecular heterogeneity of mAb preparations is the result of various co- and post-translational modifications and to contaminants related to the production process. Changes in molecular composition results in alterations of functional performance, therefore quality control and validation of therapeutic or diagnostic protein products is essential. A special case is the consistent production of mAb libraries (QuantiPlasma™ and PlasmaScan™) for proteome profiling, quality control of which represents a challenge because of high number of mAbs (>1000). Here, we devise a generally applicable multicapillary SDS-gel electrophoresis process for the analysis of fluorescently labeled mAb preparations for the high throughput quality control of mAbs of the QuantiPlasma™ and PlasmaScan™ libraries. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Real time monitoring of powder blend bulk density for coupled feed-forward/feed-back control of a continuous direct compaction tablet manufacturing process.

    PubMed

    Singh, Ravendra; Román-Ospino, Andrés D; Romañach, Rodolfo J; Ierapetritou, Marianthi; Ramachandran, Rohit

    2015-11-10

    The pharmaceutical industry is strictly regulated, where precise and accurate control of the end product quality is necessary to ensure the effectiveness of the drug products. For such control, the process and raw materials variability ideally need to be fed-forward in real time into an automatic control system so that a proactive action can be taken before it can affect the end product quality. Variations in raw material properties (e.g., particle size), feeder hopper level, amount of lubrication, milling and blending action, applied shear in different processing stages can affect the blend density significantly and thereby tablet weight, hardness and dissolution. Therefore, real time monitoring of powder bulk density variability and its incorporation into the automatic control system so that its effect can be mitigated proactively and efficiently is highly desired. However, real time monitoring of powder bulk density is still a challenging task because of different level of complexities. In this work, powder bulk density which has a significant effect on the critical quality attributes (CQA's) has been monitored in real time in a pilot-plant facility, using a NIR sensor. The sensitivity of the powder bulk density on critical process parameters (CPP's) and CQA's has been analyzed and thereby feed-forward controller has been designed. The measured signal can be used for feed-forward control so that the corrective actions on the density variations can be taken before they can influence the product quality. The coupled feed-forward/feed-back control system demonstrates improved control performance and improvements in the final product quality in the presence of process and raw material variations. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Pulse-Flow Microencapsulation System

    NASA Technical Reports Server (NTRS)

    Morrison, Dennis R.

    2006-01-01

    The pulse-flow microencapsulation system (PFMS) is an automated system that continuously produces a stream of liquid-filled microcapsules for delivery of therapeutic agents to target tissues. Prior microencapsulation systems have relied on batch processes that involve transfer of batches between different apparatuses for different stages of production followed by sampling for acquisition of quality-control data, including measurements of size. In contrast, the PFMS is a single, microprocessor-controlled system that performs all processing steps, including acquisition of quality-control data. The quality-control data can be used as real-time feedback to ensure the production of large quantities of uniform microcapsules.

  8. Quality Control Study of the GSL Reinsurance System. Final Report.

    ERIC Educational Resources Information Center

    Advanced Technology, Inc., Reston, VA.

    A quality control plan for the U.S. Department of Education's Guaranteed Student Loan (GSL) reinsurance process was developed. To identify existing errors, systems documentation and past analyses of the reinsurance system were analyzed, and interviews were conducted. Corrective actions were proposed, and a quality control checklist was developed…

  9. Quality control system preparation for photogrammetric and laser scanning missions of Spanish national plan of aerial orthophotogpaphy (PNOA). (Polish Title: Opracowanie systemu kontroli jakości realizacji nalotów fotogrametrycznych i skaningowych dla hiszpańskiego narodowego planu ortofotomapy lotniczej (PNOA))

    NASA Astrophysics Data System (ADS)

    Rzonca, A.

    2013-12-01

    The paper presents the state of the art of quality control of photogrammetric and laser scanning data captured by airborne sensors. The described subject is very important for photogrammetric and LiDAR project execution, because the data quality a prior decides about the final product quality. On the other hand, precise and effective quality control process allows to execute the missions without wide margin of safety, especially in case of the mountain areas projects. For introduction, the author presents theoretical background of the quality control, basing on his own experience, instructions and technical documentation. He describes several variants of organization solutions. Basically, there are two main approaches: quality control of the captured data and the control of discrepancies of the flight plan and its results of its execution. Both of them are able to use test of control and analysis of the data. The test is an automatic algorithm controlling the data and generating the control report. Analysis is a less complicated process, that is based on documentation, data and metadata manual check. The example of quality control system for large area project was presented. The project is being realized periodically for the territory of all Spain and named National Plan of Aerial Orthophotography (Plan Nacional de Ortofotografía Aérea, PNOA). The system of the internal control guarantees its results soon after the flight and informs the flight team of the company. It allows to correct all the errors shortly after the flight and it might stop transferring the data to another team or company, for further data processing. The described system of data quality control contains geometrical and radiometrical control of photogrammetric data and geometrical control of LiDAR data. According to all specified parameters, it checks all of them and generates the reports. They are very helpful in case of some errors or low quality data. The paper includes the author experience in the field of data quality control, presents the conclusions and suggestions of the organization and technical aspects, with a short definition of the necessary control software.

  10. Implementation of "Quality by Design (QbD)" Approach for the Development of 5-Fluorouracil Loaded Thermosensitive Hydrogel.

    PubMed

    Dalwadi, Chintan; Patel, Gayatri

    2016-01-01

    The purpose of this study was to investigate Quality by Design (QbD) principle for the preparation of hydrogel products to prove both practicability and utility of executing QbD concept to hydrogel based controlled release systems. Product and process understanding will help in decreasing the variability of critical material and process parameters, which give quality product output and reduce the risk. This study includes the identification of the Quality Target Product Profiles (QTPPs) and Critical Quality Attributes (CQAs) from literature or preliminary studies. To identify and control the variability in process and material attributes, two tools of QbD was utilized, Quality Risk Management (QRM) and Experimental Design. Further, it helps to identify the effect of these attributes on CQAs. Potential risk factors were identified from fishbone diagram and screened by risk assessment and optimized by 3-level 2- factor experimental design with center points in triplicate, to analyze the precision of the target process. This optimized formulation was further characterized by gelling time, gelling temperature, rheological parameters, in-vitro biodegradation and in-vitro drug release. Design space was created using experimental design tool that gives the control space and working within this controlled space reduces all the failure modes below the risk level. In conclusion, QbD approach with QRM tool provides potent and effectual pyramid to enhance the quality into the hydrogel.

  11. ProSens: integrated production control by automated inspection planning and efficient multisensor metrology

    NASA Astrophysics Data System (ADS)

    Glaser, Ulf; Li, Zhichao; Bichmann, Stephan, II; Pfeifer, Tilo

    2003-05-01

    By China's entry into the WTO, Chinese as well as German companies are facing the question, how to minimize the risk of unfamiliar cooperation partners when developing products. The rise of customer demands concerning quality, product diversity and the reduction of expenses require flexibility and efficiency with reliable component suppliers. In order to build and strengthen sino-german cooperations, a manufacturing control using homogenized and efficient measures to assure high quality is of vital importance. Lack of unifications may cause identical measurements conducted at subcontractors or customers to be carried out with different measurement processes which leads to incomparable results. Rapidly growing company cooperations and simultaneously decreasing of manufacturing scope cause substantial difficulties when coordinating joint quality control activities. "ProSens," a sino-german project consortium consisting of industrial users, technology producers and research institutes, aims at improving selected production processes by: Creation of a homogeneous quality awareness in sino-german cooperations. Sensitization for process accompanying metrology at an early stage of product development. Increase of the process performance by the use of integrated metrology. Reduction of production time and cost. Unification of quality control of complex products by means of efficient measurement strategies and CAD-based inspection planning.

  12. Nuclear Technology Series. Course 14: Introduction to Quality Assurance/Quality Control.

    ERIC Educational Resources Information Center

    Technical Education Research Center, Waco, TX.

    This technical specialty course is one of thirty-five courses designed for use by two-year postsecondary institutions in five nuclear technician curriculum areas: (1) radiation protection technician, (2) nuclear instrumentation and control technician, (3) nuclear materials processing technician, (4) nuclear quality-assurance/quality-control…

  13. Quality nursing care: a qualitative enquiry.

    PubMed

    Hogston, R

    1995-01-01

    In spite of the wealth of literature on quality nursing care, a disparity exists in defining quality. The purpose of this study was an attempt to seek out practising nurses' perceptions of quality nursing care and to present a definition of quality as described by nurses. Eighteen nurses from a large hospital in the south of England were interviewed. Qualitative analysis based on a modified grounded theory approach revealed three categories described as 'structure', 'process' and 'outcome'. This supports previous work on evaluating quality care but postulates that structure, process and outcome could also be used as a mechanism for defining quality. The categories are defined by using the words of the informants in order to explain the essential attributes of quality nursing care. The findings demonstrate how more informants cited quality in terms of process and outcome than structure. It is speculated that the significance of this rests with the fact that nurses have direct control over process and outcome whereas the political and economic climate in which nurses work is beyond their control and decisions over structure lie with their managers.

  14. Quality Control of the Print with the Application of Statistical Methods

    NASA Astrophysics Data System (ADS)

    Simonenko, K. V.; Bulatova, G. S.; Antropova, L. B.; Varepo, L. G.

    2018-04-01

    The basis for standardizing the process of offset printing is the control of print quality indicators. The solution of this problem has various approaches, among which the most important are statistical methods. Practical implementation of them for managing the quality of the printing process is very relevant and is reflected in this paper. The possibility of using the method of constructing a Control Card to identify the reasons for the deviation of the optical density for a triad of inks in offset printing is shown.

  15. An NCME Instructional Module on Quality Control Procedures in the Scoring, Equating, and Reporting of Test Scores

    ERIC Educational Resources Information Center

    Allalouf, Avi

    2007-01-01

    There is significant potential for error in long production processes that consist of sequential stages, each of which is heavily dependent on the previous stage, such as the SER (Scoring, Equating, and Reporting) process. Quality control procedures are required in order to monitor this process and to reduce the number of mistakes to a minimum. In…

  16. Quality Control in Clinical Laboratory Samples

    DTIC Science & Technology

    2015-01-01

    is able to find and correct flaws in the analytical processes of a lab before potentially incorrect patient resu lts are released. According to...verifi es that the results produced are accurate and precise . Clinical labs use management of documentation as well as inco rporation of a continuous...improvement process to streamline the overall quality control process . QC samples are expected to be identical and tested identically to patient

  17. Information integration and diagnosis analysis of equipment status and production quality for machining process

    NASA Astrophysics Data System (ADS)

    Zan, Tao; Wang, Min; Hu, Jianzhong

    2010-12-01

    Machining status monitoring technique by multi-sensors can acquire and analyze the machining process information to implement abnormity diagnosis and fault warning. Statistical quality control technique is normally used to distinguish abnormal fluctuations from normal fluctuations through statistical method. In this paper by comparing the advantages and disadvantages of the two methods, the necessity and feasibility of integration and fusion is introduced. Then an approach that integrates multi-sensors status monitoring and statistical process control based on artificial intelligent technique, internet technique and database technique is brought forward. Based on virtual instrument technique the author developed the machining quality assurance system - MoniSysOnline, which has been used to monitoring the grinding machining process. By analyzing the quality data and AE signal information of wheel dressing process the reason of machining quality fluctuation has been obtained. The experiment result indicates that the approach is suitable for the status monitoring and analyzing of machining process.

  18. 77 FR 11484 - Agency Information Collection Activities: Proposed Collection; Comment Request-Negative QC Review...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-27

    ... Quality Control process for the Supplemental Nutrition Assistance Program and the FNS-248 will be removed... other forms of information technology. Comments may be sent to: Francis B. Heil, Chief, Quality Control... directed to Francis B. Heil, (703) 305-2442. SUPPLEMENTARY INFORMATION: Title: Negative Quality Control...

  19. Metrology: Measurement Assurance Program Guidelines

    NASA Technical Reports Server (NTRS)

    Eicke, W. G.; Riley, J. P.; Riley, K. J.

    1995-01-01

    The 5300.4 series of NASA Handbooks for Reliability and Quality Assurance Programs have provisions for the establishment and utilization of a documented metrology system to control measurement processes and to provide objective evidence of quality conformance. The intent of these provisions is to assure consistency and conformance to specifications and tolerances of equipment, systems, materials, and processes procured and/or used by NASA, its international partners, contractors, subcontractors, and suppliers. This Measurement Assurance Program (MAP) guideline has the specific objectives to: (1) ensure the quality of measurements made within NASA programs; (2) establish realistic measurement process uncertainties; (3) maintain continuous control over the measurement processes; and (4) ensure measurement compatibility among NASA facilities. The publication addresses MAP methods as applied within and among NASA installations and serves as a guide to: control measurement processes at the local level (one facility); conduct measurement assurance programs in which a number of field installations are joint participants; and conduct measurement integrity (round robin) experiments in which a number of field installations participate to assess the overall quality of particular measurement processes at a point in time.

  20. Quality status display for a vibration welding process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spicer, John Patrick; Abell, Jeffrey A.; Wincek, Michael Anthony

    A method includes receiving, during a vibration welding process, a set of sensory signals from a collection of sensors positioned with respect to a work piece during formation of a weld on or within the work piece. The method also includes receiving control signals from a welding controller during the process, with the control signals causing the welding horn to vibrate at a calibrated frequency, and processing the received sensory and control signals using a host machine. Additionally, the method includes displaying a predicted weld quality status on a surface of the work piece using a status projector. The methodmore » may include identifying and display a quality status of a suspect weld. The laser projector may project a laser beam directly onto or immediately adjacent to the suspect welds, e.g., as a red, green, blue laser or a gas laser having a switched color filter.« less

  1. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center ({+-}4% of deviation between the calculated and measured doses) by calculatingmore » a control process capability (C{sub pc}) index. The C{sub pc} index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should improve the security of treatments. They also showed that the dose delivery processes in the cancer center were in control for prostate and head-and-neck treatments. In parallel, long term process performance indices (P{sub p}, P{sub pk}, and P{sub pm}) have been analyzed. Their analysis helped defining which actions should be undertaken in order to improve the performance of the process. The prostate dose delivery process has been shown statistically capable (0.08% of the results is expected to be outside the clinical tolerances) contrary to the head-and-neck dose delivery process (5.76% of the results are expected to be outside the clinical tolerances).« less

  2. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC).

    PubMed

    Gérard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent; Kafrouni, Hanna; Husson, François; Aletti, Pierre

    2009-04-01

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short-term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center (+/- 4% of deviation between the calculated and measured doses) by calculating a control process capability (C(pc)) index. The C(pc) index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short-term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should improve the security of treatments. They also showed that the dose delivery processes in the cancer center were in control for prostate and head-and-neck treatments. In parallel, long-term process performance indices (P(p), P(pk), and P(pm)) have been analyzed. Their analysis helped defining which actions should be undertaken in order to improve the performance of the process. The prostate dose delivery process has been shown statistically capable (0.08% of the results is expected to be outside the clinical tolerances) contrary to the head-and-neck dose delivery process (5.76% of the results are expected to be outside the clinical tolerances).

  3. Multicriteria Gain Tuning for Rotorcraft Flight Controls (also entitled The Development of the Conduit Advanced Control System Design and Evaluation Interface with a Case Study Application Fly by Wire Helicopter Design)

    NASA Technical Reports Server (NTRS)

    Biezad, Daniel

    1997-01-01

    Handling qualities analysis and control law design would seem to be naturally complimenting components of aircraft flight control system design, however these two closely coupled disciplines are often not well integrated in practice. Handling qualities engineers and control system engineers may work in separate groups within an aircraft company. Flight control system engineers and handling quality specialists may come from different backgrounds and schooling and are often not aware of the other group's research. Thus while the handling qualities specifications represent desired aircraft response characteristics, these are rarely incorporated directly in the control system design process. Instead modem control system design techniques are based on servo-loop robustness specifications, and simple representations of the desired control response. Comprehensive handling qualities analysis is often left until the end of the design cycle and performed as a check of the completed design for satisfactory performance. This can lead to costly redesign or less than satisfactory aircraft handling qualities when the flight testing phase is reached. The desire to integrate the fields of handling qualities and flight,control systems led to the development of the CONDUIT system. This tool facilitates control system designs that achieve desired handling quality requirements and servo-loop specifications in a single design process. With CONDUIT, the control system engineer is now able to directly design and control systems to meet the complete handling specifications. CONDUIT allows the designer to retain a preferred control law structure, but then tunes the system parameters to meet the handling quality requirements.

  4. Engineering Quality Software: 10 Recommendations for Improved Software Quality Management

    DTIC Science & Technology

    2010-04-27

    lack of user involvement • Inadequate Software Process Management & Control By Contractors • No “Team” of Vendors and users; little SME participation...1990 Quality Perspectives • Process Quality ( CMMI ) • Product Quality (ISO/IEC 2500x) – Internal Quality Attributes – External Quality Attributes... CMMI /ISO 9000 Assessments – Capture organizational knowledge • Identify best practices, lessons learned Know where you are, and where you need to be

  5. Network-based production quality control

    NASA Astrophysics Data System (ADS)

    Kwon, Yongjin; Tseng, Bill; Chiou, Richard

    2007-09-01

    This study investigates the feasibility of remote quality control using a host of advanced automation equipment with Internet accessibility. Recent emphasis on product quality and reduction of waste stems from the dynamic, globalized and customer-driven market, which brings opportunities and threats to companies, depending on the response speed and production strategies. The current trends in industry also include a wide spread of distributed manufacturing systems, where design, production, and management facilities are geographically dispersed. This situation mandates not only the accessibility to remotely located production equipment for monitoring and control, but efficient means of responding to changing environment to counter process variations and diverse customer demands. To compete under such an environment, companies are striving to achieve 100%, sensor-based, automated inspection for zero-defect manufacturing. In this study, the Internet-based quality control scheme is referred to as "E-Quality for Manufacturing" or "EQM" for short. By its definition, EQM refers to a holistic approach to design and to embed efficient quality control functions in the context of network integrated manufacturing systems. Such system let designers located far away from the production facility to monitor, control and adjust the quality inspection processes as production design evolves.

  6. Quality Control through Design and Process: Gambrel Roof Truss Challenge

    ERIC Educational Resources Information Center

    Ward, Dell; Jones, James

    2011-01-01

    Customers determine whether a product fulfills their needs or satisfies them. "Quality control", then, is the process of finding out what the customer wants, along with designing, producing, delivering, and servicing the product--and ultimately satisfying the customer's expectations. For many years, people considered a product to be of good…

  7. Quality control troubleshooting tools for the mill floor

    Treesearch

    John Dramm

    2000-01-01

    Statistical Process Control (SPC) provides effective tools for improving process quality in the forest products industry resulting in reduced costs and improved productivity. Implementing SPC helps identify and locate problems that occur in wood products manufacturing. SPC tools achieve their real value when applied on the mill floor for monitoring and troubleshooting...

  8. Rapid evaluation and quality control of next generation sequencing data with FaQCs.

    PubMed

    Lo, Chien-Chi; Chain, Patrick S G

    2014-11-19

    Next generation sequencing (NGS) technologies that parallelize the sequencing process and produce thousands to millions, or even hundreds of millions of sequences in a single sequencing run, have revolutionized genomic and genetic research. Because of the vagaries of any platform's sequencing chemistry, the experimental processing, machine failure, and so on, the quality of sequencing reads is never perfect, and often declines as the read is extended. These errors invariably affect downstream analysis/application and should therefore be identified early on to mitigate any unforeseen effects. Here we present a novel FastQ Quality Control Software (FaQCs) that can rapidly process large volumes of data, and which improves upon previous solutions to monitor the quality and remove poor quality data from sequencing runs. Both the speed of processing and the memory footprint of storing all required information have been optimized via algorithmic and parallel processing solutions. The trimmed output compared side-by-side with the original data is part of the automated PDF output. We show how this tool can help data analysis by providing a few examples, including an increased percentage of reads recruited to references, improved single nucleotide polymorphism identification as well as de novo sequence assembly metrics. FaQCs combines several features of currently available applications into a single, user-friendly process, and includes additional unique capabilities such as filtering the PhiX control sequences, conversion of FASTQ formats, and multi-threading. The original data and trimmed summaries are reported within a variety of graphics and reports, providing a simple way to do data quality control and assurance.

  9. Quality by design for herbal drugs: a feedforward control strategy and an approach to define the acceptable ranges of critical quality attributes.

    PubMed

    Yan, Binjun; Li, Yao; Guo, Zhengtai; Qu, Haibin

    2014-01-01

    The concept of quality by design (QbD) has been widely accepted and applied in the pharmaceutical manufacturing industry. There are still two key issues to be addressed in the implementation of QbD for herbal drugs. The first issue is the quality variation of herbal raw materials and the second issue is the difficulty in defining the acceptable ranges of critical quality attributes (CQAs). To propose a feedforward control strategy and a method for defining the acceptable ranges of CQAs for the two issues. In the case study of the ethanol precipitation process of Danshen (Radix Salvia miltiorrhiza) injection, regression models linking input material attributes and process parameters to CQAs were built first and an optimisation model for calculating the best process parameters according to the input materials was established. Then, the feasible material space was defined and the acceptable ranges of CQAs for the previous process were determined. In the case study, satisfactory regression models were built with cross-validated regression coefficients (Q(2) ) all above 91 %. The feedforward control strategy was applied successfully to compensate the quality variation of the input materials, which was able to control the CQAs in the 90-110 % ranges of the desired values. In addition, the feasible material space for the ethanol precipitation process was built successfully, which showed the acceptable ranges of the CQAs for the concentration process. The proposed methodology can help to promote the implementation of QbD for herbal drugs. Copyright © 2013 John Wiley & Sons, Ltd.

  10. MapReduce Based Parallel Bayesian Network for Manufacturing Quality Control

    NASA Astrophysics Data System (ADS)

    Zheng, Mao-Kuan; Ming, Xin-Guo; Zhang, Xian-Yu; Li, Guo-Ming

    2017-09-01

    Increasing complexity of industrial products and manufacturing processes have challenged conventional statistics based quality management approaches in the circumstances of dynamic production. A Bayesian network and big data analytics integrated approach for manufacturing process quality analysis and control is proposed. Based on Hadoop distributed architecture and MapReduce parallel computing model, big volume and variety quality related data generated during the manufacturing process could be dealt with. Artificial intelligent algorithms, including Bayesian network learning, classification and reasoning, are embedded into the Reduce process. Relying on the ability of the Bayesian network in dealing with dynamic and uncertain problem and the parallel computing power of MapReduce, Bayesian network of impact factors on quality are built based on prior probability distribution and modified with posterior probability distribution. A case study on hull segment manufacturing precision management for ship and offshore platform building shows that computing speed accelerates almost directly proportionally to the increase of computing nodes. It is also proved that the proposed model is feasible for locating and reasoning of root causes, forecasting of manufacturing outcome, and intelligent decision for precision problem solving. The integration of bigdata analytics and BN method offers a whole new perspective in manufacturing quality control.

  11. The development of a contract quality assurance program within the Virginia Department of Highways.

    DOT National Transportation Integrated Search

    1989-01-01

    In order to assure the quality of construction products and processes, the Virginia Department of Transportation has established three levels of construction control. First, contractors themselves provide oversight and quality control as set out in t...

  12. A Process Management System for Networked Manufacturing

    NASA Astrophysics Data System (ADS)

    Liu, Tingting; Wang, Huifen; Liu, Linyan

    With the development of computer, communication and network, networked manufacturing has become one of the main manufacturing paradigms in the 21st century. Under the networked manufacturing environment, there exist a large number of cooperative tasks susceptible to alterations, conflicts caused by resources and problems of cost and quality. This increases the complexity of administration. Process management is a technology used to design, enact, control, and analyze networked manufacturing processes. It supports efficient execution, effective management, conflict resolution, cost containment and quality control. In this paper we propose an integrated process management system for networked manufacturing. Requirements of process management are analyzed and architecture of the system is presented. And a process model considering process cost and quality is developed. Finally a case study is provided to explain how the system runs efficiently.

  13. The Use of Logistics n the Quality Parameters Control System of Material Flow

    ERIC Educational Resources Information Center

    Karpova, Natalia P.; Toymentseva, Irina A.; Shvetsova, Elena V.; Chichkina, Vera D.; Chubarkova, Elena V.

    2016-01-01

    The relevance of the research problem is conditioned on the need to justify the use of the logistics methodologies in the quality parameters control process of material flows. The goal of the article is to develop theoretical principles and practical recommendations for logistical system control in material flows quality parameters. A leading…

  14. Contribution au developpement d'une methode de controle des procedes dans une usine de bouletage

    NASA Astrophysics Data System (ADS)

    Gosselin, Claude

    This thesis, a collaborative effort between Ecole de technologie superieure and ArcelorMittal Company, presents the development of a methodology for monitoring and quality control of multivariable industrial production processes. This innovation research mandate was developed at ArcelorMittal Exploitation Miniere (AMEM) pellet plant in Port-Cartier (Quebec, Canada). With this undertaking, ArcelorMittal is striving to maintain its world class level of excellence and continues to pursue initiatives that can augment its competitive advantage worldwide. The plant's gravimetric classification process was retained as a prototype and development laboratory due to its effect on the company's competitiveness and its impact on subsequent steps leading to final production of iron oxide pellets. Concretely, the development of this expertise in process control and in situ monitoring will establish a firm basic knowledge in the fields of complex system physical modeling, data reconciliation, statistical observers, multivariate command and quality control using real-time monitoring of the desirability function. The hydraulic classifier is mathematically modeled. Using planned disturbances on the production line, an identification procedure was established to provide empirical estimations of the model's structural parameters. A new sampling campaign and a previously unpublished data collection and consolidation policy were implemented plant-wide. Access to these invaluable data sources has enabled the establishment of new thresholds that govern the production process and its control. Finally, as a substitute for the traditional quality control process, we have implemented a new strategy based on the use of the desirability function. Our innovation is not in using this Finally, as a substitute for the traditional quality control process, we have implemented a new strategy based on the use of the desirability function. Our innovation is not in using this function as an indicator of overall (economic) satisfaction in the production process, but rather in proposing it as an "observer" of the system's state. The first implementation steps have already demonstrated the method's feasibility as well as other numerous industrial impacts on production processes within the company. Namely, the emergence of the economical aspect as a strategic variable that assures better governance of production processes where quality variables present strategic issues.

  15. Advanced Manufacturing Systems in Food Processing and Packaging Industry

    NASA Astrophysics Data System (ADS)

    Shafie Sani, Mohd; Aziz, Faieza Abdul

    2013-06-01

    In this paper, several advanced manufacturing systems in food processing and packaging industry are reviewed, including: biodegradable smart packaging and Nano composites, advanced automation control system consists of fieldbus technology, distributed control system and food safety inspection features. The main purpose of current technology in food processing and packaging industry is discussed due to major concern on efficiency of the plant process, productivity, quality, as well as safety. These application were chosen because they are robust, flexible, reconfigurable, preserve the quality of the food, and efficient.

  16. [Discussion on Quality Evaluation Method of Medical Device During Life-Cycle in Operation Based on the Analytic Hierarchy Process].

    PubMed

    Zheng, Caixian; Zheng, Kun; Shen, Yunming; Wu, Yunyun

    2016-01-01

    The content related to the quality during life-cycle in operation of medical device includes daily use, repair volume, preventive maintenance, quality control and adverse event monitoring. In view of this, the article aims at discussion on the quality evaluation method of medical devices during their life cycle in operation based on the Analytic Hierarchy Process (AHP). The presented method is proved to be effective by evaluating patient monitors as example. The method presented in can promote and guide the device quality control work, and it can provide valuable inputs to decisions about purchase of new device.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kegel, T.M.

    Calibration laboratories are faced with the need to become accredited or registered to one or more quality standards. One requirement common to all of these standards is the need to have in place a measurement assurance program. What is a measurement assurance program? Brian Belanger, in Measurement Assurance Programs: Part 1, describes it as a {open_quotes}quality assurance program for a measurement process that quantifies the total uncertainty of the measurements (both random and systematic components of error) with respect to national or designated standards and demonstrates that the total uncertainty is sufficiently small to meet the user`s requirements.{close_quotes} Rolf Schumachermore » is more specific in Measurement Assurance in Your Own Laboratory. He states, {open_quotes}Measurement assurance is the application of broad quality control principles to measurements of calibrations.{close_quotes} Here, the focus is on one important part of any measurement assurance program: implementation of statistical process control (SPC). Paraphrasing Juran`s Quality Control Handbook, a process is in statistical control if the only observed variations are those that can be attributed to random causes. Conversely, a process that exhibits variations due to assignable causes is not in a state of statistical control. Finally, Carrol Croarkin states, {open_quotes}In the measurement assurance context the measurement algorithm including instrumentation, reference standards and operator interactions is the process that is to be controlled, and its direct product is the measurement per se. The measurements are assumed to be valid if the measurement algorithm is operating in a state of control.{close_quotes} Implicit in this statement is the important fact that an out-of-control process cannot produce valid measurements. 7 figs.« less

  18. Scater: pre-processing, quality control, normalization and visualization of single-cell RNA-seq data in R.

    PubMed

    McCarthy, Davis J; Campbell, Kieran R; Lun, Aaron T L; Wills, Quin F

    2017-04-15

    Single-cell RNA sequencing (scRNA-seq) is increasingly used to study gene expression at the level of individual cells. However, preparing raw sequence data for further analysis is not a straightforward process. Biases, artifacts and other sources of unwanted variation are present in the data, requiring substantial time and effort to be spent on pre-processing, quality control (QC) and normalization. We have developed the R/Bioconductor package scater to facilitate rigorous pre-processing, quality control, normalization and visualization of scRNA-seq data. The package provides a convenient, flexible workflow to process raw sequencing reads into a high-quality expression dataset ready for downstream analysis. scater provides a rich suite of plotting tools for single-cell data and a flexible data structure that is compatible with existing tools and can be used as infrastructure for future software development. The open-source code, along with installation instructions, vignettes and case studies, is available through Bioconductor at http://bioconductor.org/packages/scater . davis@ebi.ac.uk. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press.

  19. Quality Control of Structural MRI Images Applied Using FreeSurfer—A Hands-On Workflow to Rate Motion Artifacts

    PubMed Central

    Backhausen, Lea L.; Herting, Megan M.; Buse, Judith; Roessner, Veit; Smolka, Michael N.; Vetter, Nora C.

    2016-01-01

    In structural magnetic resonance imaging motion artifacts are common, especially when not scanning healthy young adults. It has been shown that motion affects the analysis with automated image-processing techniques (e.g., FreeSurfer). This can bias results. Several developmental and adult studies have found reduced volume and thickness of gray matter due to motion artifacts. Thus, quality control is necessary in order to ensure an acceptable level of quality and to define exclusion criteria of images (i.e., determine participants with most severe artifacts). However, information about the quality control workflow and image exclusion procedure is largely lacking in the current literature and the existing rating systems differ. Here, we propose a stringent workflow of quality control steps during and after acquisition of T1-weighted images, which enables researchers dealing with populations that are typically affected by motion artifacts to enhance data quality and maximize sample sizes. As an underlying aim we established a thorough quality control rating system for T1-weighted images and applied it to the analysis of developmental clinical data using the automated processing pipeline FreeSurfer. This hands-on workflow and quality control rating system will aid researchers in minimizing motion artifacts in the final data set, and therefore enhance the quality of structural magnetic resonance imaging studies. PMID:27999528

  20. [Application of quality by design in granulation process for Ginkgo leaf tablet (Ⅲ): process control strategy based on design space].

    PubMed

    Cui, Xiang-Long; Xu, Bing; Sun, Fei; Dai, Sheng-Yun; Shi, Xin-Yuan; Qiao, Yan-Jiang

    2017-03-01

    In this paper, under the guidance of quality by design (QbD) concept, the control strategy of the high shear wet granulation process of the ginkgo leaf tablet based on the design space was established to improve the process controllability and product quality consistency. The median granule size (D50) and bulk density (Da) of granules were identified as critical quality attributes (CQAs) and potential critical process parameters (pCPPs) were determined by the failure modes and effect analysis (FMEA). The Plackeet-Burmann experimental design was used to screen pCPPs and the results demonstrated that the binder amount, the wet massing time and the wet mixing impeller speed were critical process parameters (CPPs). The design space of the high shear wet granulation process was developed within pCPPs range based on the Box-Behnken design and quadratic polynomial regression models. ANOVA analysis showed that the P-values of model were less than 0.05 and the values of lack of fit test were more than 0.1, indicating that the relationship between CQAs and CPPs could be well described by the mathematical models. D₅₀ could be controlled within 170 to 500 μm, and the bulk density could be controlled within 0.30 to 0.44 g•cm⁻³ by using any CPPs combination within the scope of design space. Besides, granules produced by process parameters within the design space region could also meet the requirement of tensile strength of the ginkgo leaf tablet.. Copyright© by the Chinese Pharmaceutical Association.

  1. Statistical Methods for Quality Control of Steel Coils Manufacturing Process using Generalized Linear Models

    NASA Astrophysics Data System (ADS)

    García-Díaz, J. Carlos

    2009-11-01

    Fault detection and diagnosis is an important problem in process engineering. Process equipments are subject to malfunctions during operation. Galvanized steel is a value added product, furnishing effective performance by combining the corrosion resistance of zinc with the strength and formability of steel. Fault detection and diagnosis is an important problem in continuous hot dip galvanizing and the increasingly stringent quality requirements in automotive industry has also demanded ongoing efforts in process control to make the process more robust. When faults occur, they change the relationship among these observed variables. This work compares different statistical regression models proposed in the literature for estimating the quality of galvanized steel coils on the basis of short time histories. Data for 26 batches were available. Five variables were selected for monitoring the process: the steel strip velocity, four bath temperatures and bath level. The entire data consisting of 48 galvanized steel coils was divided into sets. The first training data set was 25 conforming coils and the second data set was 23 nonconforming coils. Logistic regression is a modeling tool in which the dependent variable is categorical. In most applications, the dependent variable is binary. The results show that the logistic generalized linear models do provide good estimates of quality coils and can be useful for quality control in manufacturing process.

  2. Tacit Quality Leadership: Operationalized Quality Perceptions as a Source of Influence in the American Higher Education Accreditation Process

    ERIC Educational Resources Information Center

    Saurbier, Ann L.

    2013-01-01

    American post-secondary education faces unprecedented challenges in the dynamic 21st century environment. An appreciation of the higher education accreditation process, as a quality control mechanism, therefore may be seen as a significant priority. When American higher education is viewed systemically, the perceptions of quality held and…

  3. A system identification approach for developing model predictive controllers of antibody quality attributes in cell culture processes

    PubMed Central

    Schmitt, John; Beller, Justin; Russell, Brian; Quach, Anthony; Hermann, Elizabeth; Lyon, David; Breit, Jeffrey

    2017-01-01

    As the biopharmaceutical industry evolves to include more diverse protein formats and processes, more robust control of Critical Quality Attributes (CQAs) is needed to maintain processing flexibility without compromising quality. Active control of CQAs has been demonstrated using model predictive control techniques, which allow development of processes which are robust against disturbances associated with raw material variability and other potentially flexible operating conditions. Wide adoption of model predictive control in biopharmaceutical cell culture processes has been hampered, however, in part due to the large amount of data and expertise required to make a predictive model of controlled CQAs, a requirement for model predictive control. Here we developed a highly automated, perfusion apparatus to systematically and efficiently generate predictive models using application of system identification approaches. We successfully created a predictive model of %galactosylation using data obtained by manipulating galactose concentration in the perfusion apparatus in serialized step change experiments. We then demonstrated the use of the model in a model predictive controller in a simulated control scenario to successfully achieve a %galactosylation set point in a simulated fed‐batch culture. The automated model identification approach demonstrated here can potentially be generalized to many CQAs, and could be a more efficient, faster, and highly automated alternative to batch experiments for developing predictive models in cell culture processes, and allow the wider adoption of model predictive control in biopharmaceutical processes. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers Biotechnol. Prog., 33:1647–1661, 2017 PMID:28786215

  4. Application of Statistical Quality Control Techniques to Detonator Fabrication: Feasibility Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, J. Frank

    1971-05-20

    A feasibility study was performed on the use of process control techniques which might reduce the need for a duplicate inspection by production inspection and quality control inspection. Two active detonator fabrication programs were selected for the study. Inspection areas accounting for the greatest percentage of total inspection costs were selected by applying "Pareto's Principle of Maldistribution." Data from these areas were then gathered and analyzed by a process capabiltiy study.

  5. A method for evaluating treatment quality using in vivo EPID dosimetry and statistical process control in radiation therapy.

    PubMed

    Fuangrod, Todsaporn; Greer, Peter B; Simpson, John; Zwan, Benjamin J; Middleton, Richard H

    2017-03-13

    Purpose Due to increasing complexity, modern radiotherapy techniques require comprehensive quality assurance (QA) programmes, that to date generally focus on the pre-treatment stage. The purpose of this paper is to provide a method for an individual patient treatment QA evaluation and identification of a "quality gap" for continuous quality improvement. Design/methodology/approach A statistical process control (SPC) was applied to evaluate treatment delivery using in vivo electronic portal imaging device (EPID) dosimetry. A moving range control chart was constructed to monitor the individual patient treatment performance based on a control limit generated from initial data of 90 intensity-modulated radiotherapy (IMRT) and ten volumetric-modulated arc therapy (VMAT) patient deliveries. A process capability index was used to evaluate the continuing treatment quality based on three quality classes: treatment type-specific, treatment linac-specific, and body site-specific. Findings The determined control limits were 62.5 and 70.0 per cent of the χ pass-rate for IMRT and VMAT deliveries, respectively. In total, 14 patients were selected for a pilot study the results of which showed that about 1 per cent of all treatments contained errors relating to unexpected anatomical changes between treatment fractions. Both rectum and pelvis cancer treatments demonstrated process capability indices were less than 1, indicating the potential for quality improvement and hence may benefit from further assessment. Research limitations/implications The study relied on the application of in vivo EPID dosimetry for patients treated at the specific centre. Sampling patients for generating the control limits were limited to 100 patients. Whilst the quantitative results are specific to the clinical techniques and equipment used, the described method is generally applicable to IMRT and VMAT treatment QA. Whilst more work is required to determine the level of clinical significance, the authors have demonstrated the capability of the method for both treatment specific QA and continuing quality improvement. Practical implications The proposed method is a valuable tool for assessing the accuracy of treatment delivery whilst also improving treatment quality and patient safety. Originality/value Assessing in vivo EPID dosimetry with SPC can be used to improve the quality of radiation treatment for cancer patients.

  6. Fermentanomics: Relating quality attributes of a monoclonal antibody to cell culture process variables and raw materials using multivariate data analysis.

    PubMed

    Rathore, Anurag S; Kumar Singh, Sumit; Pathak, Mili; Read, Erik K; Brorson, Kurt A; Agarabi, Cyrus D; Khan, Mansoor

    2015-01-01

    Fermentanomics is an emerging field of research and involves understanding the underlying controlled process variables and their effect on process yield and product quality. Although major advancements have occurred in process analytics over the past two decades, accurate real-time measurement of significant quality attributes for a biotech product during production culture is still not feasible. Researchers have used an amalgam of process models and analytical measurements for monitoring and process control during production. This article focuses on using multivariate data analysis as a tool for monitoring the internal bioreactor dynamics, the metabolic state of the cell, and interactions among them during culture. Quality attributes of the monoclonal antibody product that were monitored include glycosylation profile of the final product along with process attributes, such as viable cell density and level of antibody expression. These were related to process variables, raw materials components of the chemically defined hybridoma media, concentration of metabolites formed during the course of the culture, aeration-related parameters, and supplemented raw materials such as glucose, methionine, threonine, tryptophan, and tyrosine. This article demonstrates the utility of multivariate data analysis for correlating the product quality attributes (especially glycosylation) to process variables and raw materials (especially amino acid supplements in cell culture media). The proposed approach can be applied for process optimization to increase product expression, improve consistency of product quality, and target the desired quality attribute profile. © 2015 American Institute of Chemical Engineers.

  7. Real-time feedback control of twin-screw wet granulation based on image analysis.

    PubMed

    Madarász, Lajos; Nagy, Zsombor Kristóf; Hoffer, István; Szabó, Barnabás; Csontos, István; Pataki, Hajnalka; Démuth, Balázs; Szabó, Bence; Csorba, Kristóf; Marosi, György

    2018-06-04

    The present paper reports the first dynamic image analysis-based feedback control of continuous twin-screw wet granulation process. Granulation of the blend of lactose and starch was selected as a model process. The size and size distribution of the obtained particles were successfully monitored by a process camera coupled with an image analysis software developed by the authors. The validation of the developed system showed that the particle size analysis tool can determine the size of the granules with an error of less than 5 µm. The next step was to implement real-time feedback control of the process by controlling the liquid feeding rate of the pump through a PC, based on the real-time determined particle size results. After the establishment of the feedback control, the system could correct different real-life disturbances, creating a Process Analytically Controlled Technology (PACT), which guarantees the real-time monitoring and controlling of the quality of the granules. In the event of changes or bad tendencies in the particle size, the system can automatically compensate the effect of disturbances, ensuring proper product quality. This kind of quality assurance approach is especially important in the case of continuous pharmaceutical technologies. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. Software Quality Assurance and Controls Standard

    DTIC Science & Technology

    2010-04-27

    Software Quality Assurance d C t l St d dan on ro s an ar Sue Carroll Principal Software Quality Analyst, SAS John Wal z VP Technology and...for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington VA 22202-4302. Respondents should be aware that...Cycle (SLC) process? • What is in a SQA Process? • Where are SQA Controls? • What is the SQA standards history? Wh t i h i i SQA?• a s c ang ng n

  9. Research to Assembly Scheme for Satellite Deck Based on Robot Flexibility Control Principle

    NASA Astrophysics Data System (ADS)

    Guo, Tao; Hu, Ruiqin; Xiao, Zhengyi; Zhao, Jingjing; Fang, Zhikai

    2018-03-01

    Deck assembly is critical quality control point in final satellite assembly process, and cable extrusion and structure collision problems in assembly process will affect development quality and progress of satellite directly. Aimed at problems existing in deck assembly process, assembly project scheme for satellite deck based on robot flexibility control principle is proposed in this paper. Scheme is introduced firstly; secondly, key technologies on end force perception and flexible docking control in the scheme are studied; then, implementation process of assembly scheme for satellite deck is described in detail; finally, actual application case of assembly scheme is given. Result shows that compared with traditional assembly scheme, assembly scheme for satellite deck based on robot flexibility control principle has obvious advantages in work efficiency, reliability and universality aspects etc.

  10. Instrument Quality Control.

    PubMed

    Jayakody, Chatura; Hull-Ryde, Emily A

    2016-01-01

    Well-defined quality control (QC) processes are used to determine whether a certain procedure or action conforms to a widely accepted standard and/or set of guidelines, and are important components of any laboratory quality assurance program (Popa-Burke et al., J Biomol Screen 14: 1017-1030, 2009). In this chapter, we describe QC procedures useful for monitoring the accuracy and precision of laboratory instrumentation, most notably automated liquid dispensers. Two techniques, gravimetric QC and photometric QC, are highlighted in this chapter. When used together, these simple techniques provide a robust process for evaluating liquid handler accuracy and precision, and critically underpin high-quality research programs.

  11. 40 CFR 455.21 - Specialized definitions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... pollution control blowdown, steam jet blowdown, vacuum pump water, pump seal water, safety equipment.../process laboratory quality control wastewater. Notwithstanding any other regulation, process wastewater...

  12. 40 CFR 455.21 - Specialized definitions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... pollution control blowdown, steam jet blowdown, vacuum pump water, pump seal water, safety equipment.../process laboratory quality control wastewater. Notwithstanding any other regulation, process wastewater...

  13. 40 CFR 455.21 - Specialized definitions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... pollution control blowdown, steam jet blowdown, vacuum pump water, pump seal water, safety equipment.../process laboratory quality control wastewater. Notwithstanding any other regulation, process wastewater...

  14. The relationship between health related quality of life and sensory deficits among patients with diabetes mellitus.

    PubMed

    Engel-Yeger, Batya; Darawsha Najjar, Sanaa; Darawsha, Mahmud

    2017-08-13

    (1) To profile sensory deficits examined in the ability to process sensory information from daily environment and discriminate between tactile stimuli among patients with controlled and un-controlled diabetes mellitus. (2) Examine the relationship between the sensory deficits and patients' health-related quality of life. This study included 115 participants aged 33-55 with uncontrolled (n = 22) or controlled (n = 24) glycemic levels together with healthy subjects (n = 69). All participants completed the brief World Health Organization Quality of Life Questionnaire, the Adolescent/Adult Sensory Profile and performed the tactile discrimination test. Sensory deficits were more emphasized among patients with uncontrolled glycemic levels as expressed in difficulties to register sensory input, lower sensation seeking in daily environments and difficulties to discriminate between tactile stimuli. They also reported the lowest physical and social quality of life as compared to the other two groups. Better sensory seeking and registration predicted better quality of life. Disease control and duration contributed to these predictions. Difficulties in processing sensory information from their daily environments are particularly prevalent among patients with uncontrolled glycemic levels, and significantly impacted their quality of life. Clinicians should screen for sensory processing difficulties among patients with diabetes mellitus and understand their impacts on patients' quality of life. Implications for Rehabilitation Patients with diabetes mellitus, and particularly those with uncontrolled glycemic levels, may have difficulties in processing sensory information from daily environment. A multidisciplinary intervention approach is recommended: clinicians should screen for sensory processing deficits among patients with diabetes mellitus and understand their impacts on patients' daily life. By providing the patients with environmental adaptations and coping strategies, clinicians may assist in optimizing sensory experiences in real life context and elevate patients' quality of life. Relating to quality of life and emphasizing a multidisciplinary approach is of major importance in broadening our understanding of health conditions and providing holistic treatment for patients.

  15. A new hyperspectral imaging based device for quality control in plastic recycling

    NASA Astrophysics Data System (ADS)

    Bonifazi, G.; D'Agostini, M.; Dall'Ava, A.; Serranti, S.; Turioni, F.

    2013-05-01

    The quality control of contamination level in the recycled plastics stream has been identified as an important key factor for increasing the value of the recycled material by both plastic recycling and compounder industries. Existing quality control methods for the detection of both plastics and non-plastics contaminants in the plastic waste streams at different stages of the industrial process (e.g. feed, intermediate and final products) are currently based on the manual collection from the stream of a sample and on the subsequent off-line laboratory analyses. The results of such analyses are usually available after some hours, or sometimes even some days, after the material has been processed. The laboratory analyses are time-consuming and expensive (both in terms of equipment cost and their maintenance and of labour cost).Therefore, a fast on-line assessment to monitor the plastic waste feed streams and to characterize the composition of the different plastic products, is fundamental to increase the value of secondary plastics. The paper is finalized to describe and evaluate the development of an HSI-based device and of the related software architectures and processing algorithms for quality assessment of plastics in recycling plants, with particular reference to polyolefins (PO). NIR-HSI sensing devices coupled with multivariate data analysis methods was demonstrated as an objective, rapid and non-destructive technique that can be used for on-line quality and process control in the recycling process of POs. In particular, the adoption of the previous mentioned HD&SW integrated architectures can provide a solution to one of the major problems of the recycling industry, which is the lack of an accurate quality certification of materials obtained by recycling processes. These results could therefore assist in developing strategies to certify the composition of recycled PO products.

  16. A senior manufacturing laboratory for determining injection molding process capability

    NASA Technical Reports Server (NTRS)

    Wickman, Jerry L.; Plocinski, David

    1992-01-01

    The following is a laboratory experiment designed to further understanding of materials science. This subject material is directed at an upper level undergraduate/graduate student in an Engineering or Engineering Technology program. It is assumed that the student has a thorough understanding of the process and quality control. The format of this laboratory does not follow that which is normally recommended because of the nature of process capability and that of the injection molding equipment and tooling. This laboratory is instead developed to be used as a point of departure for determining process capability for any process in either a quality control laboratory or a manufacturing environment where control charts, process capability, and experimental or product design are considered important topics.

  17. Application of a tablet film coating model to define a process-imposed transition boundary for robust film coating.

    PubMed

    van den Ban, Sander; Pitt, Kendal G; Whiteman, Marshall

    2018-02-01

    A scientific understanding of interaction of product, film coat, film coating process, and equipment is important to enable design and operation of industrial scale pharmaceutical film coating processes that are robust and provide the level of control required to consistently deliver quality film coated product. Thermodynamic film coating conditions provided in the tablet film coating process impact film coat formation and subsequent product quality. A thermodynamic film coating model was used to evaluate film coating process performance over a wide range of film coating equipment from pilot to industrial scale (2.5-400 kg). An approximate process-imposed transition boundary, from operating in a dry to a wet environment, was derived, for relative humidity and exhaust temperature, and used to understand the impact of the film coating process on product formulation and process control requirements. This approximate transition boundary may aid in an enhanced understanding of risk to product quality, application of modern Quality by Design (QbD) based product development, technology transfer and scale-up, and support the science-based justification of critical process parameters (CPPs).

  18. SB certification for mixture-based specification for flexible base.

    DOT National Transportation Integrated Search

    2012-10-01

    Presentation topics: : Establish List of Qualified Producers; : Producers Responsible for Process Control/Quality Control; : Reduce TxDOT Sampling and Testing; : Expedite Aggregate Base Acceptance; : Share Responsibility (Producer/TxDOT) for Quality ...

  19. Quality engineering as a profession.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kolb, Rachel R.; Hoover, Marcey L.

    Over the course of time, the profession of quality engineering has witnessed significant change, from its original emphasis on quality control and inspection to a more contemporary focus on upholding quality processes throughout the organization and its product realization activities. This paper describes the profession of quality engineering, exploring how todays quality engineers and quality professionals are certified individuals committed to upholding quality processes and principles while working with different dimensions of product development. It also discusses the future of the quality engineering profession and the future of the quality movement as a whole.

  20. Transforming nanomedicine manufacturing toward Quality by Design and microfluidics.

    PubMed

    Colombo, Stefano; Beck-Broichsitter, Moritz; Bøtker, Johan Peter; Malmsten, Martin; Rantanen, Jukka; Bohr, Adam

    2018-04-05

    Nanopharmaceuticals aim at translating the unique features of nano-scale materials into therapeutic products and consequently their development relies critically on the progression in manufacturing technology to allow scalable processes complying with process economy and quality assurance. The relatively high failure rate in translational nanopharmaceutical research and development, with respect to new products on the market, is at least partly due to immature bottom-up manufacturing development and resulting sub-optimal control of quality attributes in nanopharmaceuticals. Recently, quality-oriented manufacturing of pharmaceuticals has undergone an unprecedented change toward process and product development interaction. In this context, Quality by Design (QbD) aims to integrate product and process development resulting in an increased number of product applications to regulatory agencies and stronger proprietary defense strategies of process-based products. Although QbD can be applied to essentially any production approach, microfluidic production offers particular opportunities for QbD-based manufacturing of nanopharmaceuticals. Microfluidics provides unique design flexibility, process control and parameter predictability, and also offers ample opportunities for modular production setups, allowing process feedback for continuously operating production and process control. The present review aims at outlining emerging opportunities in the synergistic implementation of QbD strategies and microfluidic production in contemporary development and manufacturing of nanopharmaceuticals. In doing so, aspects of design and development, but also technology management, are reviewed, as is the strategic role of these tools for aligning nanopharmaceutical innovation, development, and advanced industrialization in the broader pharmaceutical field. Copyright © 2018 Elsevier B.V. All rights reserved.

  1. A Controlled Agitation Process for Improving Quality of Canned Green Beans during Agitation Thermal Processing.

    PubMed

    Singh, Anika; Pratap Singh, Anubhav; Ramaswamy, Hosahalli S

    2016-06-01

    This work introduces the concept of a controlled agitation thermal process to reduce quality damage in liquid-particulate products during agitation thermal processing. Reciprocating agitation thermal processing (RA-TP) was used as the agitation thermal process. In order to reduce the impact of agitation, a new concept of "stopping agitations after sufficient development of cold-spot temperature" was proposed. Green beans were processed in No. 2 (307×409) cans filled with liquids of various consistency (0% to 2% CMC) at various frequencies (1 to 3 Hz) of RA-TP using a full-factorial design and heat penetration results were collected. Corresponding operator's process time to impart a 10-min process lethality (Fo ) and agitation time (AT) were calculated using heat penetration results. Accordingly, products were processed again by stopping agitations as per 3 agitation regimes, namely; full time agitation, equilibration time agitation, and partial time agitation. Processed products were photographed and tested for visual quality, color, texture, breakage of green beans, turbidity, and percentage of insoluble solids in can liquid. Results showed that stopping agitations after sufficient development of cold-spot temperatures is an effective way of reducing product damages caused by agitation (for example, breakage of beans and its leaching into liquid). Agitations till one-log temperature difference gave best color, texture and visual product quality for low-viscosity liquid-particulate mixture and extended agitations till equilibration time was best for high-viscosity products. Thus, it was shown that a controlled agitation thermal process is more effective in obtaining high product quality as compared to a regular agitation thermal process. © 2016 Institute of Food Technologists®

  2. Developments in Signature Process Control

    NASA Astrophysics Data System (ADS)

    Keller, L. B.; Dominski, Marty

    1993-01-01

    Developments in the adaptive process control technique known as Signature Process Control for Advanced Composites (SPCC) are described. This computer control method for autoclave processing of composites was used to develop an optimum cure cycle for AFR 700B polyamide and for an experimental poly-isoimide. An improved process cycle was developed for Avimid N polyamide. The potential for extending the SPCC technique to pre-preg quality control, press modeling, pultrusion and RTM is briefly discussed.

  3. Quality Audit in the Fastener Industry

    NASA Technical Reports Server (NTRS)

    Reagan, John R.

    1995-01-01

    Both the financial and quality communities rely on audits to verify customers records. The financial community is highly structured around three categories of risk, INHERENT RISK, CONTROL RISK, and DETECTION RISK. Combined, the product of these three categories constitute the AUDIT RISK. The financial community establishes CONTROL RISK based in large part on a systems level understanding of the process flow. This system level understanding is best expressed in a flowchart. The quality community may be able to adopt this structure and thereby reduce cost while maintaining and enhancing quality. The quality community should attempt to flowchart the systems level quality process before beginning substantive testing. This theory needs to be applied in several trial cases to prove or disprove this hypothesis

  4. Quality audit in the fastener industry

    NASA Astrophysics Data System (ADS)

    Reagan, John R.

    1995-09-01

    Both the financial and quality communities rely on audits to verify customers records. The financial community is highly structured around three categories of risk, INHERENT RISK, CONTROL RISK, and DETECTION RISK. Combined, the product of these three categories constitute the AUDIT RISK. The financial community establishes CONTROL RISK based in large part on a systems level understanding of the process flow. This system level understanding is best expressed in a flowchart. The quality community may be able to adopt this structure and thereby reduce cost while maintaining and enhancing quality. The quality community should attempt to flowchart the systems level quality process before beginning substantive testing. This theory needs to be applied in several trial cases to prove or disprove this hypothesis

  5. Toward Higher QA: From Parametric Release of Sterile Parenteral Products to PAT for Other Pharmaceutical Dosage Forms.

    PubMed

    Hock, Sia Chong; Constance, Neo Xue Rui; Wah, Chan Lai

    2012-01-01

    Pharmaceutical products are generally subjected to end-product batch testing as a means of quality control. Due to the inherent limitations of conventional batch testing, this is not the most ideal approach for determining the pharmaceutical quality of the finished dosage form. In the case of terminally sterilized parenteral products, the limitations of conventional batch testing have been successfully addressed with the application of parametric release (the release of a product based on control of process parameters instead of batch sterility testing at the end of the manufacturing process). Consequently, there has been an increasing interest in applying parametric release to other pharmaceutical dosage forms, beyond terminally sterilized parenteral products. For parametric release to be possible, manufacturers must be capable of designing quality into the product, monitoring the manufacturing processes, and controlling the quality of intermediates and finished products in real-time. Process analytical technology (PAT) has been thought to be capable of contributing to these prerequisites. It is believed that the appropriate use of PAT tools can eventually lead to the possibility of real-time release of other pharmaceutical dosage forms, by-passing the need for end-product batch testing. Hence, this literature review attempts to present the basic principles of PAT, introduce the various PAT tools that are currently available, present their recent applications to pharmaceutical processing, and explain the potential benefits that PAT can bring to conventional ways of processing and quality assurance of pharmaceutical products. Last but not least, current regulations governing the use of PAT and the manufacturing challenges associated with PAT implementation are also discussed. Pharmaceutical products are generally subjected to end-product batch testing as a means of quality control. Due to the inherent limitations of conventional batch testing, this is not the most ideal approach. In the case of terminally sterilized parenteral products, these limitations have been successfully addressed with the application of parametric release (the release of a product based on control of process parameters instead of batch sterility testing at the end of the manufacturing process). Consequently, there has been an increasing interest in applying parametric release to other pharmaceutical dosage forms. With the advancement of process analytical technology (PAT), it is possible to monitor the manufacturing processes closely. This will eventually enable quality control of the intermediates and finished products, and thus their release in real-time. Hence, this literature review attempts to present the basic principles of PAT, introduce the various PAT tools that are currently available, present their recent applications to pharmaceutical processing, and explain the potential benefits that PAT can bring to conventional ways of processing and quality assurance of pharmaceutical products. It will also discuss the current regulations governing the use of PAT and the manufacturing challenges associated with the implementation of PAT.

  6. Quality Management and Control of Low Pressure Cast Aluminum Alloy

    NASA Astrophysics Data System (ADS)

    Zhang, Dianxi; Zhang, Yanbo; Yang, Xiufan; Chen, Zhaosong; Jiang, Zelan

    2018-01-01

    This paper briefly reviews the history of low pressure casting and summarizes the major production processes of low pressure casting. It briefly introduces the quality management and control of low pressure cast aluminum alloy. The main processes include are: preparation of raw materials, Melting, refining, physical and chemical analysis, K-mode inspection, sand core, mold, heat treatment and so on.

  7. Cluster randomized trial assessing the effects of rapid ethical assessment on informed consent comprehension in a low-resource setting.

    PubMed

    Addissie, Adamu; Abay, Serebe; Feleke, Yeweyenhareg; Newport, Melanie; Farsides, Bobbie; Davey, Gail

    2016-07-12

    Maximizing comprehension is a major challenge for informed consent processes in low-literacy and resource-limited settings. Application of rapid qualitative assessments to improve the informed consent process is increasingly considered useful. This study assessed the effects of Rapid Ethical Assessment (REA) on comprehension, retention and quality of the informed consent process. A cluster randomized trial was conducted among participants of HPV sero-prevalence study in two districts of Northern Ethiopia, in 2013. A total of 300 study participants, 150 in the intervention and 150 in the control group, were included in the study. For the intervention group, the informed consent process was designed with further revisions based on REA findings. Informed consent comprehension levels and quality of the consent process were measured using the Modular Informed Consent Comprehension Assessment (MICCA) and Quality of Informed Consent (QuIC) process assessment tools, respectively. Study recruitment rates were 88.7 % and 80.7 % (p = 0.05), while study retention rates were 85.7 % and 70.3 % (p < 0.005) for the intervention and control groups respectively. Overall, the mean informed consent comprehension scores for the intervention and control groups were 73.1 % and 45.2 %, respectively, with a mean difference in comprehension score of 27.9 % (95 % CI 24.0 % - 33.4 %; p < 0.001,). Mean scores for quality of informed consent for the intervention and control groups were 89.1 % and 78.5 %, respectively, with a mean difference of 10.5 % (95 % CI 6.8 -14.2 %; p < 0.001). Levels of informed consent comprehension, quality of the consent process, study recruitment and retention rates were significantly improved in the intervention group. We recommend REA as a potential modality to improve informed consent comprehension and quality of informed consent process in low resource settings.

  8. 75 FR 60013 - Approval and Promulgation of Air Quality Implementation Plans; Maryland; Control of Volatile...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-29

    ... Promulgation of Air Quality Implementation Plans; Maryland; Control of Volatile Organic Compounds Emissions... Maryland's Volatile Organic Compounds from Specific Processes Regulation. Maryland has adopted standards... (RACT) requirements for sources of volatile organic compounds (VOCs) covered by control techniques...

  9. Total Quality Management (TQM), an Overview

    DTIC Science & Technology

    1991-09-01

    Quality Management (TQM). It discusses the reasons TQM is a current growth industry, what it is, and how one implements it. It describes the basic analytical tools, statistical process control, some advanced analytical tools, tools used by process improvement teams to enhance their own operations, and action plans for making improvements. The final sections discuss assessing quality efforts and measuring the quality to knowledge

  10. Honing process optimization algorithms

    NASA Astrophysics Data System (ADS)

    Kadyrov, Ramil R.; Charikov, Pavel N.; Pryanichnikova, Valeria V.

    2018-03-01

    This article considers the relevance of honing processes for creating high-quality mechanical engineering products. The features of the honing process are revealed and such important concepts as the task for optimization of honing operations, the optimal structure of the honing working cycles, stepped and stepless honing cycles, simulation of processing and its purpose are emphasized. It is noted that the reliability of the mathematical model determines the quality parameters of the honing process control. An algorithm for continuous control of the honing process is proposed. The process model reliably describes the machining of a workpiece in a sufficiently wide area and can be used to operate the CNC machine CC743.

  11. Applications of optical sensing for laser cutting and drilling.

    PubMed

    Fox, Mahlen D T; French, Paul; Peters, Chris; Hand, Duncan P; Jones, Julian D C

    2002-08-20

    Any reliable automated production system must include process control and monitoring techniques. Two laser processing techniques potentially lending themselves to automation are percussion drilling and cutting. For drilling we investigate the performance of a modification of a nonintrusive optical focus control system we previously developed for laser welding, which exploits the chromatic aberrations of the processing optics to determine focal error. We further developed this focus control system for closed-loop control of laser cutting. We show that an extension of the technique can detect deterioration in cut quality, and we describe practical trials carried out on different materials using both oxygen and nitrogen assist gas. We base our techniques on monitoring the light generated by the process, captured nonintrusively by the effector optics and processed remotely from the workpiece. We describe the relationship between the temporal and the chromatic modulation of the detected light and process quality and show how the information can be used as the basis of a process control system.

  12. Quality Risk Management: Putting GMP Controls First.

    PubMed

    O'Donnell, Kevin; Greene, Anne; Zwitkovits, Michael; Calnan, Nuala

    2012-01-01

    This paper presents a practical way in which current approaches to quality risk management (QRM) may be improved, such that they better support qualification, validation programs, and change control proposals at manufacturing sites. The paper is focused on the treatment of good manufacturing practice (GMP) controls during QRM exercises. It specifically addresses why it is important to evaluate and classify such controls in terms of how they affect the severity, probability of occurrence, and detection ratings that may be assigned to potential failure modes or negative events. It also presents a QRM process that is designed to directly link the outputs of risk assessments and risk control activities with qualification and validation protocols in the GMP environment. This paper concerns the need for improvement in the use of risk-based principles and tools when working to ensure that the manufacturing processes used to produce medicines, and their related equipment, are appropriate. Manufacturing processes need to be validated (or proven) to demonstrate that they can produce a medicine of the required quality. The items of equipment used in such processes need to be qualified, in order to prove that they are fit for their intended use. Quality risk management (QRM) tools can be used to support such qualification and validation activities, but their use should be science-based and subject to as little subjectivity and uncertainty as possible. When changes are proposed to manufacturing processes, equipment, or related activities, they also need careful evaluation to ensure that any risks present are managed effectively. This paper presents a practical approach to how QRM may be improved so that it better supports qualification, validation programs, and change control proposals in a more scientific way. This improved approach is based on the treatment of what are called good manufacturing process (GMP) controls during those QRM exercises. A GMP control can be considered to be any control that is put in place to assure product quality and regulatory compliance. This improved approach is also based on how the detectability of risks is assessed. This is important because when producing medicines, it is not always good practice to place a high reliance upon detection-type controls in the absence of an adequate level of assurance in the manufacturing process that leads to the finished medicine.

  13. [Post-marketing reevaluation for potential quality risk and quality control in clinical application of traditional Chinese medicines].

    PubMed

    Li, Hong-jiao; He, Li-yun; Liu, Bao-yan

    2015-06-01

    The effective quality control in clinical practices is an effective guarantee for the authenticity and scientificity of the findings. The post-marketing reevaluation for traditional Chinese medicines (TCM) focuses on the efficacy, adverse reaction, combined medication and effective dose of drugs in the market by expanded clinical trials, and requires a larger sample size and a wider range of patients. Therefore, this increases the difficulty of quality control in clinical practices. With the experience in quality control in clinical practices for the post-marketing reevaluation for Kangbingdu oral for cold, researchers in this study reviewed the study purpose, project, scheme design and clinical practice process from an overall point of view, analyzed the study characteristics of the post-marketing reevaluation for TCMs and the quality control risks, designed the quality control contents with quality impacting factors, defined key review contents and summarized the precautions in clinical practices, with the aim to improve the efficiency of quality control of clinical practices. This study can provide reference to clinical units and quality control-related personnel in the post-marketing reevaluation for TCMs.

  14. Flight-Test Validation and Flying Qualities Evaluation of a Rotorcraft UAV Flight Control System

    NASA Technical Reports Server (NTRS)

    Mettler, Bernard; Tuschler, Mark B.; Kanade, Takeo

    2000-01-01

    This paper presents a process of design and flight-test validation and flying qualities evaluation of a flight control system for a rotorcraft-based unmanned aerial vehicle (RUAV). The keystone of this process is an accurate flight-dynamic model of the aircraft, derived by using system identification modeling. The model captures the most relevant dynamic features of our unmanned rotorcraft, and explicitly accounts for the presence of a stabilizer bar. Using the identified model we were able to determine the performance margins of our original control system and identify limiting factors. The performance limitations were addressed and the attitude control system was 0ptimize.d for different three performance levels: slow, medium, fast. The optimized control laws will be implemented in our RUAV. We will first determine the validity of our control design approach by flight test validating our optimized controllers. Subsequently, we will fly a series of maneuvers with the three optimized controllers to determine the level of flying qualities that can be attained. The outcome enable us to draw important conclusions on the flying qualities requirements for small-scale RUAVs.

  15. An Application of Six Sigma to Reduce Supplier Quality Cost

    NASA Astrophysics Data System (ADS)

    Gaikwad, Lokpriya Mohanrao; Teli, Shivagond Nagappa; Majali, Vijay Shashikant; Bhushi, Umesh Mahadevappa

    2016-01-01

    This article presents an application of Six Sigma to reduce supplier quality cost in manufacturing industry. Although there is a wider acceptance of Six Sigma in many organizations today, there is still a lack of in-depth case study of Six Sigma. For the present research the case study methodology was used. The company decided to reduce quality cost and improve selected processes using Six Sigma methodologies. Regarding the fact that there is a lack of case studies dealing with Six Sigma especially in individual manufacturing organization this article could be of great importance also for the practitioners. This paper discusses the quality and productivity improvement in a supplier enterprise through a case study. The paper deals with an application of Six Sigma define-measure-analyze-improve-control methodology in an industry which provides a framework to identify, quantify and eliminate sources of variation in an operational process in question, to optimize the operation variables, improve and sustain performance viz. process yield with well-executed control plans. Six Sigma improves the process performance (process yield) of the critical operational process, leading to better utilization of resources, decreases variations and maintains consistent quality of the process output.

  16. Sampling the food processing environment: taking up the cudgel for preventive quality management in food processing environments.

    PubMed

    Wagner, Martin; Stessl, Beatrix

    2014-01-01

    The Listeria monitoring program for Austrian cheese factories was established in 1988. The basic idea is to control the introduction of L. monocytogenes into the food processing environment, preventing the pathogen from contaminating the food under processing. The Austrian Listeria monitoring program comprises four levels of investigation, dealing with routine monitoring of samples and consequences of finding a positive sample. Preventive quality control concepts attempt to detect a foodborne hazard along the food processing chain, prior to food delivery, retailing, and consumption. The implementation of a preventive food safety concept provokes a deepened insight by the manufacturers into problems concerning food safety. The development of preventive quality assurance strategies contributes to the national food safety status and protects public health.

  17. Process control charts in infection prevention: Make it simple to make it happen.

    PubMed

    Wiemken, Timothy L; Furmanek, Stephen P; Carrico, Ruth M; Mattingly, William A; Persaud, Annuradha K; Guinn, Brian E; Kelley, Robert R; Ramirez, Julio A

    2017-03-01

    Quality improvement is central to Infection Prevention and Control (IPC) programs. Challenges may occur when applying quality improvement methodologies like process control charts, often due to the limited exposure of typical IPs. Because of this, our team created an open-source database with a process control chart generator for IPC programs. The objectives of this report are to outline the development of the application and demonstrate application using simulated data. We used Research Electronic Data Capture (REDCap Consortium, Vanderbilt University, Nashville, TN), R (R Foundation for Statistical Computing, Vienna, Austria), and R Studio Shiny (R Foundation for Statistical Computing) to create an open source data collection system with automated process control chart generation. We used simulated data to test and visualize both in-control and out-of-control processes for commonly used metrics in IPC programs. The R code for implementing the control charts and Shiny application can be found on our Web site (https://github.com/ul-research-support/spcapp). Screen captures of the workflow and simulated data indicating both common cause and special cause variation are provided. Process control charts can be easily developed based on individual facility needs using freely available software. Through providing our work free to all interested parties, we hope that others will be able to harness the power and ease of use of the application for improving the quality of care and patient safety in their facilities. Copyright © 2017 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  18. Intelligent Processing Equipment Projects at DLA

    NASA Technical Reports Server (NTRS)

    Obrien, Donald F.

    1992-01-01

    The Defense Logistics Agency is successfully incorporating Intelligent Processing Equipment (IPE) into each of its Manufacturing Technology thrust areas. Several IPE applications are addressed in the manufacturing of two 'soldier support' items: combat rations and military apparel. In combat rations, in-line sensors for food processing are being developed or modified from other industries. In addition, many process controls are being automated to achieve better quality and to gain higher use (soldier) acceptance. IPE applications in military apparel include: in-process quality controls for identification of sewing defects, use of robots in the manufacture of shirt collars, and automated handling of garments for pressing.

  19. Manufacturing processes for fabricating graphite/PMR 15 polyimide structural elements

    NASA Technical Reports Server (NTRS)

    Sheppard, C. H.; Hoggatt, J. T.; Symonds, W. A.

    1979-01-01

    Investigations were conducted to obtain commercially available graphite/PMR-15 polyimide prepreg, develop an autoclave manufacturing process, and demonstrate the process by manufacturing structural elements. Controls were established on polymer, prepreg, composite fabrication, and quality assurance, Successful material quality control and processes were demonstrated by fabricating major structural elements including flat laminates, hat sections, I beam sections, honeycomb sandwich structures, and molded graphite reinforced fittings. Successful fabrication of structural elements and simulated section of the space shuttle aft body flap shows that the graphite/PMR-15 polyimide system and the developed processes are ready for further evaluation in flight test hardware.

  20. Intelligent processing equipment projects at DLA

    NASA Astrophysics Data System (ADS)

    Obrien, Donald F.

    1992-04-01

    The Defense Logistics Agency is successfully incorporating Intelligent Processing Equipment (IPE) into each of its Manufacturing Technology thrust areas. Several IPE applications are addressed in the manufacturing of two 'soldier support' items: combat rations and military apparel. In combat rations, in-line sensors for food processing are being developed or modified from other industries. In addition, many process controls are being automated to achieve better quality and to gain higher use (soldier) acceptance. IPE applications in military apparel include: in-process quality controls for identification of sewing defects, use of robots in the manufacture of shirt collars, and automated handling of garments for pressing.

  1. Influence of fiber quality on draftometer measurements

    USDA-ARS?s Scientific Manuscript database

    Fiber-to-fiber and fiber-to-machine friction play an important role in determining textile processing efficiency and end-product quality. A process, known as drafting, is used to control the attenuation of the fiber mass being processed in carding, drawing and spinning. The amount of attenuation t...

  2. A system identification approach for developing model predictive controllers of antibody quality attributes in cell culture processes.

    PubMed

    Downey, Brandon; Schmitt, John; Beller, Justin; Russell, Brian; Quach, Anthony; Hermann, Elizabeth; Lyon, David; Breit, Jeffrey

    2017-11-01

    As the biopharmaceutical industry evolves to include more diverse protein formats and processes, more robust control of Critical Quality Attributes (CQAs) is needed to maintain processing flexibility without compromising quality. Active control of CQAs has been demonstrated using model predictive control techniques, which allow development of processes which are robust against disturbances associated with raw material variability and other potentially flexible operating conditions. Wide adoption of model predictive control in biopharmaceutical cell culture processes has been hampered, however, in part due to the large amount of data and expertise required to make a predictive model of controlled CQAs, a requirement for model predictive control. Here we developed a highly automated, perfusion apparatus to systematically and efficiently generate predictive models using application of system identification approaches. We successfully created a predictive model of %galactosylation using data obtained by manipulating galactose concentration in the perfusion apparatus in serialized step change experiments. We then demonstrated the use of the model in a model predictive controller in a simulated control scenario to successfully achieve a %galactosylation set point in a simulated fed-batch culture. The automated model identification approach demonstrated here can potentially be generalized to many CQAs, and could be a more efficient, faster, and highly automated alternative to batch experiments for developing predictive models in cell culture processes, and allow the wider adoption of model predictive control in biopharmaceutical processes. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers Biotechnol. Prog., 33:1647-1661, 2017. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers.

  3. TU-FG-201-12: Designing a Risk-Based Quality Assurance Program for a Newly Implemented Y-90 Microspheres Procedure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vile, D; Zhang, L; Cuttino, L

    2016-06-15

    Purpose: To create a quality assurance program based upon a risk-based assessment of a newly implemented SirSpheres Y-90 procedure. Methods: A process map was created for a newly implemented SirSpheres procedure at a community hospital. The process map documented each step of this collaborative procedure, as well as the roles and responsibilities of each member. From the process map, different potential failure modes were determined as well as any current controls in place. From this list, a full failure mode and effects analysis (FMEA) was performed by grading each failure mode’s likelihood of occurrence, likelihood of detection, and potential severity.more » These numbers were then multiplied to compute the risk priority number (RPN) for each potential failure mode. Failure modes were then ranked based on their RPN. Additional controls were then added, with failure modes corresponding to the highest RPNs taking priority. Results: A process map was created that succinctly outlined each step in the SirSpheres procedure in its current implementation. From this, 72 potential failure modes were identified and ranked according to their associated RPN. Quality assurance controls and safety barriers were then added for failure modes associated with the highest risk being addressed first. Conclusion: A quality assurance program was created from a risk-based assessment of the SirSpheres process. Process mapping and FMEA were effective in identifying potential high-risk failure modes for this new procedure, which were prioritized for new quality assurance controls. TG 100 recommends the fault tree analysis methodology to design a comprehensive and effective QC/QM program, yet we found that by simply introducing additional safety barriers to address high RPN failure modes makes the whole process simpler and safer.« less

  4. The six critical attributes of the next generation of quality management software systems.

    PubMed

    Clark, Kathleen

    2011-07-01

    Driven by both the need to meet regulatory requirements and a genuine desire to drive improved quality, quality management systems encompassing standard operating procedure, corrective and preventative actions and related processes have existed for many years, both in paper and electronic form. The impact of quality management systems on 'actual' quality, however, is often reported as far less than desired. A quality management software system that moves beyond formal forms-driven processes to include a true closed loop design, manage disparate processes across the enterprise, provide support for collaborative processes and deliver insight into the overall state of control has the potential to close the gap between simply accomplishing regulatory compliance and delivering measurable improvements in quality and efficiency.

  5. Assessing the structure of non-routine decision processes in Airline Operations Control.

    PubMed

    Richters, Floor; Schraagen, Jan Maarten; Heerkens, Hans

    2016-03-01

    Unfamiliar severe disruptions challenge Airline Operations Control professionals most, as their expertise is stretched to its limits. This study has elicited the structure of Airline Operations Control professionals' decision process during unfamiliar disruptions by mapping three macrocognitive activities on the decision ladder: sensemaking, option evaluation and action planning. The relationship between this structure and decision quality was measured. A simulated task was staged, based on which think-aloud protocols were obtained. Results show that the general decision process structure resembles the structure of experts working under routine conditions, in terms of the general structure of the macrocognitive activities, and the rule-based approach used to identify options and actions. Surprisingly, high quality of decision outcomes was found to relate to the use of rule-based strategies. This implies that successful professionals are capable of dealing with unfamiliar problems by reframing them into familiar ones, rather than to engage in knowledge-based processing. Practitioner Summary: We examined the macrocognitive structure of Airline Operations Control professionals' decision process during a simulated unfamiliar disruption in relation to decision quality. Results suggest that successful professionals are capable of dealing with unfamiliar problems by reframing them into familiar ones, rather than to engage in knowledge-based processing.

  6. System of error detection in the manufacture of garments using artificial vision

    NASA Astrophysics Data System (ADS)

    Moreno, J. J.; Aguila, A.; Partida, E.; Martinez, C. L.; Morales, O.; Tejeida, R.

    2017-12-01

    A computer vision system is implemented to detect errors in the cutting stage within the manufacturing process of garments in the textile industry. It provides solution to errors within the process that cannot be easily detected by any employee, in addition to significantly increase the speed of quality review. In the textile industry as in many others, quality control is required in manufactured products and this has been carried out manually by means of visual inspection by employees over the years. For this reason, the objective of this project is to design a quality control system using computer vision to identify errors in the cutting stage within the garment manufacturing process to increase the productivity of textile processes by reducing costs.

  7. [Research advances in secondary development of Chinese patent medicines based on quality by design concept].

    PubMed

    Gong, Xing-Chu; Chen, Teng; Qu, Hai-Bin

    2017-03-01

    Quality by design (QbD) concept is an advanced pharmaceutical quality control concept. The application of QbD concept in the research and development of pharmaceutical processes of traditional Chinese medicines (TCM) mainly contains five parts, including the definition of critical processes and their evaluation criteria, the determination of critical process parameters and critical material attributes, the establishment of quantitative models, the development of design space, as well as the application and continuous improvement of control strategy. In this work, recent research advances in QbD concept implementation methods in the secondary development of Chinese patent medicines were reviewed, and five promising fields of the implementation of QbD concept were pointed out, including the research and development of TCM new drugs and Chinese medicine granules for formulation, modeling of pharmaceutical processes, development of control strategy based on industrial big data, strengthening the research of process amplification rules, and the development of new pharmaceutical equipment.. Copyright© by the Chinese Pharmaceutical Association.

  8. Control Strategies for Drug Product Continuous Direct Compression-State of Control, Product Collection Strategies, and Startup/Shutdown Operations for the Production of Clinical Trial Materials and Commercial Products.

    PubMed

    Almaya, Ahmad; De Belder, Lawrence; Meyer, Robert; Nagapudi, Karthik; Lin, Hung-Ren Homer; Leavesley, Ian; Jayanth, Jayanthy; Bajwa, Gurjit; DiNunzio, James; Tantuccio, Anthony; Blackwood, Dan; Abebe, Admassu

    2017-04-01

    Continuous manufacturing (CM) has emerged in the pharmaceutical industry as a paradigm shift with significant advantages related to cost, efficiency, flexibility, and higher assurance of quality. The inherent differences from batch processes justify examining the CM control strategy more holistically. This article describes the current thinking for the control and implementation of CM, using the example of a direct compression process and taking into consideration the ICH Q10 definition of "state of control" and process validation requirements. Statistical process control using control charts, sources of variation, process capability, and process performance is explained as a useful concept that can help assess the impact of variation within a batch and indicates if a process is in state of control. The potential for time-variant nature of startup and shutdown with CM is discussed to assure product quality while minimizing waste as well as different options for detection and isolation of non-conforming materials due to process upsets. While different levels of control are possible with CM, an appropriate balance between process control and end product testing is needed depending on the level of process understanding at the different stages of development from the production of clinical supplies through commercialization. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  9. Food safety and nutritional quality for the prevention of non communicable diseases: the Nutrient, hazard Analysis and Critical Control Point process (NACCP).

    PubMed

    Di Renzo, Laura; Colica, Carmen; Carraro, Alberto; Cenci Goga, Beniamino; Marsella, Luigi Tonino; Botta, Roberto; Colombo, Maria Laura; Gratteri, Santo; Chang, Ting Fa Margherita; Droli, Maurizio; Sarlo, Francesca; De Lorenzo, Antonino

    2015-04-23

    The important role of food and nutrition in public health is being increasingly recognized as crucial for its potential impact on health-related quality of life and the economy, both at the societal and individual levels. The prevalence of non-communicable diseases calls for a reformulation of our view of food. The Hazard Analysis and Critical Control Point (HACCP) system, first implemented in the EU with the Directive 43/93/CEE, later replaced by Regulation CE 178/2002 and Regulation CE 852/2004, is the internationally agreed approach for food safety control. Our aim is to develop a new procedure for the assessment of the Nutrient, hazard Analysis and Critical Control Point (NACCP) process, for total quality management (TMQ), and optimize nutritional levels. NACCP was based on four general principles: i) guarantee of health maintenance; ii) evaluate and assure the nutritional quality of food and TMQ; iii) give correct information to the consumers; iv) ensure an ethical profit. There are three stages for the application of the NACCP process: 1) application of NACCP for quality principles; 2) application of NACCP for health principals; 3) implementation of the NACCP process. The actions are: 1) identification of nutritional markers, which must remain intact throughout the food supply chain; 2) identification of critical control points which must monitored in order to minimize the likelihood of a reduction in quality; 3) establishment of critical limits to maintain adequate levels of nutrient; 4) establishment, and implementation of effective monitoring procedures of critical control points; 5) establishment of corrective actions; 6) identification of metabolic biomarkers; 7) evaluation of the effects of food intake, through the application of specific clinical trials; 8) establishment of procedures for consumer information; 9) implementation of the Health claim Regulation EU 1924/2006; 10) starting a training program. We calculate the risk assessment as follows: Risk (R) = probability (P) × damage (D). The NACCP process considers the entire food supply chain "from farm to consumer"; in each point of the chain it is necessary implement a tight monitoring in order to guarantee optimal nutritional quality.

  10. 78 FR 41075 - Federal Housing Administration (FHA): Single Family Quality Assurance-Solicitation of Information...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-09

    ... the efficiency and effectiveness of FHA's quality assurance process (QAP). The objective of FHA's QAP... control plan (QCP).\\1\\ A copy of the plan must be submitted by the lender when applying for FHA lender... processes: post-endorsement technical reviews, Quality Assurance Division reviews and targeted lender...

  11. 21 CFR 106.1 - Status and applicability of the quality control procedures regulation.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION INFANT FORMULA QUALITY CONTROL PROCEDURES... infant formula meets the safety, quality, and nutrient requirements of section 412 of the act and the..., processing, and packaging of an infant formula shall render such formula adulterated under section 412(a)(1...

  12. 21 CFR 106.1 - Status and applicability of the quality control procedures regulation.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION INFANT FORMULA QUALITY CONTROL PROCEDURES... infant formula meets the safety, quality, and nutrient requirements of section 412 of the act and the..., processing, and packaging of an infant formula shall render such formula adulterated under section 412(a)(1...

  13. Quality changes of pomegranate arils throughout shelf life affected by deficit irrigation and pre-processing storage.

    PubMed

    Peña-Estévez, María E; Artés-Hernández, Francisco; Artés, Francisco; Aguayo, Encarna; Martínez-Hernández, Ginés Benito; Galindo, Alejandro; Gómez, Perla A

    2016-10-15

    This study investigated the influence of sustained deficit irrigation (SDI, 78% less water supply than the reference evapotranspiration, ET0) compared to a control (100% ET0) on the physicochemical and sensory qualities and health-promoting compounds of pomegranate arils stored for 14days at 5°C. Prior to processing, the fruits were stored for 0, 30, 60 or 90days at 5°C. The effect of the pre-processing storage duration was also examined. Physicochemical and sensory qualities were kept during the storage period. Arils from SDI fruit had lower punicalagin-α and ellagic acid losses than the control (13% vs 50%). However, the anthocyanin content decreased during the shelf-life (72%) regardless of the treatment. The ascorbic acid slight decreased. Arils from SDI experienced glucose/fructose ratio loss (19%) lower than that of the control (35%). In general, arils from SDI showed better quality and health attributes during the shelf-life than did the control samples. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. BatMass: a Java Software Platform for LC-MS Data Visualization in Proteomics and Metabolomics.

    PubMed

    Avtonomov, Dmitry M; Raskind, Alexander; Nesvizhskii, Alexey I

    2016-08-05

    Mass spectrometry (MS) coupled to liquid chromatography (LC) is a commonly used technique in metabolomic and proteomic research. As the size and complexity of LC-MS-based experiments grow, it becomes increasingly more difficult to perform quality control of both raw data and processing results. In a practical setting, quality control steps for raw LC-MS data are often overlooked, and assessment of an experiment's success is based on some derived metrics such as "the number of identified compounds". The human brain interprets visual data much better than plain text, hence the saying "a picture is worth a thousand words". Here, we present the BatMass software package, which allows for performing quick quality control of raw LC-MS data through its fast visualization capabilities. It also serves as a testbed for developers of LC-MS data processing algorithms by providing a data access library for open mass spectrometry file formats and a means of visually mapping processing results back to the original data. We illustrate the utility of BatMass with several use cases of quality control and data exploration.

  15. BatMass: a Java software platform for LC/MS data visualization in proteomics and metabolomics

    PubMed Central

    Avtonomov, Dmitry; Raskind, Alexander; Nesvizhskii, Alexey I.

    2017-01-01

    Mass spectrometry (MS) coupled to liquid chromatography (LC) is a commonly used technique in metabolomic and proteomic research. As the size and complexity of LC/MS based experiments grow, it becomes increasingly more difficult to perform quality control of both raw data and processing results. In a practical setting, quality control steps for raw LC/MS data are often overlooked and assessment of an experiment's success is based on some derived metrics such as “the number of identified compounds”. Human brain interprets visual data much better than plain text, hence the saying “a picture is worth a thousand words”. Here we present BatMass software package which allows to perform quick quality control of raw LC/MS data through its fast visualization capabilities. It also serves as a testbed for developers of LC/MS data processing algorithms by providing a data access library for open mass spectrometry file formats and a means of visually mapping processing results back to the original data. We illustrate the utility of BatMass with several use cases of quality control and data exploration. PMID:27306858

  16. QCloud: A cloud-based quality control system for mass spectrometry-based proteomics laboratories

    PubMed Central

    Chiva, Cristina; Olivella, Roger; Borràs, Eva; Espadas, Guadalupe; Pastor, Olga; Solé, Amanda

    2018-01-01

    The increasing number of biomedical and translational applications in mass spectrometry-based proteomics poses new analytical challenges and raises the need for automated quality control systems. Despite previous efforts to set standard file formats, data processing workflows and key evaluation parameters for quality control, automated quality control systems are not yet widespread among proteomics laboratories, which limits the acquisition of high-quality results, inter-laboratory comparisons and the assessment of variability of instrumental platforms. Here we present QCloud, a cloud-based system to support proteomics laboratories in daily quality assessment using a user-friendly interface, easy setup, automated data processing and archiving, and unbiased instrument evaluation. QCloud supports the most common targeted and untargeted proteomics workflows, it accepts data formats from different vendors and it enables the annotation of acquired data and reporting incidences. A complete version of the QCloud system has successfully been developed and it is now open to the proteomics community (http://qcloud.crg.eu). QCloud system is an open source project, publicly available under a Creative Commons License Attribution-ShareAlike 4.0. PMID:29324744

  17. Manufacturing Squares: An Integrative Statistical Process Control Exercise

    ERIC Educational Resources Information Center

    Coy, Steven P.

    2016-01-01

    In the exercise, students in a junior-level operations management class are asked to manufacture a simple product. Given product specifications, they must design a production process, create roles and design jobs for each team member, and develop a statistical process control plan that efficiently and effectively controls quality during…

  18. Using Statistical Process Control to Enhance Student Progression

    ERIC Educational Resources Information Center

    Hanna, Mark D.; Raichura, Nilesh; Bernardes, Ednilson

    2012-01-01

    Public interest in educational outcomes has markedly increased in the most recent decade; however, quality management and statistical process control have not deeply penetrated the management of academic institutions. This paper presents results of an attempt to use Statistical Process Control (SPC) to identify a key impediment to continuous…

  19. [Quality control in anesthesiology].

    PubMed

    Muñoz-Ramón, J M

    1995-03-01

    The process of quality control and auditing of anesthesiology allows us to evaluate care given by a service and solve problems that are detected. Quality control is a basic element of care giving and is only secondarily an area of academic research; it is therefore a meaningless effort if the information does not serve to improve departmental procedures. Quality assurance procedures assume certain infrastructural requirements and an initial period of implementation and adjustment. The main objectives of quality control are the reduction of morbidity and mortality due to anesthesia, assurance of the availability and proper management of resources and, finally, the well-being and safety of the patient.

  20. Using the scanning electron microscope on the production line to assure quality semiconductors

    NASA Technical Reports Server (NTRS)

    Adolphsen, J. W.; Anstead, R. J.

    1972-01-01

    The use of the scanning electron microscope to detect metallization defects introduced during batch processing of semiconductor devices is discussed. A method of determining metallization integrity was developed which culminates in a procurement specification using the scanning microscope on the production line as a quality control tool. Batch process control of the metallization operation is monitored early in the manufacturing cycle.

  1. Total Quality Management: Getting Started

    DTIC Science & Technology

    1990-08-01

    Quality Management (TQM) program using Organizational Development (OD) intervention techniques to gain acceptance of the program. It emphasizes human behavior and the need for collaborative management and consensus in organizational change. Lessons learned stress the importance of choosing a skilled TQM facilitator, training process action teams, and fostering open communication and teamwork to minimize resistance to change. Keywords: Management planning and control, Quality control, Quality , Management , Organization change, Organization development,

  2. Guidelines for preparation of the 1996 state water quality assessments (305(b) reports)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1995-05-01

    The Federal Water Polluton Control Act (PL92-500, commonly known as the Clean Water Act), establishes a process for States to use to develop information on the quality of the Nation`s water resources and to report this information to the U.S. Environmental Protection Agency (EPA), the U.S. Congress, and the citizens of this country. Each State must develop a program to monitor the quality of its surface and ground waters and prepare a report every 2 years describing the status of its water quality. EPA compiles the data from the State reports, summarizes them, and transmits the summaries to Congress alongmore » with an analysis of the status of water quality nationwide. This process, referred to as the 305(b) process, is an essential aspect of the Nation`s water pollution control effort.« less

  3. Teaching Quality Control with Chocolate Chip Cookies

    ERIC Educational Resources Information Center

    Baker, Ardith

    2014-01-01

    Chocolate chip cookies are used to illustrate the importance and effectiveness of control charts in Statistical Process Control. By counting the number of chocolate chips, creating the spreadsheet, calculating the control limits and graphing the control charts, the student becomes actively engaged in the learning process. In addition, examining…

  4. Considerations In The Design And Specifications Of An Automatic Inspection System

    NASA Astrophysics Data System (ADS)

    Lee, David T.

    1980-05-01

    Considerable activities have been centered around the automation of manufacturing quality control and inspection functions. Several reasons can be cited for this development. The continuous pressure of direct and indirect labor cost increase is only one of the obvious motivations. With the drive for electronics miniaturization come more and more complex processes where control parameters are critical and the yield is highly susceptible to inadequate process monitor and inspection. With multi-step, multi-layer process for substrate fabrication, process defects that are not detected and corrected at certain critical points may render the entire subassembly useless. As a process becomes more complex, the time required to test the product increases significantly in the total build cycle. The urgency to reduce test time brings more pressure to improve in-process control and inspection. The advances and improvements of components, assemblies and systems such as micro-processors, micro-computers, programmable controllers, and other intelligent devices, have made the automation of quality control much more cost effective and justifiable.

  5. [Establishment of Quality Control System of Nucleic Acid Detection for Ebola Virus in Sierra Leone-China Friendship Biological Safety Laboratory].

    PubMed

    Wang, Qin; Zhang, Yong; Nie, Kai; Wang, Huanyu; Du, Haijun; Song, Jingdong; Xiao, Kang; Lei, Wenwen; Guo, Jianqiang; Wei, Hejiang; Cai, Kun; Wang, Yanhai; Wu, Jiang; Gerald, Bangura; Kamara, Idrissa Laybohr; Liang, Mifang; Wu, Guizhen; Dong, Xiaoping

    2016-03-01

    The quality control process throughout the Ebola virus nucleic acid detection in Sierra Leone-China Friendship Biological Safety Laboratory (SLE-CHN Biosafety Lab) was described in detail, in order to comprehensively display the scientific, rigorous, accurate and efficient practice in detection of Ebola virus of first batch detection team in SLE-CHN Biosafety Lab. Firstly, the key points of laboratory quality control system was described, including the managements and organizing, quality control documents and information management, instrument, reagents and supplies, assessment, facilities design and space allocation, laboratory maintenance and biosecurity. Secondly, the application of quality control methods in the whole process of the Ebola virus detection, including before the test, during the test and after the test, was analyzed. The excellent and professional laboratory staffs, the implementation of humanized management are the cornerstone of the success; High-level biological safety protection is the premise for effective quality control and completion of Ebola virus detection tasks. And professional logistics is prerequisite for launching the laboratory diagnosis of Ebola virus. The establishment and running of SLE-CHN Biosafety Lab has landmark significance for the friendship between Sierra Leone and China, and the lab becomes the most important base for Ebola virus laboratory testing in Sierra Leone.

  6. Rapid evaluation and quality control of next generation sequencing data with FaQCs

    DOE PAGES

    Lo, Chien -Chi; Chain, Patrick S. G.

    2014-12-01

    Background: Next generation sequencing (NGS) technologies that parallelize the sequencing process and produce thousands to millions, or even hundreds of millions of sequences in a single sequencing run, have revolutionized genomic and genetic research. Because of the vagaries of any platform's sequencing chemistry, the experimental processing, machine failure, and so on, the quality of sequencing reads is never perfect, and often declines as the read is extended. These errors invariably affect downstream analysis/application and should therefore be identified early on to mitigate any unforeseen effects. Results: Here we present a novel FastQ Quality Control Software (FaQCs) that can rapidly processmore » large volumes of data, and which improves upon previous solutions to monitor the quality and remove poor quality data from sequencing runs. Both the speed of processing and the memory footprint of storing all required information have been optimized via algorithmic and parallel processing solutions. The trimmed output compared side-by-side with the original data is part of the automated PDF output. We show how this tool can help data analysis by providing a few examples, including an increased percentage of reads recruited to references, improved single nucleotide polymorphism identification as well as de novo sequence assembly metrics. Conclusion: FaQCs combines several features of currently available applications into a single, user-friendly process, and includes additional unique capabilities such as filtering the PhiX control sequences, conversion of FASTQ formats, and multi-threading. The original data and trimmed summaries are reported within a variety of graphics and reports, providing a simple way to do data quality control and assurance.« less

  7. [Cleaning and disinfection in nursing homes. Data on quality of structure, process and outcome in nursing homes in Frankfurt am Main, Germany, 2011].

    PubMed

    Heudorf, U; Gasteyer, S; Samoiski, Y; Voigt, K

    2012-08-01

    Due to the Infectious Disease Prevention Act, public health services in Germany are obliged to check the infection prevention in hospitals and other medical facilities as well as in nursing homes. In Frankfurt/Main, Germany, standardized control visits have been performed for many years. In 2011 focus was laid on cleaning and disinfection of surfaces. All 41 nursing homes were checked according to a standardized checklist covering quality of structure (i.e. staffing, hygiene concept), quality of process (observation of the cleaning processes in the homes) and quality of output, which was monitored by checking the cleaning of fluorescent marks which had been applied some days before and should have been removed via cleaning in the following days before the final check. In more than two thirds of the homes, cleaning personnel were salaried, in one third external personnel were hired. Of the homes 85% provided service clothing and all of them offered protective clothing. All homes had established hygiene and cleaning concepts, however, in 15% of the homes concepts for the handling of Norovirus and in 30% concepts for the handling of Clostridium difficile were missing. Regarding process quality only half of the processes observed, i.e. cleaning of hand contact surfaces, such as handrails, washing areas and bins, were correct. Only 44% of the cleaning controls were correct with enormous differences between the homes (0-100%). The correlation between quality of process and quality of output was significant. There was good quality of structure in the homes but regarding quality of process and outcome there was great need for improvement. This was especially due to faults in communication and coordination between cleaning personnel and nursing personnel. Quality outcome was neither associated with the number of the places for residents nor with staffing. Thus, not only quality of structure but also quality of process and outcome should be checked by the public health services.

  8. [Quality control an assessment system. Its location within a program for food, nutrition and metabolic intervention].

    PubMed

    Santana Porbén, S

    2012-01-01

    A design proposal for a HQCAS Hospital Quality Control and Assessment System of the nutritional and feeding care processes conducted in a hospital environment is presented in this article. The design proposal is accompanied of the results of inspections conducted by the hospital NST Nutritional Support Group between 2005-2010. The system design includes quality policies that should rule the useful and safe conduction of such processes, the recording and documentary foundations of the System, and the quality control and assessment exercises for the continuous verification of such established policies. The current state of the conduction of these processes was documented from secondary records opened by the NST after satisfying consultation requests from the medical care teams of the institution. Inspections conducted by the NST revealed that less than half of clinical charts contained information minimally enough for elaborating nutritional judgments, almost one-fifth of the assisted patients were on Nils Per Oris, for whom no nutritional support schemes were prescribed, and a low prescription and usage of artificial nutrition schemes. Corrective measures adopted by the NST served to significantly increase the rates of successful completion of inspected processes. Quality assurance of feeding and nutritional care processes is a practical as well as an intellectual activity subjected to constant remodeling, in order to always warrant the fulfillment of quality policies advanced by the NST, and thus, that the patient benefits from the prescribed nutritional intervention strategy.

  9. [Application of traditional Chinese medicine reference standards in quality control of Chinese herbal pieces].

    PubMed

    Lu, Tu-Lin; Li, Jin-Ci; Yu, Jiang-Yong; Cai, Bao-Chang; Mao, Chun-Qin; Yin, Fang-Zhou

    2014-01-01

    Traditional Chinese medicine (TCM) reference standards plays an important role in the quality control of Chinese herbal pieces. This paper overviewed the development of TCM reference standards. By analyzing the 2010 edition of Chinese pharmacopoeia, the application of TCM reference standards in the quality control of Chinese herbal pieces was summarized, and the problems exiting in the system were put forward. In the process of improving the quality control level of Chinese herbal pieces, various kinds of advanced methods and technology should be used to research the characteristic reference standards of Chinese herbal pieces, more and more reasonable reference standards should be introduced in the quality control system of Chinese herbal pieces. This article discussed the solutions in the aspect of TCM reference standards, and future development of quality control on Chinese herbal pieces is prospected.

  10. Real-time control of combined surface water quantity and quality: polder flushing.

    PubMed

    Xu, M; van Overloop, P J; van de Giesen, N C; Stelling, G S

    2010-01-01

    In open water systems, keeping both water depths and water quality at specified values is critical for maintaining a 'healthy' water system. Many systems still require manual operation, at least for water quality management. When applying real-time control, both quantity and quality standards need to be met. In this paper, an artificial polder flushing case is studied. Model Predictive Control (MPC) is developed to control the system. In addition to MPC, a 'forward estimation' procedure is used to acquire water quality predictions for the simplified model used in MPC optimization. In order to illustrate the advantages of MPC, classical control [Proportional-Integral control (PI)] has been developed for comparison in the test case. The results show that both algorithms are able to control the polder flushing process, but MPC is more efficient in functionality and control flexibility.

  11. [Application progress on near infrared spectroscopy in quality control and process monitoring of traditional Chinese medicine].

    PubMed

    Li, Wenlong; Qu, Haibin

    2017-01-25

    The industry of traditional Chinese medicine (TCM) encounters problems like quality fluctuation of raw materials and unstandardized production process. Near infrared (NIR) spectroscopy technology is widely used in quality control of TCM because of its abundant information, fast and nondestructive characters. The main applications include quantitative analysis of Chinese medicinal materials, intermediates and Chinese patent medicines; the authenticity of TCM, species, origins and manufacturers; monitoring and control of the extraction, alcohol precipitation, column chromatography and blending process. This article reviews the progress on the application of NIR spectroscopy technology in TCM field. In view of the problems existing in the application, the article proposes that the standardization of NIR analysis method should be developed according to specific characteristics of TCM, which will promote the application of NIR technology in the TCM industry.

  12. Double shell tanks (DST) chemistry control data quality objectives

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    BANNING, D.L.

    2001-10-09

    One of the main functions of the River Protection Project is to store the Hanford Site tank waste until the Waste Treatment Plant (WTP) is ready to receive and process the waste. Waste from the older single-shell tanks is being transferred to the newer double-shell tanks (DSTs). Therefore, the integrity of the DSTs must be maintained until the waste from all tanks has been retrieved and transferred to the WTP. To help maintain the integrity of the DSTs over the life of the project, specific chemistry limits have been established to control corrosion of the DSTs. These waste chemistry limitsmore » are presented in the Technical Safety Requirements (TSR) document HNF-SD-WM-TSR-006, Sec. 5 . IS, Rev 2B (CHG 200 I). In order to control the chemistry in the DSTs, the Chemistry Control Program will require analyses of the tank waste. This document describes the Data Quality Objective (DUO) process undertaken to ensure appropriate data will be collected to control the waste chemistry in the DSTs. The DQO process was implemented in accordance with Data Quality Objectives for Sampling and Analyses, HNF-IP-0842, Rev. Ib, Vol. IV, Section 4.16, (Banning 2001) and the U.S. Environmental Protection Agency EPA QA/G4, Guidance for the Data Quality Objectives Process (EPA 1994), with some modifications to accommodate project or tank specific requirements and constraints.« less

  13. A conceptual study of automatic and semi-automatic quality assurance techniques for round image processing

    NASA Technical Reports Server (NTRS)

    1983-01-01

    This report summarizes the results of a study conducted by Engineering and Economics Research (EER), Inc. under NASA Contract Number NAS5-27513. The study involved the development of preliminary concepts for automatic and semiautomatic quality assurance (QA) techniques for ground image processing. A distinction is made between quality assessment and the more comprehensive quality assurance which includes decision making and system feedback control in response to quality assessment.

  14. Using Deming To Improve Quality in Colleges and Universities.

    ERIC Educational Resources Information Center

    Cornesky, Robert A.; And Others

    Of all the people known for stressing quality in industry, W. Edwards Deming is the pioneer. He stresses statistical process control (SPC) and a 14-point process for managers to improve quality and productivity. His approach is humanistic and treats people as intelligent human beings who want to do a good job. Twelve administrators in a university…

  15. ISO 9002 as Literacy Practice: Coping with Quality-Control Documents in a High-Tech Company

    ERIC Educational Resources Information Center

    Kleifgen, Jo Anne

    2005-01-01

    This study describes the process by which a circuit board manufacturing company became certified in an international quality control program known as ISO 9002. Particular attention is paid to how quality documents were made and used in actual practice and to the relationship between these standardized procedures (official literacies) and…

  16. Application of the suggestion system in the improvement of the production process and product quality control

    NASA Astrophysics Data System (ADS)

    Gołaś, H.; Mazur, A.; Gruszka, J.; Szafer, P.

    2016-08-01

    The elaboration is a case study and the research was carried out in the company Alco-Mot Ltd., which employs 120 people. The company specializes in the production of lead poles for industrial and traction batteries using gravity casting. The elements embedded in the cast are manufactured on a machining centre, which provides the stability of the process and of the dimensions of the product as well as a very short production time. As a result of observation and analysis the authors have developed a concept for the implementation of a dynamic suggestion system in ALCO-MOT, including, among others, a standard for actions in the implementation of the suggestion system, as well as clear guidelines for the processing and presentation of the activities undertaken in the time between the establishment of the concept (suggestions) and the benefits analysis after the proposed solutions have been implemented. The authors also present how suggestions proposed by ALCO-MOT staff contributed to the improvement of the processes of production and quality control. Employees offered more than 30 suggestions, of which more than a half are being implemented now and further actions are being prepared for implementation. The authors will present the results of improvements in, for example, tool replacement time, scrap reduction. The authors will present how kaizen can improve the production and quality control processes. They will present how the production and quality control processes looked before and after the implementation of employee suggestions.

  17. Design and implementation of a control structure for quality products in a crude oil atmospheric distillation column.

    PubMed

    Sotelo, David; Favela-Contreras, Antonio; Sotelo, Carlos; Jiménez, Guillermo; Gallegos-Canales, Luis

    2017-11-01

    In recent years, interest for petrochemical processes has been increasing, especially in refinement area. However, the high variability in the dynamic characteristics present in the atmospheric distillation column poses a challenge to obtain quality products. To improve distillates quality in spite of the changes in the input crude oil composition, this paper details a new design of a control strategy in a conventional crude oil distillation plant defined using formal interaction analysis tools. The process dynamic and its control are simulated on Aspen HYSYS ® dynamic environment under real operating conditions. The simulation results are compared against a typical control strategy commonly used in crude oil atmospheric distillation columns. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  18. ITS data quality control and the calculation of mobility performance measures

    DOT National Transportation Integrated Search

    2000-09-01

    This report describes the results of research on the use of intelligent transportation system (ITS) data in calculating mobility performance measures for ITS operations. The report also describes a data quality control process developed for the Trans...

  19. Statistical Methods in Assembly Quality Management of Multi-Element Products on Automatic Rotor Lines

    NASA Astrophysics Data System (ADS)

    Pries, V. V.; Proskuriakov, N. E.

    2018-04-01

    To control the assembly quality of multi-element mass-produced products on automatic rotor lines, control methods with operational feedback are required. However, due to possible failures in the operation of the devices and systems of automatic rotor line, there is always a real probability of getting defective (incomplete) products into the output process stream. Therefore, a continuous sampling control of the products completeness, based on the use of statistical methods, remains an important element in managing the quality of assembly of multi-element mass products on automatic rotor lines. The feature of continuous sampling control of the multi-element products completeness in the assembly process is its breaking sort, which excludes the possibility of returning component parts after sampling control to the process stream and leads to a decrease in the actual productivity of the assembly equipment. Therefore, the use of statistical procedures for continuous sampling control of the multi-element products completeness when assembled on automatic rotor lines requires the use of such sampling plans that ensure a minimum size of control samples. Comparison of the values of the limit of the average output defect level for the continuous sampling plan (CSP) and for the automated continuous sampling plan (ACSP) shows the possibility of providing lower limit values for the average output defects level using the ACSP-1. Also, the average sample size when using the ACSP-1 plan is less than when using the CSP-1 plan. Thus, the application of statistical methods in the assembly quality management of multi-element products on automatic rotor lines, involving the use of proposed plans and methods for continuous selective control, will allow to automating sampling control procedures and the required level of quality of assembled products while minimizing sample size.

  20. Knowledge acquisition for a simple expert controller

    NASA Technical Reports Server (NTRS)

    Bieker, B.

    1987-01-01

    A method is presented for process control which has the properties of being incremental, cyclic and top-down. It is described on the basis of the development of an expert controller for a simple, but nonlinear control route. A quality comparison between expert controller and process operator shows the ability of the method for knowledge acquisition.

  1. Feasibility study of using statistical process control to customized quality assurance in proton therapy.

    PubMed

    Rah, Jeong-Eun; Shin, Dongho; Oh, Do Hoon; Kim, Tae Hyun; Kim, Gwe-Ya

    2014-09-01

    To evaluate and improve the reliability of proton quality assurance (QA) processes and, to provide an optimal customized tolerance level using the statistical process control (SPC) methodology. The authors investigated the consistency check of dose per monitor unit (D/MU) and range in proton beams to see whether it was within the tolerance level of the daily QA process. This study analyzed the difference between the measured and calculated ranges along the central axis to improve the patient-specific QA process in proton beams by using process capability indices. The authors established a customized tolerance level of ±2% for D/MU and ±0.5 mm for beam range in the daily proton QA process. In the authors' analysis of the process capability indices, the patient-specific range measurements were capable of a specification limit of ±2% in clinical plans. SPC methodology is a useful tool for customizing the optimal QA tolerance levels and improving the quality of proton machine maintenance, treatment delivery, and ultimately patient safety.

  2. Advances in Process Control.

    ERIC Educational Resources Information Center

    Morrison, David L.; And Others

    1982-01-01

    Advances in electronics and computer science have enabled industries (pulp/paper, iron/steel, petroleum/chemical) to attain better control of their processes with resulting increases in quality, productivity, profitability, and compliance with government regulations. (JN)

  3. 21 CFR 111.120 - What quality control operations are required for components, packaging, and labels before use in...

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... components, packaging, and labels before use in the manufacture of a dietary supplement? 111.120 Section 111..., OR HOLDING OPERATIONS FOR DIETARY SUPPLEMENTS Production and Process Control System: Requirements for... labels before use in the manufacture of a dietary supplement? Quality control operations for components...

  4. 21 CFR 111.120 - What quality control operations are required for components, packaging, and labels before use in...

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... components, packaging, and labels before use in the manufacture of a dietary supplement? 111.120 Section 111..., OR HOLDING OPERATIONS FOR DIETARY SUPPLEMENTS Production and Process Control System: Requirements for... labels before use in the manufacture of a dietary supplement? Quality control operations for components...

  5. 21 CFR 111.120 - What quality control operations are required for components, packaging, and labels before use in...

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... components, packaging, and labels before use in the manufacture of a dietary supplement? 111.120 Section 111..., OR HOLDING OPERATIONS FOR DIETARY SUPPLEMENTS Production and Process Control System: Requirements for... labels before use in the manufacture of a dietary supplement? Quality control operations for components...

  6. 21 CFR 111.120 - What quality control operations are required for components, packaging, and labels before use in...

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... components, packaging, and labels before use in the manufacture of a dietary supplement? 111.120 Section 111..., OR HOLDING OPERATIONS FOR DIETARY SUPPLEMENTS Production and Process Control System: Requirements for... labels before use in the manufacture of a dietary supplement? Quality control operations for components...

  7. 76 FR 5386 - Draft Guidance for Industry: Pre-Storage Leukocyte Reduction of Whole Blood and Blood Components...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-31

    ... intended for transfusion, including recommendations for validation and quality control monitoring of the..., including recommendations for validation and quality control monitoring of the leukocyte reduction process... control number 0910-0052; the collections of information in 21 CFR 606.100(b), 606.100(c), and 606.121...

  8. 21 CFR 111.120 - What quality control operations are required for components, packaging, and labels before use in...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... components, packaging, and labels before use in the manufacture of a dietary supplement? 111.120 Section 111..., OR HOLDING OPERATIONS FOR DIETARY SUPPLEMENTS Production and Process Control System: Requirements for... labels before use in the manufacture of a dietary supplement? Quality control operations for components...

  9. The design of control system of livestock feeding processing

    NASA Astrophysics Data System (ADS)

    Sihombing, Juna; Napitupulu, Humala L.; Hidayati, Juliza

    2018-03-01

    PT. XYZ is a company that produces animal feed. One type of animal feed produced is 105 ISA P. In carrying out its production process, PT. XYZ faces the problem of rejected feed amounts during 2014 to June 2015 due to the amount of animal feed that exceeds the standard feed quality of 13% of moisture content and 3% for ash content. Therefore, the researchers analyzed the relationship between factors affecting the quality and extent of damage by using regression and correlation and determine the optimum value of each processing process. Analysis results found that variables affecting product quality are mixing time, steam conditioning temperature and cooling time. The most dominant variable affecting the product moisture content is mixing time with the correlation coefficient of (0.7959) and the most dominant variable affecting the ash content of the product during the processing is mixing time with the correlation coefficient of (0.8541). The design of the proposed product processing control is to run the product processing process with mixing time 235 seconds, steam conditioning temperature 87 0C and cooling time 192 seconds. Product quality 105 ISA P obtained by using this design is with 12.16% moisture content and ash content of 2.59%.

  10. Generating Options for Active Risk Control (GO-ARC): introducing a novel technique.

    PubMed

    Card, Alan J; Ward, James R; Clarkson, P John

    2014-01-01

    After investing significant amounts of time and money in conducting formal risk assessments, such as root cause analysis (RCA) or failure mode and effects analysis (FMEA), healthcare workers are left to their own devices in generating high-quality risk control options. They often experience difficulty in doing so, and tend toward an overreliance on administrative controls (the weakest category in the hierarchy of risk controls). This has important implications for patient safety and the cost effectiveness of risk management operations. This paper describes a before and after pilot study of the Generating Options for Active Risk Control (GO-ARC) technique, a novel tool to improve the quality of the risk control options generation process. The quantity, quality (using the three-tiered hierarchy of risk controls), variety, and novelty of risk controls generated. Use of the GO-ARC technique was associated with improvement on all measures. While this pilot study has some notable limitations, it appears that the GO-ARC technique improved the risk control options generation process. Further research is needed to confirm this finding. It is also important to note that improved risk control options are a necessary, but not sufficient, step toward the implementation of more robust risk controls. © 2013 National Association for Healthcare Quality.

  11. REDUCING WASTEWATER FROM CUCUMBER PICKLING PROCESS BY CONTROLLED CULTURE FERMENTATION

    EPA Science Inventory

    On a demonstration scale, the controlled culture fermentation process (CCF) developed by the U.S. Food Fermentation Laboratory was compared with the conventional natural fermentation process (NF) in regard to product quality and yield and volume and concentration of wastewaters. ...

  12. 42 CFR 480.107 - Limitations on redisclosure.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... (CONTINUED) QUALITY IMPROVEMENT ORGANIZATIONS ACQUISITION, PROTECTION, AND DISCLOSURE OF QUALITY IMPROVEMENT ORGANIZATION REVIEW INFORMATION Utilization and Quality Control Quality Improvement Organizations (QIOs... the Act or for CMS to process sanctions under section 1156 of the Act; (d) If the health care services...

  13. Incorporating Handling Qualities Analysis into Rotorcraft Conceptual Design

    NASA Technical Reports Server (NTRS)

    Lawrence, Ben

    2014-01-01

    This paper describes the initial development of a framework to incorporate handling qualities analyses into a rotorcraft conceptual design process. In particular, the paper describes how rotorcraft conceptual design level data can be used to generate flight dynamics models for handling qualities analyses. Also, methods are described that couple a basic stability augmentation system to the rotorcraft flight dynamics model to extend analysis to beyond that of the bare airframe. A methodology for calculating the handling qualities characteristics of the flight dynamics models and for comparing the results to ADS-33E criteria is described. Preliminary results from the application of the handling qualities analysis for variations in key rotorcraft design parameters of main rotor radius, blade chord, hub stiffness and flap moment of inertia are shown. Varying relationships, with counteracting trends for different handling qualities criteria and different flight speeds are exhibited, with the action of the control system playing a complex part in the outcomes. Overall, the paper demonstrates how a broad array of technical issues across flight dynamics stability and control, simulation and modeling, control law design and handling qualities testing and evaluation had to be confronted to implement even a moderately comprehensive handling qualities analysis of relatively low fidelity models. A key outstanding issue is to how to 'close the loop' with an overall design process, and options for the exploration of how to feedback handling qualities results to a conceptual design process are proposed for future work.

  14. Modified SPC for short run test and measurement process in multi-stations

    NASA Astrophysics Data System (ADS)

    Koh, C. K.; Chin, J. F.; Kamaruddin, S.

    2018-03-01

    Due to short production runs and measurement error inherent in electronic test and measurement (T&M) processes, continuous quality monitoring through real-time statistical process control (SPC) is challenging. Industry practice allows the installation of guard band using measurement uncertainty to reduce the width of acceptance limit, as an indirect way to compensate the measurement errors. This paper presents a new SPC model combining modified guard band and control charts (\\bar{\\text{Z}} chart and W chart) for short runs in T&M process in multi-stations. The proposed model standardizes the observed value with measurement target (T) and rationed measurement uncertainty (U). S-factor (S f) is introduced to the control limits to improve the sensitivity in detecting small shifts. The model was embedded in automated quality control system and verified with a case study in real industry.

  15. Standard Reference Specimens in Quality Control of Engineering Surfaces

    PubMed Central

    Song, J. F.; Vorburger, T. V.

    1991-01-01

    In the quality control of engineering surfaces, we aim to understand and maintain a good relationship between the manufacturing process and surface function. This is achieved by controlling the surface texture. The control process involves: 1) learning the functional parameters and their control values through controlled experiments or through a long history of production and use; 2) maintaining high accuracy and reproducibility with measurements not only of roughness calibration specimens but also of real engineering parts. In this paper, the characteristics, utilizations, and limitations of different classes of precision roughness calibration specimens are described. A measuring procedure of engineering surfaces, based on the calibration procedure of roughness specimens at NIST, is proposed. This procedure involves utilization of check specimens with waveform, wavelength, and other roughness parameters similar to functioning engineering surfaces. These check specimens would be certified under standardized reference measuring conditions, or by a reference instrument, and could be used for overall checking of the measuring procedure and for maintaining accuracy and agreement in engineering surface measurement. The concept of “surface texture design” is also suggested, which involves designing the engineering surface texture, the manufacturing process, and the quality control procedure to meet the optimal functional needs. PMID:28184115

  16. Causes of cine image quality deterioration in cardiac catheterization laboratories.

    PubMed

    Levin, D C; Dunham, L R; Stueve, R

    1983-10-01

    Deterioration of cineangiographic image quality can result from malfunctions or technical errors at a number of points along the cine imaging chain: generator and automatic brightness control, x-ray tube, x-ray beam geometry, image intensifier, optics, cine camera, cine film, film processing, and cine projector. Such malfunctions or errors can result in loss of image contrast, loss of spatial resolution, improper control of film optical density (brightness), or some combination thereof. While the electronic and photographic technology involved is complex, physicians who perform cardiac catheterization should be conversant with the problems and what can be done to solve them. Catheterization laboratory personnel have control over a number of factors that directly affect image quality, including radiation dose rate per cine frame, kilovoltage or pulse width (depending on type of automatic brightness control), cine run time, selection of small or large focal spot, proper object-intensifier distance and beam collimation, aperture of the cine camera lens, selection of cine film, processing temperature, processing immersion time, and selection of developer.

  17. 10 CFR 830.122 - Quality assurance criteria.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... control design interfaces. (4) Verify or validate the adequacy of design products using individuals or... quality problems. (2) Identify, control, and correct items, services, and processes that do not meet..., specify requirements, or establish design. (2) Specify, prepare, review, approve, and maintain records. (e...

  18. 10 CFR 830.122 - Quality assurance criteria.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... control design interfaces. (4) Verify or validate the adequacy of design products using individuals or... quality problems. (2) Identify, control, and correct items, services, and processes that do not meet..., specify requirements, or establish design. (2) Specify, prepare, review, approve, and maintain records. (e...

  19. 10 CFR 830.122 - Quality assurance criteria.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... control design interfaces. (4) Verify or validate the adequacy of design products using individuals or... quality problems. (2) Identify, control, and correct items, services, and processes that do not meet..., specify requirements, or establish design. (2) Specify, prepare, review, approve, and maintain records. (e...

  20. 10 CFR 830.122 - Quality assurance criteria.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... control design interfaces. (4) Verify or validate the adequacy of design products using individuals or... quality problems. (2) Identify, control, and correct items, services, and processes that do not meet..., specify requirements, or establish design. (2) Specify, prepare, review, approve, and maintain records. (e...

  1. 10 CFR 830.122 - Quality assurance criteria.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... control design interfaces. (4) Verify or validate the adequacy of design products using individuals or... quality problems. (2) Identify, control, and correct items, services, and processes that do not meet..., specify requirements, or establish design. (2) Specify, prepare, review, approve, and maintain records. (e...

  2. Implementation of quality by design toward processing of food products.

    PubMed

    Rathore, Anurag S; Kapoor, Gautam

    2017-05-28

    Quality by design (QbD) is a systematic approach that begins with predefined objectives and emphasizes product and process understanding and process control. It is an approach based on principles of sound science and quality risk management. As the food processing industry continues to embrace the idea of in-line, online, and/or at-line sensors and real-time characterization for process monitoring and control, the existing gaps with regard to our ability to monitor multiple parameters/variables associated with the manufacturing process will be alleviated over time. Investments made for development of tools and approaches that facilitate high-throughput analytical and process development, process analytical technology, design of experiments, risk analysis, knowledge management, and enhancement of process/product understanding would pave way for operational and economic benefits later in the commercialization process and across other product pipelines. This article aims to achieve two major objectives. First, to review the progress that has been made in the recent years on the topic of QbD implementation in processing of food products and second, present a case study that illustrates benefits of such QbD implementation.

  3. Sigma Metrics Across the Total Testing Process.

    PubMed

    Charuruks, Navapun

    2017-03-01

    Laboratory quality control has been developed for several decades to ensure patients' safety, from a statistical quality control focus on the analytical phase to total laboratory processes. The sigma concept provides a convenient way to quantify the number of errors in extra-analytical and analytical phases through the defect per million and sigma metric equation. Participation in a sigma verification program can be a convenient way to monitor analytical performance continuous quality improvement. Improvement of sigma-scale performance has been shown from our data. New tools and techniques for integration are needed. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. Quality Control Pathways for Nucleus-Encoded Eukaryotic tRNA Biosynthesis and Subcellular Trafficking

    PubMed Central

    Huang, Hsiao-Yun

    2015-01-01

    tRNAs perform an essential role in translating the genetic code. They are long-lived RNAs that are generated via numerous posttranscriptional steps. Eukaryotic cells have evolved numerous layers of quality control mechanisms to ensure that the tRNAs are appropriately structured, processed, and modified. We describe the known tRNA quality control processes that check tRNAs and correct or destroy aberrant tRNAs. These mechanisms employ two types of exonucleases, CCA end addition, tRNA nuclear aminoacylation, and tRNA subcellular traffic. We arrange these processes in order of the steps that occur from generation of precursor tRNAs by RNA polymerase (Pol) III transcription to end maturation and modification in the nucleus to splicing and additional modifications in the cytoplasm. Finally, we discuss the tRNA retrograde pathway, which allows tRNA reimport into the nucleus for degradation or repair. PMID:25848089

  5. Monitoring Processes in Visual Search Enhanced by Professional Experience: The Case of Orange Quality-Control Workers

    PubMed Central

    Visalli, Antonino; Vallesi, Antonino

    2018-01-01

    Visual search tasks have often been used to investigate how cognitive processes change with expertise. Several studies have shown visual experts' advantages in detecting objects related to their expertise. Here, we tried to extend these findings by investigating whether professional search experience could boost top-down monitoring processes involved in visual search, independently of advantages specific to objects of expertise. To this aim, we recruited a group of quality-control workers employed in citrus farms. Given the specific features of this type of job, we expected that the extensive employment of monitoring mechanisms during orange selection could enhance these mechanisms even in search situations in which orange-related expertise is not suitable. To test this hypothesis, we compared performance of our experimental group and of a well-matched control group on a computerized visual search task. In one block the target was an orange (expertise target) while in the other block the target was a Smurfette doll (neutral target). The a priori hypothesis was to find an advantage for quality-controllers in those situations in which monitoring was especially involved, that is, when deciding the presence/absence of the target required a more extensive inspection of the search array. Results were consistent with our hypothesis. Quality-controllers were faster in those conditions that extensively required monitoring processes, specifically, the Smurfette-present and both target-absent conditions. No differences emerged in the orange-present condition, which resulted to mainly rely on bottom-up processes. These results suggest that top-down processes in visual search can be enhanced through immersive real-life experience beyond visual expertise advantages. PMID:29497392

  6. [Quality control of laser imagers].

    PubMed

    Winkelbauer, F; Ammann, M; Gerstner, N; Imhof, H

    1992-11-01

    Multiformat imagers based on laser systems are used for documentation in an increasing number of investigations. The specific problems of quality control are explained and the persistence of film processing in these imager systems of different configuration with (Machine 1: 3M-Laser-Imager-Plus M952 with connected 3M Film-Processor, 3M-Film IRB, X-Rax Chemical Mixer 3M-XPM, 3M-Developer and Fixer) or without (Machine 2: 3M-Laser-Imager-Plus M952 with separate DuPont-Cronex Film-processor, Kodak IR-Film, Kodak Automixer, Kodak-Developer and Fixer) connected film processing unit are investigated. In our checking based on DIN 6868 and ONORM S 5240 we found persistence of film processing in the equipment with directly adapted film processing unit according to DIN and ONORM. The checking of film persistence as demanded by DIN 6868 in these equipment could therefore be performed in longer periods. Systems with conventional darkroom processing comparatively show plain increased fluctuation, and hence the demanded daily control is essential to guarantee appropriate reaction and constant quality of documentation.

  7. Opportunities and challenges of real-time release testing in biopharmaceutical manufacturing.

    PubMed

    Jiang, Mo; Severson, Kristen A; Love, John Christopher; Madden, Helena; Swann, Patrick; Zang, Li; Braatz, Richard D

    2017-11-01

    Real-time release testing (RTRT) is defined as "the ability to evaluate and ensure the quality of in-process and/or final drug product based on process data, which typically includes a valid combination of measured material attributes and process controls" (ICH Q8[R2]). This article discusses sensors (process analytical technology, PAT) and control strategies that enable RTRT for the spectrum of critical quality attributes (CQAs) in biopharmaceutical manufacturing. Case studies from the small-molecule and biologic pharmaceutical industry are described to demonstrate how RTRT can be facilitated by integrated manufacturing and multivariable control strategies to ensure the quality of products. RTRT can enable increased assurance of product safety, efficacy, and quality-with improved productivity including faster release and potentially decreased costs-all of which improve the value to patients. To implement a complete RTRT solution, biologic drug manufacturers need to consider the special attributes of their industry, particularly sterility and the measurement of viral and microbial contamination. Continued advances in on-line and in-line sensor technologies are key for the biopharmaceutical manufacturing industry to achieve the potential of RTRT. Related article: http://onlinelibrary.wiley.com/doi/10.1002/bit.26378/full. © 2017 Wiley Periodicals, Inc.

  8. 10 CFR 71.119 - Control of special processes.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 2 2012-01-01 2012-01-01 false Control of special processes. 71.119 Section 71.119 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) PACKAGING AND TRANSPORTATION OF RADIOACTIVE MATERIAL Quality... shall establish measures to assure that special processes, including welding, heat treating, and...

  9. 10 CFR 71.119 - Control of special processes.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 2 2013-01-01 2013-01-01 false Control of special processes. 71.119 Section 71.119 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) PACKAGING AND TRANSPORTATION OF RADIOACTIVE MATERIAL Quality... shall establish measures to assure that special processes, including welding, heat treating, and...

  10. 10 CFR 71.119 - Control of special processes.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 2 2014-01-01 2014-01-01 false Control of special processes. 71.119 Section 71.119 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) PACKAGING AND TRANSPORTATION OF RADIOACTIVE MATERIAL Quality... shall establish measures to assure that special processes, including welding, heat treating, and...

  11. 10 CFR 71.119 - Control of special processes.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 2 2011-01-01 2011-01-01 false Control of special processes. 71.119 Section 71.119 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) PACKAGING AND TRANSPORTATION OF RADIOACTIVE MATERIAL Quality... shall establish measures to assure that special processes, including welding, heat treating, and...

  12. 42 CFR 475.102 - Eligibility of physician-sponsored organizations.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... AND HUMAN SERVICES (CONTINUED) QUALITY IMPROVEMENT ORGANIZATIONS QUALITY IMPROVEMENT ORGANIZATIONS Utilization and Quality Control Quality Improvement Organizations § 475.102 Eligibility of physician-sponsored..., during the contract evaluation process, a set number of bonus points. [49 FR 7207, Feb. 27, 1984...

  13. Quality at a Glance

    DTIC Science & Technology

    1990-01-01

    This document contains summaries of fifteen of the well known books which underlie the Total Quality Management philosophy. Members of the DCASR St Louis staff offer comments and opinions on how the authors have presented the quality concept in todays business environment. Keywords: TQM (Total Quality Management ), Quality concepts, Statistical process control.

  14. Validation of gamma irradiator controls for quality and regulatory compliance

    NASA Astrophysics Data System (ADS)

    Harding, Rorry B.; Pinteric, Francis J. A.

    1995-09-01

    Since 1978 the U.S. Food and Drug Administration (FDA) has had both the legal authority and the Current Good Manufacturing Practice (CGMP) regulations in place to require irradiator owners who process medical devices to produce evidence of Irradiation Process Validation. One of the key components of Irradiation Process Validation is the validation of the irradiator controls. However, it is only recently that FDA audits have focused on this component of the process validation. What is Irradiator Control System Validation? What constitutes evidence of control? How do owners obtain evidence? What is the irradiator supplier's role in validation? How does the ISO 9000 Quality Standard relate to the FDA's CGMP requirement for evidence of Control System Validation? This paper presents answers to these questions based on the recent experiences of Nordion's engineering and product management staff who have worked with several US-based irradiator owners. This topic — Validation of Irradiator Controls — is a significant regulatory compliance and operations issue within the irradiator suppliers' and users' community.

  15. System-wide hybrid MPC-PID control of a continuous pharmaceutical tablet manufacturing process via direct compaction.

    PubMed

    Singh, Ravendra; Ierapetritou, Marianthi; Ramachandran, Rohit

    2013-11-01

    The next generation of QbD based pharmaceutical products will be manufactured through continuous processing. This will allow the integration of online/inline monitoring tools, coupled with an efficient advanced model-based feedback control systems, to achieve precise control of process variables, so that the predefined product quality can be achieved consistently. The direct compaction process considered in this study is highly interactive and involves time delays for a number of process variables due to sensor placements, process equipment dimensions, and the flow characteristics of the solid material. A simple feedback regulatory control system (e.g., PI(D)) by itself may not be sufficient to achieve the tight process control that is mandated by regulatory authorities. The process presented herein comprises of coupled dynamics involving slow and fast responses, indicating the requirement of a hybrid control scheme such as a combined MPC-PID control scheme. In this manuscript, an efficient system-wide hybrid control strategy for an integrated continuous pharmaceutical tablet manufacturing process via direct compaction has been designed. The designed control system is a hybrid scheme of MPC-PID control. An effective controller parameter tuning strategy involving an ITAE method coupled with an optimization strategy has been used for tuning of both MPC and PID parameters. The designed hybrid control system has been implemented in a first-principles model-based flowsheet that was simulated in gPROMS (Process System Enterprise). Results demonstrate enhanced performance of critical quality attributes (CQAs) under the hybrid control scheme compared to only PID or MPC control schemes, illustrating the potential of a hybrid control scheme in improving pharmaceutical manufacturing operations. Copyright © 2013 Elsevier B.V. All rights reserved.

  16. Challenges in Special Steel Making

    NASA Astrophysics Data System (ADS)

    Balachandran, G.

    2018-02-01

    Special bar quality [SBQ] is a long steel product where an assured quality is delivered by the steel mill to its customer. The bars have enhanced tolerance to higher stress application and it is demanded for specialised component making. The SBQ bars are sought for component making processing units such as closed die hot forging, hot extrusion, cold forging, machining, heat treatment, welding operations. The final component quality of the secondary processing units depends on the quality maintained at the steel maker end along with quality maintained at the fabricator end. Thus, quality control is ensured at every unit process stages. The various market segments catered to by SBQ steel segment is ever growing and is reviewed. Steel mills need adequate infrastructure and technological capability to make these higher quality steels. Some of the critical stages of processing SBQ and the critical quality maintenance parameters at the steel mill in the manufacture has been brought out.

  17. GEOSPATIAL QUALITY COUNCIL

    EPA Science Inventory

    Geospatial Science is increasingly becoming an important tool in making Agency decisions. QualIty Control and Quality Assurance are required to be integrated during the planning, implementation and assessment of geospatial databases, processes and products. In order to ensure Age...

  18. Protein Quality Control and the Amyotrophic Lateral Sclerosis/Frontotemporal Dementia Continuum

    PubMed Central

    Shahheydari, Hamideh; Ragagnin, Audrey; Walker, Adam K.; Toth, Reka P.; Vidal, Marta; Jagaraj, Cyril J.; Perri, Emma R.; Konopka, Anna; Sultana, Jessica M.; Atkin, Julie D.

    2017-01-01

    Protein homeostasis, or proteostasis, has an important regulatory role in cellular function. Protein quality control mechanisms, including protein folding and protein degradation processes, have a crucial function in post-mitotic neurons. Cellular protein quality control relies on multiple strategies, including molecular chaperones, autophagy, the ubiquitin proteasome system, endoplasmic reticulum (ER)-associated degradation (ERAD) and the formation of stress granules (SGs), to regulate proteostasis. Neurodegenerative diseases are characterized by the presence of misfolded protein aggregates, implying that protein quality control mechanisms are dysfunctional in these conditions. Amyotrophic lateral sclerosis (ALS) and frontotemporal dementia (FTD) are neurodegenerative diseases that are now recognized to overlap clinically and pathologically, forming a continuous disease spectrum. In this review article, we detail the evidence for dysregulation of protein quality control mechanisms across the whole ALS-FTD continuum, by discussing the major proteins implicated in ALS and/or FTD. We also discuss possible ways in which protein quality mechanisms could be targeted therapeutically in these disorders and highlight promising protein quality control-based therapeutics for clinical trials. PMID:28539871

  19. The Status of the Quality Control in Acupuncture-Neuroimaging Studies

    PubMed Central

    Qiu, Ke; Jing, Miaomiao; Liu, Xiaoyan; Gao, Feifei; Liang, Fanrong; Zeng, Fang

    2016-01-01

    Using neuroimaging techniques to explore the central mechanism of acupuncture gains increasing attention, but the quality control of acupuncture-neuroimaging study remains to be improved. We searched the PubMed Database during 1995 to 2014. The original English articles with neuroimaging scan performed on human beings were included. The data involved quality control including the author, sample size, characteristics of the participant, neuroimaging technology, and acupuncture intervention were extracted and analyzed. The rigorous inclusion and exclusion criteria are important guaranty for the participants' homogeneity. The standard operation process of acupuncture and the stricter requirement for acupuncturist play significant role in quality control. More attention should be paid to the quality control in future studies to improve the reproducibility and reliability of the acupuncture-neuroimaging studies. PMID:27242911

  20. Chemometrics-based process analytical technology (PAT) tools: applications and adaptation in pharmaceutical and biopharmaceutical industries.

    PubMed

    Challa, Shruthi; Potumarthi, Ravichandra

    2013-01-01

    Process analytical technology (PAT) is used to monitor and control critical process parameters in raw materials and in-process products to maintain the critical quality attributes and build quality into the product. Process analytical technology can be successfully implemented in pharmaceutical and biopharmaceutical industries not only to impart quality into the products but also to prevent out-of-specifications and improve the productivity. PAT implementation eliminates the drawbacks of traditional methods which involves excessive sampling and facilitates rapid testing through direct sampling without any destruction of sample. However, to successfully adapt PAT tools into pharmaceutical and biopharmaceutical environment, thorough understanding of the process is needed along with mathematical and statistical tools to analyze large multidimensional spectral data generated by PAT tools. Chemometrics is a chemical discipline which incorporates both statistical and mathematical methods to obtain and analyze relevant information from PAT spectral tools. Applications of commonly used PAT tools in combination with appropriate chemometric method along with their advantages and working principle are discussed. Finally, systematic application of PAT tools in biopharmaceutical environment to control critical process parameters for achieving product quality is diagrammatically represented.

  1. Sustainable improvement of animal health care by systematic quality risk management according to the HACCP concept.

    PubMed

    Noordhuizen, J P; Welpelo, H J

    1996-12-01

    This paper addresses the principles of the Hazard Analysis Critical Control Point (HACCP) concept as applied to animal health management strategy. Characteristics of the concept were analysed and compared with those of current animal health care strategies for disease risk identification and herd health management, insurance, and certification. HACCP is a hybrid strategy of quality control at both production process and product level. Animal health is considered a particular quality feature. We show that process control (expressed in terms of controlling both general and specific disease risk factors) and product control (expressed in terms of testing animals or animal products for specific disease agents) could form the basis for improving animal health. We conclude that HACCP provides ample opportunity for preventive health action and risk management at a relatively low cost in terms of labour, finance and documentation expenditure, at both the farm and sector level. Epidemiological field studies are currently needed to identify critical control points and to design HACCP procedures for livestock producers. In the long run, HACCP based animal health care can be further developed into a quality control systems approach to cover all aspects that are related, either directly or indirectly, to animal health.

  2. An In-Process Surface Roughness Recognition System in End Milling Operations

    ERIC Educational Resources Information Center

    Yang, Lieh-Dai; Chen, Joseph C.

    2004-01-01

    To develop an in-process quality control system, a sensor technique and a decision-making algorithm need to be applied during machining operations. Several sensor techniques have been used in the in-process prediction of quality characteristics in machining operations. For example, an accelerometer sensor can be used to monitor the vibration of…

  3. The Validity of Higher-Order Questions as a Process Indicator of Educational Quality

    ERIC Educational Resources Information Center

    Renaud, Robert D.; Murray, Harry G.

    2007-01-01

    One way to assess the quality of education in post-secondary institutions is through the use of performance indicators. Studies that have compared currently popular process indicators (e.g., library size, percentage of faculty with PhD) found that after controlling for incoming student ability, these process indicators tend to be weakly associated…

  4. Nuclear Technology Series. Course 23: Nuclear Chemical Processes.

    ERIC Educational Resources Information Center

    Center for Occupational Research and Development, Inc., Waco, TX.

    This technical specialty course is one of thirty-five courses designed for use by two-year postsecondary institutions in five nuclear technician curriculum areas: (1) radiation protection technician, (2) nuclear instrumentation and control technician, (3) nuclear materials processing technician, (4) nuclear quality-assurance/quality-control…

  5. Nuclear Technology Series. Course 33: Control Systems I.

    ERIC Educational Resources Information Center

    Center for Occupational Research and Development, Inc., Waco, TX.

    This technical specialty course is one of thirty-five courses designed for use by two-year postsecondary institutions in five nuclear technician curriculum areas: (1) radiation protection technician, (2) nuclear instrumentation and control technician, (3) nuclear materials processing technician, (4) nuclear quality-assurance/quality-control…

  6. Nuclear Technology Series. Course 34: Control Systems II.

    ERIC Educational Resources Information Center

    Center for Occupational Research and Development, Inc., Waco, TX.

    This technical specialty course is one of thirty-five courses designed for use by two-year postsecondary institutions in five nuclear technician curriculum areas: (1) radiation protection technician, (2) nuclear instrumentation and control technician, (3) nuclear materials processing technician, (4) nuclear quality-assurance/quality-control…

  7. 40 CFR 60.21 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... control equipment or process change; and (5) Final compliance. (i) Region means an air quality control... standard of performance for new stationary sources, but for which air quality criteria have not been issued... account the cost of such reduction) the Administrator has determined has been adequately demonstrated for...

  8. Make no mistake—errors can be controlled*

    PubMed Central

    Hinckley, C

    2003-01-01

    

 Traditional quality control methods identify "variation" as the enemy. However, the control of variation by itself can never achieve the remarkably low non-conformance rates of world class quality leaders. Because the control of variation does not achieve the highest levels of quality, an inordinate focus on these techniques obscures key quality improvement opportunities and results in unnecessary pain and suffering for patients, and embarrassment, litigation, and loss of revenue for healthcare providers. Recent experience has shown that mistakes are the most common cause of problems in health care as well as in other industrial environments. Excessive product and process complexity contributes to both excessive variation and unnecessary mistakes. The best methods for controlling variation, mistakes, and complexity are each a form of mistake proofing. Using these mistake proofing techniques, virtually every mistake and non-conformance can be controlled at a fraction of the cost of traditional quality control methods. PMID:14532368

  9. Application of QC_DR software for acceptance testing and routine quality control of direct digital radiography systems: initial experiences using the Italian Association of Physicist in Medicine quality control protocol.

    PubMed

    Nitrosi, Andrea; Bertolini, Marco; Borasi, Giovanni; Botti, Andrea; Barani, Adriana; Rivetti, Stefano; Pierotti, Luisa

    2009-12-01

    Ideally, medical x-ray imaging systems should be designed to deliver maximum image quality at an acceptable radiation risk to the patient. Quality assurance procedures are employed to ensure that these standards are maintained. A quality control protocol for direct digital radiography (DDR) systems is described and discussed. Software to automatically process and analyze the required images was developed. In this paper, the initial results obtained on equipment of different DDR manufacturers were reported. The protocol was developed to highlight even small discrepancies in standard operating performance.

  10. Quality Space and Launch Requirements, Addendum to AS9100C

    DTIC Science & Technology

    2015-05-08

    45 8.9.1 Statistical Process Control (SPC) .......................................................................... 45...SMC Space and Missile Systems Center SME Subject Matter Expert SOW Statement of Work SPC Statistical Process Control SPO System Program Office SRP...occur without any individual data exceeding the control limits. Control limits are developed using standard statistical methods or other approved

  11. Applying Statistical Process Quality Control Methodology to Educational Settings.

    ERIC Educational Resources Information Center

    Blumberg, Carol Joyce

    A subset of Statistical Process Control (SPC) methodology known as Control Charting is introduced. SPC methodology is a collection of graphical and inferential statistics techniques used to study the progress of phenomena over time. The types of control charts covered are the null X (mean), R (Range), X (individual observations), MR (moving…

  12. Improving Service Delivery in a County Health Department WIC Clinic: An Application of Statistical Process Control Techniques

    PubMed Central

    Boe, Debra Thingstad; Parsons, Helen

    2009-01-01

    Local public health agencies are challenged to continually improve service delivery, yet they frequently operate with constrained resources. Quality improvement methods and techniques such as statistical process control are commonly used in other industries, and they have recently been proposed as a means of improving service delivery and performance in public health settings. We analyzed a quality improvement project undertaken at a local Special Supplemental Nutrition Program for Women, Infants, and Children (WIC) clinic to reduce waiting times and improve client satisfaction with a walk-in nutrition education service. We used statistical process control techniques to evaluate initial process performance, implement an intervention, and assess process improvements. We found that implementation of these techniques significantly reduced waiting time and improved clients' satisfaction with the WIC service. PMID:19608964

  13. Quality evaluation and control of end cap welds in PHWR fuel elements by ultrasonic examination

    NASA Astrophysics Data System (ADS)

    Choi, M. S.; Yang, M. S.

    1991-02-01

    The current quality control procedure of nuclear fuel end cap weld is mainly dependent on the destructive metallographic examination. A nondestructive examination technique, i.e., ultrasonic examination, has been developed to identify and evaluate weld discontinuities. A few interesting results of the weld quality evaluation by applying the developed ultrasonic examination technique to PHWR fuel welds are presented. In addition, the feasibility of the weld quality control by the ultrasonic examination is discussed. This study shows that the ultrasonic examination is effective and reliable method for detecting abnormal weld contours and weld discontinuities such as micro-fissure, crack, upset split and expulsion, and can be used as a quality control tool for the end cap welding process.

  14. Self-regulation and quality of life in high-functioning young adults with autism.

    PubMed

    Dijkhuis, Renee R; Ziermans, Tim B; Van Rijn, Sophie; Staal, Wouter G; Swaab, Hanna

    2017-10-01

    Autism is generally associated with poor functional outcome but little is known about predictors of quality of life, especially during early adulthood. This study was conducted to assess subjective quality of life during early adulthood in high-functioning autism spectrum disorder and its relation with self-regulating abilities. Individuals with high-functioning autism spectrum disorder who progressed into post-secondary higher education ( N = 75) were compared to a typical peer control group ( N = 28) based on behavioral self-report questionnaires. The results indicated that individuals with high-functioning autism spectrum disorder reported significantly lower subjective quality of life than typical controls ( p < 0.001, effect size ( d) = 1.84). In addition, individuals with high-functioning autism spectrum disorder reported more problems with emotion processing ( p < 0.05, effect size ( d) = 0.79) and daily executive functioning ( p < 0.001, effect size ( d) = 1.29) than controls. A higher level of executive functioning problems was related to lower quality of life in the high-functioning autism spectrum disorder group, but no significant relation between level of emotion processing and subjective quality of life became apparent in the regression analysis. Our findings show that even in high-functioning young adults with autism, executive functioning, emotion processing, and subjective quality of life are low compared to typically developing peers. Furthermore, these results emphasize the importance of targeting executive functioning problems in individuals with autism to improve subjective quality of life.

  15. Regulatory mechanisms of RNA function: emerging roles of DNA repair enzymes.

    PubMed

    Jobert, Laure; Nilsen, Hilde

    2014-07-01

    The acquisition of an appropriate set of chemical modifications is required in order to establish correct structure of RNA molecules, and essential for their function. Modification of RNA bases affects RNA maturation, RNA processing, RNA quality control, and protein translation. Some RNA modifications are directly involved in the regulation of these processes. RNA epigenetics is emerging as a mechanism to achieve dynamic regulation of RNA function. Other modifications may prevent or be a signal for degradation. All types of RNA species are subject to processing or degradation, and numerous cellular mechanisms are involved. Unexpectedly, several studies during the last decade have established a connection between DNA and RNA surveillance mechanisms in eukaryotes. Several proteins that respond to DNA damage, either to process or to signal the presence of damaged DNA, have been shown to participate in RNA quality control, turnover or processing. Some enzymes that repair DNA damage may also process modified RNA substrates. In this review, we give an overview of the DNA repair proteins that function in RNA metabolism. We also discuss the roles of two base excision repair enzymes, SMUG1 and APE1, in RNA quality control.

  16. Thermal Catalytic Oxidation of Airborne Contaminants by a Reactor Using Ultra-Short Channel Length, Monolithic Catalyst Substrates

    NASA Technical Reports Server (NTRS)

    Perry, J. L.; Tomes, K. M.; Tatara, J. D.

    2005-01-01

    Contaminated air, whether in a crewed spacecraft cabin or terrestrial work and living spaces, is a pervasive problem affecting human health, performance, and well being. The need for highly effective, economical air quality processes spans a wide range of terrestrial and space flight applications. Typically, air quality control processes rely on absorption-based processes. Most industrial packed-bed adsorption processes use activated carbon. Once saturated, the carbon is either dumped or regenerated. In either case, the dumped carbon and concentrated waste streams constitute a hazardous waste that must be handled safely while minimizing environmental impact. Thermal catalytic oxidation processes designed to address waste handling issues are moving to the forefront of cleaner air quality control and process gas decontamination processes. Careful consideration in designing the catalyst substrate and reactor can lead to more complete contaminant destruction and poisoning resistance. Maintenance improvements leading to reduced waste handling and process downtime can also be realized. Performance of a prototype thermal catalytic reaction based on ultra-short waste channel, monolith catalyst substrate design, under a variety of process flow and contaminant loading conditions, is discussed.

  17. Patient perspectives of telemedicine quality

    PubMed Central

    LeRouge, Cynthia M; Garfield, Monica J; Hevner, Alan R

    2015-01-01

    Background The purpose of this study was to explore the quality attributes required for effective telemedicine encounters from the perspective of the patient. Methods We used a multi-method (direct observation, focus groups, survey) field study to collect data from patients who had experienced telemedicine encounters. Multi-perspectives (researcher and provider) were used to interpret a rich set of data from both a research and practice perspective. Results The result of this field study is a taxonomy of quality attributes for telemedicine service encounters that prioritizes the attributes from the patient perspective. We identify opportunities to control the level of quality for each attribute (ie, who is responsible for control of each attribute and when control can be exerted in relation to the encounter process). This analysis reveals that many quality attributes are in the hands of various stakeholders, and all attributes can be addressed proactively to some degree before the encounter begins. Conclusion Identification of the quality attributes important to a telemedicine encounter from a patient perspective enables one to better design telemedicine encounters. This preliminary work not only identifies such attributes, but also ascertains who is best able to address quality issues prior to an encounter. For practitioners, explicit representation of the quality attributes of technology-based systems and processes and insight on controlling key attributes are essential to implementation, utilization, management, and common understanding. PMID:25565781

  18. 75 FR 32858 - Approval and Promulgation of Air Quality Implementation Plans; Delaware; Control of Nitrogen...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-10

    ... Boilers and Process Heaters at Petroleum Refineries Correction In rule document 2010-13377 beginning on... limitations for Control [Insert page number any industrial boiler or Requirements. where the document process...

  19. Process analytical technologies (PAT) in freeze-drying of parenteral products.

    PubMed

    Patel, Sajal Manubhai; Pikal, Michael

    2009-01-01

    Quality by Design (QbD), aims at assuring quality by proper design and control, utilizing appropriate Process Analytical Technologies (PAT) to monitor critical process parameters during processing to ensure that the product meets the desired quality attributes. This review provides a comprehensive list of process monitoring devices that can be used to monitor critical process parameters and will focus on a critical review of the viability of the PAT schemes proposed. R&D needs in PAT for freeze-drying have also been addressed with particular emphasis on batch techniques that can be used on all the dryers independent of the dryer scale.

  20. Langley Wind Tunnel Data Quality Assurance-Check Standard Results

    NASA Technical Reports Server (NTRS)

    Hemsch, Michael J.; Grubb, John P.; Krieger, William B.; Cler, Daniel L.

    2000-01-01

    A framework for statistical evaluation, control and improvement of wind funnel measurement processes is presented The methodology is adapted from elements of the Measurement Assurance Plans developed by the National Bureau of Standards (now the National Institute of Standards and Technology) for standards and calibration laboratories. The present methodology is based on the notions of statistical quality control (SQC) together with check standard testing and a small number of customer repeat-run sets. The results of check standard and customer repeat-run -sets are analyzed using the statistical control chart-methods of Walter A. Shewhart long familiar to the SQC community. Control chart results are presented for. various measurement processes in five facilities at Langley Research Center. The processes include test section calibration, force and moment measurements with a balance, and instrument calibration.

  1. Nuclear Technology Series. Course 3: Principles of Process Instrumentation.

    ERIC Educational Resources Information Center

    Technical Education Research Center, Waco, TX.

    This technical specialty course is one of thirty-five courses designed for use by two-year postsecondary institutions in five nuclear technician curriculum areas: (1) radiation protection technician, (2) nuclear instrumentation and control technician, (3) nuclear materials processing technician, (4) nuclear quality-assurance/quality-control…

  2. 38 CFR 36.4348 - Servicer Appraisal Processing Program.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... requirement, routine reviews of SAPP cases will be made by VA staff based upon quality control procedures..., that its activities do not deviate from high standards of integrity. The quality control system must include frequent, periodic audits that specifically address the appraisal review activity. These audits...

  3. 38 CFR 36.4348 - Servicer Appraisal Processing Program.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... requirement, routine reviews of SAPP cases will be made by VA staff based upon quality control procedures..., that its activities do not deviate from high standards of integrity. The quality control system must include frequent, periodic audits that specifically address the appraisal review activity. These audits...

  4. 38 CFR 36.4348 - Servicer Appraisal Processing Program.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... requirement, routine reviews of SAPP cases will be made by VA staff based upon quality control procedures..., that its activities do not deviate from high standards of integrity. The quality control system must include frequent, periodic audits that specifically address the appraisal review activity. These audits...

  5. 38 CFR 36.4348 - Servicer Appraisal Processing Program.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... requirement, routine reviews of SAPP cases will be made by VA staff based upon quality control procedures..., that its activities do not deviate from high standards of integrity. The quality control system must include frequent, periodic audits that specifically address the appraisal review activity. These audits...

  6. Lessons Learned from the Implementation of Total Quality Management at the Naval Aviation Depot, North Island, California

    DTIC Science & Technology

    1988-12-01

    Kaoru Ishikawa recognized the potential of statistical process control during one of Dr. Deming’s many instructional visits to Japan. He wrote the Guide...to Quality Control which has been utilized for both self-study and classroom training. In the Guide to Quality Control, Dr. Ishikawa describes...job data are essential for making a proper evaluation.( Ishikawa , p. 14) The gathering of data and its subsequent analysis are the foundation of

  7. Particulate organic matter quality influences nitrate retention and denitrification in stream sediments: evidence from a carbon burial experiment

    USGS Publications Warehouse

    Stelzer, Robert S.; Scott, J. Thad; Bartsch, Lynn; Parr, Thomas B.

    2014-01-01

    Organic carbon supply is linked to nitrogen transformation in ecosystems. However, the role of organic carbon quality in nitrogen processing is not as well understood. We determined how the quality of particulate organic carbon (POC) influenced nitrogen transformation in stream sediments by burying identical quantities of varying quality POC (northern red oak (Quercus rubra) leaves, red maple (Acer rubrum) leaves, red maple wood) in stream mesocosms and measuring the effects on nitrogen retention and denitrification compared to a control of combusted sand. We also determined how POC quality affected the quantity and quality of dissolved organic carbon (DOC) and dissolved oxygen concentration in groundwater. Nitrate and total dissolved nitrogen (TDN) retention were assessed by comparing solute concentrations and fluxes along groundwater flow paths in the mesocosms. Denitrification was measured by in situ changes in N2 concentrations (using MIMS) and by acetylene block incubations. POC quality was measured by C:N and lignin:N ratios and DOC quality was assessed by fluorescence excitation emission matrix spectroscopy. POC quality had strong effects on nitrogen processing. Leaf treatments had much higher nitrate retention, TDN retention and denitrification rates than the wood and control treatments and red maple leaf burial resulted in higher nitrate and TDN retention rates than burial of red oak leaves. Leaf, but not wood, burial drove pore water to severe hypoxia and leaf treatments had higher DOC production and different DOC chemical composition than the wood and control treatments. We think that POC quality affected nitrogen processing in the sediments by influencing the quantity and quality of DOC and redox conditions. Our results suggest that the type of organic carbon inputs can affect the rates of nitrogen transformation in stream ecosystems.

  8. In-situ quality monitoring during laser brazing

    NASA Astrophysics Data System (ADS)

    Ungers, Michael; Fecker, Daniel; Frank, Sascha; Donst, Dmitri; Märgner, Volker; Abels, Peter; Kaierle, Stefan

    Laser brazing of zinc coated steel is a widely established manufacturing process in the automotive sector, where high quality requirements must be fulfilled. The strength, impermeablitiy and surface appearance of the joint are particularly important for judging its quality. The development of an on-line quality control system is highly desired by the industry. This paper presents recent works on the development of such a system, which consists of two cameras operating in different spectral ranges. For the evaluation of the system, seam imperfections are created artificially during experiments. Finally image processing algorithms for monitoring process parameters based the captured images are presented.

  9. Quality Assurance and School Monitoring in Hong Kong

    ERIC Educational Resources Information Center

    Mok, Magdalena Mo Ching

    2007-01-01

    This study reports on the Hong Kong education quality assurance and school monitoring system. Three research questions were addressed: (1) Who controls the quality of school education in Hong Kong? (2) What strategies are used in the Hong Kong school education quality assurance process? (3) Agenda for Future Research on quality assurance and…

  10. Automatic Control of the Concrete Mixture Homogeneity in Cycling Mixers

    NASA Astrophysics Data System (ADS)

    Anatoly Fedorovich, Tikhonov; Drozdov, Anatoly

    2018-03-01

    The article describes the factors affecting the concrete mixture quality related to the moisture content of aggregates, since the effectiveness of the concrete mixture production is largely determined by the availability of quality management tools at all stages of the technological process. It is established that the unaccounted moisture of aggregates adversely affects the concrete mixture homogeneity and, accordingly, the strength of building structures. A new control method and the automatic control system of the concrete mixture homogeneity in the technological process of mixing components have been proposed, since the tasks of providing a concrete mixture are performed by the automatic control system of processing kneading-and-mixing machinery with operational automatic control of homogeneity. Theoretical underpinnings of the control of the mixture homogeneity are presented, which are related to a change in the frequency of vibrodynamic vibrations of the mixer body. The structure of the technical means of the automatic control system for regulating the supply of water is determined depending on the change in the concrete mixture homogeneity during the continuous mixing of components. The following technical means for establishing automatic control have been chosen: vibro-acoustic sensors, remote terminal units, electropneumatic control actuators, etc. To identify the quality indicator of automatic control, the system offers a structure flowchart with transfer functions that determine the ACS operation in transient dynamic mode.

  11. Some Improvements in H-PDLCs

    NASA Technical Reports Server (NTRS)

    Crawford, Gregory P.; Li, Liuliu

    2005-01-01

    Some improvements have been made in the formulation of holographically formed polymer-dispersed liquid crystals (H-PDLCs) and in the fabrication of devices made from these materials, with resulting improvements in performance. H-PDLCs are essentially volume Bragg gratings. Devices made from H-PDLCs function as electrically switchable reflective filters. Heretofore, it has been necessary to apply undesirably high drive voltages in order to switch H-PDLC devices. Many scientific papers on H-PDLCs and on the potential utility of H-PDLC devices for display and telecommunication applications have been published. However, until now, little has been published about improving quality control in synthesis of H-PDLCs and fabrication of H-PDLC devices to minimize (1) spatial nonuniformities within individual devices, (2) nonuniformities among nominally identical devices, and (3) variations in performance among nominally identical devices. The improvements reported here are results of a research effort directed partly toward solving these quality-control problems and partly toward reducing switching voltages. The quality-control improvements include incorporation of a number of process controls to create a relatively robust process, such that the H-PDLC devices fabricated in this process are more nearly uniform than were those fabricated in a prior laboratory-type process. The improved process includes ultrasonic mixing, ultrasonic cleaning, the use of a micro dispensing technique, and the use of a bubble press.

  12. Modeling in the quality by design environment: Regulatory requirements and recommendations for design space and control strategy appointment.

    PubMed

    Djuris, Jelena; Djuric, Zorica

    2017-11-30

    Mathematical models can be used as an integral part of the quality by design (QbD) concept throughout the product lifecycle for variety of purposes, including appointment of the design space and control strategy, continual improvement and risk assessment. Examples of different mathematical modeling techniques (mechanistic, empirical and hybrid) in the pharmaceutical development and process monitoring or control are provided in the presented review. In the QbD context, mathematical models are predominantly used to support design space and/or control strategies. Considering their impact to the final product quality, models can be divided into the following categories: high, medium and low impact models. Although there are regulatory guidelines on the topic of modeling applications, review of QbD-based submission containing modeling elements revealed concerns regarding the scale-dependency of design spaces and verification of models predictions at commercial scale of manufacturing, especially regarding real-time release (RTR) models. Authors provide critical overview on the good modeling practices and introduce concepts of multiple-unit, adaptive and dynamic design space, multivariate specifications and methods for process uncertainty analysis. RTR specification with mathematical model and different approaches to multivariate statistical process control supporting process analytical technologies are also presented. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. 75 FR 31711 - Approval and Promulgation of Air Quality Implementation Plans; Delaware; Control of Nitrogen...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-04

    ... Boilers and Process Heaters at Petroleum Refineries AGENCY: Environmental Protection Agency (EPA). ACTION... controlling nitrogen oxide (NOx) emissions from industrial boilers. This action is being taken under the Clean...--Control of Nitrogen Oxide Emissions from Industrial Boilers and Process Heaters at Petroleum Refineries in...

  14. Effective Process Design for the Production of HIC-Resistant Linepipe Steels

    NASA Astrophysics Data System (ADS)

    Nieto, J.; Elías, T.; López, G.; Campos, G.; López, F.; Garcia, R.; De, Amar K.

    2013-09-01

    Production of slabs for sour service applications requires stringent control in slab internal quality and secondary processing so as to guarantee resistance against hydrogen-induced cracking (HIC). ArcelorMittal Steelmaking facility at Lazaro Cardenas, Mexico had recently implemented key steelmaking and casting processing technologies for production of sound, centerline free slabs for catering to the growing API Linepipe and off-shore market for sour service applications. State-of-the-art steelmaking with use of residual-free Direct-reduced Iron and continuous casting facilities with dynamic soft reduction were introduced for the production of slabs with ultra clean centerline. Introduction of controlled cooling of slabs for atomic hydrogen control well below 2 ppm has enabled production of slabs suitable for excellent HIC-resistant plate processing. Substantial tonnages of slabs were produced for production of API X52-X65 grade plates and pipes for sour service. Stringent quality control at each stage of steelmaking, casting, and slab inspection ensured slabs with excellent internal quality suitable for HIC resistance to be guaranteed in final product (Plates & Pipes). Details of production steps which resulted in successful HIC-resistant slab production have been described in this article.

  15. Quality Space and Launch Requirements Addendum to AS9100C

    DTIC Science & Technology

    2015-03-05

    45 8.9.1 Statistical Process Control (SPC) .......................................................................... 45 8.9.1.1 Out of Control...Systems Center SME Subject Matter Expert SOW Statement of Work SPC Statistical Process Control SPO System Program Office SRP Standard Repair...individual data exceeding the control limits. Control limits are developed using standard statistical methods or other approved techniques and are based on

  16. Guidelines for the processing and quality assurance of benthic invertebrate samples collected as part of the National Water-Quality Assessment Program

    USGS Publications Warehouse

    Cuffney, T.F.; Gurtz, M.E.; Meador, M.R.

    1993-01-01

    Benthic invertebrate samples are collected as part of the U.S. Geological Survey's National Water-Quality Assessment Program. This is a perennial, multidisciplinary program that integrates biological, physical, and chemical indicators of water quality to evaluate status and trends and to develop an understanding of the factors controlling observed water quality. The Program examines water quality in 60 study units (coupled ground- and surface-water systems) that encompass most of the conterminous United States and parts of Alaska and Hawaii. Study-unit teams collect and process qualitative and semi-quantitative invertebrate samples according to standardized procedures. These samples are processed (elutriated and subsampled) in the field to produce as many as four sample components: large-rare, main-body, elutriate, and split. Each sample component is preserved in 10-percent formalin, and two components, large-rare and main-body, are sent to contract laboratories for further processing. The large-rare component is composed of large invertebrates that are removed from the sample matrix during field processing and placed in one or more containers. The main-body sample component consists of the remaining sample materials (sediment, detritus, and invertebrates) and is subsampled in the field to achieve a volume of 750 milliliters or less. The remaining two sample components, elutriate and split, are used for quality-assurance and quality-control purposes. Contract laboratories are used to identify and quantify invertebrates from the large-rare and main-body sample components according to the procedures and guidelines specified within this document. These guidelines allow the use of subsampling techniques to reduce the volume of sample material processed and to facilitate identifications. These processing procedures and techniques may be modified if the modifications provide equal or greater levels of accuracy and precision. The intent of sample processing is to determine the quantity of each taxon present in the semi-quantitative samples or to list the taxa present in qualitative samples. The processing guidelines provide standardized laboratory forms, sample labels, detailed sample processing flow charts, standardized format for electronic data, quality-assurance procedures and checks, sample tracking standards, and target levels for taxonomic determinations. The contract laboratory (1) is responsible for identifications and quantifications, (2) constructs reference collections, (3) provides data in hard copy and electronic forms, (4) follows specified quality-assurance and quality-control procedures, and (5) returns all processed and unprocessed portions of the samples. The U.S. Geological Survey's Quality Management Group maintains a Biological Quality-Assurance Unit, located at the National Water-Quality Laboratory, Arvada, Colorado, to oversee the use of contract laboratories and ensure the quality of data obtained from these laboratories according to the guidelines established in this document. This unit establishes contract specifications, reviews contractor performance (timeliness, accuracy, and consistency), enters data into the National Water Information System-II data base, maintains in-house reference collections, deposits voucher specimens in outside museums, and interacts with taxonomic experts within and outside the U.S. Geological Survey. This unit also modifies the existing sample processing and quality-assurance guidelines, establishes criteria and testing procedures for qualifying potential contract laboratories, identifies qualified taxonomic experts, and establishes voucher collections.

  17. Choice of mathematical models for technological process of glass rod drawing

    NASA Astrophysics Data System (ADS)

    Alekseeva, L. B.

    2017-10-01

    The technological process of drawing glass rods (light guides) is considered. Automated control of the drawing process is reduced to the process of making decisions to ensure a given quality. The drawing process is considered as a control object, including the drawing device (control device) and the optical fiber forming zone (control object). To study the processes occurring in the formation zone, mathematical models are proposed, based on the continuum mechanics basics. To assess the influence of disturbances, a transfer function is obtained from the basis of the wave equation. Obtaining the regression equation also adequately describes the drawing process.

  18. Framework for establishing records control in hospitals as an ISO 9001 requirement.

    PubMed

    Al-Qatawneh, Lina

    2017-02-13

    Purpose The purpose of this paper is to present the process followed to control records in a Jordanian private community hospital as an ISO 9001:2008 standard requirement. Design/methodology/approach Under the hospital quality council's supervision, the quality management and development office staff were responsible for designing, planning and implementing the quality management system (QMS) using the ISO 9001:2008 standard. A policy for records control was established. An action plan for establishing the records control was developed and implemented. On completion, a coding system for records was specified to be used by hospital staff. Finally, an internal audit was performed to verify conformity to the ISO 9001:2008 standard requirements. Findings Successful certification by a neutral body ascertained that the hospital's QMS conformed to the ISO 9001:2008 requirements. A framework was developed that describes the records controlling process, which can be used by staff in any healthcare organization wanting to achieve ISO 9001:2008 accreditation. Originality/value Given the increased interest among healthcare organizations to achieve the ISO 9001 certification, the proposed framework for establishing records control is developed and is expected to be a valuable management tool to improve and sustain healthcare quality.

  19. MO-A-16A-01: QA Procedures and Metrics: In Search of QA Usability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sathiaseelan, V; Thomadsen, B

    Radiation therapy has undergone considerable changes in the past two decades with a surge of new technology and treatment delivery methods. The complexity of radiation therapy treatments has increased and there has been increased awareness and publicity about the associated risks. In response, there has been proliferation of guidelines for medical physicists to adopt to ensure that treatments are delivered safely. Task Group recommendations are copious, and clinical physicists' hours are longer, stretched to various degrees between site planning and management, IT support, physics QA, and treatment planning responsibilities.Radiation oncology has many quality control practices in place to ensure themore » delivery of high-quality, safe treatments. Incident reporting systems have been developed to collect statistics about near miss events at many radiation oncology centers. However, tools are lacking to assess the impact of these various control measures. A recent effort to address this shortcoming is the work of Ford et al (2012) who recently published a methodology enumerating quality control quantification for measuring the effectiveness of safety barriers. Over 4000 near-miss incidents reported from 2 academic radiation oncology clinics were analyzed using quality control quantification, and a profile of the most effective quality control measures (metrics) was identified.There is a critical need to identify a QA metric to help the busy clinical physicists to focus their limited time and resources most effectively in order to minimize or eliminate errors in the radiation treatment delivery processes. In this symposium the usefulness of workflows and QA metrics to assure safe and high quality patient care will be explored.Two presentations will be given:Quality Metrics and Risk Management with High Risk Radiation Oncology ProceduresStrategies and metrics for quality management in the TG-100 Era Learning Objectives: Provide an overview and the need for QA usability metrics: Different cultures/practices affecting the effectiveness of methods and metrics. Show examples of quality assurance workflows, Statistical process control, that monitor the treatment planning and delivery process to identify errors. To learn to identify and prioritize risks and QA procedures in radiation oncology. Try to answer the question: Can a quality assurance program aided by quality assurance metrics help minimize errors and ensure safe treatment delivery. Should such metrics be institution specific.« less

  20. Total Quality Management in a Knowledge Management Perspective.

    ERIC Educational Resources Information Center

    Johannsen, Carl Gustav

    2000-01-01

    Presents theoretical considerations on both similarities and differences between information management and knowledge management and presents a conceptual model of basic knowledge management processes. Discusses total quality management and quality control in the context of information management. (Author/LRW)

  1. Feasibility study of using statistical process control to customized quality assurance in proton therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rah, Jeong-Eun; Oh, Do Hoon; Shin, Dongho

    Purpose: To evaluate and improve the reliability of proton quality assurance (QA) processes and, to provide an optimal customized tolerance level using the statistical process control (SPC) methodology. Methods: The authors investigated the consistency check of dose per monitor unit (D/MU) and range in proton beams to see whether it was within the tolerance level of the daily QA process. This study analyzed the difference between the measured and calculated ranges along the central axis to improve the patient-specific QA process in proton beams by using process capability indices. Results: The authors established a customized tolerance level of ±2% formore » D/MU and ±0.5 mm for beam range in the daily proton QA process. In the authors’ analysis of the process capability indices, the patient-specific range measurements were capable of a specification limit of ±2% in clinical plans. Conclusions: SPC methodology is a useful tool for customizing the optimal QA tolerance levels and improving the quality of proton machine maintenance, treatment delivery, and ultimately patient safety.« less

  2. 40 CFR 98.284 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... accounting purposes including direct measurement weighing the petroleum coke fed into your process (by belt... used to ensure the accuracy of monthly petroleum coke consumption measurements. (c) For CO2 process... quality assurance and quality control of the supplier data, you must conduct an annual measurement of the...

  3. Issues of Teaching Metrology in Higher Education Institutions of Civil Engineering in Russia

    ERIC Educational Resources Information Center

    Pukharenko, Yurii Vladimirovich; Norin, Veniamin Aleksandrovich

    2017-01-01

    The work analyses the training process condition in teaching the discipline "Metrology, Standardization, Certification and Quality Control." It proves that the current educational standard regarding the instruction of the discipline "Metrology, Standardization, Certification and Quality Control" does not meet the needs of the…

  4. Nuclear Technology Series. Course 6: Instrumentation and Control of Reactors and Plant Systems.

    ERIC Educational Resources Information Center

    Center for Occupational Research and Development, Inc., Waco, TX.

    This technical specialty course is one of thirty-five courses designed for use by two-year postsecondary institutions in five nuclear technician curriculum areas: (1) radiation protection technician, (2) nuclear instrumentation and control technician, (3) nuclear materials processing technician, (4) nuclear quality-assurance/quality-control…

  5. A method of setting limits for the purpose of quality assurance

    NASA Astrophysics Data System (ADS)

    Sanghangthum, Taweap; Suriyapee, Sivalee; Kim, Gwe-Ya; Pawlicki, Todd

    2013-10-01

    The result from any assurance measurement needs to be checked against some limits for acceptability. There are two types of limits; those that define clinical acceptability (action limits) and those that are meant to serve as a warning that the measurement is close to the action limits (tolerance limits). Currently, there is no standard procedure to set these limits. In this work, we propose an operational procedure to set tolerance limits and action limits. The approach to establish the limits is based on techniques of quality engineering using control charts and a process capability index. The method is different for tolerance limits and action limits with action limits being categorized into those that are specified and unspecified. The procedure is to first ensure process control using the I-MR control charts. Then, the tolerance limits are set equal to the control chart limits on the I chart. Action limits are determined using the Cpm process capability index with the requirements that the process must be in-control. The limits from the proposed procedure are compared to an existing or conventional method. Four examples are investigated: two of volumetric modulated arc therapy (VMAT) point dose quality assurance (QA) and two of routine linear accelerator output QA. The tolerance limits range from about 6% larger to 9% smaller than conventional action limits for VMAT QA cases. For the linac output QA, tolerance limits are about 60% smaller than conventional action limits. The operational procedure describe in this work is based on established quality management tools and will provide a systematic guide to set up tolerance and action limits for different equipment and processes.

  6. Opportunities for Applied Behavior Analysis in the Total Quality Movement.

    ERIC Educational Resources Information Center

    Redmon, William K.

    1992-01-01

    This paper identifies critical components of recent organizational quality improvement programs and specifies how applied behavior analysis can contribute to quality technology. Statistical Process Control and Total Quality Management approaches are compared, and behavior analysts are urged to build their research base and market behavior change…

  7. [Quality assurance and quality improvement. Personal experiences and intentions].

    PubMed

    Roche, B G; Sommer, C

    1995-01-01

    In may 1994 we were selected by the surgical Swiss association to make a study about quality in USA. During our travel we visited 3 types of institutions: Hospitals, National Institute of standard and Technology, Industry, Johnson & Johnson. We appreciate to compare 2 types of quality programs: Quality Assurance (QA) and Continuous Quality Improvement (CQI). In traditional healthcare circles, QA is the process established to meet external regulatory requirements and to assure that patient care is consistent with established standards. In a modern quality terms, QA outside of healthcare means designing a product or service, as well as controlling its production, so well that quality is inevitable. The ideas of W. Edward Deming is that there is never improvement just by inspection. He developed a theory based on 14 principles. A productive work is accomplished through processes. Understanding the variability of processes is a key to improve quality. Quality management sees each person in an organisation as part of one or more processes. The job of every worker is to receive the work of others, add value to that work, and supply it to the next person in the process. This is called the triple role the workers as customer, processor, and supplier. The main source of quality defects is problems in the process. The old assumption is that quality fails when people do the right thing wrong; the new assumption is that, more often, quality failures arise when people do the wrong think right. Exhortation, incentives and discipline of workers are unlikely to improve quality. If quality is failing when people do their jobs as designed, then exhorting them to do better is managerial nonsense. Modern quality theory is customer focused. Customers are identified internally and externally. The modern approach to quality is thoroughly grounded in scientific and statistical thinking. Like in medicine, the symptom is a defect in quality. The therapist of process must perform diagnostic test, formulate hypotheses of cause, test those hypotheses, apply remedies, and assess the effect of remedies. Total employee involvement is critical. A power comes from enabling all employees to become involved in quality improvement. A great advantage of CQI is the prevention orientation of the concept. The CQI permeated a collegial approach, people learn how to work together to improve. CQI is a time consuming procedure. During our travel we learned the definition of quality as the customer satisfaction. To build a CQI concept in employed time but all employed are involved in quality improvement. Applying CQI we could be able to refuse Quality control programs.

  8. Effect of pay for performance to improve quality of maternal and child care in low- and middle-income countries: a systematic review.

    PubMed

    Das, Ashis; Gopalan, Saji S; Chandramohan, Daniel

    2016-04-14

    Pay for Performance (P4P) mechanisms to health facilities and providers are currently being tested in several low- and middle-income countries (LMIC) to improve maternal and child health (MCH). This paper reviews the existing evidence on the effect of P4P program on quality of MCH care in LMICs. A systematic review of literature was conducted according to a registered protocol. MEDLINE, Web of Science, and Embase were searched using the key words maternal care, quality of care, ante natal care, emergency obstetric and neonatal care (EmONC) and child care. Of 4535 records retrieved, only eight papers met the inclusion criteria. Primary outcome of interest was quality of MCH disaggregated into structural quality, process quality and outcomes. Risk of bias across studies was assessed through a customized quality checklist. There were four controlled before after intervention studies, three cluster randomized controlled trials and one case control with post-intervention comparison of P4P programs for MCH care in Burundi, Democratic Republic of Congo, Egypt, the Philippines, and Rwanda. There is some evidence of positive effect of P4P only on process quality of MCH. The effect of P4P on delivery, EmONC, post natal care and under-five child care were not evaluated in these studies. There is weak evidence for P4P's positive effect on maternal and neonatal health outcomes and out-of-pocket expenses. P4P program had a few negative effects on structural quality. P4P is effective to improve process quality of ante natal care. However, further research is needed to understand P4P's impact on MCH and their causal pathways in LMICs. PROSPERO registration number CRD42014013077 .

  9. Recent developments in smart freezing technology applied to fresh foods.

    PubMed

    Xu, Ji-Cheng; Zhang, Min; Mujumdar, Arun S; Adhikari, Benu

    2017-09-02

    Due to the increased awareness of consumers in sensorial and nutritional quality of frozen foods, the freezing technology has to seek new and innovative technologies for better retaining the fresh like quality of foods. In this article, we review the recent developments in smart freezing technology applied to fresh foods. The application of these intelligent technologies and the associated underpinning concepts have greatly improved the quality of frozen foods and the freezing efficiency. These technologies are able to automatically collect the information in-line during freezing and help control the freezing process better. Smart freezing technology includes new and intelligent technologies and concepts applied to the pretreatment of the frozen product, freezing processes, cold chain logistics as well as warehouse management. These technologies enable real-time monitoring of quality during the freezing process and help improve product quality and freezing efficiency. We also provide a brief overview of several sensing technologies used to achieve automatic control of individual steps of freezing process. These sensing technologies include computer vision, electronic nose, electronic tongue, digital simulation, confocal laser, near infrared spectroscopy, nuclear magnetic resonance technology and ultrasound. Understanding of the mechanism of these new technologies will be helpful for applying them to improve the quality of frozen foods.

  10. An empirical evaluation of software quality assurance practices and challenges in a developing country: a comparison of Nigeria and Turkey.

    PubMed

    Sowunmi, Olaperi Yeside; Misra, Sanjay; Fernandez-Sanz, Luis; Crawford, Broderick; Soto, Ricardo

    2016-01-01

    The importance of quality assurance in the software development process cannot be overemphasized because its adoption results in high reliability and easy maintenance of the software system and other software products. Software quality assurance includes different activities such as quality control, quality management, quality standards, quality planning, process standardization and improvement amongst others. The aim of this work is to further investigate the software quality assurance practices of practitioners in Nigeria. While our previous work covered areas on quality planning, adherence to standardized processes and the inherent challenges, this work has been extended to include quality control, software process improvement and international quality standard organization membership. It also makes comparison based on a similar study carried out in Turkey. The goal is to generate more robust findings that can properly support decision making by the software community. The qualitative research approach, specifically, the use of questionnaire research instruments was applied to acquire data from software practitioners. In addition to the previous results, it was observed that quality assurance practices are quite neglected and this can be the cause of low patronage. Moreover, software practitioners are neither aware of international standards organizations or the required process improvement techniques; as such their claimed standards are not aligned to those of accredited bodies, and are only limited to their local experience and knowledge, which makes it questionable. The comparison with Turkey also yielded similar findings, making the results typical of developing countries. The research instrument used was tested for internal consistency using the Cronbach's alpha, and it was proved reliable. For the software industry in developing countries to grow strong and be a viable source of external revenue, software assurance practices have to be taken seriously because its effect is evident in the final product. Moreover, quality frameworks and tools which require minimum time and cost are highly needed in these countries.

  11. Application of machine learning and expert systems to Statistical Process Control (SPC) chart interpretation

    NASA Technical Reports Server (NTRS)

    Shewhart, Mark

    1991-01-01

    Statistical Process Control (SPC) charts are one of several tools used in quality control. Other tools include flow charts, histograms, cause and effect diagrams, check sheets, Pareto diagrams, graphs, and scatter diagrams. A control chart is simply a graph which indicates process variation over time. The purpose of drawing a control chart is to detect any changes in the process signalled by abnormal points or patterns on the graph. The Artificial Intelligence Support Center (AISC) of the Acquisition Logistics Division has developed a hybrid machine learning expert system prototype which automates the process of constructing and interpreting control charts.

  12. Guidelines for Risk-Based Changeover of Biopharma Multi-Product Facilities.

    PubMed

    Lynch, Rob; Barabani, David; Bellorado, Kathy; Canisius, Peter; Heathcote, Doug; Johnson, Alan; Wyman, Ned; Parry, Derek Willison

    2018-01-01

    In multi-product biopharma facilities, the protection from product contamination due to the manufacture of multiple products simultaneously is paramount to assure product quality. To that end, the use of traditional changeover methods (elastomer change-out, full sampling, etc.) have been widely used within the industry and have been accepted by regulatory agencies. However, with the endorsement of Quality Risk Management (1), the use of risk-based approaches may be applied to assess and continuously improve established changeover processes. All processes, including changeover, can be improved with investment (money/resources), parallel activities, equipment design improvements, and standardization. However, processes can also be improved by eliminating waste. For product changeover, waste is any activity not needed for the new process or that does not provide added assurance of the quality of the subsequent product. The application of a risk-based approach to changeover aligns with the principles of Quality Risk Management. Through the use of risk assessments, the appropriate changeover controls can be identified and controlled to assure product quality is maintained. Likewise, the use of risk assessments and risk-based approaches may be used to improve operational efficiency, reduce waste, and permit concurrent manufacturing of products. © PDA, Inc. 2018.

  13. Long story short: an introduction to the short-term and long-term Six Sigma quality and its importance in the laboratory medicine for the management of extra-analytical processes.

    PubMed

    Ialongo, Cristiano; Bernardini, Sergio

    2018-06-18

    There is a compelling need for quality tools that enable effective control of the extra-analytical phase. In this regard, Six Sigma seems to offer a valid methodological and conceptual opportunity, and in recent times, the International Federation of Clinical Chemistry and Laboratory Medicine has adopted it for indicating the performance requirements for non-analytical laboratory processes. However, the Six Sigma implies a distinction between short-term and long-term quality that is based on the dynamics of the processes. These concepts are still not widespread and applied in the field of laboratory medicine although they are of fundamental importance to exploit the full potential of this methodology. This paper reviews the Six Sigma quality concepts and shows how they originated from Shewhart's control charts, in respect of which they are not an alternative but a completion. It also discusses the dynamic nature of process and how it arises, concerning particularly the long-term dynamic mean variation, and explains why this leads to the fundamental distinction of quality we previously mentioned.

  14. Mathematical modeling for resource and energy saving control of extruders in multi-assortment productions of polymeric films

    NASA Astrophysics Data System (ADS)

    Polosin, A. N.; Chistyakova, T. B.

    2018-05-01

    In this article, the authors describe mathematical modeling of polymer processing in extruders of various types used in extrusion and calender productions of film materials. The method consists of the synthesis of a static model for calculating throughput, energy consumption of the extruder, extrudate quality indices, as well as a dynamic model for evaluating polymer residence time in the extruder, on which the quality indices depend. Models are adjusted according to the extruder type (single-screw, reciprocating, twin-screw), its screw and head configuration, extruder’s work temperature conditions, and the processed polymer type. Models enable creating extruder screw configurations and determining extruder controlling action values that provide the extrudate of required quality while satisfying extruder throughput and energy consumption requirements. Model adequacy has been verified using polyolefins’ and polyvinylchloride processing data in different extruders. The program complex, based on mathematical models, has been developed in order to control extruders of various types in order to ensure resource and energy saving in multi-assortment productions of polymeric films. Using the program complex in the control system for the extrusion stage of the polymeric film productions enables improving film quality, reducing spoilage, lessening the time required for production line change-over to other throughput and film type assignment.

  15. Quality control of mRNP biogenesis: networking at the transcription site.

    PubMed

    Eberle, Andrea B; Visa, Neus

    2014-08-01

    Eukaryotic cells carry out quality control (QC) over the processes of RNA biogenesis to inactivate or eliminate defective transcripts, and to avoid their production. In the case of protein-coding transcripts, the quality controls can sense defects in the assembly of mRNA-protein complexes, in the processing of the precursor mRNAs, and in the sequence of open reading frames. Different types of defect are monitored by different specialized mechanisms. Some of them involve dedicated factors whose function is to identify faulty molecules and target them for degradation. Others are the result of a more subtle balance in the kinetics of opposing activities in the mRNA biogenesis pathway. One way or another, all such mechanisms hinder the expression of the defective mRNAs through processes as diverse as rapid degradation, nuclear retention and transcriptional silencing. Three major degradation systems are responsible for the destruction of the defective transcripts: the exosome, the 5'-3' exoribonucleases, and the nonsense-mediated mRNA decay (NMD) machinery. This review summarizes recent findings on the cotranscriptional quality control of mRNA biogenesis, and speculates that a protein-protein interaction network integrates multiple mRNA degradation systems with the transcription machinery. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. Continuous processing and the applications of online tools in pharmaceutical product manufacture: developments and examples.

    PubMed

    Ooi, Shing Ming; Sarkar, Srimanta; van Varenbergh, Griet; Schoeters, Kris; Heng, Paul Wan Sia

    2013-04-01

    Continuous processing and production in pharmaceutical manufacturing has received increased attention in recent years mainly due to the industries' pressing needs for more efficient, cost-effective processes and production, as well as regulatory facilitation. To achieve optimum product quality, the traditional trial-and-error method for the optimization of different process and formulation parameters is expensive and time consuming. Real-time evaluation and the control of product quality using an online process analyzer in continuous processing can provide high-quality production with very high-throughput at low unit cost. This review focuses on continuous processing and the application of different real-time monitoring tools used in the pharmaceutical industry for continuous processing from powder to tablets.

  17. "A manager in the minds of doctors:" a comparison of new modes of control in European hospitals.

    PubMed

    Kuhlmann, Ellen; Burau, Viola; Correia, Tiago; Lewandowski, Roman; Lionis, Christos; Noordegraaf, Mirko; Repullo, Jose

    2013-07-02

    Hospital governance increasingly combines management and professional self-governance. This article maps the new emergent modes of control in a comparative perspective and aims to better understand the relationship between medicine and management as hybrid and context-dependent. Theoretically, we critically review approaches into the managerialism-professionalism relationship; methodologically, we expand cross-country comparison towards the meso-level of organisations; and empirically, the focus is on processes and actors in a range of European hospitals. The research is explorative and was carried out as part of the FP7 COST action IS0903 Medicine and Management, Working Group 2. Comprising seven European countries, the focus is on doctors and public hospitals. We use a comparative case study design that primarily draws on expert information and document analysis as well as other secondary sources. The findings reveal that managerial control is not simply an external force but increasingly integrated in medical professionalism. These processes of change are relevant in all countries but shaped by organisational settings, and therefore create different patterns of control: (1) 'integrated' control with high levels of coordination and coherent patterns for cost and quality controls; (2) 'partly integrated' control with diversity of coordination on hospital and department level and between cost and quality controls; and (3) 'fragmented' control with limited coordination and gaps between quality control more strongly dominated by medicine, and cost control by management. Our comparison highlights how organisations matter and brings the crucial relevance of 'coordination' of medicine and management across the levels (hospital/department) and the substance (cost/quality-safety) of control into perspective. Consequently, coordination may serve as a taxonomy of emergent modes of control, thus bringing new directions for cost-efficient and quality-effective hospital governance into perspective.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malkoske, Kyle; Nielsen, Michelle; Brown, Erika

    The Canadian Partnership for Quality Radiotherapy (CPQR) and the Canadian Organization of Medical Physicist’s (COMP) Quality Assurance and Radiation Safety Advisory Committee (QARSAC) have worked together in the development of a suite of Technical Quality Control (TQC) Guidelines for radiation treatment equipment and technologies, that outline specific performance objectives and criteria that equipment should meet in order to assure an acceptable level of radiation treatment quality. Early community engagement and uptake survey data showed 70% of Canadian centers are part of this process and that the data in the guideline documents reflect, and are influencing the way Canadian radiation treatmentmore » centres run their technical quality control programs. As the TQC development framework matured as a cross-country initiative, guidance documents have been developed in many clinical technologies. Recently, there have been new TQC documents initiated for Gamma Knife and Cyberknife technologies where the entire communities within Canada are involved in the review process. At the same time, QARSAC reviewed the suite as a whole for the first time and it was found that some tests and tolerances overlapped across multiple documents as single tests could pertain to multiple quality control areas. The work to streamline the entire suite has allowed for improved usability of the suite while keeping the integrity of single quality control areas. The suite will be published by the JACMP, in the coming year.« less

  19. Automated Data Quality Assurance using OGC Sensor Web Enablement Frameworks for Marine Observatories

    NASA Astrophysics Data System (ADS)

    Toma, Daniel; Bghiel, Ikram; del Rio, Joaquin; Hidalgo, Alberto; Carreras, Normandino; Manuel, Antoni

    2014-05-01

    Over the past years, environmental sensors have continuously improved by becoming smaller, cheaper, and more intelligent. Therefore, many sensor networks are increasingly deployed to monitor our environment. But due to the large number of sensor manufacturers, accompanying protocols and data encoding, automated integration and data quality assurance of diverse sensors in an observing systems is not straightforward, requiring development of data management code and manual tedious configuration. However, over the past few years it has been demonstrated that Open-Geospatial Consortium (OGC) frameworks can enable web services with fully-described sensor systems, including data processing, sensor characteristics and quality control tests and results. So far, the SWE framework does not describe how to integrate sensors on-the-fly with minimal human intervention. The data management software which enables access to sensors, data processing and quality control tests has to be implemented and the results have to be manually mapped to the SWE models. In this contribution, we describe a Sensor Plug & Play infrastructure for the Sensor Web by combining (1) OGC PUCK protocol - a simple standard embedded instrument protocol to store and retrieve directly from the devices the declarative description of sensor characteristics and quality control tests, (2) an automatic mechanism for data processing and quality control tests underlying the Sensor Web - the Sensor Interface Descriptor (SID) concept, as well as (3) a model for the declarative description of sensor which serves as a generic data management mechanism - designed as a profile and extension of OGC SWE's SensorML standard. We implement and evaluate our approach by applying it to the OBSEA Observatory, and can be used to demonstrate the ability to assess data quality for temperature, salinity, air pressure and wind speed and direction observations off the coast of Garraf, in the north-eastern Spain.

  20. Effect of sous vide processing on physicochemical, ultrastructural, microbial and sensory changes in vacuum packaged chicken sausages.

    PubMed

    Naveena, B M; Khansole, Panjab S; Shashi Kumar, M; Krishnaiah, N; Kulkarni, Vinayak V; Deepak, S J

    2017-01-01

    The processing of sous vide chicken sausages was optimized under vacuum packaging condition and cooking at 100 ℃ for 30 min (SV30), 60 min (SV60) and 120 min (SV120) and compared with aerobically cooked control at 100 ℃ for 30 min. Sous vide processing of chicken sausages (SV30) produced higher (p < 0.05) cooking yield, Hunterlab a* values and sensory attributes without affecting proximate composition and shear force values relative to control. The sodium dodecyl sulphate-polyacrylamide gel electrophoresis and scanning electron microscopy results revealed no significant changes in protein quality and emulsion ultra-structure due to SV30 processing relative to control sausages. Sous vide processing of chicken sausages enriched with rosemary diterpene phenols retained the freshness and quality up to 120 days during storage at 4 ± 1 ℃ relative to control sausages that were spoiled on 20th day. Lipid oxidation and microbial growth remained below the spoilage levels for all the SV-processed sausages throughout the storage and addition of rosemary diterpene mixture at 0.02% v/w reduced the microbial growth and improved (p < 0.05) the sensory attributes. Our results demonstrate that sous vide processing minimizes lipid oxidation and microbial growth of chicken sausages with improved product quality and shelf-life at 4 ± 1 ℃. © The Author(s) 2016.

  1. PACS quality control and automatic problem notifier

    NASA Astrophysics Data System (ADS)

    Honeyman-Buck, Janice C.; Jones, Douglas; Frost, Meryll M.; Staab, Edward V.

    1997-05-01

    One side effect of installing a clinical PACS Is that users become dependent upon the technology and in some cases it can be very difficult to revert back to a film based system if components fail. The nature of system failures range from slow deterioration of function as seen in the loss of monitor luminance through sudden catastrophic loss of the entire PACS networks. This paper describes the quality control procedures in place at the University of Florida and the automatic notification system that alerts PACS personnel when a failure has happened or is anticipated. The goal is to recover from a failure with a minimum of downtime and no data loss. Routine quality control is practiced on all aspects of PACS, from acquisition, through network routing, through display, and including archiving. Whenever possible, the system components perform self and between platform checks for active processes, file system status, errors in log files, and system uptime. When an error is detected or a exception occurs, an automatic page is sent to a pager with a diagnostic code. Documentation on each code, trouble shooting procedures, and repairs are kept on an intranet server accessible only to people involved in maintaining the PACS. In addition to the automatic paging system for error conditions, acquisition is assured by an automatic fax report sent on a daily basis to all technologists acquiring PACS images to be used as a cross check that all studies are archived prior to being removed from the acquisition systems. Daily quality control is preformed to assure that studies can be moved from each acquisition and contrast adjustment. The results of selected quality control reports will be presented. The intranet documentation server will be described with the automatic pager system. Monitor quality control reports will be described and the cost of quality control will be quantified. As PACS is accepted as a clinical tool, the same standards of quality control must be established as are expected on other equipment used in the diagnostic process.

  2. Adaptive Data Processing Technique for Lidar-Assisted Control to Bridge the Gap between Lidar Systems and Wind Turbines: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schlipf, David; Raach, Steffen; Haizmann, Florian

    2015-12-14

    This paper presents first steps toward an adaptive lidar data processing technique crucial for lidar-assisted control in wind turbines. The prediction time and the quality of the wind preview from lidar measurements depend on several factors and are not constant. If the data processing is not continually adjusted, the benefit of lidar-assisted control cannot be fully exploited, or can even result in harmful control action. An online analysis of the lidar and turbine data are necessary to continually reassess the prediction time and lidar data quality. In this work, a structured process to develop an analysis tool for the predictionmore » time and a new hardware setup for lidar-assisted control are presented. The tool consists of an online estimation of the rotor effective wind speed from lidar and turbine data and the implementation of an online cross correlation to determine the time shift between both signals. Further, initial results from an ongoing campaign in which this system was employed for providing lidar preview for feed-forward pitch control are presented.« less

  3. Benefits of a comprehensive quality program for cryopreserved PBMC covering 28 clinical trials sites utilizing an integrated, analytical web-based portal

    PubMed Central

    Ducar, Constance; Smith, Donna; Pinzon, Cris; Stirewalt, Michael; Cooper, Cristine; McElrath, M. Juliana; Hural, John

    2014-01-01

    The HIV Vaccine Trials Network (HVTN) is a global network of 28 clinical trial sites dedicated to identifying an effective HIV vaccine. Cryopreservation of high-quality peripheral blood mononuclear cells (PBMC) is critical for the assessment of vaccine-induced cellular immune functions. The HVTN PBMC Quality Management Program is designed to ensure viable PBMC are processed, stored and shipped for clinical trial assays from all HVTN clinical trial sites. The program has evolved by developing and incorporating best practices for laboratory and specimen quality and implementing automated, web-based tools. These tools allow the site-affiliated processing laboratories and the central Laboratory Operations Unit to rapidly collect, analyze and report PBMC quality data. The HVTN PBMC Quality Management Program includes five key components: 1) Laboratory Assessment, 2) PBMC Training and Certification, 3) Internal Quality Control, 4) External Quality Control (EQC), and 5) Assay Specimen Quality Control. Fresh PBMC processing data is uploaded from each clinical site processing laboratory to a central HVTN Statistical and Data Management Center database for access and analysis on a web portal. Samples are thawed at a central laboratory for assay or specimen quality control and sample quality data is uploaded directly to the database by the central laboratory. Four year cumulative data covering 23,477 blood draws reveals an average fresh PBMC yield of 1.45×106 ±0.48 cells per milliliter of useable whole blood. 95% of samples were within the acceptable range for fresh cell yield of 0.8–3.2×106 cells/ml of usable blood. Prior to full implementation of the HVTN PBMC Quality Management Program, the 2007 EQC evaluations from 10 international sites showed a mean day 2 thawed viability of 83.1% and recovery of 67.5%. Since then, four year cumulative data covering 3338 specimens used in immunologic assays shows that 99.88% had acceptable viabilities (>66%) for use in cellular assays (mean, 91.46% ±4.5%), and 96.2% had acceptable recoveries (50%–130%) with a mean of recovery of 85.8% ±19.12% of the originally cryopreserved cells. EQC testing revealed that since August 2009, failed recoveries dropped from 4.1% to 1.6% and failed viabilities dropped from 1.0% to 0.3%. The HVTN PBMC quality program provides for laboratory assessment, training and tools for identifying problems, implementing corrective action and monitoring for improvements. These data support the benefits of implementing a comprehensive, web-based PBMC quality program for large clinical trials networks. PMID:24709391

  4. Self-regulation and quality of life in high-functioning young adults with autism

    PubMed Central

    Dijkhuis, Renee R; Ziermans, Tim B; Van Rijn, Sophie; Staal, Wouter G; Swaab, Hanna

    2016-01-01

    Background: Autism is generally associated with poor functional outcome but little is known about predictors of quality of life, especially during early adulthood. This study was conducted to assess subjective quality of life during early adulthood in high-functioning autism spectrum disorder and its relation with self-regulating abilities. Individuals with high-functioning autism spectrum disorder who progressed into post-secondary higher education (N = 75) were compared to a typical peer control group (N = 28) based on behavioral self-report questionnaires. The results indicated that individuals with high-functioning autism spectrum disorder reported significantly lower subjective quality of life than typical controls (p < 0.001, effect size (d) = 1.84). In addition, individuals with high-functioning autism spectrum disorder reported more problems with emotion processing (p < 0.05, effect size (d) = 0.79) and daily executive functioning (p < 0.001, effect size (d) = 1.29) than controls. A higher level of executive functioning problems was related to lower quality of life in the high-functioning autism spectrum disorder group, but no significant relation between level of emotion processing and subjective quality of life became apparent in the regression analysis. Our findings show that even in high-functioning young adults with autism, executive functioning, emotion processing, and subjective quality of life are low compared to typically developing peers. Furthermore, these results emphasize the importance of targeting executive functioning problems in individuals with autism to improve subjective quality of life. PMID:27407040

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lo, Chien -Chi; Chain, Patrick S. G.

    Background: Next generation sequencing (NGS) technologies that parallelize the sequencing process and produce thousands to millions, or even hundreds of millions of sequences in a single sequencing run, have revolutionized genomic and genetic research. Because of the vagaries of any platform's sequencing chemistry, the experimental processing, machine failure, and so on, the quality of sequencing reads is never perfect, and often declines as the read is extended. These errors invariably affect downstream analysis/application and should therefore be identified early on to mitigate any unforeseen effects. Results: Here we present a novel FastQ Quality Control Software (FaQCs) that can rapidly processmore » large volumes of data, and which improves upon previous solutions to monitor the quality and remove poor quality data from sequencing runs. Both the speed of processing and the memory footprint of storing all required information have been optimized via algorithmic and parallel processing solutions. The trimmed output compared side-by-side with the original data is part of the automated PDF output. We show how this tool can help data analysis by providing a few examples, including an increased percentage of reads recruited to references, improved single nucleotide polymorphism identification as well as de novo sequence assembly metrics. Conclusion: FaQCs combines several features of currently available applications into a single, user-friendly process, and includes additional unique capabilities such as filtering the PhiX control sequences, conversion of FASTQ formats, and multi-threading. The original data and trimmed summaries are reported within a variety of graphics and reports, providing a simple way to do data quality control and assurance.« less

  6. Near-infrared Spectroscopy in the Brewing Industry.

    PubMed

    Sileoni, Valeria; Marconi, Ombretta; Perretti, Giuseppe

    2015-01-01

    This article offers an exhaustive description of the use of Near-Infrared (NIR) Spectroscopy in the brewing industry. This technique is widely used for quality control testing of raw materials, intermediates, and finished products, as well as process monitoring during malting and brewing. In particular, most of the reviewed works focus on the assessment of barley properties, aimed at quickly selecting the best barley varieties in order to produce a high-quality malt leading to high-quality beer. Various works concerning the use of NIR in the evaluation of raw materials, such as barley, malt, hop, and yeast, are also summarized here. The implementation of NIR sensors for the control of malting and brewing processes is also highlighted, as well as the use of NIR for quality assessment of the final product.

  7. A QRM Discussion of Microbial Contamination of Non-sterile Drug Products, Using FDA and EMA Warning Letters Recorded between 2008 and 2016.

    PubMed

    Santos, Ana M C; Doria, Mara S; Meirinhos-Soares, Luís; Almeida, António J; Menezes, José C

    2018-01-01

    Microbial quality control of non-sterile drug products has been a concern to regulatory agencies and the pharmaceutical industry since the 1960s. Despite being an old challenge to companies, microbial contamination still affects a high number of manufacturers of non-sterile products. Consequences go well beyond the obvious direct costs related to batch rejections or product recalls, as human lives and a company's reputation are significantly impacted if such events occur. To better manage risk and establish effective mitigation strategies, it is necessary to understand the microbial hazards involved in non-sterile drug products manufacturing, be able to evaluate their potential impact on final product quality, and apply mitigation actions. Herein we discuss the most likely root causes involved in microbial contaminations referenced in warning letters issued by US health authorities and non-compliance reports issued by European health authorities over a period of several years. The quality risk management tools proposed were applied to the data gathered from those databases, and a generic risk ranking was provided based on a panel of non-sterile drug product manufacturers that was assembled and given the opportunity to perform the risk assessments. That panel identified gaps and defined potential mitigation actions, based on their own experience of potential risks expected for their processes. Major findings clearly indicate that the manufacturers affected by the warning letters should focus their attention on process improvements and microbial control strategies, especially those related to microbial analysis and raw material quality control. Additionally, the WLs considered frequently referred to failures in quality-related issues, which indicates that the quality commitment should be reinforced at most companies to avoid microbiological contaminations. LAY ABSTRACT: Microbial contamination of drug products affects the quality of non-sterile drug products produced by numerous manufacturers, representing a major risk to patients. It is necessary to understand the microbial hazards involved in the manufacturing process and evaluate their impact on final product quality so that effective prevention strategies can be implemented. A risk-based classification of most likely root causes for microbial contamination found in the warning letters issued by the US Food and Drug Administration and the European Medicines Agency is proposed. To validate the likely root causes extracted from the warning letters, a subject matter expert panel made of several manufacturers was formed and consulted. A quality risk management approach to assess microbiological contamination of non-sterile drug products is proposed for the identification of microbial hazards involved in the manufacturing process. To enable ranking of microbial contamination risks, quality risk management metrics related to criticality and overall risk were applied. The results showed that manufacturers of non-sterile drug products should improve their microbial control strategy, with special attention to quality controls of raw materials, primary containers, and closures. Besides that, they should invest in a more robust quality system and culture. As a start, manufacturers may consider investigating their specific microbiological risks, adressing their sites' own microbial ecology, type of manufacturing processes, and dosage form characteristics, as these may lead to increased contamination risks. Authorities should allow and enforce innovative, more comprehensive, and more effective approaches to in-process contamination monitoring and controls. © PDA, Inc. 2018.

  8. [IMPLEMENTATION OF A QUALITY MANAGEMENT SYSTEM IN A NUTRITION UNIT ACCORDING TO ISO 9001:2008].

    PubMed

    Velasco Gimeno, Cristina; Cuerda Compés, Cristina; Alonso Puerta, Alba; Frías Soriano, Laura; Camblor Álvarez, Miguel; Bretón Lesmes, Irene; Plá Mestre, Rosa; Izquierdo Membrilla, Isabel; García-Peris, Pilar

    2015-09-01

    the implementation of quality management systems (QMS) in the health sector has made great progress in recent years, remains a key tool for the management and improvement of services provides to patients. to describe the process of implementing a quality management system (QMS) according to the standard ISO 9001:2008 in a Nutrition Unit. the implementation began in October 2012. Nutrition Unit was supported by Hospital Preventive Medicine and Quality Management Service (PMQM). Initially training sessions on QMS and ISO standards for staff were held. Quality Committee (QC) was established with representation of the medical and nursing staff. Every week, meeting took place among members of the QC and PMQM to define processes, procedures and quality indicators. We carry on a 2 months follow-up of these documents after their validation. a total of 4 processes were identified and documented (Nutritional status assessment, Nutritional treatment, Monitoring of nutritional treatment and Planning and control of oral feeding) and 13 operating procedures in which all the activity of the Unit were described. The interactions among them were defined in the processes map. Each process has associated specific quality indicators for measuring the state of the QMS, and identifying opportunities for improvement. All the documents associated with requirements of ISO 9001:2008 were developed: quality policy, quality objectives, quality manual, documents and records control, internal audit, nonconformities and corrective and preventive actions. The unit was certified by AENOR in April 2013. the implementation of a QMS causes a reorganization of the activities of the Unit in order to meet customer's expectations. Documenting these activities ensures a better understanding of the organization, defines the responsibilities of all staff and brings a better management of time and resources. QMS also improves the internal communication and is a motivational element. Explore the satisfaction and expectations of patients can include their view in the design of care processes. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.

  9. Development of Process Analytical Technology (PAT) methods for controlled release pellet coating.

    PubMed

    Avalle, P; Pollitt, M J; Bradley, K; Cooper, B; Pearce, G; Djemai, A; Fitzpatrick, S

    2014-07-01

    This work focused on the control of the manufacturing process for a controlled release (CR) pellet product, within a Quality by Design (QbD) framework. The manufacturing process was Wurster coating: firstly layering active pharmaceutical ingredient (API) onto sugar pellet cores and secondly a controlled release (CR) coating. For each of these two steps, development of a Process Analytical Technology (PAT) method is discussed and also a novel application of automated microscopy as the reference method. Ultimately, PAT methods should link to product performance and the two key Critical Quality Attributes (CQAs) for this CR product are assay and release rate, linked to the API and CR coating steps respectively. In this work, the link between near infra-red (NIR) spectra and those attributes was explored by chemometrics over the course of the coating process in a pilot scale industrial environment. Correlations were built between the NIR spectra and coating weight (for API amount), CR coating thickness and dissolution performance. These correlations allow the coating process to be monitored at-line and so better control of the product performance in line with QbD requirements. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. How gamma radiation processing systems are benefiting from the latest advances in information technology

    NASA Astrophysics Data System (ADS)

    Gibson, Wayne H.; Levesque, Daniel

    2000-03-01

    This paper discusses how gamma irradiation plants are putting the latest advances in computer and information technology to use for better process control, cost savings, and strategic advantages. Some irradiator operations are gaining significant benefits by integrating computer technology and robotics with real-time information processing, multi-user databases, and communication networks. The paper reports on several irradiation facilities that are making good use of client/server LANs, user-friendly graphics interfaces, supervisory control and data acquisition (SCADA) systems, distributed I/O with real-time sensor devices, trending analysis, real-time product tracking, dynamic product scheduling, and automated dosimetry reading. These plants are lowering costs by fast and reliable reconciliation of dosimetry data, easier validation to GMP requirements, optimizing production flow, and faster release of sterilized products to market. There is a trend in the manufacturing sector towards total automation using "predictive process control". Real-time verification of process parameters "on-the-run" allows control parameters to be adjusted appropriately, before the process strays out of limits. Applying this technology to the gamma radiation process, control will be based on monitoring the key parameters such as time, and making adjustments during the process to optimize quality and throughput. Dosimetry results will be used as a quality control measurement rather than as a final monitor for the release of the product. Results are correlated with the irradiation process data to quickly and confidently reconcile variations. Ultimately, a parametric process control system utilizing responsive control, feedback and verification will not only increase productivity and process efficiency, but can also result in operating within tighter dose control set points.

  11. A DMAIC approach for process capability improvement an engine crankshaft manufacturing process

    NASA Astrophysics Data System (ADS)

    Sharma, G. V. S. S.; Rao, P. Srinivasa

    2014-05-01

    The define-measure-analyze-improve-control (DMAIC) approach is a five-strata approach, namely DMAIC. This approach is the scientific approach for reducing the deviations and improving the capability levels of the manufacturing processes. The present work elaborates on DMAIC approach applied in reducing the process variations of the stub-end-hole boring operation of the manufacture of crankshaft. This statistical process control study starts with selection of the critical-to-quality (CTQ) characteristic in the define stratum. The next stratum constitutes the collection of dimensional measurement data of the CTQ characteristic identified. This is followed by the analysis and improvement strata where the various quality control tools like Ishikawa diagram, physical mechanism analysis, failure modes effects analysis and analysis of variance are applied. Finally, the process monitoring charts are deployed at the workplace for regular monitoring and control of the concerned CTQ characteristic. By adopting DMAIC approach, standard deviation is reduced from 0.003 to 0.002. The process potential capability index ( C P) values improved from 1.29 to 2.02 and the process performance capability index ( C PK) values improved from 0.32 to 1.45, respectively.

  12. Irregular analytical errors in diagnostic testing - a novel concept.

    PubMed

    Vogeser, Michael; Seger, Christoph

    2018-02-23

    In laboratory medicine, routine periodic analyses for internal and external quality control measurements interpreted by statistical methods are mandatory for batch clearance. Data analysis of these process-oriented measurements allows for insight into random analytical variation and systematic calibration bias over time. However, in such a setting, any individual sample is not under individual quality control. The quality control measurements act only at the batch level. Quantitative or qualitative data derived for many effects and interferences associated with an individual diagnostic sample can compromise any analyte. It is obvious that a process for a quality-control-sample-based approach of quality assurance is not sensitive to such errors. To address the potential causes and nature of such analytical interference in individual samples more systematically, we suggest the introduction of a new term called the irregular (individual) analytical error. Practically, this term can be applied in any analytical assay that is traceable to a reference measurement system. For an individual sample an irregular analytical error is defined as an inaccuracy (which is the deviation from a reference measurement procedure result) of a test result that is so high it cannot be explained by measurement uncertainty of the utilized routine assay operating within the accepted limitations of the associated process quality control measurements. The deviation can be defined as the linear combination of the process measurement uncertainty and the method bias for the reference measurement system. Such errors should be coined irregular analytical errors of the individual sample. The measurement result is compromised either by an irregular effect associated with the individual composition (matrix) of the sample or an individual single sample associated processing error in the analytical process. Currently, the availability of reference measurement procedures is still highly limited, but LC-isotope-dilution mass spectrometry methods are increasingly used for pre-market validation of routine diagnostic assays (these tests also involve substantial sets of clinical validation samples). Based on this definition/terminology, we list recognized causes of irregular analytical error as a risk catalog for clinical chemistry in this article. These issues include reproducible individual analytical errors (e.g. caused by anti-reagent antibodies) and non-reproducible, sporadic errors (e.g. errors due to incorrect pipetting volume due to air bubbles in a sample), which can both lead to inaccurate results and risks for patients.

  13. TQM: the essential concepts.

    PubMed

    Chambers, D W

    1998-01-01

    This is an introduction to the major concepts in total quality management, a loose collection of management approaches that focus on continuous improvement of processes, guided by routine data collection and adjustment of the processes. Customer focus and involvement of all members of an organization are also characteristics commonly found in TQM. The seventy-five-year history of the movement is sketched from its beginning in statistical work on quality assurance through the many improvements and redefinitions added by American and Japanese thinkers. Essential concepts covered include: control cycles, focus on the process rather than the defects, the GEAR model, importance of the customer, upstream quality, just-in-time, kaizen, and service quality.

  14. [Strategies and development of quality assurance and control in the ELSA-Brasil].

    PubMed

    Schmidt, Maria Inês; Griep, Rosane Härter; Passos, Valéria Maria; Luft, Vivian Cristine; Goulart, Alessandra Carvalho; Menezes, Greice Maria de Souza; Molina, Maria del Carmen Bisi; Vigo, Alvaro; Nunes, Maria Angélica

    2013-06-01

    The ELSA-Brasil (Estudo Longitudinal de Saúde do Adulto - Brazilian Longitudinal Study for Adult Health) is a cohort study composed of 15,105 adults followed up in order to assess the development of chronic diseases, especially diabetes and cardiovascular disease. Its size, multicenter nature and the diversity of measurements required effective and efficient mechanisms of quality assurance and control. The main quality assurance activities (those developed before data collection) were: careful selection of research instruments, centralized training and certification, pretesting and pilot studies, and preparation of operation manuals for the procedures. Quality control activities (developed during data collection and processing) were performed more intensively at the beginning, when routines had not been established yet. The main quality control activities were: periodic observation of technicians, test-retest studies, data monitoring, network of supervisors, and cross visits. Data that estimate the reliability of the obtained information attest that the quality goals have been achieved.

  15. [Comparative quality measurements part 3: funnel plots].

    PubMed

    Kottner, Jan; Lahmann, Nils

    2014-02-01

    Comparative quality measurements between organisations or institutions are common. Quality measures need to be standardised and risk adjusted. Random error must also be taken adequately into account. Rankings without consideration of the precision lead to flawed interpretations and enhances "gaming". Application of confidence intervals is one possibility to take chance variation into account. Funnel plots are modified control charts based on Statistical Process Control (SPC) theory. The quality measures are plotted against their sample size. Warning and control limits that are 2 or 3 standard deviations from the center line are added. With increasing group size the precision increases and so the control limits are forming a funnel. Data points within the control limits are considered to show common cause variation; data points outside special cause variation without the focus of spurious rankings. Funnel plots offer data based information about how to evaluate institutional performance within quality management contexts.

  16. A quality improvement project using statistical process control methods for type 2 diabetes control in a resource-limited setting.

    PubMed

    Flood, David; Douglas, Kate; Goldberg, Vera; Martinez, Boris; Garcia, Pablo; Arbour, MaryCatherine; Rohloff, Peter

    2017-08-01

    Quality improvement (QI) is a key strategy for improving diabetes care in low- and middle-income countries (LMICs). This study reports on a diabetes QI project in rural Guatemala whose primary aim was to improve glycemic control of a panel of adult diabetes patients. Formative research suggested multiple areas for programmatic improvement in ambulatory diabetes care. This project utilized the Model for Improvement and Agile Global Health, our organization's complementary healthcare implementation framework. A bundle of improvement activities were implemented at the home, clinic and institutional level. Control charts of mean hemoglobin A1C (HbA1C) and proportion of patients meeting target HbA1C showed improvement as special cause variation was identified 3 months after the intervention began. Control charts for secondary process measures offered insights into the value of different components of the intervention. Intensity of home-based diabetes education emerged as an important driver of panel glycemic control. Diabetes QI work is feasible in resource-limited settings in LMICs and can improve glycemic control. Statistical process control charts are a promising methodology for use with panels or registries of diabetes patients. © The Author 2017. Published by Oxford University Press in association with the International Society for Quality in Health Care. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  17. Quality-Focused Management.

    ERIC Educational Resources Information Center

    Needham, Robbie Lee

    1993-01-01

    Presents the quality-focused management (QFM) system and explains the departure QFM makes from established community college management practices. Describes the system's self-directed teams engaged in a continuous improvement process driven by customer demand and long-term commitment to quality and cost control. (13 references.) (MAB)

  18. Math Problems for Water Quality Control Personnel, Instructor's Manual. Second Edition.

    ERIC Educational Resources Information Center

    Delvecchio, Fred; Brutsch, Gloria

    This document is the instructor's manual for a course in mathematics for water quality control personnel. It is designed so a program may be designed for a specific facility. The problem structures are arranged alphabetically by treatment process. Charts, graphs and/or drawings representing familiar data forms contain the necessary information to…

  19. Math Problems for Water Quality Control Personnel, Student Workbook. Second Edition.

    ERIC Educational Resources Information Center

    Delvecchio, Fred; Brutsch, Gloria

    This document is the student workbook for a course in mathematics for water quality control personnel. This version contains complete problems, answers and references. Problems are arranged alphabetically by treatment process. Charts, graphs, and drawings represent data forms an operator might see in a plant containing information necessary for…

  20. Evaluation of statistical protocols for quality control of ecosystem carbon dioxide fluxes

    Treesearch

    Jorge F. Perez-Quezada; Nicanor Z. Saliendra; William E. Emmerich; Emilio A. Laca

    2007-01-01

    The process of quality control of micrometeorological and carbon dioxide (CO2) flux data can be subjective and may lack repeatability, which would undermine the results of many studies. Multivariate statistical methods and time series analysis were used together and independently to detect and replace outliers in CO2 flux...

  1. Development and demonstration of manufacturing processes for fabricating graphite/LARC-160 polyimide structural elements, part 4, paragraph B

    NASA Technical Reports Server (NTRS)

    1980-01-01

    A quality assurance program was developed which included specifications for celion/LARC-160 polyimide materials and quality control of materials and processes. The effects of monomers and/or polymer variables and prepeg variables on the processibility of celion/LARC prepeg were included. Processes for fabricating laminates, honeycomb core panels, and chopped fiber moldings were developed. Specimens and conduct tests were fabricated to qualify the processes for fabrication of demonstration components.

  2. Statistical Process Control: A Quality Tool for a Venous Thromboembolic Disease Registry.

    PubMed

    Posadas-Martinez, Maria Lourdes; Rojas, Liliana Paloma; Vazquez, Fernando Javier; De Quiros, Fernan Bernaldo; Waisman, Gabriel Dario; Giunta, Diego Hernan

    2016-01-01

    We aim to describe Statistical Control Process as a quality tool for the Institutional Registry of Venous Thromboembolic Disease (IRTD), a registry developed in a community-care tertiary hospital in Buenos Aires, Argentina. The IRTD is a prospective cohort. The process of data acquisition began with the creation of a computerized alert generated whenever physicians requested imaging or laboratory study to diagnose venous thromboembolism, which defined eligible patients. The process then followed a structured methodology for patient's inclusion, evaluation, and posterior data entry. To control this process, process performance indicators were designed to be measured monthly. These included the number of eligible patients, the number of included patients, median time to patient's evaluation, and percentage of patients lost to evaluation. Control charts were graphed for each indicator. The registry was evaluated in 93 months, where 25,757 patients were reported and 6,798 patients met inclusion criteria. The median time to evaluation was 20 hours (SD, 12) and 7.7% of the total was lost to evaluation. Each indicator presented trends over time, caused by structural changes and improvement cycles, and therefore the central limit suffered inflexions. Statistical process control through process performance indicators allowed us to control the performance of the registry over time to detect systematic problems. We postulate that this approach could be reproduced for other clinical registries.

  3. Integrating automated support for a software management cycle into the TAME system

    NASA Technical Reports Server (NTRS)

    Sunazuka, Toshihiko; Basili, Victor R.

    1989-01-01

    Software managers are interested in the quantitative management of software quality, cost and progress. An integrated software management methodology, which can be applied throughout the software life cycle for any number purposes, is required. The TAME (Tailoring A Measurement Environment) methodology is based on the improvement paradigm and the goal/question/metric (GQM) paradigm. This methodology helps generate a software engineering process and measurement environment based on the project characteristics. The SQMAR (software quality measurement and assurance technology) is a software quality metric system and methodology applied to the development processes. It is based on the feed forward control principle. Quality target setting is carried out before the plan-do-check-action activities are performed. These methodologies are integrated to realize goal oriented measurement, process control and visual management. A metric setting procedure based on the GQM paradigm, a management system called the software management cycle (SMC), and its application to a case study based on NASA/SEL data are discussed. The expected effects of SMC are quality improvement, managerial cost reduction, accumulation and reuse of experience, and a highly visual management reporting system.

  4. Review of indicators for cross-sectoral optimization of nosocomial infection prophylaxis – a perspective from structurally- and process-oriented hygiene

    PubMed Central

    Hübner, Nils-Olaf; Fleßa, Steffen; Jakisch, Ralf; Assadian, Ojan; Kramer, Axel

    2012-01-01

    In the care of patients, the prevention of nosocomial infections is crucial. For it to be successful, cross-sectoral, interface-oriented hygiene quality management is necessary. The goal is to apply the HACCP (Hazard Assessment and Critical Control Points) concept to hospital hygiene, in order to create a multi-dimensional hygiene control system based on hygiene indicators that will overcome the limitations of a procedurally non-integrated and non-cross-sectoral view of hygiene. Three critical risk dimensions can be identified for the implementation of three-dimensional quality control of hygiene in clinical routine: the constitution of the person concerned, the surrounding physical structures and technical equipment, and the medical procedures. In these dimensions, the establishment of indicators and threshold values enables a comprehensive assessment of hygiene quality. Thus, the cross-sectoral evaluation of the quality of structure, processes and results is decisive for the success of integrated infection prophylaxis. This study lays the foundation for hygiene indicator requirements and develops initial concepts for evaluating quality management in hygiene. PMID:22558049

  5. Evaluation of pH monitoring as a method of processor control.

    PubMed

    Stears, J G; Gray, J E; Winkler, N T

    1979-01-01

    Sensitometry and pH values of the developer solution were compared in controlled over-replenishment, developer depletion, fixer contamination experiments, and on a daily quality control basis. The purpose of these comparisons was to evaluate the potential of pH monitoring as a method of processor control, or a supplement to sensitometry as a method of quality control. Reasonable correlation was found between pH values and film density in two of the three experiments but little or no correlation was found in the third experiment and on a day-to-day basis. The conclusion drawn from these comparisons is that pH monitoring has several limitations which render it unsuitable as a method of daily processor quality control as either a primary or supplementary technique. Sensitometry takes into account all the variables encountered in film processing and is the clear method of choice for processor quality control.

  6. Phenol induced by irradiation does not impair sensory quality of fenugreek and papaya

    NASA Astrophysics Data System (ADS)

    Chatterjee, Suchandra; Variyar, Prasad S.; Sharma, Arun

    2013-11-01

    The effect of radiation processing on the sensory quality of fenugreek and papaya exposed to doses in the range of 2.5-10 kGy and 100 Gy-2.5 kGy respectively was investigated. Despite an increase in the content of phenol in the volatile oil of these food products overall sensory quality of the irradiated and control samples was not significantly affected by radiation processing.

  7. A Quadrupole Dalton-based multi-attribute method for product characterization, process development, and quality control of therapeutic proteins.

    PubMed

    Xu, Weichen; Jimenez, Rod Brian; Mowery, Rachel; Luo, Haibin; Cao, Mingyan; Agarwal, Nitin; Ramos, Irina; Wang, Xiangyang; Wang, Jihong

    2017-10-01

    During manufacturing and storage process, therapeutic proteins are subject to various post-translational modifications (PTMs), such as isomerization, deamidation, oxidation, disulfide bond modifications and glycosylation. Certain PTMs may affect bioactivity, stability or pharmacokinetics and pharmacodynamics profile and are therefore classified as potential critical quality attributes (pCQAs). Identifying, monitoring and controlling these PTMs are usually key elements of the Quality by Design (QbD) approach. Traditionally, multiple analytical methods are utilized for these purposes, which is time consuming and costly. In recent years, multi-attribute monitoring methods have been developed in the biopharmaceutical industry. However, these methods combine high-end mass spectrometry with complicated data analysis software, which could pose difficulty when implementing in a quality control (QC) environment. Here we report a multi-attribute method (MAM) using a Quadrupole Dalton (QDa) mass detector to selectively monitor and quantitate PTMs in a therapeutic monoclonal antibody. The result output from the QDa-based MAM is straightforward and automatic. Evaluation results indicate this method provides comparable results to the traditional assays. To ensure future application in the QC environment, this method was qualified according to the International Conference on Harmonization (ICH) guideline and applied in the characterization of drug substance and stability samples. The QDa-based MAM is shown to be an extremely useful tool for product and process characterization studies that facilitates facile understanding of process impact on multiple quality attributes, while being QC friendly and cost-effective.

  8. Read Code Quality Assurance

    PubMed Central

    Schulz, Erich; Barrett, James W.; Price, Colin

    1998-01-01

    As controlled clinical vocabularies assume an increasing role in modern clinical information systems, so the issue of their quality demands greater attention. In order to meet the resulting stringent criteria for completeness and correctness, a quality assurance system comprising a database of more than 500 rules is being developed and applied to the Read Thesaurus. The authors discuss the requirement to apply quality assurance processes to their dynamic editing database in order to ensure the quality of exported products. Sources of errors include human, hardware, and software factors as well as new rules and transactions. The overall quality strategy includes prevention, detection, and correction of errors. The quality assurance process encompasses simple data specification, internal consistency, inspection procedures and, eventually, field testing. The quality assurance system is driven by a small number of tables and UNIX scripts, with “business rules” declared explicitly as Structured Query Language (SQL) statements. Concurrent authorship, client-server technology, and an initial failure to implement robust transaction control have all provided valuable lessons. The feedback loop for error management needs to be short. PMID:9670131

  9. Read Code quality assurance: from simple syntax to semantic stability.

    PubMed

    Schulz, E B; Barrett, J W; Price, C

    1998-01-01

    As controlled clinical vocabularies assume an increasing role in modern clinical information systems, so the issue of their quality demands greater attention. In order to meet the resulting stringent criteria for completeness and correctness, a quality assurance system comprising a database of more than 500 rules is being developed and applied to the Read Thesaurus. The authors discuss the requirement to apply quality assurance processes to their dynamic editing database in order to ensure the quality of exported products. Sources of errors include human, hardware, and software factors as well as new rules and transactions. The overall quality strategy includes prevention, detection, and correction of errors. The quality assurance process encompasses simple data specification, internal consistency, inspection procedures and, eventually, field testing. The quality assurance system is driven by a small number of tables and UNIX scripts, with "business rules" declared explicitly as Structured Query Language (SQL) statements. Concurrent authorship, client-server technology, and an initial failure to implement robust transaction control have all provided valuable lessons. The feedback loop for error management needs to be short.

  10. Self-assessment procedure using fuzzy sets

    NASA Astrophysics Data System (ADS)

    Mimi, Fotini

    2000-10-01

    Self-Assessment processes, initiated by a company itself and carried out by its own people, are considered to be the starting point for a regular strategic or operative planning process to ensure a continuous quality improvement. Their importance has increased by the growing relevance and acceptance of international quality awards such as the Malcolm Baldrige National Quality Award, the European Quality Award and the Deming Prize. Especially award winners use the instrument of a systematic and regular Self-Assessment and not only because they have to verify their quality and business results for at least three years. The Total Quality Model of the European Foundation for Quality Management (EFQM), used for the European Quality Award, is the basis for Self-Assessment in Europe. This paper presents a self-assessment supporting method based on a methodology of fuzzy control systems providing an effective means of converting the linguistic approximation into an automatic control strategy. In particular, the elements of the Quality Model mentioned above are interpreted as linguistic variables. The LR-type of a fuzzy interval is used for their representation. The input data has a qualitative character based on empirical investigation and expert knowledge and therefore the base- variables are ordinal scaled. The aggregation process takes place on the basis of a hierarchical structure. Finally, in order to render the use of the method more practical a software system on PC basis is developed and implemented.

  11. The NCC project: A quality management perspective

    NASA Technical Reports Server (NTRS)

    Lee, Raymond H.

    1993-01-01

    The Network Control Center (NCC) Project introduced the concept of total quality management (TQM) in mid-1990. The CSC project team established a program which focused on continuous process improvement in software development methodology and consistent deliveries of high quality software products for the NCC. The vision of the TQM program was to produce error free software. Specific goals were established to allow continuing assessment of the progress toward meeting the overall quality objectives. The total quality environment, now a part of the NCC Project culture, has become the foundation for continuous process improvement and has resulted in the consistent delivery of quality software products over the last three years.

  12. SWiFT Software Quality Assurance Plan.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berg, Jonathan Charles

    This document describes the software development practice areas and processes which contribute to the ability of SWiFT software developers to provide quality software. These processes are designed to satisfy the requirements set forth by the Sandia Software Quality Assurance Program (SSQAP). APPROVALS SWiFT Software Quality Assurance Plan (SAND2016-0765) approved by: Department Manager SWiFT Site Lead Dave Minster (6121) Date Jonathan White (6121) Date SWiFT Controls Engineer Jonathan Berg (6121) Date CHANGE HISTORY Issue Date Originator(s) Description A 2016/01/27 Jon Berg (06121) Initial release of the SWiFT Software Quality Assurance Plan

  13. Implementation of Quality Management in Core Service Laboratories

    PubMed Central

    Creavalle, T.; Haque, K.; Raley, C.; Subleski, M.; Smith, M.W.; Hicks, B.

    2010-01-01

    CF-28 The Genetics and Genomics group of the Advanced Technology Program of SAIC-Frederick exists to bring innovative genomic expertise, tools and analysis to NCI and the scientific community. The Sequencing Facility (SF) provides next generation short read (Illumina) sequencing capacity to investigators using a streamlined production approach. The Laboratory of Molecular Technology (LMT) offers a wide range of genomics core services including microarray expression analysis, miRNA analysis, array comparative genome hybridization, long read (Roche) next generation sequencing, quantitative real time PCR, transgenic genotyping, Sanger sequencing, and clinical mutation detection services to investigators from across the NIH. As the technology supporting this genomic research becomes more complex, the need for basic quality processes within all aspects of the core service groups becomes critical. The Quality Management group works alongside members of these labs to establish or improve processes supporting operations control (equipment, reagent and materials management), process improvement (reengineering/optimization, automation, acceptance criteria for new technologies and tech transfer), and quality assurance and customer support (controlled documentation/SOPs, training, service deficiencies and continual improvement efforts). Implementation and expansion of quality programs within unregulated environments demonstrates SAIC-Frederick's dedication to providing the highest quality products and services to the NIH community.

  14. The use of a non-nuclear density gauge for monitoring the compaction process of asphalt pavement

    NASA Astrophysics Data System (ADS)

    Van den bergh, Wim; Vuye, Cedric; Kara, Patricia; Couscheir, Karolien; Blom, Johan; Van Bouwel, Philippe

    2017-09-01

    The mechanical performance of an asphalt pavement affects its durability - thus carbon footprint. Many parameters contribute to the success of a durable asphalt mix, e.g. material selection, an accurate mix and even the road design in which the asphalt mix quality is quantified. The quality of the asphalt mix, by its mechanical properties, is also related to the compaction degree. However, and specifically for high volume rates, the laying process at the construction site needs an effective method to monitor and adjust immediately the compaction quality before cooling and without damaging the layer, which is now absent. In this paper the use of a non-nuclear density gauge (PQI - Pavement Quality Indicator) is evaluated, based on a site at Brussels Airport. Considering the outcome of the present research, this PQI is advised as a unique tool for continuous density measurements and allow immediate adjustments during compaction, and decreases the number of core drilling for quality control, and as a posteriori asphalt pavement density test where cores are prohibited. The use of PQI could be recommended to be a part of the standard quality control process in the Flemish region.

  15. Results of a Regional Effort to Improve Warfarin Management.

    PubMed

    Rose, Adam J; Park, Angela; Gillespie, Christopher; Van Deusen Lukas, Carol; Ozonoff, Al; Petrakis, Beth Ann; Reisman, Joel I; Borzecki, Ann M; Benedict, Ashley J; Lukesh, William N; Schmoke, Timothy J; Jones, Ellen A; Morreale, Anthony P; Ourth, Heather L; Schlosser, James E; Mayo-Smith, Michael F; Allen, Arthur L; Witt, Daniel M; Helfrich, Christian D; McCullough, Megan B

    2017-05-01

    Improved anticoagulation control with warfarin reduces adverse events and represents a target for quality improvement. No previous study has described an effort to improve anticoagulation control across a health system. To describe the results of an effort to improve anticoagulation control in the New England region of the Veterans Health Administration (VA). Our intervention encompassed 8 VA sites managing warfarin for more than 5000 patients in New England (Veterans Integrated Service Network 1 [VISN 1]). We provided sites with a system to measure processes of care, along with targeted audit and feedback. We focused on processes of care associated with site-level anticoagulation control, including prompt follow-up after out-of-range international normalized ratio (INR) values, minimizing loss to follow-up, and use of guideline-concordant INR target ranges. We used a difference-in-differences (DID) model to examine changes in anticoagulation control, measured as percentage time in therapeutic range (TTR), as well as process measures and compared VISN 1 sites with 116 VA sites located outside VISN 1. VISN 1 sites improved on TTR, our main indicator of quality, from 66.4% to 69.2%, whereas sites outside VISN 1 improved from 65.9% to 66.4% (DID 2.3%, P < 0.001). Improvement in TTR correlated strongly with the extent of improvement on process-of-care measures, which varied widely across VISN 1 sites. A regional quality improvement initiative, using performance measurement with audit and feedback, improved TTR by 2.3% more than control sites, which is a clinically important difference. Improving relevant processes of care can improve outcomes for patients receiving warfarin.

  16. A novel frame-level constant-distortion bit allocation for smooth H.264/AVC video quality

    NASA Astrophysics Data System (ADS)

    Liu, Li; Zhuang, Xinhua

    2009-01-01

    It is known that quality fluctuation has a major negative effect on visual perception. In previous work, we introduced a constant-distortion bit allocation method [1] for H.263+ encoder. However, the method in [1] can not be adapted to the newest H.264/AVC encoder directly as the well-known chicken-egg dilemma resulted from the rate-distortion optimization (RDO) decision process. To solve this problem, we propose a new two stage constant-distortion bit allocation (CDBA) algorithm with enhanced rate control for H.264/AVC encoder. In stage-1, the algorithm performs RD optimization process with a constant quantization QP. Based on prediction residual signals from stage-1 and target distortion for smooth video quality purpose, the frame-level bit target is allocated by using a close-form approximations of ratedistortion relationship similar to [1], and a fast stage-2 encoding process is performed with enhanced basic unit rate control. Experimental results show that, compared with original rate control algorithm provided by H.264/AVC reference software JM12.1, the proposed constant-distortion frame-level bit allocation scheme reduces quality fluctuation and delivers much smoother PSNR on all testing sequences.

  17. Six Sigma Quality Management System and Design of Risk-based Statistical Quality Control.

    PubMed

    Westgard, James O; Westgard, Sten A

    2017-03-01

    Six sigma concepts provide a quality management system (QMS) with many useful tools for managing quality in medical laboratories. This Six Sigma QMS is driven by the quality required for the intended use of a test. The most useful form for this quality requirement is the allowable total error. Calculation of a sigma-metric provides the best predictor of risk for an analytical examination process, as well as a design parameter for selecting the statistical quality control (SQC) procedure necessary to detect medically important errors. Simple point estimates of sigma at medical decision concentrations are sufficient for laboratory applications. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. The construction of control chart for PM10 functional data

    NASA Astrophysics Data System (ADS)

    Shaadan, Norshahida; Jemain, Abdul Aziz; Deni, Sayang Mohd

    2014-06-01

    In this paper, a statistical procedure to construct a control chart for monitoring air quality (PM10) using functional data is proposed. A set of daily indices that represent the daily PM10 curves were obtained using Functional Principal Component Analysis (FPCA). By means of an iterative charting procedure, a reference data set that represented a stable PM10 process was obtained. The data were then used as a reference for monitoring future data. The application of the procedure was conducted using seven-year (2004-2010) period of recorded data from the Klang air quality monitoring station located in the Klang Valley region of Peninsular Malaysia. The study showed that the control chart provided a useful visualization tool for monitoring air quality and was capable in detecting abnormality in the process system. As in the case of Klang station, the results showed that with reference to 2004-2008, the air quality (PM10) in 2010 was better than that in 2009.

  19. Nuclear Technology Series. Course 31: Quality-Assurance Practices.

    ERIC Educational Resources Information Center

    Center for Occupational Research and Development, Inc., Waco, TX.

    This technical specialty course is one of thirty-five courses designed for use by two-year postsecondary institutions in five nuclear technician curriculum areas: (1) radiation protection technician, (2) nuclear instrumentation and control technician, (3) nuclear materials processing technician, (4) nuclear quality-assurance/quality-control…

  20. Small Steps, Big Reward: Quality Improvement through Pilot Groups.

    ERIC Educational Resources Information Center

    Bindl, Jim; Schuler, Jim

    1988-01-01

    Because of a need for quality improvement, Wisconsin Power and Light trained two six-person pilot groups in statistical process control, had them apply that knowledge to actual problems, and showed management the dollars-and-cents savings that come from quality improvement. (JOW)

  1. Data-quality measures for stakeholder-implemented watershed-monitoring programs

    USGS Publications Warehouse

    Greve, Adrienne I.

    2002-01-01

    Community-based watershed groups, many of which collect environmental data, have steadily increased in number over the last decade. The data generated by these programs are often underutilized due to uncertainty in the quality of data produced. The incorporation of data-quality measures into stakeholder monitoring programs lends statistical validity to data. Data-quality measures are divided into three steps: quality assurance, quality control, and quality assessment. The quality-assurance step attempts to control sources of error that cannot be directly quantified. This step is part of the design phase of a monitoring program and includes clearly defined, quantifiable objectives, sampling sites that meet the objectives, standardized protocols for sample collection, and standardized laboratory methods. Quality control (QC) is the collection of samples to assess the magnitude of error in a data set due to sampling, processing, transport, and analysis. In order to design a QC sampling program, a series of issues needs to be considered: (1) potential sources of error, (2) the type of QC samples, (3) inference space, (4) the number of QC samples, and (5) the distribution of the QC samples. Quality assessment is the process of evaluating quality-assurance measures and analyzing the QC data in order to interpret the environmental data. Quality assessment has two parts: one that is conducted on an ongoing basis as the monitoring program is running, and one that is conducted during the analysis of environmental data. The discussion of the data-quality measures is followed by an example of their application to a monitoring program in the Big Thompson River watershed of northern Colorado.

  2. The effects of RN staffing hours on nursing home quality: a two-stage model.

    PubMed

    Lee, Hyang Yuol; Blegen, Mary A; Harrington, Charlene

    2014-03-01

    Based on structure-process-outcome approach, this study examined the association of registered nurse (RN) staffing hours and five quality indicators, including two process measures (catheter use and antipsychotic drug use) and three outcome measures (pressure ulcers, urinary tract infections, and weight loss). We used data on resident assessments, RN staffing, organizational characteristics, and market factors to examine the quality of 195 nursing homes operating in a rural state of United States - Colorado. Two-stage least squares regression models were performed to address the endogenous relationships between RN staffing and the outcome-related quality indicators, and ordinary least squares regression was used for the process-related ones. This analysis focused on the relationship of RN staffing to nursing home quality indicators, controlling for organizational characteristics, resources, resident casemix, and market factors with clustering to control for geographical differences. Higher RN hours were associated with fewer pressure ulcers, but RN hours were not related to the other quality indicators. The study finding shows the importance of understanding the role of 'nurse staffing' under nursing home care, as well as the significance of associated/contextual factors with nursing home quality even in a small rural state. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Shuttle Mission STS-50: Orbital Processing of High-Quality CdTe Compound Semiconductors Experiment: Final Flight Sample Characterization Report

    NASA Technical Reports Server (NTRS)

    Larson, David J.; Casagrande, Luis G.; DiMarzio, Don; Alexander, J. Iwan D.; Carlson, Fred; Lee, Taipo; Dudley, Michael; Raghathamachar, Balaji

    1998-01-01

    The Orbital Processing of High-Quality Doped and Alloyed CdTe Compound Semiconductors program was initiated to investigate, quantitatively, the influences of gravitationally dependent phenomena on the growth and quality of bulk compound semiconductors. The objective was to improve crystal quality (both structural and compositional) and to better understand and control the variables within the crystal growth production process. The empirical effort entailed the development of a terrestrial (one-g) experiment baseline for quantitative comparison with microgravity (mu-g) results. This effort was supported by the development of high-fidelity process models of heat transfer, fluid flow and solute redistribution, and thermo-mechanical stress occurring in the furnace, safety cartridge, ampoule, and crystal throughout the melting, seeding, crystal growth, and post-solidification processing. In addition, the sensitivity of the orbital experiments was analyzed with respect to the residual microgravity (mu-g) environment, both steady state and g-jitter. CdZnTe crystals were grown in one-g and in mu-g. Crystals processed terrestrially were grown at the NASA Ground Control Experiments Laboratory (GCEL) and at Grumman Aerospace Corporation (now Northrop Grumman Corporation). Two mu-g crystals were grown in the Crystal Growth Furnace (CGF) during the First United States Microgravity Laboratory Mission (USML-1), STS-50, June 24 - July 9, 1992.

  4. Bioreactor process parameter screening utilizing a Plackett-Burman design for a model monoclonal antibody.

    PubMed

    Agarabi, Cyrus D; Schiel, John E; Lute, Scott C; Chavez, Brittany K; Boyne, Michael T; Brorson, Kurt A; Khan, Mansoora; Read, Erik K

    2015-06-01

    Consistent high-quality antibody yield is a key goal for cell culture bioprocessing. This endpoint is typically achieved in commercial settings through product and process engineering of bioreactor parameters during development. When the process is complex and not optimized, small changes in composition and control may yield a finished product of less desirable quality. Therefore, changes proposed to currently validated processes usually require justification and are reported to the US FDA for approval. Recently, design-of-experiments-based approaches have been explored to rapidly and efficiently achieve this goal of optimized yield with a better understanding of product and process variables that affect a product's critical quality attributes. Here, we present a laboratory-scale model culture where we apply a Plackett-Burman screening design to parallel cultures to study the main effects of 11 process variables. This exercise allowed us to determine the relative importance of these variables and identify the most important factors to be further optimized in order to control both desirable and undesirable glycan profiles. We found engineering changes relating to culture temperature and nonessential amino acid supplementation significantly impacted glycan profiles associated with fucosylation, β-galactosylation, and sialylation. All of these are important for monoclonal antibody product quality. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.

  5. Mindcontrol: A web application for brain segmentation quality control.

    PubMed

    Keshavan, Anisha; Datta, Esha; M McDonough, Ian; Madan, Christopher R; Jordan, Kesshi; Henry, Roland G

    2018-04-15

    Tissue classification plays a crucial role in the investigation of normal neural development, brain-behavior relationships, and the disease mechanisms of many psychiatric and neurological illnesses. Ensuring the accuracy of tissue classification is important for quality research and, in particular, the translation of imaging biomarkers to clinical practice. Assessment with the human eye is vital to correct various errors inherent to all currently available segmentation algorithms. Manual quality assurance becomes methodologically difficult at a large scale - a problem of increasing importance as the number of data sets is on the rise. To make this process more efficient, we have developed Mindcontrol, an open-source web application for the collaborative quality control of neuroimaging processing outputs. The Mindcontrol platform consists of a dashboard to organize data, descriptive visualizations to explore the data, an imaging viewer, and an in-browser annotation and editing toolbox for data curation and quality control. Mindcontrol is flexible and can be configured for the outputs of any software package in any data organization structure. Example configurations for three large, open-source datasets are presented: the 1000 Functional Connectomes Project (FCP), the Consortium for Reliability and Reproducibility (CoRR), and the Autism Brain Imaging Data Exchange (ABIDE) Collection. These demo applications link descriptive quality control metrics, regional brain volumes, and thickness scalars to a 3D imaging viewer and editing module, resulting in an easy-to-implement quality control protocol that can be scaled for any size and complexity of study. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  6. Statistical Process Control Charts for Measuring and Monitoring Temporal Consistency of Ratings

    ERIC Educational Resources Information Center

    Omar, M. Hafidz

    2010-01-01

    Methods of statistical process control were briefly investigated in the field of educational measurement as early as 1999. However, only the use of a cumulative sum chart was explored. In this article other methods of statistical quality control are introduced and explored. In particular, methods in the form of Shewhart mean and standard deviation…

  7. Information Quality in Regulatory Decision Making: Peer Review versus Good Laboratory Practice.

    PubMed

    McCarty, Lynn S; Borgert, Christopher J; Mihaich, Ellen M

    2012-07-01

    There is an ongoing discussion on the provenance of toxicity testing data regarding how best to ensure its validity and credibility. A central argument is whether journal peer-review procedures are superior to Good Laboratory Practice (GLP) standards employed for compliance with regulatory mandates. We sought to evaluate the rationale for regulatory decision making based on peer-review procedures versus GLP standards. We examined pertinent published literature regarding how scientific data quality and validity are evaluated for peer review, GLP compliance, and development of regulations. Some contend that peer review is a coherent, consistent evaluative procedure providing quality control for experimental data generation, analysis, and reporting sufficient to reliably establish relative merit, whereas GLP is seen as merely a tracking process designed to thwart investigator corruption. This view is not supported by published analyses pointing to subjectivity and variability in peer-review processes. Although GLP is not designed to establish relative merit, it is an internationally accepted quality assurance, quality control method for documenting experimental conduct and data. Neither process is completely sufficient for establishing relative scientific soundness. However, changes occurring both in peer-review processes and in regulatory guidance resulting in clearer, more transparent communication of scientific information point to an emerging convergence in ensuring information quality. The solution to determining relative merit lies in developing a well-documented, generally accepted weight-of-evidence scheme to evaluate both peer-reviewed and GLP information used in regulatory decision making where both merit and specific relevance inform the process.

  8. Quality risk management of top spray fluidized bed process for antihypertensive drug formulation with control strategy engendered by Box-behnken experimental design space.

    PubMed

    Mukharya, Amit; Patel, Paresh U; Shenoy, Dinesh; Chaudhary, Shivang

    2013-01-01

    Lacidipine (LCDP) is a very low soluble and highly biovariable calcium channel blocker used in the treatment of hypertension. To increase its apparent solubility and to reduce its biovariability, solid dispersion fluid bed processing technology was explored, as it produces highly dispersible granules with a characteristic porous structure that enhances dispersibility, wettability, blend uniformity (by dissolving and spraying a solution of actives), flow ability and compressibility of granules for tableting and reducing variability by uniform drug-binder solution distribution on carrier molecules. Main object of this quality risk management (QRM) study is to provide a sophisticated "robust and rugged" Fluidized Bed Process (FBP) for the preparation of LCDP tablets with desired quality (stability) and performance (dissolution) by quality by design (QbD) concept. THIS STUDY IS PRINCIPALLY FOCUSING ON THOROUGH MECHANISTIC UNDERSTANDING OF THE FBP BY WHICH IT IS DEVELOPED AND SCALED UP WITH A KNOWLEDGE OF THE CRITICAL RISKS INVOLVED IN MANUFACTURING PROCESS ANALYZED BY RISK ASSESSMENT TOOLS LIKE: Qualitative Initial Risk-based Matrix Analysis (IRMA) and Quantitative Failure Mode Effective Analysis (FMEA) to identify and rank parameters with potential to have an impact on In Process/Finished Product Critical Quality Attributes (IP/FP CQAs). These Critical Process Parameters (CPPs) were further refined by DoE and MVDA to develop design space with Real Time Release Testing (RTRT) that leads to implementation of a control strategy to achieve consistent finished product quality at lab scale itself to prevent possible product failure at larger manufacturing scale.

  9. Quality risk management of top spray fluidized bed process for antihypertensive drug formulation with control strategy engendered by Box-behnken experimental design space

    PubMed Central

    Mukharya, Amit; Patel, Paresh U; Shenoy, Dinesh; Chaudhary, Shivang

    2013-01-01

    Introduction: Lacidipine (LCDP) is a very low soluble and highly biovariable calcium channel blocker used in the treatment of hypertension. To increase its apparent solubility and to reduce its biovariability, solid dispersion fluid bed processing technology was explored, as it produces highly dispersible granules with a characteristic porous structure that enhances dispersibility, wettability, blend uniformity (by dissolving and spraying a solution of actives), flow ability and compressibility of granules for tableting and reducing variability by uniform drug-binder solution distribution on carrier molecules. Materials and Methods: Main object of this quality risk management (QRM) study is to provide a sophisticated “robust and rugged” Fluidized Bed Process (FBP) for the preparation of LCDP tablets with desired quality (stability) and performance (dissolution) by quality by design (QbD) concept. Results and Conclusion: This study is principally focusing on thorough mechanistic understanding of the FBP by which it is developed and scaled up with a knowledge of the critical risks involved in manufacturing process analyzed by risk assessment tools like: Qualitative Initial Risk-based Matrix Analysis (IRMA) and Quantitative Failure Mode Effective Analysis (FMEA) to identify and rank parameters with potential to have an impact on In Process/Finished Product Critical Quality Attributes (IP/FP CQAs). These Critical Process Parameters (CPPs) were further refined by DoE and MVDA to develop design space with Real Time Release Testing (RTRT) that leads to implementation of a control strategy to achieve consistent finished product quality at lab scale itself to prevent possible product failure at larger manufacturing scale. PMID:23799202

  10. Automation of testing modules of controller ELSY-ТМК

    NASA Astrophysics Data System (ADS)

    Dolotov, A. E.; Dolotova, R. G.; Petuhov, D. V.; Potapova, A. P.

    2017-01-01

    In modern life, there are means for automation of various processes which allow one to provide high quality standards of released products and to raise labour efficiency. In the given paper, the data on the automation of the test process of the ELSY-TMK controller [1] is presented. The ELSY-TMK programmed logic controller is an effective modular platform for construction of automation systems for small and average branches of industrial production. The modern and functional standard of communication and open environment of the logic controller give a powerful tool of wide spectrum applications for industrial automation. The algorithm allows one to test controller modules by operating the switching system and external devices faster and at a higher level of quality than a human without such means does.

  11. Many roads may lead to Rome: Selected features of quality control within environmental assessment systems in the US, NL, CA, and UK

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Günther, Markus, E-mail: markus.guenther@tu-berlin.de; Geißler, Gesa; Köppel, Johann

    As there is no one-and-only concept on how to precisely define and establish quality control (QC) or quality assurance (QA) in the making of environmental assessments (EA), this paper presents selected features of international approaches that address quality in EA systems in the USA, the Netherlands, Canada, and the United Kingdom. Based on explanative case studies, we highlight the embedding of specific quality control features within the EA systems, the objectives and processes, and relevant transparency challenges. Such features of QC/QA approaches can be considered in cases where substantial quality control and assurance efforts are still missing. Yet further researchmore » needs to be conducted on the efficacy of these approaches, which remains beyond the scope of this study. - Highlights: • We present four tools for quality control and assurance from different EA systems. • Approaches vary in institutional setting, objectives, procedures, and transparency. • Highlighted features might provide guidance in cases where QC/QA is still lacking.« less

  12. Lessons Learned on Quality (of) Standards

    NASA Astrophysics Data System (ADS)

    Gerlich, Rainer; Gerlich, Ralf

    2011-08-01

    Standards are used to describe and ensure the quality of products, services and processes throughout almost all branches of industry, including the field of software engineering. Contractors and suppliers are obligated by their customers and certification authorities to follow a certain set of standards during development. For example, a customer can easier actively participate in and control the contractor's process when enforcing a standard process..However, as with any requirement, a standard may also impede the contractor or supplier in assuring actual quality of the product in the sense of fitness for the purpose intended by the customer.This is the case when a standard defines specific quality assurance activities requiring a considerable amount of effort while other more efficient but equivalent or even superior approaches are blocked. Then improvement of the ratio between cost and quality exceeding miniscule advances is heavily impeded.While in some parts being too specific in defining the mechanisms of the enforced process, standards are sometimes too weak in defining the principles or goals on control of product quality.Therefore this paper addresses the following issues: (1) Which conclusions can be drawn on the quality and efficiency of a standard? (2) If and how is it possible to improve or evolve a standard? (3) How well does a standard guide a user towards high quality of the end product?One conclusion is that the analyzed standards do interfere with technological innovation, though the standards leave a lot of freedom for concretization and are understood as technology-independent.Another conclusion is that standards are not only a matter of quality but also a matter of competitiveness of the industry depending on resulting costs and time-to- market. When the costs induced by a standard are not adequate to the achievable quality, industry encounters a significant disadvantage.

  13. Process monitoring and visualization solutions for hot-melt extrusion: a review.

    PubMed

    Saerens, Lien; Vervaet, Chris; Remon, Jean Paul; De Beer, Thomas

    2014-02-01

    Hot-melt extrusion (HME) is applied as a continuous pharmaceutical manufacturing process for the production of a variety of dosage forms and formulations. To ensure the continuity of this process, the quality of the extrudates must be assessed continuously during manufacturing. The objective of this review is to provide an overview and evaluation of the available process analytical techniques which can be applied in hot-melt extrusion. Pharmaceutical extruders are equipped with traditional (univariate) process monitoring tools, observing barrel and die temperatures, throughput, screw speed, torque, drive amperage, melt pressure and melt temperature. The relevance of several spectroscopic process analytical techniques for monitoring and control of pharmaceutical HME has been explored recently. Nevertheless, many other sensors visualizing HME and measuring diverse critical product and process parameters with potential use in pharmaceutical extrusion are available, and were thoroughly studied in polymer extrusion. The implementation of process analytical tools in HME serves two purposes: (1) improving process understanding by monitoring and visualizing the material behaviour and (2) monitoring and analysing critical product and process parameters for process control, allowing to maintain a desired process state and guaranteeing the quality of the end product. This review is the first to provide an evaluation of the process analytical tools applied for pharmaceutical HME monitoring and control, and discusses techniques that have been used in polymer extrusion having potential for monitoring and control of pharmaceutical HME. © 2013 Royal Pharmaceutical Society.

  14. A Validity-Based Approach to Quality Control and Assurance of Automated Scoring

    ERIC Educational Resources Information Center

    Bejar, Isaac I.

    2011-01-01

    Automated scoring of constructed responses is already operational in several testing programmes. However, as the methodology matures and the demand for the utilisation of constructed responses increases, the volume of automated scoring is likely to increase at a fast pace. Quality assurance and control of the scoring process will likely be more…

  15. [Application of quality by design in granulation process for ginkgo leaf tablet (Ⅱ): identification of critical quality attributes].

    PubMed

    Xu, Bing; Cui, Xiang-Long; Yang, Chan; Wang, Xin; Shi, Xin-Yuan; Qiao, Yan-Jiang

    2017-03-01

    Quality by design (QbD) highlights the concept of "begin with the end", which means to thoroughly understand the target product quality first, and then guide pharmaceutical process development and quality control throughout the whole manufacturing process. In this paper, the Ginkgo biloba granules intermediates were taken as the research object, and the requirements of the tensile strength of tablets were treated as the goals to establish the methods for identification of granules' critical quality attributes (CQAs) and establishment of CQAs' limits. Firstly, the orthogonal partial least square (OPLS) model was adopted to build the relationship between the micromeritic properties of 29 batches of granules and the tensile strength of ginkgo leaf tablets, and thereby the potential critical quality attributes (pCQAs) were screened by variable importance in the projection (VIP) indexes. Then, a series of OPLS models were rebuilt by reducing pCQAs variables one by one in view of the rule of VIP values from low to high in sequence. The model performance results demonstrated that calibration and predictive performance of the model had no decreasing trend after variables reduction. In consideration of the results from variables selection as well as the collinearity test and testability of the pCQAs, the median particle size (D₅₀) and the bulk density (Da) were identified as critical quality attributes (CQAs). The design space of CQAs was developed based on a multiple linear regression model established between the CQAs (D₅₀ and Da) and the tensile strength. The control constraints of the CQAs were determined as 170 μm< D₅₀<500 μm and 0.30 g•cm⁻³

  16. GEOSPATIAL QA

    EPA Science Inventory

    Geospatial Science is increasingly becoming an important tool in making Agency decisions. Quality Control and Quality Assurance are required to be integrated during the planning, implementation and assessment of geospatial databases, processes and products. In order to ensure Age...

  17. ISO 9001 in a neonatal intensive care unit (NICU).

    PubMed

    Vitner, Gad; Nadir, Erez; Feldman, Michael; Yurman, Shmuel

    2011-01-01

    The aim of this paper is to present the process for approving and certifying a neonatal intensive care unit to ISO 9001 standards. The process started with the department head's decision to improve services quality before deciding to achieve ISO 9001 certification. Department processes were mapped and quality management mechanisms were developed. Process control and performance measurements were defined and implemented to monitor the daily work. A service satisfaction review was conducted to get feedback from families. In total, 28 processes and related work instructions were defined. Process yields showed service improvements. Family satisfaction improved. The paper is based on preparing only one neonatal intensive care unit to the ISO 9001 standard. The case study should act as an incentive for hospital managers aiming to improve service quality based on the ISO 9001 standard. ISO 9001 is becoming a recommended tool to improve clinical service quality.

  18. Combined Effects of Irrigation Regime, Genotype, and Harvest Stage Determine Tomato Fruit Quality and Aptitude for Processing into Puree.

    PubMed

    Arbex de Castro Vilas Boas, Alexandre; Page, David; Giovinazzo, Robert; Bertin, Nadia; Fanciullino, Anne-Laure

    2017-01-01

    Industry tomatoes are produced under a range of climatic conditions and practices which significantly impact on main quality traits of harvested fruits. However, the quality of tomato intended for processing is currently addressed on delivery through color and Brix only, whereas other traits are overlooked. Very few works provided an integrated view of the management of tomato puree quality throughout the chain. To gain insights into pre- and post-harvest interactions, four genotypes, two water regimes, three maturity stages, and two processes were investigated. Field and glasshouse experiments were conducted near Avignon, France, from May to August 2016. Two irrigation regimes were applied: control plants were irrigated in order to match 100% of evapotranspiration (ETP); water deficit (WD) plants were irrigated as control plants until anthesis of the first flowers, then irrigation was reduced to 60 and 50% ETP in field, and glasshouse respectively. Fruits were collected at three stages during ripening. Their color, fresh weight, dry matter content, and metabolite contents were determined before processing. Pericarp cell size was evaluated in glasshouse only. Two laboratory-scaled processing methods were applied before structural and biochemical analyses of the purees. Results outlined interactive effects between crop and process management. WD hardly reduced yield, but increased dry matter content in the field, in contrast to the glasshouse. The puree viscosity strongly depended on the genotype and the maturity stage, but it was disconnected from fruit dry matter content or Brix. The process impact on puree viscosity strongly depended on water supply during fruit production. Moreover, the lycopene content of fresh fruit may influence puree viscosity. This work opens new perspectives for managing puree quality in the field showing that it was possible to reduce water supply without affecting yield and to improve puree quality.

  19. The patient's perspective in the Dutch National Technical Agreement on Telemedicine.

    PubMed

    Meijer, Wouter J

    2008-01-01

    In 2007, the Dutch National Technical Agreement (NTA) for Telemedicine was established. Telemedicine deals with care processes. The goals of Telemedicine were defined broadly, including quality of life in non-medical terms as seen from the patient's perspective: 1) independence, 2) self-reliance; 3) participation in society and social life and 4) self-determination (autonomy through freedom of choice) for the care consumer and his environment. Quality aspects were defined at three levels:1) patient level Telemedicine must be in line with his needs. 2) level of information provision, such as: patient's rights in information control were also defined in the NTA: the care consumer has ultimate control over his own data. The care consumer decides who, in which functional capacity within the care process, is entitled to access which data at which level (reading) and is entitled to process it in some way: making additions, changes or possibly deleting (writing). On request, the healthcare provider must allow the care consumer access to his own data as quickly as possible and/or provide a copy of (part of) the record.3) level of business processes, e.g.it is important that the care process is designed on the basis of statutory requirements for the allocation and registration of the roles, rights and obligations of all actors concerned. For quality assurance, the processes must be defined on the basis of the function that they perform in the achievement of the goals (intended outcome), from the starting situation (input). The intended outcome means that the needs or requirements of the involved parties are fulfilled. The quality of the Telemedicine service must be assured in a cyclical and ongoing process. This can best be done by developing a quality management system based on indicators and criteria for quality.

  20. Preanalytical errors in medical laboratories: a review of the available methodologies of data collection and analysis.

    PubMed

    West, Jamie; Atherton, Jennifer; Costelloe, Seán J; Pourmahram, Ghazaleh; Stretton, Adam; Cornes, Michael

    2017-01-01

    Preanalytical errors have previously been shown to contribute a significant proportion of errors in laboratory processes and contribute to a number of patient safety risks. Accreditation against ISO 15189:2012 requires that laboratory Quality Management Systems consider the impact of preanalytical processes in areas such as the identification and control of non-conformances, continual improvement, internal audit and quality indicators. Previous studies have shown that there is a wide variation in the definition, repertoire and collection methods for preanalytical quality indicators. The International Federation of Clinical Chemistry Working Group on Laboratory Errors and Patient Safety has defined a number of quality indicators for the preanalytical stage, and the adoption of harmonized definitions will support interlaboratory comparisons and continual improvement. There are a variety of data collection methods, including audit, manual recording processes, incident reporting mechanisms and laboratory information systems. Quality management processes such as benchmarking, statistical process control, Pareto analysis and failure mode and effect analysis can be used to review data and should be incorporated into clinical governance mechanisms. In this paper, The Association for Clinical Biochemistry and Laboratory Medicine PreAnalytical Specialist Interest Group review the various data collection methods available. Our recommendation is the use of the laboratory information management systems as a recording mechanism for preanalytical errors as this provides the easiest and most standardized mechanism of data capture.

  1. The loss of SMG1 causes defects in quality control pathways in Physcomitrella patens

    PubMed Central

    Lang, Daniel; Zimmer, Andreas D; Causier, Barry

    2018-01-01

    Abstract Nonsense-mediated mRNA decay (NMD) is important for RNA quality control and gene regulation in eukaryotes. NMD targets aberrant transcripts for decay and also directly influences the abundance of non-aberrant transcripts. In animals, the SMG1 kinase plays an essential role in NMD by phosphorylating the core NMD factor UPF1. Despite SMG1 being ubiquitous throughout the plant kingdom, little is known about its function, probably because SMG1 is atypically absent from the genome of the model plant, Arabidopsis thaliana. By combining our previously established SMG1 knockout in moss with transcriptome-wide analysis, we reveal the range of processes involving SMG1 in plants. Machine learning assisted analysis suggests that 32% of multi-isoform genes produce NMD-targeted transcripts and that splice junctions downstream of a stop codon act as the major determinant of NMD targeting. Furthermore, we suggest that SMG1 is involved in other quality control pathways, affecting DNA repair and the unfolded protein response, in addition to its role in mRNA quality control. Consistent with this, smg1 plants have increased susceptibility to DNA damage, but increased tolerance to unfolded protein inducing agents. The potential involvement of SMG1 in RNA, DNA and protein quality control has major implications for the study of these processes in plants. PMID:29596649

  2. Application of high-throughput mini-bioreactor system for systematic scale-down modeling, process characterization, and control strategy development.

    PubMed

    Janakiraman, Vijay; Kwiatkowski, Chris; Kshirsagar, Rashmi; Ryll, Thomas; Huang, Yao-Ming

    2015-01-01

    High-throughput systems and processes have typically been targeted for process development and optimization in the bioprocessing industry. For process characterization, bench scale bioreactors have been the system of choice. Due to the need for performing different process conditions for multiple process parameters, the process characterization studies typically span several months and are considered time and resource intensive. In this study, we have shown the application of a high-throughput mini-bioreactor system viz. the Advanced Microscale Bioreactor (ambr15(TM) ), to perform process characterization in less than a month and develop an input control strategy. As a pre-requisite to process characterization, a scale-down model was first developed in the ambr system (15 mL) using statistical multivariate analysis techniques that showed comparability with both manufacturing scale (15,000 L) and bench scale (5 L). Volumetric sparge rates were matched between ambr and manufacturing scale, and the ambr process matched the pCO2 profiles as well as several other process and product quality parameters. The scale-down model was used to perform the process characterization DoE study and product quality results were generated. Upon comparison with DoE data from the bench scale bioreactors, similar effects of process parameters on process yield and product quality were identified between the two systems. We used the ambr data for setting action limits for the critical controlled parameters (CCPs), which were comparable to those from bench scale bioreactor data. In other words, the current work shows that the ambr15(TM) system is capable of replacing the bench scale bioreactor system for routine process development and process characterization. © 2015 American Institute of Chemical Engineers.

  3. The relationship between the content and the form of metaphorical statements.

    PubMed

    Xu, Xu

    2010-04-01

    Recent research suggests that the quality of a metaphorical topic-vehicle pairing should be the determinant to the choice of a proper grammatical form, nominal metaphor versus simile. Two studies examined the relationship between the quality of the content of a metaphorical statement and its grammatical form. Study 1 showed that the two grammatical forms did not differ in aptness when the quality of topic-vehicle pairs and the conventionality of vehicles, a factor associated with the quality of metaphorical expressions, were controlled. With an online comprehension measure, Study 2 found that high quality metaphorical pairings were easier to process than low quality metaphorical pairings in both the metaphor form and the simile form. For high quality metaphorical pairings, information related to both the topics and the vehicles was highly activated at an early stage of processing. The relations among factors involved in the interpretive process of metaphorical language are discussed.

  4. Decision Maker Perception of Information Quality: A Case Study of Military Command and Control

    ERIC Educational Resources Information Center

    Morgan, Grayson B.

    2013-01-01

    Decision maker perception of information quality cues from an "information system" (IS) and the process which creates such meta cueing, or data about cues, is a critical yet un-modeled component of "situation awareness" (SA). Examples of common information quality meta cueing for quality criteria include custom ring-tones for…

  5. Manufacturing Execution Systems: Examples of Performance Indicator and Operational Robustness Tools.

    PubMed

    Gendre, Yannick; Waridel, Gérard; Guyon, Myrtille; Demuth, Jean-François; Guelpa, Hervé; Humbert, Thierry

    Manufacturing Execution Systems (MES) are computerized systems used to measure production performance in terms of productivity, yield, and quality. In the first part, performance indicator and overall equipment effectiveness (OEE), process robustness tools and statistical process control are described. The second part details some tools to help process robustness and control by operators by preventing deviations from target control charts. MES was developed by Syngenta together with CIMO for automation.

  6. 14 CFR 21.143 - Quality control data requirements; prime manufacturer.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ..., purchased items, and parts and assemblies produced by manufacturers' suppliers including methods used to... special manufacturing processes involved, the means used to control the processes, the final test... procedure for recording review board decisions and disposing of rejected parts; (5) An outline of a system...

  7. Evaluating Statistical Process Control (SPC) techniques and computing the uncertainty of force calibrations

    NASA Technical Reports Server (NTRS)

    Navard, Sharon E.

    1989-01-01

    In recent years there has been a push within NASA to use statistical techniques to improve the quality of production. Two areas where statistics are used are in establishing product and process quality control of flight hardware and in evaluating the uncertainty of calibration of instruments. The Flight Systems Quality Engineering branch is responsible for developing and assuring the quality of all flight hardware; the statistical process control methods employed are reviewed and evaluated. The Measurement Standards and Calibration Laboratory performs the calibration of all instruments used on-site at JSC as well as those used by all off-site contractors. These calibrations must be performed in such a way as to be traceable to national standards maintained by the National Institute of Standards and Technology, and they must meet a four-to-one ratio of the instrument specifications to calibrating standard uncertainty. In some instances this ratio is not met, and in these cases it is desirable to compute the exact uncertainty of the calibration and determine ways of reducing it. A particular example where this problem is encountered is with a machine which does automatic calibrations of force. The process of force calibration using the United Force Machine is described in detail. The sources of error are identified and quantified when possible. Suggestions for improvement are made.

  8. Groundwater hydrogeochemical characteristics in rehabilitated coalmine spoils

    NASA Astrophysics Data System (ADS)

    Gomo, M.; Masemola, E.

    2016-04-01

    The investigation aims to identify and describe hydrogeochemical processes controlling the evolution of groundwater chemistry in rehabilitated coalmine spoils and their overall influence on groundwater quality at a study area located in the Karoo basin of South Africa. A good understanding of the processes that controls the evolution of the mine water quality is vital for the planning, application and management of post-mining remedial actions. The study utilises scatter plots, statistical analysis, PHREEQC hydrogeochemical modelling, stoichiometric reaction ratios analysis, and the expanded Durov diagram as complimentary tools to interpret the groundwater chemistry data collected from monitoring boreholes from 1995 to 2014. Measured pH ranging between 6-8 and arithmetic mean of 7.32 shows that the groundwater system is characterised by circumneutral hydrogeochemical conditions period. Comparison of measured groundwater ion concentrations to theoretical reaction stoichiometry identifies Dolomite-Acid Mine Drainage (AMD) neutralisation as the main hydrogeochemical process controlling the evolution of the groundwater chemistry. Hydrogeochemical modelling shows that, the groundwater has temporal variations of calcite and dolomite saturation indices characterised by alternating cycles of over-saturation and under-saturation that is driven by the release of sulphate, calcium and magnesium ions from the carbonate-AMD neutralization process. Arithmetic mean concentrations of sulphate, calcium and magnesium are in the order of 762 mg/L, 141 mg/L and 108 mg/L. Calcium and magnesium ions contribute to very hard groundwater quality conditions. Classification based on total dissolved solids (TDS), shows the circumneutral water is of poor to unacceptable quality for drinking purposes. Despite its ability to prevent AMD formation and leaching of metals, the dolomite-AMD neutralisation process can still lead to problems of elevated TDS and hardness which mines should be aware of when developing water quality management plans.

  9. Quality control in the recycling stream of PVC cable waste by hyperspectral imaging analysis

    NASA Astrophysics Data System (ADS)

    Luciani, Valentina; Serranti, Silvia; Bonifazi, Giuseppe; Rem, Peter

    2005-05-01

    In recent years recycling is gaining a key role in the manufacturing industry. The use of recycled materials in the production of new goods has the double advantage of saving energy and natural resources, moreover from an economic point of view, recycled materials are in general cheaper than the virgin ones. Despite of these environmental and economic strengths, the use of recycled sources is still low compared to the raw materials consumption, indeed in Europe only 10% of the market is covered by recycled products. One of the reasons of this reticence in the use of secondary sources is the lack of an accurate quality certification system. The inputs of a recycled process are not always the same, which means that also the output of a particular process can vary depending on the initial composition of the treated material. Usually if a continuous quality control system is not present at the end of the process the quality of the output material is assessed on the minimum certified characteristics. Solving this issue is crucial to expand the possible applications of recycled materials and to assign a price based on the real characteristic of the material. The possibility of applying a quality control system based on a hyperspectral imaging (HSI) technology working in the near infrared (NIR) range to the output of a separation process of PVC cable wastes is explored in this paper. The analysed material was a residue fraction of a traditional separation process further treated by magnetic density separation. Results show as PVC, PE, rubber and copper particles can be identified and classified adopting the NIR-HSI approach.

  10. Effects of rigor status during high-pressure processing on the physical qualities of farm-raised abalone (Haliotis rufescens).

    PubMed

    Hughes, Brianna H; Greenberg, Neil J; Yang, Tom C; Skonberg, Denise I

    2015-01-01

    High-pressure processing (HPP) is used to increase meat safety and shelf-life, with conflicting quality effects depending on rigor status during HPP. In the seafood industry, HPP is used to shuck and pasteurize oysters, but its use on abalones has only been minimally evaluated and the effect of rigor status during HPP on abalone quality has not been reported. Farm-raised abalones (Haliotis rufescens) were divided into 12 HPP treatments and 1 unprocessed control treatment. Treatments were processed pre-rigor or post-rigor at 2 pressures (100 and 300 MPa) and 3 processing times (1, 3, and 5 min). The control was analyzed post-rigor. Uniform plugs were cut from adductor and foot meat for texture profile analysis, shear force, and color analysis. Subsamples were used for scanning electron microscopy of muscle ultrastructure. Texture profile analysis revealed that post-rigor processed abalone was significantly (P < 0.05) less firm and chewy than pre-rigor processed irrespective of muscle type, processing time, or pressure. L values increased with pressure to 68.9 at 300 MPa for pre-rigor processed foot, 73.8 for post-rigor processed foot, 90.9 for pre-rigor processed adductor, and 89.0 for post-rigor processed adductor. Scanning electron microscopy images showed fraying of collagen fibers in processed adductor, but did not show pressure-induced compaction of the foot myofibrils. Post-rigor processed abalone meat was more tender than pre-rigor processed meat, and post-rigor processed foot meat was lighter in color than pre-rigor processed foot meat, suggesting that waiting for rigor to resolve prior to processing abalones may improve consumer perceptions of quality and market value. © 2014 Institute of Food Technologists®

  11. A Process Analytical Technology (PAT) approach to control a new API manufacturing process: development, validation and implementation.

    PubMed

    Schaefer, Cédric; Clicq, David; Lecomte, Clémence; Merschaert, Alain; Norrant, Edith; Fotiadu, Frédéric

    2014-03-01

    Pharmaceutical companies are progressively adopting and introducing Process Analytical Technology (PAT) and Quality-by-Design (QbD) concepts promoted by the regulatory agencies, aiming the building of the quality directly into the product by combining thorough scientific understanding and quality risk management. An analytical method based on near infrared (NIR) spectroscopy was developed as a PAT tool to control on-line an API (active pharmaceutical ingredient) manufacturing crystallization step during which the API and residual solvent contents need to be precisely determined to reach the predefined seeding point. An original methodology based on the QbD principles was designed to conduct the development and validation of the NIR method and to ensure that it is fitted for its intended use. On this basis, Partial least squares (PLS) models were developed and optimized using chemometrics methods. The method was fully validated according to the ICH Q2(R1) guideline and using the accuracy profile approach. The dosing ranges were evaluated to 9.0-12.0% w/w for the API and 0.18-1.50% w/w for the residual methanol. As by nature the variability of the sampling method and the reference method are included in the variability obtained for the NIR method during the validation phase, a real-time process monitoring exercise was performed to prove its fit for purpose. The implementation of this in-process control (IPC) method on the industrial plant from the launch of the new API synthesis process will enable automatic control of the final crystallization step in order to ensure a predefined quality level of the API. In addition, several valuable benefits are expected including reduction of the process time, suppression of a rather difficult sampling and tedious off-line analyses. © 2013 Published by Elsevier B.V.

  12. Quality control in the recycling stream of PVC from window frames by hyperspectral imaging

    NASA Astrophysics Data System (ADS)

    Luciani, Valentina; Serranti, Silvia; Bonifazi, Giuseppe; Di Maio, Francesco; Rem, Peter

    2013-05-01

    Polyvinyl chloride (PVC) is one of the most commonly used thermoplastic materials in respect to the worldwide polymer consumption. PVC is mainly used in the building and construction sector, products such as pipes, window frames, cable insulation, floors, coverings, roofing sheets, etc. are realised utilising this material. In recent years, the problem of PVC waste disposal gained increasing importance in the public discussion. The quantity of used PVC items entering the waste stream is gradually increased as progressively greater numbers of PVC products approach to the end of their useful economic lives. The quality of the recycled PVC depends on the characteristics of the recycling process and the quality of the input waste. Not all PVC-containing waste streams have the same economic value. A transparent relation between value and composition is required to decide if the recycling process is cost effective for a particular waste stream. An objective and reliable quality control technique is needed in the recycling industry for the monitoring of both recycled flow streams and final products in the plant. In this work hyperspectral imaging technique in the near infrared (NIR) range (1000-1700 nm) was applied to identify unwanted plastic contaminants and rubber present in PVC coming from windows frame waste in order to assess a quality control procedure during its recycling process. Results showed as PVC, PE and rubber can be identified adopting the NIR-HSI approach.

  13. Cost Models for MMC Manufacturing Processes

    NASA Technical Reports Server (NTRS)

    Elzey, Dana M.; Wadley, Haydn N. G.

    1996-01-01

    The quality cost modeling (QCM) tool is intended to be a relatively simple-to-use device for obtaining a first-order assessment of the quality-cost relationship for a given process-material combination. The QCM curve is a plot of cost versus quality (an index indicating microstructural quality), which is unique for a given process-material combination. The QCM curve indicates the tradeoff between cost and performance, thus enabling one to evaluate affordability. Additionally, the effect of changes in process design, raw materials, and process conditions on the cost-quality relationship can be evaluated. Such results might indicate the most efficient means to obtain improved quality at reduced cost by process design refinements, the implementation of sensors and models for closed loop process control, or improvement in the properties of raw materials being fed into the process. QCM also allows alternative processes for producing the same or similar material to be compared in terms of their potential for producing competitively priced, high quality material. Aside from demonstrating the usefulness of the QCM concept, this is one of the main foci of the present research program, namely to compare processes for making continuous fiber reinforced, metal matrix composites (MMC's). Two processes, low pressure plasma spray deposition and tape casting are considered for QCM development. This document consists of a detailed look at the design of the QCM approach, followed by discussion of the application of QCM to each of the selected MMC manufacturing processes along with results, comparison of processes, and finally, a summary of findings and recommendations.

  14. FDA 2011 process validation guidance: lifecycle compliance model.

    PubMed

    Campbell, Cliff

    2014-01-01

    This article has been written as a contribution to the industry's efforts in migrating from a document-driven to a data-driven compliance mindset. A combination of target product profile, control engineering, and general sum principle techniques is presented as the basis of a simple but scalable lifecycle compliance model in support of modernized process validation. Unit operations and significant variables occupy pole position within the model, documentation requirements being treated as a derivative or consequence of the modeling process. The quality system is repositioned as a subordinate of system quality, this being defined as the integral of related "system qualities". The article represents a structured interpretation of the U.S. Food and Drug Administration's 2011 Guidance for Industry on Process Validation and is based on the author's educational background and his manufacturing/consulting experience in the validation field. The U.S. Food and Drug Administration's Guidance for Industry on Process Validation (2011) provides a wide-ranging and rigorous outline of compliant drug manufacturing requirements relative to its 20(th) century predecessor (1987). Its declared focus is patient safety, and it identifies three inter-related (and obvious) stages of the compliance lifecycle. Firstly, processes must be designed, both from a technical and quality perspective. Secondly, processes must be qualified, providing evidence that the manufacturing facility is fully "roadworthy" and fit for its intended purpose. Thirdly, processes must be verified, meaning that commercial batches must be monitored to ensure that processes remain in a state of control throughout their lifetime.

  15. Using Unified Modelling Language (UML) as a process-modelling technique for clinical-research process improvement.

    PubMed

    Kumarapeli, P; De Lusignan, S; Ellis, T; Jones, B

    2007-03-01

    The Primary Care Data Quality programme (PCDQ) is a quality-improvement programme which processes routinely collected general practice computer data. Patient data collected from a wide range of different brands of clinical computer systems are aggregated, processed, and fed back to practices in an educational context to improve the quality of care. Process modelling is a well-established approach used to gain understanding and systematic appraisal, and identify areas of improvement of a business process. Unified modelling language (UML) is a general purpose modelling technique used for this purpose. We used UML to appraise the PCDQ process to see if the efficiency and predictability of the process could be improved. Activity analysis and thinking-aloud sessions were used to collect data to generate UML diagrams. The UML model highlighted the sequential nature of the current process as a barrier for efficiency gains. It also identified the uneven distribution of process controls, lack of symmetric communication channels, critical dependencies among processing stages, and failure to implement all the lessons learned in the piloting phase. It also suggested that improved structured reporting at each stage - especially from the pilot phase, parallel processing of data and correctly positioned process controls - should improve the efficiency and predictability of research projects. Process modelling provided a rational basis for the critical appraisal of a clinical data processing system; its potential maybe underutilized within health care.

  16. Use of artificial intelligence in the production of high quality minced meat

    NASA Astrophysics Data System (ADS)

    Kapovsky, B. R.; Pchelkina, V. A.; Plyasheshnik, P. I.; Dydykin, A. S.; Lazarev, A. A.

    2017-09-01

    A design for an automatic line for minced meat production according to new production technology based on an innovative meat milling method is proposed. This method allows the necessary degree of raw material comminution at the stage of raw material preparation to be obtained, which leads to production intensification due to the traditional meat mass comminution equipment being unnecessary. To ensure consistent quality of the product obtained, the use of on-line automatic control of the technological process for minced meat production is envisaged. This system has been developed using artificial intelligence methods and technologies. The system is trainable during the operation process, adapts to changes in processed raw material characteristics and to external impacts that affect the system operation, and manufactures meat shavings with minimal dispersion of the typical particle size. The control system includes equipment for express analysis of the chemical composition of the minced meat and its temperature after comminution. In this case, the minced meat production process can be controlled strictly as a function of time, which excludes subjective factors for assessing the degree of finished product readiness. This will allow finished meat products with consistent, targeted high quality to be produced.

  17. Dropwise additive manufacturing of pharmaceutical products for melt-based dosage forms.

    PubMed

    Içten, Elçin; Giridhar, Arun; Taylor, Lynne S; Nagy, Zoltan K; Reklaitis, Gintaras V

    2015-05-01

    The US Food and Drug Administration introduced the quality by design approach and process analytical technology guidance to encourage innovation and efficiency in pharmaceutical development, manufacturing, and quality assurance. As part of this renewed emphasis on the improvement of manufacturing, the pharmaceutical industry has begun to develop more efficient production processes with more intensive use of online measurement and sensing, real-time quality control, and process control tools. Here, we present dropwise additive manufacturing of pharmaceutical products (DAMPP) as an alternative to conventional pharmaceutical manufacturing methods. This mini-manufacturing process for the production of pharmaceuticals utilizes drop on demand printing technology for automated and controlled deposition of melt-based formulations onto edible substrates. The advantages of drop-on-demand technology, including reproducible production of small droplets, adjustable drop sizing, high placement accuracy, and flexible use of different formulations, enable production of individualized dosing even for low-dose and high-potency drugs. In this work, DAMPP is used to produce solid oral dosage forms from hot melts of an active pharmaceutical ingredient and a polymer. The dosage forms are analyzed to show the reproducibility of dosing and the dissolution behavior of different formulations. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.

  18. Emerging structural insights into glycoprotein quality control coupled with N-glycan processing in the endoplasmic reticulum.

    PubMed

    Satoh, Tadashi; Yamaguchi, Takumi; Kato, Koichi

    2015-01-30

    In the endoplasmic reticulum (ER), the sugar chain is initially introduced onto newly synthesized proteins as a triantennary tetradecasaccharide (Glc3Man9GlcNAc2). The attached oligosaccharide chain is subjected to stepwise trimming by the actions of specific glucosidases and mannosidases. In these processes, the transiently expressed N-glycans, as processing intermediates, function as signals for the determination of glycoprotein fates, i.e., folding, transport, or degradation through interactions of a series of intracellular lectins. The monoglucosylated glycoforms are hallmarks of incompletely folded states of glycoproteins in this system, whereas the outer mannose trimming leads to ER-associated glycoprotein degradation. This review outlines the recently emerging evidence regarding the molecular and structural basis of this glycoprotein quality control system, which is regulated through dynamic interplay among intracellular lectins, glycosidases, and glycosyltransferase. Structural snapshots of carbohydrate-lectin interactions have been provided at the atomic level using X-ray crystallographic analyses. Conformational ensembles of uncomplexed triantennary high-mannose-type oligosaccharides have been characterized in a quantitative manner using molecular dynamics simulation in conjunction with nuclear magnetic resonance spectroscopy. These complementary views provide new insights into glycoprotein recognition in quality control coupled with N-glycan processing.

  19. Microstructural Influence on Mechanical Properties in Plasma Microwelding of Ti6Al4V Alloy

    NASA Astrophysics Data System (ADS)

    Baruah, M.; Bag, S.

    2016-11-01

    The complexity of joining Ti6Al4V alloy enhances with reduction in sheet thickness. The present work puts emphasis on microplasma arc welding (MPAW) of 500-μm-thick Ti6Al4V alloy in butt joint configuration. Using controlled and regulated arc current, the MPAW process is specifically designed to use in joining of thin sheet components over a wide range of process parameters. The weld quality is assessed by carefully controlling the process parameters and by reducing the formation of oxides. The combined effect of welding speed and current on the weld joint properties is evaluated for joining of Ti6Al4V alloy. The macro- and microstructural characterizations of the weldment by optical microscopy as well as the analysis of mechanical properties by microtensile and microhardness test have been performed. The weld joint quality is affected by specifically designed fixture that controls the oxidation of the joint and introduces high cooling rate. Hence, the solidified microstructure of welded specimen influences the mechanical properties of the joint. The butt joint of titanium alloy by MPAW at optimal process parameters is of very high quality, without any internal defects and with minimum residual distortion.

  20. A quality improvement project to improve the Medicare and Medicaid Services (CMS) sepsis bundle compliance rate in a large healthcare system.

    PubMed

    Raschke, Robert A; Groves, Robert H; Khurana, Hargobind S; Nikhanj, Nidhi; Utter, Ethel; Hartling, Didi; Stoffer, Brenda; Nunn, Kristina; Tryon, Shona; Bruner, Michelle; Calleja, Maria; Curry, Steven C

    2017-01-01

    Sepsis is a leading cause of mortality and morbidity in hospitalised patients. The Centers for Medicare and Medicaid Services (CMS) mandated that US hospitals report sepsis bundle compliance rate as a quality process measure in October 2015. The specific aim of our study was to improve the CMS sepsis bundle compliance rate from 30% to 40% across 20 acute care hospitals in our healthcare system within 1 year. The study included all adult inpatients with sepsis sampled according to CMS specifications from October 2015 to September 2016. The CMS sepsis bundle compliance rate was tracked monthly using statistical process control charting. A baseline rate of 28.5% with 99% control limits was established. We implemented multiple interventions including computerised decision support systems (CDSSs) to increase compliance with the most commonly missing bundle elements. Compliance reached 42% (99% statistical process control limits 18.4%-38.6%) as CDSS was implemented system-wide, but this improvement was not sustained after CMS changed specifications of the outcome measure. Difficulties encountered elucidate shortcomings of our study methodology and of the CMS sepsis bundle compliance rate as a quality process measure.

  1. Use of near-infrared spectroscopy and multipoint measurements for quality control of pharmaceutical drug products.

    PubMed

    Boiret, Mathieu; Chauchard, Fabien

    2017-01-01

    Near-infrared (NIR) spectroscopy is a non-destructive analytical technique that enables better-understanding and optimization of pharmaceutical processes and final drug products. The use in line is often limited by acquisition speed and sampling area. This work focuses on performing a multipoint measurement at high acquisition speed at the end of the manufacturing process on a conveyor belt system to control both the distribution and the content of active pharmaceutical ingredient within final drug products, i.e., tablets. A specially designed probe with several collection fibers was developed for this study. By measuring spectral and spatial information, it provides physical and chemical knowledge on the final drug product. The NIR probe was installed on a conveyor belt system that enables the analysis of a lot of tablets. The use of these NIR multipoint measurement probes on a conveyor belt system provided an innovative method that has the potential to be used as a new paradigm to ensure the drug product quality at the end of the manufacturing process and as a new analytical method for the real-time release control strategy. Graphical abstract Use of near-infrared spectroscopy and multipoint measurements for quality control of pharmaceutical drug products.

  2. ChiLin: a comprehensive ChIP-seq and DNase-seq quality control and analysis pipeline.

    PubMed

    Qin, Qian; Mei, Shenglin; Wu, Qiu; Sun, Hanfei; Li, Lewyn; Taing, Len; Chen, Sujun; Li, Fugen; Liu, Tao; Zang, Chongzhi; Xu, Han; Chen, Yiwen; Meyer, Clifford A; Zhang, Yong; Brown, Myles; Long, Henry W; Liu, X Shirley

    2016-10-03

    Transcription factor binding, histone modification, and chromatin accessibility studies are important approaches to understanding the biology of gene regulation. ChIP-seq and DNase-seq have become the standard techniques for studying protein-DNA interactions and chromatin accessibility respectively, and comprehensive quality control (QC) and analysis tools are critical to extracting the most value from these assay types. Although many analysis and QC tools have been reported, few combine ChIP-seq and DNase-seq data analysis and quality control in a unified framework with a comprehensive and unbiased reference of data quality metrics. ChiLin is a computational pipeline that automates the quality control and data analyses of ChIP-seq and DNase-seq data. It is developed using a flexible and modular software framework that can be easily extended and modified. ChiLin is ideal for batch processing of many datasets and is well suited for large collaborative projects involving ChIP-seq and DNase-seq from different designs. ChiLin generates comprehensive quality control reports that include comparisons with historical data derived from over 23,677 public ChIP-seq and DNase-seq samples (11,265 datasets) from eight literature-based classified categories. To the best of our knowledge, this atlas represents the most comprehensive ChIP-seq and DNase-seq related quality metric resource currently available. These historical metrics provide useful heuristic quality references for experiment across all commonly used assay types. Using representative datasets, we demonstrate the versatility of the pipeline by applying it to different assay types of ChIP-seq data. The pipeline software is available open source at https://github.com/cfce/chilin . ChiLin is a scalable and powerful tool to process large batches of ChIP-seq and DNase-seq datasets. The analysis output and quality metrics have been structured into user-friendly directories and reports. We have successfully compiled 23,677 profiles into a comprehensive quality atlas with fine classification for users.

  3. Signal quality and Bayesian signal processing in neurofeedback based on real-time fMRI.

    PubMed

    Koush, Yury; Zvyagintsev, Mikhail; Dyck, Miriam; Mathiak, Krystyna A; Mathiak, Klaus

    2012-01-02

    Real-time fMRI allows analysis and visualization of the brain activity online, i.e. within one repetition time. It can be used in neurofeedback applications where subjects attempt to control an activation level in a specified region of interest (ROI) of their brain. The signal derived from the ROI is contaminated with noise and artifacts, namely with physiological noise from breathing and heart beat, scanner drift, motion-related artifacts and measurement noise. We developed a Bayesian approach to reduce noise and to remove artifacts in real-time using a modified Kalman filter. The system performs several signal processing operations: subtraction of constant and low-frequency signal components, spike removal and signal smoothing. Quantitative feedback signal quality analysis was used to estimate the quality of the neurofeedback time series and performance of the applied signal processing on different ROIs. The signal-to-noise ratio (SNR) across the entire time series and the group event-related SNR (eSNR) were significantly higher for the processed time series in comparison to the raw data. Applied signal processing improved the t-statistic increasing the significance of blood oxygen level-dependent (BOLD) signal changes. Accordingly, the contrast-to-noise ratio (CNR) of the feedback time series was improved as well. In addition, the data revealed increase of localized self-control across feedback sessions. The new signal processing approach provided reliable neurofeedback, performed precise artifacts removal, reduced noise, and required minimal manual adjustments of parameters. Advanced and fast online signal processing algorithms considerably increased the quality as well as the information content of the control signal which in turn resulted in higher contingency in the neurofeedback loop. Copyright © 2011 Elsevier Inc. All rights reserved.

  4. Defining process design space for a hydrophobic interaction chromatography (HIC) purification step: application of quality by design (QbD) principles.

    PubMed

    Jiang, Canping; Flansburg, Lisa; Ghose, Sanchayita; Jorjorian, Paul; Shukla, Abhinav A

    2010-12-15

    The concept of design space has been taking root under the quality by design paradigm as a foundation of in-process control strategies for biopharmaceutical manufacturing processes. This paper outlines the development of a design space for a hydrophobic interaction chromatography (HIC) process step. The design space included the impact of raw material lot-to-lot variability and variations in the feed stream from cell culture. A failure modes and effects analysis was employed as the basis for the process characterization exercise. During mapping of the process design space, the multi-dimensional combination of operational variables were studied to quantify the impact on process performance in terms of yield and product quality. Variability in resin hydrophobicity was found to have a significant influence on step yield and high-molecular weight aggregate clearance through the HIC step. A robust operating window was identified for this process step that enabled a higher step yield while ensuring acceptable product quality. © 2010 Wiley Periodicals, Inc.

  5. Assessing Educational Processes Using Total-Quality-Management Measurement Tools.

    ERIC Educational Resources Information Center

    Macchia, Peter, Jr.

    1993-01-01

    Discussion of the use of Total Quality Management (TQM) assessment tools in educational settings highlights and gives examples of fishbone diagrams, or cause and effect charts; Pareto diagrams; control charts; histograms and check sheets; scatter diagrams; and flowcharts. Variation and quality are discussed in terms of continuous process…

  6. The Individualized Quality Control Plan - Coming Soon to Clinical Microbiology Laboratories Everywhere!

    PubMed

    Anderson, Nancy

    2015-11-15

    As of January 1, 2016, microbiology laboratories can choose to adopt a new quality control option, the Individualized Quality Control Plan (IQCP), under the Clinical Laboratory Improvement Amendments of 1988 (CLIA). This voluntary approach increases flexibility for meeting regulatory requirements and provides laboratories the opportunity to customize QC for their testing in their unique environments and by their testing personnel. IQCP is an all-inclusive approach to quality based on risk management to address potential errors in the total testing process. It includes three main steps, (1) performing a risk assessment, (2) developing a QC plan, and (3) monitoring the plan through quality assessment. Resources are available from the Centers for Medicare & Medicaid Services, Centers for Disease Control and Prevention, American Society for Microbiology, Clinical and Laboratory Standards Institute, and accrediting organizations, such as the College of American Pathologists and Joint Commission, to assist microbiology laboratories implementing IQCP.

  7. The pharmaceutical vial capping process: Container closure systems, capping equipment, regulatory framework, and seal quality tests.

    PubMed

    Mathaes, Roman; Mahler, Hanns-Christian; Buettiker, Jean-Pierre; Roehl, Holger; Lam, Philippe; Brown, Helen; Luemkemann, Joerg; Adler, Michael; Huwyler, Joerg; Streubel, Alexander; Mohl, Silke

    2016-02-01

    Parenteral drug products are protected by appropriate primary packaging to protect against environmental factors, including potential microbial contamination during shelf life duration. The most commonly used CCS configuration for parenteral drug products is the glass vial, sealed with a rubber stopper and an aluminum crimp cap. In combination with an adequately designed and controlled aseptic fill/finish processes, a well-designed and characterized capping process is indispensable to ensure product quality and integrity and to minimize rejections during the manufacturing process. In this review, the health authority requirements and expectations related to container closure system quality and container closure integrity are summarized. The pharmaceutical vial, the rubber stopper, and the crimp cap are described. Different capping techniques are critically compared: The most common capping equipment with a rotating capping plate produces the lowest amount of particle. The strength and challenges of methods to control the capping process are discussed. The residual seal force method can characterize the capping process independent of the used capping equipment or CCS. We analyze the root causes of several cosmetic defects associated with the vial capping process. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Relationship between influence function accuracy and polishing quality in magnetorheological finishing

    NASA Astrophysics Data System (ADS)

    Schinhaerl, Markus; Schneider, Florian; Rascher, Rolf; Vogt, Christian; Sperber, Peter

    2010-10-01

    Magnetorheological finishing is a typical commercial application of a computer-controlled polishing process in the manufacturing of precision optical surfaces. Precise knowledge of the material removal characteristic of the polishing tool (influence function) is essential for controlling the material removal on the workpiece surface by the dwell time method. Results from the testing series with magnetorheological finishing have shown that a deviation of only 5% between the actual material removal characteristic of the polishing tool and that represented by the influence function caused a considerable reduction in the polishing quality. The paper discusses reasons for inaccuracies in the influence function and the effects on the polishing quality. The generic results of this research serve for the development of improved polishing strategies, and may be used in alternative applications of computer-controlled polishing processes that quantify the material removal characteristic by influence functions.

  9. 10 CFR 26.167 - Quality assurance and quality control.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... be designed, implemented, and reviewed to monitor the conduct of each step of the testing process. (b... sample and the error is determined to be technical or methodological, the licensee or other entity shall...

  10. 10 CFR 26.167 - Quality assurance and quality control.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... be designed, implemented, and reviewed to monitor the conduct of each step of the testing process. (b... sample and the error is determined to be technical or methodological, the licensee or other entity shall...

  11. 10 CFR 26.167 - Quality assurance and quality control.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... be designed, implemented, and reviewed to monitor the conduct of each step of the testing process. (b... sample and the error is determined to be technical or methodological, the licensee or other entity shall...

  12. 10 CFR 26.167 - Quality assurance and quality control.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... be designed, implemented, and reviewed to monitor the conduct of each step of the testing process. (b... sample and the error is determined to be technical or methodological, the licensee or other entity shall...

  13. 10 CFR 26.167 - Quality assurance and quality control.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... be designed, implemented, and reviewed to monitor the conduct of each step of the testing process. (b... sample and the error is determined to be technical or methodological, the licensee or other entity shall...

  14. Employee empowerment through team building and use of process control methods.

    PubMed

    Willems, S

    1998-02-01

    The article examines the use of statistical process control and performance improvement techniques in employee empowerment. The focus is how these techniques provide employees with information to improve their productivity and become involved in the decision-making process. Findings suggest that at one Mississippi hospital employee improvement has had a positive effect on employee productivity, morale, and quality of work.

  15. Manufacturing Research: Self-Directed Control

    DTIC Science & Technology

    1991-01-01

    reduce this sensitivity. SDO is performing Taguchi’s parameter design . 1-13 Statistical Process Control SPC techniques will be used to monitor the process...Florida,R.E. Krieger Pub. Co., 1988. Dehnad, Khowrow, Quality Control . Robust Design . and the Taguchi Method, Pacific Grove, California, Wadsworth... control system. This turns out to be a non -trivial exercise. A human operator can see an event occur (such as the vessel pressurizing above its setpoint

  16. Statistical Process Control in the Practice of Program Evaluation.

    ERIC Educational Resources Information Center

    Posavac, Emil J.

    1995-01-01

    A technique developed to monitor the quality of manufactured products, statistical process control (SPC), incorporates several features that may prove attractive to evaluators. This paper reviews the history of SPC, suggests how the approach can enrich program evaluation, and illustrates its use in a hospital-based example. (SLD)

  17. 76 FR 31856 - Approval and Promulgation of Air Quality Implementation Plans; Pennsylvania; Adoption of Control...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-02

    ... Flat Wood Paneling Surface Coating Processes AGENCY: Environmental Protection Agency (EPA). ACTION... by EPA's Control Techniques Guidelines (CTG) standards for flat wood paneling surface coating processes. EPA is approving this revision concerning the adoption of the EPA CTG requirements for flat wood...

  18. Effects of Meteorological Data Quality on Snowpack Modeling

    NASA Astrophysics Data System (ADS)

    Havens, S.; Marks, D. G.; Robertson, M.; Hedrick, A. R.; Johnson, M.

    2017-12-01

    Detailed quality control of meteorological inputs is the most time-intensive component of running the distributed, physically-based iSnobal snow model, and the effect of data quality of the inputs on the model is unknown. The iSnobal model has been run operationally since WY2013, and is currently run in several basins in Idaho and California. The largest amount of user input during modeling is for the quality control of precipitation, temperature, relative humidity, solar radiation, wind speed and wind direction inputs. Precipitation inputs require detailed user input and are crucial to correctly model the snowpack mass. This research applies a range of quality control methods to meteorological input, from raw input with minimal cleaning, to complete user-applied quality control. The meteorological input cleaning generally falls into two categories. The first is global minimum/maximum and missing value correction that could be corrected and/or interpolated with automated processing. The second category is quality control for inputs that are not globally erroneous, yet are still unreasonable and generally indicate malfunctioning measurement equipment, such as temperature or relative humidity that remains constant, or does not correlate with daily trends observed at nearby stations. This research will determine how sensitive model outputs are to different levels of quality control and guide future operational applications.

  19. Benefits of a comprehensive quality program for cryopreserved PBMC covering 28 clinical trials sites utilizing an integrated, analytical web-based portal.

    PubMed

    Ducar, Constance; Smith, Donna; Pinzon, Cris; Stirewalt, Michael; Cooper, Cristine; McElrath, M Juliana; Hural, John

    2014-07-01

    The HIV Vaccine Trials Network (HVTN) is a global network of 28 clinical trial sites dedicated to identifying an effective HIV vaccine. Cryopreservation of high-quality peripheral blood mononuclear cells (PBMC) is critical for the assessment of vaccine-induced cellular immune functions. The HVTN PBMC Quality Management Program is designed to ensure that viable PBMC are processed, stored and shipped for clinical trial assays from all HVTN clinical trial sites. The program has evolved by developing and incorporating best practices for laboratory and specimen quality and implementing automated, web-based tools. These tools allow the site-affiliated processing laboratories and the central Laboratory Operations Unit to rapidly collect, analyze and report PBMC quality data. The HVTN PBMC Quality Management Program includes five key components: 1) Laboratory Assessment, 2) PBMC Training and Certification, 3) Internal Quality Control, 4) External Quality Control (EQC), and 5) Assay Specimen Quality Control. Fresh PBMC processing data is uploaded from each clinical site processing laboratory to a central HVTN Statistical and Data Management Center database for access and analysis on a web portal. Samples are thawed at a central laboratory for assay or specimen quality control and sample quality data is uploaded directly to the database by the central laboratory. Four year cumulative data covering 23,477 blood draws reveals an average fresh PBMC yield of 1.45×10(6)±0.48 cells per milliliter of useable whole blood. 95% of samples were within the acceptable range for fresh cell yield of 0.8-3.2×10(6) cells/ml of usable blood. Prior to full implementation of the HVTN PBMC Quality Management Program, the 2007 EQC evaluations from 10 international sites showed a mean day 2 thawed viability of 83.1% and a recovery of 67.5%. Since then, four year cumulative data covering 3338 specimens used in immunologic assays shows that 99.88% had acceptable viabilities (>66%) for use in cellular assays (mean, 91.46% ±4.5%), and 96.2% had acceptable recoveries (50%-130%) with a mean of recovery of 85.8% ±19.12% of the originally cryopreserved cells. EQC testing revealed that since August 2009, failed recoveries dropped from 4.1% to 1.6% and failed viabilities dropped from 1.0% to 0.3%. The HVTN PBMC quality program provides for laboratory assessment, training and tools for identifying problems, implementing corrective action and monitoring for improvements. These data support the benefits of implementing a comprehensive, web-based PBMC quality program for large clinical trials networks. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Taking advantage of ground data systems attributes to achieve quality results in testing software

    NASA Technical Reports Server (NTRS)

    Sigman, Clayton B.; Koslosky, John T.; Hageman, Barbara H.

    1994-01-01

    During the software development life cycle process, basic testing starts with the development team. At the end of the development process, an acceptance test is performed for the user to ensure that the deliverable is acceptable. Ideally, the delivery is an operational product with zero defects. However, the goal of zero defects is normally not achieved but is successful to various degrees. With the emphasis on building low cost ground support systems while maintaining a quality product, a key element in the test process is simulator capability. This paper reviews the Transportable Payload Operations Control Center (TPOCC) Advanced Spacecraft Simulator (TASS) test tool that is used in the acceptance test process for unmanned satellite operations control centers. The TASS is designed to support the development, test and operational environments of the Goddard Space Flight Center (GSFC) operations control centers. The TASS uses the same basic architecture as the operations control center. This architecture is characterized by its use of distributed processing, industry standards, commercial off-the-shelf (COTS) hardware and software components, and reusable software. The TASS uses much of the same TPOCC architecture and reusable software that the operations control center developer uses. The TASS also makes use of reusable simulator software in the mission specific versions of the TASS. Very little new software needs to be developed, mainly mission specific telemetry communication and command processing software. By taking advantage of the ground data system attributes, successful software reuse for operational systems provides the opportunity to extend the reuse concept into the test area. Consistency in test approach is a major step in achieving quality results.

  1. Implementation of Cyber-Physical Production Systems for Quality Prediction and Operation Control in Metal Casting.

    PubMed

    Lee, JuneHyuck; Noh, Sang Do; Kim, Hyun-Jung; Kang, Yong-Shin

    2018-05-04

    The prediction of internal defects of metal casting immediately after the casting process saves unnecessary time and money by reducing the amount of inputs into the next stage, such as the machining process, and enables flexible scheduling. Cyber-physical production systems (CPPS) perfectly fulfill the aforementioned requirements. This study deals with the implementation of CPPS in a real factory to predict the quality of metal casting and operation control. First, a CPPS architecture framework for quality prediction and operation control in metal-casting production was designed. The framework describes collaboration among internet of things (IoT), artificial intelligence, simulations, manufacturing execution systems, and advanced planning and scheduling systems. Subsequently, the implementation of the CPPS in actual plants is described. Temperature is a major factor that affects casting quality, and thus, temperature sensors and IoT communication devices were attached to casting machines. The well-known NoSQL database, HBase and the high-speed processing/analysis tool, Spark, are used for IoT repository and data pre-processing, respectively. Many machine learning algorithms such as decision tree, random forest, artificial neural network, and support vector machine were used for quality prediction and compared with R software. Finally, the operation of the entire system is demonstrated through a CPPS dashboard. In an era in which most CPPS-related studies are conducted on high-level abstract models, this study describes more specific architectural frameworks, use cases, usable software, and analytical methodologies. In addition, this study verifies the usefulness of CPPS by estimating quantitative effects. This is expected to contribute to the proliferation of CPPS in the industry.

  2. Evaluation of feedback interventions for improving the quality assurance of cancer screening in Japan: study design and report of the baseline survey.

    PubMed

    Machii, Ryoko; Saika, Kumiko; Higashi, Takahiro; Aoki, Ayako; Hamashima, Chisato; Saito, Hiroshi

    2012-02-01

    The importance of quality assurance in cancer screening has recently gained increasing attention in Japan. To evaluate and improve quality, checklists and process indicators have been developed. To explore effective methods of enhancing quality in cancer screening, we started a randomized control study of the methods of evaluation and feedback for cancer control from 2009 to 2014. We randomly assigned 1270 municipal governments, equivalent to 71% of all Japanese municipal governments that performed screening programs, into three groups. The high-intensity intervention groups (n = 425) were individually evaluated using both checklist performance and process indicator values, while the low-intensity intervention groups (n= 421) were individually evaluated on the basis of only checklist performance. The control group (n = 424) received only a basic report that included the national average of checklist performance scores. We repeated the survey for each municipality's quality assurance activity performance using checklists and process indicators. In this paper, we report our study design and the result of the baseline survey. The checklist adherence rates were especially low in the checklist elements related to invitation of individuals, detailed monitoring of process indicators such as cancer detection rates according to screening histories and appropriate selection of screening facilities. Screening rate and percentage of examinees who underwent detailed examination tended to be lower for large cities when compared with smaller cities for all cancer sites. The performance of the Japanese cancer screening program in 2009 was identified for the first time.

  3. Nonlinear Multiobjective MPC-Based Optimal Operation of a High Consistency Refining System in Papermaking

    DOE PAGES

    Li, Mingjie; Zhou, Ping; Wang, Hong; ...

    2017-09-19

    As one of the most important unit in the papermaking industry, the high consistency (HC) refining system is confronted with challenges such as improving pulp quality, energy saving, and emissions reduction in its operation processes. Here in this correspondence, an optimal operation of HC refining system is presented using nonlinear multiobjective model predictive control strategies that aim at set-point tracking objective of pulp quality, economic objective, and specific energy (SE) consumption objective, respectively. First, a set of input and output data at different times are employed to construct the subprocess model of the state process model for the HC refiningmore » system, and then the Wiener-type model can be obtained through combining the mechanism model of Canadian Standard Freeness and the state process model that determines their structures based on Akaike information criterion. Second, the multiobjective optimization strategy that optimizes both the set-point tracking objective of pulp quality and SE consumption is proposed simultaneously, which uses NSGA-II approach to obtain the Pareto optimal set. Furthermore, targeting at the set-point tracking objective of pulp quality, economic objective, and SE consumption objective, the sequential quadratic programming method is utilized to produce the optimal predictive controllers. In conclusion, the simulation results demonstrate that the proposed methods can make the HC refining system provide a better performance of set-point tracking of pulp quality when these predictive controllers are employed. In addition, while the optimal predictive controllers orienting with comprehensive economic objective and SE consumption objective, it has been shown that they have significantly reduced the energy consumption.« less

  4. The Ratios of Pre-emulsified Duck Skin for Optimized Processing of Restructured Ham.

    PubMed

    Shim, Jae-Yun; Kim, Tae-Kyung; Kim, Young-Boong; Jeon, Ki-Hong; Ahn, Kwang-Il; Paik, Hyun-Dong; Choi, Yun-Sang

    2018-02-01

    The purpose of this study was to investigate the quality of duck ham formulated with duck skin through the pre-emulsification process. The experiments to investigate the quality characteristics of duck ham were carried out to measure proximate composition, cooking loss, emulsion stability, pH, color, texture profile analysis, apparent viscosity, and sensory characteristics. Duck ham was prepared with various ratios of duck skin in pre-emulsion as follows: Control (duct skin 30%), T1 (duck skin 20% + pre-emulsified duck skin 10%), T2 (duck skin 15% + pre-emulsified duck skin 15%), T3 (duck skin 10% + pre-emulsified duck skin 20%), and T4 (pre-emulsified duck skin 30%). As the ratio of duck skin to pre-emulsified skin changed, the quality of duck ham in terms of moisture content, fat content, cooking loss, emulsion stability, lightness, textural analysis, apparent viscosity, and overall acceptability changed. The moisture content of T2 was the highest ( p <0.05) and that of the control and T4 was the lowest ( p <0.05). The fat content of control was higher than all treatments ( p <0.05). T2 had the lowest values in cooking loss, total expressible fluid, fat separation, hardness, springiness, and gumminess ( p <0.05). The score of overall acceptability of all treatments with pre-emulsified skin was higher than control ( p <0.05). Therefore, the pre-emulsification process can improve the quality characteristics of duck ham and 1:1 ratio of duck skin and pre-emulsified skin was the proper ratio to improve the quality characteristics of duck ham.

  5. Nonlinear Multiobjective MPC-Based Optimal Operation of a High Consistency Refining System in Papermaking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Mingjie; Zhou, Ping; Wang, Hong

    As one of the most important unit in the papermaking industry, the high consistency (HC) refining system is confronted with challenges such as improving pulp quality, energy saving, and emissions reduction in its operation processes. Here in this correspondence, an optimal operation of HC refining system is presented using nonlinear multiobjective model predictive control strategies that aim at set-point tracking objective of pulp quality, economic objective, and specific energy (SE) consumption objective, respectively. First, a set of input and output data at different times are employed to construct the subprocess model of the state process model for the HC refiningmore » system, and then the Wiener-type model can be obtained through combining the mechanism model of Canadian Standard Freeness and the state process model that determines their structures based on Akaike information criterion. Second, the multiobjective optimization strategy that optimizes both the set-point tracking objective of pulp quality and SE consumption is proposed simultaneously, which uses NSGA-II approach to obtain the Pareto optimal set. Furthermore, targeting at the set-point tracking objective of pulp quality, economic objective, and SE consumption objective, the sequential quadratic programming method is utilized to produce the optimal predictive controllers. In conclusion, the simulation results demonstrate that the proposed methods can make the HC refining system provide a better performance of set-point tracking of pulp quality when these predictive controllers are employed. In addition, while the optimal predictive controllers orienting with comprehensive economic objective and SE consumption objective, it has been shown that they have significantly reduced the energy consumption.« less

  6. Method for Examination and Documentation of Basic Information and Metadata from Published Reports Relevant to the Study of Stormwater Runoff Quality

    USGS Publications Warehouse

    Dionne, Shannon G.; Granato, Gregory E.; Tana, Cameron K.

    1999-01-01

    A readily accessible archive of information that is valid, current, and technically defensible is needed to make informed highway-planning, design, and management decisions. The National Highway Runoff Water-Quality Data and Methodology Synthesis (NDAMS) is a cataloging and assessment of the documentation of information relevant to highway-runoff water quality available in published reports. The report review process is based on the NDAMS review sheet, which was designed by the USGS with input from the FHWA, State transportation agencies, and the regulatory community. The report-review process is designed to determine the technical merit of the existing literature in terms of current requirements for data documentation, data quality, quality assurance and quality control (QA/QC), and technical issues that may affect the use of historical data. To facilitate the review process, the NDAMS review sheet is divided into 12 sections: (1) administrative review information, (2) investigation and report information, (3) temporal information, (4) location information (5) water-quality-monitoring information, (6) sample-handling methods, (7) constituent information, (8) sampling focus and matrix, (9) flow monitoring methods, (10) field QA/QC, (11) laboratory, and (12) uncertainty/error analysis. This report describes the NDAMS report reviews and metadata documentation methods and provides an overview of the approach and of the quality-assurance and quality-control program used to implement the review process. Detailed information, including a glossary of relevant terms, a copy of the report-review sheets, and reportreview instructions are completely documented in a series of three appendixes included with this report. Therefore the reviews are repeatable and the methods can be used by transportation research organizations to catalog new reports as they are published.

  7. Comparison of Grand Median and Cumulative Sum Control Charts on Shuttlecock Weight Variable in CV Marjoko Kompas dan Domas

    NASA Astrophysics Data System (ADS)

    Musdalifah, N.; Handajani, S. S.; Zukhronah, E.

    2017-06-01

    Competition between the homoneous companies cause the company have to keep production quality. To cover this problem, the company controls the production with statistical quality control using control chart. Shewhart control chart is used to normal distributed data. The production data is often non-normal distribution and occured small process shift. Grand median control chart is a control chart for non-normal distributed data, while cumulative sum (cusum) control chart is a sensitive control chart to detect small process shift. The purpose of this research is to compare grand median and cusum control charts on shuttlecock weight variable in CV Marjoko Kompas dan Domas by generating data as the actual distribution. The generated data is used to simulate multiplier of standard deviation on grand median and cusum control charts. Simulation is done to get average run lenght (ARL) 370. Grand median control chart detects ten points that out of control, while cusum control chart detects a point out of control. It can be concluded that grand median control chart is better than cusum control chart.

  8. Robotic operation of the Observatorio Astrofísico de Javalambre

    NASA Astrophysics Data System (ADS)

    Yanes-Díaz, A.; Antón, J. L.; Rueda-Teruel, S.; Guillén-Civera, L.; Bello, R.; Jiménez-Mejías, D.; Chueca, S.; Lasso-Cabrera, N. M.; Suárez, O.; Rueda-Teruel, F.; Cenarro, A. J.; Cristóbal-Hornillos, D.; Marín-Franch, A.; Luis-Simoes, R.; López-Alegre, G.; Rodríguez-Hernández, M. A. C.; Moles, M.; Ederoclite, A.; Varela, J.; Vázquez Ramió, H.; Díaz-Martí, M. C.; Iglesias-Marzoa, R.; Maicas, N.; Lamadrid, J. L.; López-Sainz, A.; Hernández-Fuertes, J.; Valdivielso, L.

    2015-05-01

    The Observatorio Astrofísico de Javalambre (OAJ) is a new astronomical facility located at the Sierra de Javalambre (Teruel, Spain) whose primary role will be to conduct all-sky astronomical surveys with two unprecedented telescopes of unusually large fields of view: the JST/T250, a 2.55 m telescope of 3 deg field of view, and the JAST/T80, an 83 cm telescope of 2 deg field of view. CEFCA engineering team has been designing the OAJ control system as a global concept to manage, monitor, control and maintain all the observatory systems including not only astronomical subsystems but also infrastructure and other facilities. Three main factors have been considered in the design of a global control system for the robotic OAJ: quality, reliability and efficiency. We propose CIA (Control Integrated Architecture) design and OEE (Overall Equipment Effectiveness) as a key performance indicator in order to improve operation processes, minimizing resources and obtain high cost reduction maintaining quality requirements. Here we present the OAJ robotic control strategy to achieve maximum quality efficiency for the observatory surveys, processes and operations, giving practical examples of our approach.

  9. ITC Guidelines on Quality Control in Scoring, Test Analysis, and Reporting of Test Scores

    ERIC Educational Resources Information Center

    Allalouf, Avi

    2014-01-01

    The Quality Control (QC) Guidelines are intended to increase the efficiency, precision, and accuracy of the scoring, analysis, and reporting process of testing. The QC Guidelines focus on large-scale testing operations where multiple forms of tests are created for use on set dates. However, they may also be used for a wide variety of other testing…

  10. Analysis and quality control of carbohydrates in therapeutic proteins with fluorescence HPLC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Kun; Huang, Jian; Center for Informational Biology, University of Electronic Science and Technology of China, Chengdu 610054

    Conbercept is an Fc fusion protein with very complicated carbohydrate profiles which must be carefully monitored through manufacturing process. Here, we introduce an optimized fluorescence derivatization high-performance liquid chromatographic method for glycan mapping in conbercept. Compared with conventional glycan analysis method, this method has much better resolution and higher reproducibility making it excellent for product quality control.

  11. Application of advanced data collection and quality assurance methods in open prospective study - a case study of PONS project.

    PubMed

    Wawrzyniak, Zbigniew M; Paczesny, Daniel; Mańczuk, Marta; Zatoński, Witold A

    2011-01-01

    Large-scale epidemiologic studies can assess health indicators differentiating social groups and important health outcomes of the incidence and mortality of cancer, cardiovascular disease, and others, to establish a solid knowledgebase for the prevention management of premature morbidity and mortality causes. This study presents new advanced methods of data collection and data management systems with current data quality control and security to ensure high quality data assessment of health indicators in the large epidemiologic PONS study (The Polish-Norwegian Study). The material for experiment is the data management design of the large-scale population study in Poland (PONS) and the managed processes are applied into establishing a high quality and solid knowledge. The functional requirements of the PONS study data collection, supported by the advanced IT web-based methods, resulted in medical data of a high quality, data security, with quality data assessment, control process and evolution monitoring are fulfilled and shared by the IT system. Data from disparate and deployed sources of information are integrated into databases via software interfaces, and archived by a multi task secure server. The practical and implemented solution of modern advanced database technologies and remote software/hardware structure successfully supports the research of the big PONS study project. Development and implementation of follow-up control of the consistency and quality of data analysis and the processes of the PONS sub-databases have excellent measurement properties of data consistency of more than 99%. The project itself, by tailored hardware/software application, shows the positive impact of Quality Assurance (QA) on the quality of outcomes analysis results, effective data management within a shorter time. This efficiency ensures the quality of the epidemiological data and indicators of health by the elimination of common errors of research questionnaires and medical measurements.

  12. Yeast prions are useful for studying protein chaperones and protein quality control.

    PubMed

    Masison, Daniel C; Reidy, Michael

    2015-01-01

    Protein chaperones help proteins adopt and maintain native conformations and play vital roles in cellular processes where proteins are partially folded. They comprise a major part of the cellular protein quality control system that protects the integrity of the proteome. Many disorders are caused when proteins misfold despite this protection. Yeast prions are fibrous amyloid aggregates of misfolded proteins. The normal action of chaperones on yeast prions breaks the fibers into pieces, which results in prion replication. Because this process is necessary for propagation of yeast prions, even small differences in activity of many chaperones noticeably affect prion phenotypes. Several other factors involved in protein processing also influence formation, propagation or elimination of prions in yeast. Thus, in much the same way that the dependency of viruses on cellular functions has allowed us to learn much about cell biology, the dependency of yeast prions on chaperones presents a unique and sensitive way to monitor the functions and interactions of many components of the cell's protein quality control system. Our recent work illustrates the utility of this system for identifying and defining chaperone machinery interactions.

  13. Acoustic Sample Deposition MALDI-MS (ASD-MALDI-MS): A Novel Process Flow for Quality Control Screening of Compound Libraries.

    PubMed

    Chin, Jefferson; Wood, Elizabeth; Peters, Grace S; Drexler, Dieter M

    2016-02-01

    In the early stages of drug discovery, high-throughput screening (HTS) of compound libraries against pharmaceutical targets is a common method to identify potential lead molecules. For these HTS campaigns to be efficient and successful, continuous quality control of the compound collection is necessary and crucial. However, the large number of compound samples and the limited sample amount pose unique challenges. Presented here is a proof-of-concept study for a novel process flow for the quality control screening of small-molecule compound libraries that consumes only minimal amounts of samples and affords compound-specific molecular data. This process employs an acoustic sample deposition (ASD) technique for the offline sample preparation by depositing nanoliter volumes in an array format onto microscope glass slides followed by matrix-assisted laser desorption/ionization mass spectrometric (MALDI-MS) analysis. An initial study of a 384-compound array employing the ASD-MALDI-MS workflow resulted in a 75% first-pass positive identification rate with an analysis time of <1 s per sample. © 2015 Society for Laboratory Automation and Screening.

  14. Managing the travel model process : small and medium-sized MPOs. Instructor guide.

    DOT National Transportation Integrated Search

    2013-09-01

    The learning objectives of this course were to: explain fundamental travel model concepts; describe the model development process; identify key inputs and describe the quality control process; and identify and manage resources.

  15. Managing the travel model process : small and medium-sized MPOs. Participant handbook.

    DOT National Transportation Integrated Search

    2013-09-01

    The learning objectives of this course were to: explain fundamental travel model concepts; describe the model development process; identify key inputs and describe the quality control process; and identify and manage resources.

  16. Embedding Quality Culture in Higher Education in Ghana: Quality Control and Assessment in Emerging Private Universities

    ERIC Educational Resources Information Center

    Ntim, Stephen

    2014-01-01

    High quality provision has been one of the key aims of the current reforms in higher educational institutions across the globe since the beginning of the century and the millennium. Consequently this has led to the increasing demand for quality assurance (QA). This report identifies those institutional processes and structures that support the…

  17. The principles of ultrasound and its application in freezing related processes of food materials: A review.

    PubMed

    Cheng, Xinfeng; Zhang, Min; Xu, Baoguo; Adhikari, Benu; Sun, Jincai

    2015-11-01

    Ultrasonic processing is a novel and promising technology in food industry. The propagation of ultrasound in a medium generates various physical and chemical effects and these effects have been harnessed to improve the efficiency of various food processing operations. Ultrasound has also been used in food quality control as diagnostic technology. This article provides an overview of recent developments related to the application of ultrasound in low temperature and closely related processes such as freezing, thawing, freeze concentration and freeze drying. The applications of high intensity ultrasound to improve the efficiency of freezing process, to control the size and size distribution of ice crystals and to improve the quality of frozen foods have been discussed in considerable detail. The use of low intensity ultrasound in monitoring the ice content and to monitor the progress of freezing process has also been highlighted. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Data Quality Control of the French Permanent Broadband Network in the RESIF Framework.

    NASA Astrophysics Data System (ADS)

    Grunberg, M.; Lambotte, S.; Engels, F.

    2014-12-01

    In the framework of the RESIF (Réseau Sismologique et géodésique Français) project, a new information system is setting up, allowing the improvement of the management and the distribution of high quality data from the different elements of RESIF. Within this information system, EOST (in Strasbourg) is in charge of collecting real-time permanent broadband seismic waveform, and performing Quality Control on these data. The real-time and validated data set are pushed to the French National Distribution Center (Isterre/Grenoble) to make them publicly available. Furthermore EOST hosts the BCSF-ReNaSS, in charge of the French metropolitan seismic bulletin. This allows to benefit from some high-end quality control based on the national and world-wide seismicity. Here we present the real-time seismic data flow from the stations of the French National Broad Band Network to EOST, and then, the data Quality Control procedures that were recently installed, including some new developments.The data Quality Control consists in applying a variety of processes to check the consistency of the whole system from the stations to the data center. This allows us to verify that instruments and data transmission are operating correctly. Moreover, time quality is critical for most of the scientific data applications. To face this challenge and check the consistency of polarities and amplitudes, we deployed several high-end processes including a noise correlation procedure to check for timing accuracy (intrumental time errors result in a time-shift of the whole cross-correlation, clearly distinct from those due to change in medium physical properties), and a systematic comparison of synthetic and real data for teleseismic earthquakes of magnitude larger than 6.5 to detect timing errors as well as polarity and amplitude problems.

  19. “A manager in the minds of doctors:” a comparison of new modes of control in European hospitals

    PubMed Central

    2013-01-01

    Background Hospital governance increasingly combines management and professional self-governance. This article maps the new emergent modes of control in a comparative perspective and aims to better understand the relationship between medicine and management as hybrid and context-dependent. Theoretically, we critically review approaches into the managerialism-professionalism relationship; methodologically, we expand cross-country comparison towards the meso-level of organisations; and empirically, the focus is on processes and actors in a range of European hospitals. Methods The research is explorative and was carried out as part of the FP7 COST action IS0903 Medicine and Management, Working Group 2. Comprising seven European countries, the focus is on doctors and public hospitals. We use a comparative case study design that primarily draws on expert information and document analysis as well as other secondary sources. Results The findings reveal that managerial control is not simply an external force but increasingly integrated in medical professionalism. These processes of change are relevant in all countries but shaped by organisational settings, and therefore create different patterns of control: (1) ‘integrated’ control with high levels of coordination and coherent patterns for cost and quality controls; (2) ‘partly integrated’ control with diversity of coordination on hospital and department level and between cost and quality controls; and (3) ‘fragmented’ control with limited coordination and gaps between quality control more strongly dominated by medicine, and cost control by management. Conclusions Our comparison highlights how organisations matter and brings the crucial relevance of ‘coordination’ of medicine and management across the levels (hospital/department) and the substance (cost/quality-safety) of control into perspective. Consequently, coordination may serve as a taxonomy of emergent modes of control, thus bringing new directions for cost-efficient and quality-effective hospital governance into perspective. PMID:23819578

  20. Quality Leadership and Quality Control

    PubMed Central

    Badrick, Tony

    2003-01-01

    Different quality control rules detect different analytical errors with varying levels of efficiency depending on the type of error present, its prevalence and the number of observations. The efficiency of a rule can be gauged by inspection of a power function graph. Control rules are only part of a process and not an end in itself; just as important are the trouble-shooting systems employed when a failure occurs. 'Average of patient normals' may develop as a usual adjunct to conventional quality control serum based programmes. Acceptable error can be based on various criteria; biological variation is probably the most sensible. Once determined, acceptable error can be used as limits in quality control rule systems. A key aspect of an organisation is leadership, which links the various components of the quality system. Leadership is difficult to characterise but its key aspects include trust, setting an example, developing staff and critically setting the vision for the organisation. Organisations also have internal characteristics such as the degree of formalisation, centralisation, and complexity. Medical organisations can have internal tensions because of the dichotomy between the bureaucratic and the shadow medical structures. PMID:18568046

  1. Offline modeling for product quality prediction of mineral processing using modeling error PDF shaping and entropy minimization.

    PubMed

    Ding, Jinliang; Chai, Tianyou; Wang, Hong

    2011-03-01

    This paper presents a novel offline modeling for product quality prediction of mineral processing which consists of a number of unit processes in series. The prediction of the product quality of the whole mineral process (i.e., the mixed concentrate grade) plays an important role and the establishment of its predictive model is a key issue for the plantwide optimization. For this purpose, a hybrid modeling approach of the mixed concentrate grade prediction is proposed, which consists of a linear model and a nonlinear model. The least-squares support vector machine is adopted to establish the nonlinear model. The inputs of the predictive model are the performance indices of each unit process, while the output is the mixed concentrate grade. In this paper, the model parameter selection is transformed into the shape control of the probability density function (PDF) of the modeling error. In this context, both the PDF-control-based and minimum-entropy-based model parameter selection approaches are proposed. Indeed, this is the first time that the PDF shape control idea is used to deal with system modeling, where the key idea is to turn model parameters so that either the modeling error PDF is controlled to follow a target PDF or the modeling error entropy is minimized. The experimental results using the real plant data and the comparison of the two approaches are discussed. The results show the effectiveness of the proposed approaches.

  2. Design of optimal buffer layers for CuInGaSe2 thin-film solar cells(Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Lordi, Vincenzo; Varley, Joel B.; He, Xiaoqing; Rockett, Angus A.; Bailey, Jeff; Zapalac, Geordie H.; Mackie, Neil; Poplavskyy, Dmitry; Bayman, Atiye

    2016-09-01

    Optimizing the buffer layer in manufactured thin-film PV is essential to maximize device efficiency. Here, we describe a combined synthesis, characterization, and theory effort to design optimal buffers based on the (Cd,Zn)(O,S) alloy system for CIGS devices. Optimization of buffer composition and absorber/buffer interface properties in light of several competing requirements for maximum device efficiency were performed, along with process variations to control the film and interface quality. The most relevant buffer properties controlling performance include band gap, conduction band offset with absorber, dopability, interface quality, and film crystallinity. Control of an all-PVD deposition process enabled variation of buffer composition, crystallinity, doping, and quality of the absorber/buffer interface. Analytical electron microscopy was used to characterize the film composition and morphology, while hybrid density functional theory was used to predict optimal compositions and growth parameters based on computed material properties. Process variations were developed to produce layers with controlled crystallinity, varying from amorphous to fully epitaxial, depending primarily on oxygen content. Elemental intermixing between buffer and absorber, particularly involving Cd and Cu, also is controlled and significantly affects device performance. Secondary phase formation at the interface is observed for some conditions and may be detrimental depending on the morphology. Theoretical calculations suggest optimal composition ranges for the buffer based on a suite of computed properties and drive process optimizations connected with observed film properties. Prepared by LLNL under Contract DE-AC52-07NA27344.

  3. Analysis of batch-related influences on injection molding processes viewed in the context of electro plating quality demands

    NASA Astrophysics Data System (ADS)

    Siepmann, Jens P.; Wortberg, Johannes; Heinzler, Felix A.

    2016-03-01

    The injection molding process is mandatorily influenced by the viscosity of the material. By varying the material batch the viscosity of the polymer changes. For the process and part quality the initial conditions of the material in addition to the processing parameters define the process and product quality. A high percentage of technical polymers processed in injection molding is refined in a follow-up production step, for example electro plating. Processing optimized for electro plating often requires avoiding high shear stresses by using low injection speed and pressure conditions. Therefore differences in the material charges' viscosity occur especially in the quality related low shear rate area. These differences and quality related influences can be investigated by high detail rheological analysis and process simulation based on adapted material describing models. Differences in viscosity between batches can be detected by measurements with high-pressure-capillary-rheometers or oscillatory rheometers for low shear rates. A combination of both measurement techniques is possible by the Cox-Merz-Relation. The detected differences in the rheological behavior of both charges are summarized in two material behavior describing model approaches and added to the simulation. In this paper the results of processing-simulations with standard filling parameters are presented with two ABS charges. Part quality defining quantities such as temperature, pressure and shear stress are investigated and the influence of charge variations is pointed out with respect to electro plating quality demands. Furthermore, the results of simulations with a new quality related process control are presented and compared to the standard processing.

  4. Ion mobility spectrometry for food quality and safety.

    PubMed

    Vautz, W; Zimmermann, D; Hartmann, M; Baumbach, J I; Nolte, J; Jung, J

    2006-11-01

    Ion mobility spectrometry is known to be a fast and sensitive technique for the detection of trace substances, and it is increasingly in demand not only for protection against explosives and chemical warfare agents, but also for new applications in medical diagnosis or process control. Generally, a gas phase sample is ionized by help of ultraviolet light, ss-radiation or partial discharges. The ions move in a weak electrical field towards a detector. During their drift they collide with a drift gas flowing in the opposite direction and, therefore, are slowed down depending on their size, shape and charge. As a result, different ions reach the detector at different drift times, which are characteristic for the ions considered. The number of ions reaching the detector are a measure of the concentration of the analyte. The method enables the identification and quantification of analytes with high sensitivity (ng l(-1) range). The selectivity can even be increased - as necessary for the analyses of complex mixtures - using pre-separation techniques such as gas chromatography or multi-capillary columns. No pre-concentration of the sample is necessary. Those characteristics of the method are preserved even in air with up to a 100% relative humidity rate. The suitability of the method for application in the field of food quality and safety - including storage, process and quality control as well as the characterization of food stuffs - was investigated in recent years for a number of representative examples, which are summarized in the following, including new studies as well: (1) the detection of metabolites from bacteria for the identification and control of their growth; (2) process control in food production - beer fermentation being an example; (3) the detection of the metabolites of mould for process control during cheese production, for quality control of raw materials or for the control of storage conditions; (4) the quality control of packaging materials during the production of polymeric materials; and (5) the characterization of products - wine being an example. The challenges of such applications were operation in humid air, fast on-line analyses of complex mixtures, high sensitivity - detection limits have to be, for example, in the range of the odour limits - and, in some cases, the necessity of mobile instrumentation. It can be shown that ion mobility spectrometry is optimally capable of fulfilling those challenges for many applications.

  5. Electron-beam lithography with character projection technique for high-throughput exposure with line-edge quality control

    NASA Astrophysics Data System (ADS)

    Ikeno, Rimon; Maruyama, Satoshi; Mita, Yoshio; Ikeda, Makoto; Asada, Kunihiro

    2016-07-01

    The high throughput of character projection (CP) electron-beam (EB) lithography makes it a promising technique for low-to-medium volume device fabrication with regularly arranged layouts, such as for standard-cell logics and memory arrays. However, non-VLSI applications such as MEMS and MOEMS may not be able to fully utilize the benefits of the CP method due to the wide variety of layout figures including curved and oblique edges. In addition, the stepwise shapes that appear because of the EB exposure process often result in intolerable edge roughness, which degrades device performances. In this study, we propose a general EB lithography methodology for such applications utilizing a combination of the CP and variable-shaped beam methods. In the process of layout data conversion with CP character instantiation, several control parameters were optimized to minimize the shot count, improve the edge quality, and enhance the overall device performance. We have demonstrated EB shot reduction and edge-quality improvement with our methodology by using a leading-edge EB exposure tool, ADVANTEST F7000S-VD02, and a high-resolution hydrogen silsesquioxane resist. Atomic force microscope observations were used to analyze the resist edge profiles' quality to determine the influence of the control parameters used in the data conversion process.

  6. Data Assimilation Experiments using Quality Controlled AIRS Version 5 Temperature Soundings

    NASA Technical Reports Server (NTRS)

    Susskind, Joel

    2008-01-01

    The AIRS Science Team Version 5 retrieval algorithm has been finalized and is now operational at the Goddard DAAC in the processing (and reprocessing) of all AlRS data. Version 5 contains accurate case-by-case error estimates for most derived products, which are also used for quality control. We have conducted forecast impact experiments assimilating AlRS quality controlled temperature profiles using the NASA GEOS-5 data assimilation system, consisting of the NCEP GSI analysis coupled with the NASA FVGCM. Assimilation of quality controlled temperature profiles resulted in significantly improved forecast skill in both the Northern Hemisphere and Southern Hemisphere Extra-Tropics, compared to that obtained from analyses obtained when all data used operationally by NCEP except for AlRS data is assimilated. Experiments using different Quality Control thresholds for assimilation of AlRS temperature retrievals showed that a medium quality control threshold performed better than a tighter threshold, which provided better overall sounding accuracy; or a looser threshold, which provided better spatial coverage of accepted soundings. We are conducting more experiments to further optimize this balance of spatial coverage and sounding accuracy from the data assimilation perspective. In all cases, temperature soundings were assimilated well below cloud level in partially cloudy cases. The positive impact of assimilating AlRS derived atmospheric temperatures all but vanished when only AIRS stratospheric temperatures were assimilated. Forecast skill resulting from assimilation of AlRS radiances uncontaminated by clouds, instead of AlRS temperature soundings, was only slightly better than that resulting from assimilation of only stratospheric AlRS temperatures. This reduction in forecast skill is most likely the result of significant loss of tropospheric information when only AIRS radiances unaffected by clouds are used in the data assimilation process.

  7. Radiation dosimetry for quality control of food preservation and disinfestation

    NASA Astrophysics Data System (ADS)

    McLaughlin, W. L.; Miller, A.; Uribe, R. M.

    In the use of x and gamma rays and scanned electron beams to extend the shelf life of food by delay of sprouting and ripening, killing of microbes, and control of insect population, quality assurance is provided by standardized radiation dosimetry. By strategic placement of calibrated dosimeters that are sufficiently stable and reproducible, it is possible to monitor minimum and maximum radiation absorbed dose levels and dose uniformity for a given processed foodstuff. The dosimetry procedure is especially important in the commisioning of a process and in making adjustments of process parameters (e.g. conveyor speed) to meet changes that occur in product and source parameters (e.g. bulk density and radiation spectrum). Routine dosimetry methods and certain corrections of dosimetry data may be selected for the radiations used in typical food processes.

  8. NASA. Langley Research Center dry powder towpreg system

    NASA Technical Reports Server (NTRS)

    Baucom, Robert M.; Marchello, Joseph M.

    1990-01-01

    Dry powder polymer impregnated carbon fiber tows were produced for preform weaving and composite materials molding applications. In the process, fluidized powder is deposited on spread tow bundles and melted on the fibers by radiant heating to adhere the polymer to the fiber. Unit design theory and operating correlations were developed to provide the basis for scale up of the process to commercial operation. Special features of the operation are the pneumatic tow spreader, fluidized bed, resin feeder, and quality control system. Bench scale experiments, at tow speeds up to 50 cm/sec, demonstrated that process variables can be controlled to produce weavable LARC-TPI carbon fiber towpreg. The towpreg made by the dry powder process was formed into unidirectional fiber moldings and was woven and molded into preform material of good quality.

  9. [Study thought of pharmaceutical preparations quality standards by dynamic quality control technology].

    PubMed

    Yu, Dan-Hong; Mao, Chen-Mei; Lv, Cheng-Zhe; Jin, Hui-Zhen; Yao, Xin; Jia, Xiao-Bin

    2014-07-01

    Pharmaceutical preparations, particularly as a "secret recipe" of traditional Chinese medicine in medical institutions, are the product of China's medical and health industry, and they are also an important means of competing of different medical institutions. Although pharmaceutical preparations have advantages and characteristics than institutes for drug and pharmaceutical companies, the quality standards of pharmaceutical preparations in medical institutions has not reached the desired level over the years. As we all know, the quality of pharmaceutical preparations is important to ensure the efficacy, especially under the environment of people pay more sttention on drug safety and effectiveness and contry increase emphasis on the stste of pharmaceutical preparations. In view of this, we will improve the grade, stability, and clinical efficacy of pharmaceutical preparations by the advanced equipment, testing instruments and the process dynamic quality control technology. Finally, we hope we can provide new ideas for the quality control of pharmaceutical preparations.

  10. Quality Attribute Techniques Framework

    NASA Astrophysics Data System (ADS)

    Chiam, Yin Kia; Zhu, Liming; Staples, Mark

    The quality of software is achieved during its development. Development teams use various techniques to investigate, evaluate and control potential quality problems in their systems. These “Quality Attribute Techniques” target specific product qualities such as safety or security. This paper proposes a framework to capture important characteristics of these techniques. The framework is intended to support process tailoring, by facilitating the selection of techniques for inclusion into process models that target specific product qualities. We use risk management as a theory to accommodate techniques for many product qualities and lifecycle phases. Safety techniques have motivated the framework, and safety and performance techniques have been used to evaluate the framework. The evaluation demonstrates the ability of quality risk management to cover the development lifecycle and to accommodate two different product qualities. We identify advantages and limitations of the framework, and discuss future research on the framework.

  11. [Service quality in health care: the application of the results of marketing research].

    PubMed

    Verheggen, F W; Harteloh, P P

    1993-01-01

    This paper deals with quality assurance in health care and its relation to quality assurance in trade and industry. We present the service quality model--a model of quality from marketing research--and discuss how it can be applied to health care. Traditional quality assurance appears to have serious flaws. It lacks a general theory of the sources of hazards in the complex process of patient care and tends to stagnate, for no real improvement takes place. Departing from this criticism, modern quality assurance in health care is marked by: defining quality in a preferential sense as "fitness for use"; the use of theories and models of trade and industry (process-control); an emphasis on analyzing the process, instead of merely inspecting it; use of the Deming problem solving technique (plan, do, check, act); improvement of the process of care by altering perceptions of parties involved. We present an experience of application and utilization of this method in the University Hospital Maastricht, The Netherlands. The successful application of this model requires a favorable corporate culture and motivation of the health care workers. This model provides a useful framework to uplift the traditional approach to quality assurance in health care.

  12. Inhibitory Control Mediates the Association between Perceived Stress and Secure Relationship Quality.

    PubMed

    Herd, Toria; Li, Mengjiao; Maciejewski, Dominique; Lee, Jacob; Deater-Deckard, Kirby; King-Casas, Brooks; Kim-Spoon, Jungmeen

    2018-01-01

    Past research has demonstrated negative associations between exposure to stressors and quality of interpersonal relationships among children and adolescents. Nevertheless, underlying mechanisms of this association remain unclear. Chronic stress has been shown to disrupt prefrontal functioning in the brain, including inhibitory control abilities, and evidence is accumulating that inhibitory control may play an important role in secure interpersonal relationship quality, including peer problems and social competence. In this prospective longitudinal study, we examine whether changes in inhibitory control, measured at both behavioral and neural levels, mediate the association between stress and changes in secure relationship quality with parents and peers. The sample included 167 adolescents (53% males) who were first recruited at age 13 or 14 years and assessed annually three times. Adolescents' inhibitory control was measured by their behavioral performance and brain activities, and adolescents self-reported perceived stress levels and relationship quality with mothers, fathers, and peers. Results suggest that behavioral inhibitory control mediates the association between perceived stress and adolescent's secure relationship quality with their mothers and fathers, but not their peers. In contrast, given that stress was not significantly correlated with neural inhibitory control, we did not further test the mediation path. Our results highlight the role of inhibitory control as a process through which stressful life experiences are related to impaired secure relationship quality between adolescents and their mothers and fathers.

  13. Inhibitory Control Mediates the Association between Perceived Stress and Secure Relationship Quality

    PubMed Central

    Herd, Toria; Li, Mengjiao; Maciejewski, Dominique; Lee, Jacob; Deater-Deckard, Kirby; King-Casas, Brooks; Kim-Spoon, Jungmeen

    2018-01-01

    Past research has demonstrated negative associations between exposure to stressors and quality of interpersonal relationships among children and adolescents. Nevertheless, underlying mechanisms of this association remain unclear. Chronic stress has been shown to disrupt prefrontal functioning in the brain, including inhibitory control abilities, and evidence is accumulating that inhibitory control may play an important role in secure interpersonal relationship quality, including peer problems and social competence. In this prospective longitudinal study, we examine whether changes in inhibitory control, measured at both behavioral and neural levels, mediate the association between stress and changes in secure relationship quality with parents and peers. The sample included 167 adolescents (53% males) who were first recruited at age 13 or 14 years and assessed annually three times. Adolescents’ inhibitory control was measured by their behavioral performance and brain activities, and adolescents self-reported perceived stress levels and relationship quality with mothers, fathers, and peers. Results suggest that behavioral inhibitory control mediates the association between perceived stress and adolescent’s secure relationship quality with their mothers and fathers, but not their peers. In contrast, given that stress was not significantly correlated with neural inhibitory control, we did not further test the mediation path. Our results highlight the role of inhibitory control as a process through which stressful life experiences are related to impaired secure relationship quality between adolescents and their mothers and fathers. PMID:29535664

  14. The automated system for technological process of spacecraft's waveguide paths soldering

    NASA Astrophysics Data System (ADS)

    Tynchenko, V. S.; Murygin, A. V.; Emilova, O. A.; Bocharov, A. N.; Laptenok, V. D.

    2016-11-01

    The paper solves the problem of automated process control of space vehicles waveguide paths soldering by means of induction heating. The peculiarities of the induction soldering process are analyzed and necessity of information-control system automation is identified. The developed automated system makes the control of the product heating process, by varying the power supplied to the inductor, on the basis of information about the soldering zone temperature, and stabilizing the temperature in a narrow range above the melting point of the solder but below the melting point of the waveguide. This allows the soldering process automating to improve the quality of the waveguides and eliminate burn-troughs. The article shows a block diagram of a software system consisting of five modules, and describes the main algorithm of its work. Also there is a description of the waveguide paths automated soldering system operation, for explaining the basic functions and limitations of the system. The developed software allows setting of the measurement equipment, setting and changing parameters of the soldering process, as well as view graphs of temperatures recorded by the system. There is shown the results of experimental studies that prove high quality of soldering process control and the system applicability to the tasks of automation.

  15. Implementation of statistical process control for proteomic experiments via LC MS/MS.

    PubMed

    Bereman, Michael S; Johnson, Richard; Bollinger, James; Boss, Yuval; Shulman, Nick; MacLean, Brendan; Hoofnagle, Andrew N; MacCoss, Michael J

    2014-04-01

    Statistical process control (SPC) is a robust set of tools that aids in the visualization, detection, and identification of assignable causes of variation in any process that creates products, services, or information. A tool has been developed termed Statistical Process Control in Proteomics (SProCoP) which implements aspects of SPC (e.g., control charts and Pareto analysis) into the Skyline proteomics software. It monitors five quality control metrics in a shotgun or targeted proteomic workflow. None of these metrics require peptide identification. The source code, written in the R statistical language, runs directly from the Skyline interface, which supports the use of raw data files from several of the mass spectrometry vendors. It provides real time evaluation of the chromatographic performance (e.g., retention time reproducibility, peak asymmetry, and resolution), and mass spectrometric performance (targeted peptide ion intensity and mass measurement accuracy for high resolving power instruments) via control charts. Thresholds are experiment- and instrument-specific and are determined empirically from user-defined quality control standards that enable the separation of random noise and systematic error. Finally, Pareto analysis provides a summary of performance metrics and guides the user to metrics with high variance. The utility of these charts to evaluate proteomic experiments is illustrated in two case studies.

  16. Statistical process control: a practical application for hospitals.

    PubMed

    VanderVeen, L M

    1992-01-01

    A six-step plan based on using statistics was designed to improve quality in the central processing and distribution department of a 223-bed hospital in Oakland, CA. This article describes how the plan was implemented sequentially, starting with the crucial first step of obtaining administrative support. The QI project succeeded in overcoming beginners' fear of statistics and in training both managers and staff to use inspection checklists, Pareto charts, cause-and-effect diagrams, and control charts. The best outcome of the program was the increased commitment to quality improvement by the members of the department.

  17. Quality-assurance plan for the analysis of fluvial sediment by the U.S. Geological Survey Kentucky Water Science Center Sediment Laboratory

    USGS Publications Warehouse

    Shreve, Elizabeth A.; Downs, Aimee C.

    2005-01-01

    This report describes laboratory procedures used by the U.S. Geological Survey Kentucky Water Science Center Sediment Laboratory for the processing and analysis of fluvial-sediment samples for concentration of sand and finer material. The report details the processing of a sediment sample through the laboratory from receiving the sediment sample, through the analytical process, to compiling results of the requested analysis. Procedures for preserving sample integrity, calibrating and maintaining of laboratory and field instruments and equipment, analyzing samples, internal quality assurance and quality control, and validity of the sediment-analysis results also are described. The report includes a list of references cited and a glossary of sediment and quality-assurance terms.

  18. Some Inspection Methods for Quality Control and In-service Inspection of GLARE

    NASA Astrophysics Data System (ADS)

    Sinke, J.

    2003-07-01

    Quality control of materials and structures is an important issue, also for GLARE. During the manufacturing stage the processes and materials should be monitored and checked frequently in order to obtain a qualified product. During the operation of the aircraft, frequent monitoring and inspections are performed to maintain the quality at a prescribed level. Therefore, in-service inspection methods are applied, and when necessary repair activities are conducted. For the quality control of the GLARE panels and components during manufacturing, the C-scan method proves to be an effective tool. For in-service inspection the Eddy Current Method is one of the suitable options. In this paper a brief overview is presented of both methods and their application on GLARE products.

  19. Development of risk-based air quality management strategies under impacts of climate change.

    PubMed

    Liao, Kuo-Jen; Amar, Praveen; Tagaris, Efthimios; Russell, Armistead G

    2012-05-01

    Climate change is forecast to adversely affect air quality through perturbations in meteorological conditions, photochemical reactions, and precursor emissions. To protect the environment and human health from air pollution, there is an increasing recognition of the necessity of developing effective air quality management strategies under the impacts of climate change. This paper presents a framework for developing risk-based air quality management strategies that can help policy makers improve their decision-making processes in response to current and future climate change about 30-50 years from now. Development of air quality management strategies under the impacts of climate change is fundamentally a risk assessment and risk management process involving four steps: (1) assessment of the impacts of climate change and associated uncertainties; (2) determination of air quality targets; (3) selections of potential air quality management options; and (4) identification of preferred air quality management strategies that minimize control costs, maximize benefits, or limit the adverse effects of climate change on air quality when considering the scarcity of resources. The main challenge relates to the level of uncertainties associated with climate change forecasts and advancements in future control measures, since they will significantly affect the risk assessment results and development of effective air quality management plans. The concept presented in this paper can help decision makers make appropriate responses to climate change, since it provides an integrated approach for climate risk assessment and management when developing air quality management strategies. Development of climate-responsive air quality management strategies is fundamentally a risk assessment and risk management process. The risk assessment process includes quantification of climate change impacts on air quality and associated uncertainties. Risk management for air quality under the impacts of climate change includes determination of air quality targets, selections of potential management options, and identification of effective air quality management strategies through decision-making models. The risk-based decision-making framework can also be applied to develop climate-responsive management strategies for the other environmental dimensions and assess costs and benefits of future environmental management policies.

  20. Implementation of hazard analysis and critical control point (HACCP) in dried anchovy production process

    NASA Astrophysics Data System (ADS)

    Citraresmi, A. D. P.; Wahyuni, E. E.

    2018-03-01

    The aim of this study was to inspect the implementation of Hazard Analysis and Critical Control Point (HACCP) for identification and prevention of potential hazards in the production process of dried anchovy at PT. Kelola Mina Laut (KML), Lobuk unit, Sumenep. Cold storage process is needed in each anchovy processing step in order to maintain its physical and chemical condition. In addition, the implementation of quality assurance system should be undertaken to maintain product quality. The research was conducted using a survey method, by following the whole process of making anchovy from the receiving raw materials to the packaging of final product. The method of data analysis used was descriptive analysis method. Implementation of HACCP at PT. KML, Lobuk unit, Sumenep was conducted by applying Pre Requisite Programs (PRP) and preparation stage consisting of 5 initial stages and 7 principles of HACCP. The results showed that CCP was found in boiling process flow with significant hazard of Listeria monocytogenesis bacteria and final sorting process with significant hazard of foreign material contamination in the product. Actions taken were controlling boiling temperature of 100 – 105°C for 3 - 5 minutes and training for sorting process employees.

  1. Improving the Quality of E-Learning: Lessons from the eMM

    ERIC Educational Resources Information Center

    Marshall, S.

    2012-01-01

    The quality of e-learning can be defined in many different ways, reflecting different stakeholders and the complexity of the systems and processes used in higher education. These different conceptions of quality can be mutually contradictory and, while politically significant, may also be beyond the direct control or influence of institutional…

  2. Using a statistical process control chart during the quality assessment of cancer registry data.

    PubMed

    Myles, Zachary M; German, Robert R; Wilson, Reda J; Wu, Manxia

    2011-01-01

    Statistical process control (SPC) charts may be used to detect acute variations in the data while simultaneously evaluating unforeseen aberrations that may warrant further investigation by the data user. Using cancer stage data captured by the Summary Stage 2000 (SS2000) variable, we sought to present a brief report highlighting the utility of the SPC chart during the quality assessment of cancer registry data. Using a county-level caseload for the diagnosis period of 2001-2004 (n=25,648), we found the overall variation of the SS2000 variable to be in control during diagnosis years of 2001 and 2002, exceeded the lower control limit (LCL) in 2003, and exceeded the upper control limit (UCL) in 2004; in situ/localized stages were in control throughout the diagnosis period, regional stage exceeded UCL in 2004, and distant stage exceeded the LCL in 2001 and the UCL in 2004. Our application of the SPC chart with cancer registry data illustrates that the SPC chart may serve as a readily available and timely tool for identifying areas of concern during the data collection and quality assessment of central cancer registry data.

  3. ECG compression using non-recursive wavelet transform with quality control

    NASA Astrophysics Data System (ADS)

    Liu, Je-Hung; Hung, King-Chu; Wu, Tsung-Ching

    2016-09-01

    While wavelet-based electrocardiogram (ECG) data compression using scalar quantisation (SQ) yields excellent compression performance, a wavelet's SQ scheme, however, must select a set of multilevel quantisers for each quantisation process. As a result of the properties of multiple-to-one mapping, however, this scheme is not conducive for reconstruction error control. In order to address this problem, this paper presents a single-variable control SQ scheme able to guarantee the reconstruction quality of wavelet-based ECG data compression. Based on the reversible round-off non-recursive discrete periodised wavelet transform (RRO-NRDPWT), the SQ scheme is derived with a three-stage design process that first uses genetic algorithm (GA) for high compression ratio (CR), followed by a quadratic curve fitting for linear distortion control, and the third uses a fuzzy decision-making for minimising data dependency effect and selecting the optimal SQ. The two databases, Physikalisch-Technische Bundesanstalt (PTB) and Massachusetts Institute of Technology (MIT) arrhythmia, are used to evaluate quality control performance. Experimental results show that the design method guarantees a high compression performance SQ scheme with statistically linear distortion. This property can be independent of training data and can facilitate rapid error control.

  4. Welding and joining techniques.

    PubMed

    Chipperfield, F A; Dunkerton, S B

    2001-05-01

    There is a welding solution for most applications. As products must meet more stringent requirements or require more flexible processes to aid design or reduce cost, further improvements or totally new processes are likely to be developed. Quality control aspects are also becoming more important to meet regulation, and monitoring and control of welding processes and the standardised testing of joints will meet some if not all of these requirements.

  5. The drug regulatory and review process in Guyana.

    PubMed

    Woo-Ming, R B

    1993-01-01

    After the old "Sale of Food and Drugs" Ordinance, Cap. 144 was repealed, the new Food and Drugs Act was enacted in 1971. This new Act has considerable flexibility and gives the Minister extensive authority to make Regulations (for carrying out the purposes and provisions of the Act). The Act controls the manufacture, importation, sale, advertising, labeling, packaging, and distribution of drug samples, and the testing of drugs. The Act also controls raw materials and finished products of drugs at the point of entry into the country, with a single agency coordinating both the inspection and analytical services. Developing countries could ensure the procurement of safe, good quality, and effective drugs and devices with the enactment of a similar Food and Drugs Act only. Rapid assessment of Drug Safety, Quality and Efficacy is done through Guyana's participation in the WHO Certification Scheme on the Quality of Pharmaceutical Products moving in International Commerce. This certification scheme is highly commendable especially to third-world countries. The Food and Drug Regulations (1977) have several unique features for drug, cosmetic and device control and they allow for a system of centralized control with limited staff to enforce the legislation. In summary, enforcement of legislative control of imported pharmaceuticals and product evaluation can be considered strong points in the drug regulatory and review process in Guyana. A cautious attitude is observed so as to ensure efficacy, safety, and quality of drugs entering the market. This Drug Regulatory and Review Process is recommended for implementation by third-world countries with outdated drug legislation.(ABSTRACT TRUNCATED AT 250 WORDS)

  6. The relationship between context, structure, and processes with outcomes of 6 regional diabetes networks in Europe.

    PubMed

    Mahdavi, Mahdi; Vissers, Jan; Elkhuizen, Sylvia; van Dijk, Mattees; Vanhala, Antero; Karampli, Eleftheria; Faubel, Raquel; Forte, Paul; Coroian, Elena; van de Klundert, Joris

    2018-01-01

    While health service provisioning for the chronic condition Type 2 Diabetes (T2D) often involves a network of organisations and professionals, most evidence on the relationships between the structures and processes of service provisioning and the outcomes considers single organisations or solo practitioners. Extending Donabedian's Structure-Process-Outcome (SPO) model, we investigate how differences in quality of life, effective coverage of diabetes, and service satisfaction are associated with differences in the structures, processes, and context of T2D services in six regions in Finland, Germany, Greece, Netherlands, Spain, and UK. Data collection consisted of: a) systematic modelling of provider network's structures and processes, and b) a cross-sectional survey of patient reported outcomes and other information. The survey resulted in data from 1459 T2D patients, during 2011-2012. Stepwise linear regression models were used to identify how independent cumulative proportion of variance in quality of life and service satisfaction are related to differences in context, structure and process. The selected context, structure and process variables are based on Donabedian's SPO model, a service quality research instrument (SERVQUAL), and previous organization and professional level evidence. Additional analysis deepens the possible bidirectional relation between outcomes and processes. The regression models explain 44% of variance in service satisfaction, mostly by structure and process variables (such as human resource use and the SERVQUAL dimensions). The models explained 23% of variance in quality of life between the networks, much of which is related to contextual variables. Our results suggest that effectiveness of A1c control is negatively correlated with process variables such as total hours of care provided per year and cost of services per year. While the selected structure and process variables explain much of the variance in service satisfaction, this is less the case for quality of life. Moreover, it appears that the effect of the clinical outcome A1c control on processes is stronger than the other way around, as poorer control seems to relate to more service use, and higher cost. The standardized operational models used in this research prove to form a basis for expanding the network level evidence base for effective T2D service provisioning.

  7. The relationship between context, structure, and processes with outcomes of 6 regional diabetes networks in Europe

    PubMed Central

    Elkhuizen, Sylvia; van Dijk, Mattees; Vanhala, Antero; Karampli, Eleftheria; Faubel, Raquel; Forte, Paul; Coroian, Elena

    2018-01-01

    Background While health service provisioning for the chronic condition Type 2 Diabetes (T2D) often involves a network of organisations and professionals, most evidence on the relationships between the structures and processes of service provisioning and the outcomes considers single organisations or solo practitioners. Extending Donabedian’s Structure-Process-Outcome (SPO) model, we investigate how differences in quality of life, effective coverage of diabetes, and service satisfaction are associated with differences in the structures, processes, and context of T2D services in six regions in Finland, Germany, Greece, Netherlands, Spain, and UK. Methods Data collection consisted of: a) systematic modelling of provider network’s structures and processes, and b) a cross-sectional survey of patient reported outcomes and other information. The survey resulted in data from 1459 T2D patients, during 2011–2012. Stepwise linear regression models were used to identify how independent cumulative proportion of variance in quality of life and service satisfaction are related to differences in context, structure and process. The selected context, structure and process variables are based on Donabedian’s SPO model, a service quality research instrument (SERVQUAL), and previous organization and professional level evidence. Additional analysis deepens the possible bidirectional relation between outcomes and processes. Results The regression models explain 44% of variance in service satisfaction, mostly by structure and process variables (such as human resource use and the SERVQUAL dimensions). The models explained 23% of variance in quality of life between the networks, much of which is related to contextual variables. Our results suggest that effectiveness of A1c control is negatively correlated with process variables such as total hours of care provided per year and cost of services per year. Conclusions While the selected structure and process variables explain much of the variance in service satisfaction, this is less the case for quality of life. Moreover, it appears that the effect of the clinical outcome A1c control on processes is stronger than the other way around, as poorer control seems to relate to more service use, and higher cost. The standardized operational models used in this research prove to form a basis for expanding the network level evidence base for effective T2D service provisioning. PMID:29447220

  8. Economic-Oriented Stochastic Optimization in Advanced Process Control of Chemical Processes

    PubMed Central

    Dobos, László; Király, András; Abonyi, János

    2012-01-01

    Finding the optimal operating region of chemical processes is an inevitable step toward improving economic performance. Usually the optimal operating region is situated close to process constraints related to product quality or process safety requirements. Higher profit can be realized only by assuring a relatively low frequency of violation of these constraints. A multilevel stochastic optimization framework is proposed to determine the optimal setpoint values of control loops with respect to predetermined risk levels, uncertainties, and costs of violation of process constraints. The proposed framework is realized as direct search-type optimization of Monte-Carlo simulation of the controlled process. The concept is illustrated throughout by a well-known benchmark problem related to the control of a linear dynamical system and the model predictive control of a more complex nonlinear polymerization process. PMID:23213298

  9. 40 CFR 63.10001 - Affirmative defense for exceedence of emission limit during malfunction.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... unavoidable failure of air pollution control and monitoring equipment, process equipment, or a process to..., proper design or better operation and maintenance practices; and (iii) Did not stem from any activity or... ambient air quality, the environment and human health; and (6) All emissions monitoring and control...

  10. Using standard treatment protocols to manage costs and quality of hospital services.

    PubMed

    Meyer, J W; Feingold, M G

    1993-06-01

    The current health care environment has made it critically important that hospital costs and quality be managed in an integrated fashion. Promised health care reforms are expected to make cost reduction and quality enhancement only more important. Traditional methods of hospital cost and quality control have largely been replaced by such approaches as practice parameters, outcomes measurement, clinical indicators, clinical paths, benchmarking, patient-centered care, and a focus on patient selection criteria. This Special Report describes an integrated process for strategically managing costs and quality simultaneously, incorporating key elements of many important new quality and cost control tools. By using a multidisciplinary group process to develop standard treatment protocols, hospitals and their medical staffs address the most important services provided within major product lines. Using both clinical and financial data, groups of physicians, nurses, department managers, financial analysts, and administrators redesign key patterns of care within their hospital, incorporating the best practices of their own and other institutions. The outcome of this process is a new, standardized set of clinical guidelines that reduce unnecessary variation in care, eliminate redundant interventions, establish clear lines of communication for all caregivers, and reduce the cost of each stay. The hospital, medical staff, and patients benefit from the improved opportunities for managed care contracting, more efficient hospital systems, consensus-based quality measures, and reductions in the cost of care. STPs offer a workable and worthwhile approach to positioning the hospital of the 1990s for operational efficiency and cost and quality competitiveness.

  11. Mitochondrial quality control and communications with the nucleus are important in maintaining mitochondrial function and cell health☆☆☆

    PubMed Central

    Kotiadis, Vassilios N.; Duchen, Michael R.; Osellame, Laura D.

    2014-01-01

    Background The maintenance of cell metabolism and homeostasis is a fundamental characteristic of living organisms. In eukaryotes, mitochondria are the cornerstone of these life supporting processes, playing leading roles in a host of core cellular functions, including energy transduction, metabolic and calcium signalling, and supporting roles in a number of biosynthetic pathways. The possession of a discrete mitochondrial genome dictates that the maintenance of mitochondrial ‘fitness’ requires quality control mechanisms which involve close communication with the nucleus. Scope of review This review explores the synergistic mechanisms that control mitochondrial quality and function and ensure cellular bioenergetic homeostasis. These include antioxidant defence mechanisms that protect against oxidative damage caused by reactive oxygen species, while regulating signals transduced through such free radicals. Protein homeostasis controls import, folding, and degradation of proteins underpinned by mechanisms that regulate bioenergetic capacity through the mitochondrial unfolded protein response. Autophagic machinery is recruited for mitochondrial turnover through the process of mitophagy. Mitochondria also communicate with the nucleus to exact specific transcriptional responses through retrograde signalling pathways. Major conclusions The outcome of mitochondrial quality control is not only reliant on the efficient operation of the core homeostatic mechanisms but also in the effective interaction of mitochondria with other cellular components, namely the nucleus. General significance Understanding mitochondrial quality control and the interactions between the organelle and the nucleus will be crucial in developing therapies for the plethora of diseases in which the pathophysiology is determined by mitochondrial dysfunction. This article is part of a Special Issue entitled Frontiers of Mitochondrial Research. PMID:24211250

  12. Manufacturing Bms/Iso System Review

    NASA Technical Reports Server (NTRS)

    Gomez, Yazmin

    2004-01-01

    The Quality Management System (QMS) is one that recognizes the need to continuously change and improve an organization s products and services as determined by system feedback, and corresponding management decisions. The purpose of a Quality Management System is to minimize quality variability of an organization's products and services. The optimal Quality Management System balances the need for an organization to maintain flexibility in the products and services it provides with the need for providing the appropriate level of discipline and control over the processes used to provide them. The goal of a Quality Management System is to ensure the quality of the products and services while consistently (through minimizing quality variability) meeting or exceeding customer expectations. The GRC Business Management System (BMS) is the foundation of the Center's ISO 9001:2000 registered quality system. ISO 9001 is a quality system model developed by the International Organization for Standardization. BMS supports and promote the Glenn Research Center Quality Policy and wants to ensure the customer satisfaction while also meeting quality standards. My assignment during this summer is to examine the manufacturing processes used to develop research hardware, which in most cases are one of a kind hardware, made with non conventional equipment and materials. During this process of observation I will make a determination, based on my observations of the hardware development processes the best way to meet customer requirements and at the same time achieve the GRC quality standards. The purpose of my task is to review the manufacturing processes identifying opportunities in which to optimize the efficiency of the processes and establish a plan for implementation and continuous improvement.

  13. GCE Data Toolbox for MATLAB - a software framework for automating environmental data processing, quality control and documentation

    NASA Astrophysics Data System (ADS)

    Sheldon, W.; Chamblee, J.; Cary, R. H.

    2013-12-01

    Environmental scientists are under increasing pressure from funding agencies and journal publishers to release quality-controlled data in a timely manner, as well as to produce comprehensive metadata for submitting data to long-term archives (e.g. DataONE, Dryad and BCO-DMO). At the same time, the volume of digital data that researchers collect and manage is increasing rapidly due to advances in high frequency electronic data collection from flux towers, instrumented moorings and sensor networks. However, few pre-built software tools are available to meet these data management needs, and those tools that do exist typically focus on part of the data management lifecycle or one class of data. The GCE Data Toolbox has proven to be both a generalized and effective software solution for environmental data management in the Long Term Ecological Research Network (LTER). This open source MATLAB software library, developed by the Georgia Coastal Ecosystems LTER program, integrates metadata capture, creation and management with data processing, quality control and analysis to support the entire data lifecycle. Raw data can be imported directly from common data logger formats (e.g. SeaBird, Campbell Scientific, YSI, Hobo), as well as delimited text files, MATLAB files and relational database queries. Basic metadata are derived from the data source itself (e.g. parsed from file headers) and by value inspection, and then augmented using editable metadata templates containing boilerplate documentation, attribute descriptors, code definitions and quality control rules. Data and metadata content, quality control rules and qualifier flags are then managed together in a robust data structure that supports database functionality and ensures data validity throughout processing. A growing suite of metadata-aware editing, quality control, analysis and synthesis tools are provided with the software to support managing data using graphical forms and command-line functions, as well as developing automated workflows for unattended processing. Finalized data and structured metadata can be exported in a wide variety of text and MATLAB formats or uploaded to a relational database for long-term archiving and distribution. The GCE Data Toolbox can be used as a complete, light-weight solution for environmental data and metadata management, but it can also be used in conjunction with other cyber infrastructure to provide a more comprehensive solution. For example, newly acquired data can be retrieved from a Data Turbine or Campbell LoggerNet Database server for quality control and processing, then transformed to CUAHSI Observations Data Model format and uploaded to a HydroServer for distribution through the CUAHSI Hydrologic Information System. The GCE Data Toolbox can also be leveraged in analytical workflows developed using Kepler or other systems that support MATLAB integration or tool chaining. This software can therefore be leveraged in many ways to help researchers manage, analyze and distribute the data they collect.

  14. 42 CFR 431.806 - State plan requirements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... processing assessment system. Except in a State that has an approved Medicaid Management Information System... Medicaid quality control claims processing assessment system that meets the requirements of §§ 431.830...

  15. Quality Control Analysis of Selected Aspects of Programs Administered by the Bureau of Student Financial Assistance. Task 1 and Quality Control Sample; Error-Prone Modeling Analysis Plan.

    ERIC Educational Resources Information Center

    Saavedra, Pedro; And Others

    Parameters and procedures for developing an error-prone model (EPM) to predict financial aid applicants who are likely to misreport on Basic Educational Opportunity Grant (BEOG) applications are introduced. Specifications to adapt these general parameters to secondary data analysis of the Validation, Edits, and Applications Processing Systems…

  16. The Automation of Nowcast Model Assessment Processes

    DTIC Science & Technology

    2016-09-01

    that will automate real-time WRE-N model simulations, collect and quality control check weather observations for assimilation and verification, and...domains centered near White Sands Missile Range, New Mexico, where the Meteorological Sensor Array (MSA) will be located. The MSA will provide...observations and performing quality -control checks for the pre-forecast data assimilation period. 2. Run the WRE-N model to generate model forecast data

  17. [Application of bioassay in quality control of Chinese materia medica-taking Radix Isatidis as an example].

    PubMed

    Yan, Dan; Ren, Yongshen; Luo, Jiaoyang; Li, Hanbing; Feng, Xue; Xiao, Xiaohe

    2010-10-01

    Bioassay, which construct the characteristics consistents with Chinese medical science, is the core mode and methods for the quality control of Chinese materia medica. Taking the bioassay of Radix Isatidis as an example, the contribution, status and application of bioassay in the quality control of Chinese materia medica were introduced in this article, and two key issue (the selection of reference and measurement methods) in the process of establishing bioassay were also explained. This article expects to provide a reference for the development and improvement of the bioassay of Chinese materia medica in a practical manipulation level.

  18. Zinc thermal spray coatings for reinforced concrete: An AWS process standard

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sulit, R.A.

    Zinc and aluminum thermal spray coatings (TSC) have been used for lining concrete weir in Great Britain since the 1950`s to maintain the dimensions of the weir for flow control concomitant with reduced wear and erosion of the concrete surfaces. This paper reports the development and the content of the ANSI/AWS C2.20-XX standard for the application of An TSC on concrete using flame and arc spray processes. This standard is formatted as an industrial process instruction: job description; safety; feedstock materials; equipment; a step-by-step method for surface preparation, thermal spraying; quality control; repair and maintenance of surface preparation, thermal spraying;more » quality control; repair and maintenance of Zn TSC on concrete; and a Job Control Record. Job planning and training and certification requirements are presented for An TSC inspectors and thermal spray operators. Four annexes are included in the standard: (a) historical summary of Zn TSC on concrete (b) sample job control record; (c) thermal spray operator qualification; and (d) portable adhesion testing for An TSC on concrete. This standard is based on the current literature and industrial equipment, process, and practices.« less

  19. Automating PACS quality control with the Vanderbilt image processing enterprise resource

    NASA Astrophysics Data System (ADS)

    Esparza, Michael L.; Welch, E. Brian; Landman, Bennett A.

    2012-02-01

    Precise image acquisition is an integral part of modern patient care and medical imaging research. Periodic quality control using standardized protocols and phantoms ensures that scanners are operating according to specifications, yet such procedures do not ensure that individual datasets are free from corruption; for example due to patient motion, transient interference, or physiological variability. If unacceptable artifacts are noticed during scanning, a technologist can repeat a procedure. Yet, substantial delays may be incurred if a problematic scan is not noticed until a radiologist reads the scans or an automated algorithm fails. Given scores of slices in typical three-dimensional scans and widevariety of potential use cases, a technologist cannot practically be expected inspect all images. In large-scale research, automated pipeline systems have had great success in achieving high throughput. However, clinical and institutional workflows are largely based on DICOM and PACS technologies; these systems are not readily compatible with research systems due to security and privacy restrictions. Hence, quantitative quality control has been relegated to individual investigators and too often neglected. Herein, we propose a scalable system, the Vanderbilt Image Processing Enterprise Resource (VIPER) to integrate modular quality control and image analysis routines with a standard PACS configuration. This server unifies image processing routines across an institutional level and provides a simple interface so that investigators can collaborate to deploy new analysis technologies. VIPER integrates with high performance computing environments has successfully analyzed all standard scans from our institutional research center over the course of the last 18 months.

  20. Interpreting the handling qualities of aircraft with stability and control augmentation

    NASA Technical Reports Server (NTRS)

    Hodgkinson, J.; Potsdam, E. H.; Smith, R. E.

    1990-01-01

    The general process of designing an aircraft for good flying qualities is first discussed. Lessons learned are pointed out, with piloted evaluation emerging as a crucial element. Two sources of rating variability in performing these evaluations are then discussed. First, the finite endpoints of the Cooper-Harper scale do not bias parametric statistical analyses unduly. Second, the wording of the scale does introduce some scatter. Phase lags generated by augmentation systems, as represented by equivalent time delays, often cause poor flying qualities. An analysis is introduced which allows a designer to relate any level of time delay to a probability of loss of aircraft control. This view of time delays should, it is hoped, allow better visibility of the time delays in the design process.

  1. [Study of continuous quality improvement for clinical laboratory processes via the platform of Hospital Group].

    PubMed

    Song, Wenqi; Shen, Ying; Peng, Xiaoxia; Tian, Jian; Wang, Hui; Xu, Lili; Nie, Xiaolu; Ni, Xin

    2015-05-26

    The program of continuous quality improvement in clinical laboratory processes for complete blood count (CBC) was launched via the platform of Beijing Children's Hospital Group in order to improve the quality of pediatric clinical laboratories. Fifteen children's hospitals of Beijing Children's Hospital group were investigated using the method of Chinese adapted continuous quality improvement with PDCA (Plan-Do-Check-Action). The questionnaire survey and inter-laboratory comparison was conducted to find the existing problems, to analyze reasons, to set forth quality targets and to put them into practice. Then, targeted training was conducted to 15 children's hospitals and the second questionnaire survey, self examinations by the clinical laboratories was performed. At the same time, the Group's online internal quality control platform was established. Overall effects of the program were evaluated so that lay a foundation for the next stage of PDCA. Both quality of control system documents and CBC internal quality control scheme for all of clinical laboratories were improved through this program. In addition, standardization of performance verification was also improved, especially with the comparable verification rate of precision and internal laboratory results up to 100%. In terms of instrument calibration and mandatory diagnostic rates, only three out of the 15 hospitals (20%) failed to pass muster in 2014 from 46.67% (seven out of the 15 hospitals) in 2013. The abnormal data of intraday precision variance coefficients of the five CBC indicator parameters (WBC, RBC, Hb, Plt and Hct) of all the 15 laboratories accounted for 1.2% (2/165) in 2014, a marked decrease from 9.6% (14/145) in 2013. While the number of the hospitals using only one horizontal quality control object for daily quality control has dropped to three from five. The 15 hospitals organized a total of 263 times of training in 2014 from 101 times in 2013, up 160%. The quality improvement program for the clinical laboratories launched via the Hospital Group platform can promote the joint development of the pediatric clinical laboratory discipline of all the member hospitals with remarkable improvement results, and the experience is recommendable for further rollout.

  2. Developing a workstation-based, real-time simulation for rapid handling qualities evaluations during design

    NASA Technical Reports Server (NTRS)

    Anderson, Frederick; Biezad, Daniel J.

    1994-01-01

    This paper describes the Rapid Aircraft DynamIcs AssessmeNt (RADIAN) project - an integration of the Aircraft SYNThesis (ACSTNT) design code with the USAD DATCOM code that estimates stability derivatives. Both of these codes are available to universities. These programs are then linked to flight simulation and flight controller synthesis tools and resulting design is evaluated on a graphics workstation. The entire process reduces the preliminary design time by an order of magnitude and provides an initial handling qualities evaluation of the design coupled to a control law. The integrated design process is applicable to both conventional aircraft taken from current textbooks and to unconventional designs emphasizing agility and propulsive control of attitude. The interactive and concurrent nature of the design process has been well received by industry and by design engineers at NASA. The process is being implemented into the design curriculum and is being used by students who view it as a significant advance over prior methods.

  3. Quality status display for a vibration welding process

    DOEpatents

    Spicer, John Patrick; Abell, Jeffrey A.; Wincek, Michael Anthony; Chakraborty, Debejyo; Bracey, Jennifer; Wang, Hui; Tavora, Peter W.; Davis, Jeffrey S.; Hutchinson, Daniel C.; Reardon, Ronald L.; Utz, Shawn

    2017-03-28

    A system includes a host machine and a status projector. The host machine is in electrical communication with a collection of sensors and with a welding controller that generates control signals for controlling the welding horn. The host machine is configured to execute a method to thereby process the sensory and control signals, as well as predict a quality status of a weld that is formed using the welding horn, including identifying any suspect welds. The host machine then activates the status projector to illuminate the suspect welds. This may occur directly on the welds using a laser projector, or on a surface of the work piece in proximity to the welds. The system and method may be used in the ultrasonic welding of battery tabs of a multi-cell battery pack in a particular embodiment. The welding horn and welding controller may also be part of the system.

  4. Water quality management library. 2. edition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eckenfelder, W.W.; Malina, J.F.; Patterson, J.W.

    1998-12-31

    A series of ten books offered in conjunction with Water Quality International, the Biennial Conference and Exposition of the International Association on Water Pollution Research and Control (IAWPRC). Volume 1, Activated Sludge Process, Design and Control, 2nd edition, 1998: Volume 2, Upgrading Wastewater Treatment Plants, 2nd edition, 1998: Volume 3, Toxicity Reduction, 2nd edition, 1998: Volume 4, Municipal Sewage Sludge Management, 2nd edition, 1998: Volume 5, Design and Retrofit of Wastewater Treatment Plants for Biological Nutrient Removal, 1st edition, 1992: Volume 6, Dynamics and Control of the Activated Sludge Process, 2nd edition, 1998: Volume 7: Design of Anaerobic Processes formore » the Treatment of Industrial and Municipal Wastes, 1st edition, 1992: Volume 8, Groundwater Remediation, 1st edition, 1992: Volume 9, Nonpoint Pollution and Urban Stormwater Management, 1st edition, 1995: Volume 10, Wastewater Reclamation and Reuse, 1st edition, 1998.« less

  5. Topographic and hydrographic survey data for the São Francisco River near Torrinha, Bahia, Brazil, 2014

    USGS Publications Warehouse

    Fosness, Ryan L.; Dietsch, Benjamin J.

    2015-10-21

    This report presents the surveying techniques and data-processing methods used to collect, process, and disseminate topographic and hydrographic data. All standard and non‑standard data-collection methods, techniques, and data process methods were documented. Additional discussion describes the quality-assurance and quality-control elements used in this study, along with the limitations for the Torrinha-Itacoatiara study reach data. The topographic and hydrographic geospatial data are published along with associated metadata.

  6. The Brazilian Air Force Uniform Distribution Process: Using Lean Thinking, Statistical Process Control and Theory of Constraints to Address Improvement Opportunities

    DTIC Science & Technology

    2015-03-26

    universal definition” (Evans & Lindsay, 1996). Heizer and Render (2010) argue that several definitions of this term are user-based, meaning, that quality...for example, really good ice cream has high butterfat levels.” ( Heizer & Render , 2010). Garvin, in his Competing in Eight Dimensions of Quality...Montgomery, 2005). As for definition purposes, the concept adopted by this research was provided by Heizer and Render (2010), for whom Statistical Process

  7. 10 CFR 72.158 - Control of special processes.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... NUCLEAR FUEL, HIGH-LEVEL RADIOACTIVE WASTE, AND REACTOR-RELATED GREATER THAN CLASS C WASTE Quality..., and applicant for a CoC shall establish measures to ensure that special processes, including welding...

  8. 10 CFR 72.158 - Control of special processes.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... NUCLEAR FUEL, HIGH-LEVEL RADIOACTIVE WASTE, AND REACTOR-RELATED GREATER THAN CLASS C WASTE Quality..., and applicant for a CoC shall establish measures to ensure that special processes, including welding...

  9. 10 CFR 72.158 - Control of special processes.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... NUCLEAR FUEL, HIGH-LEVEL RADIOACTIVE WASTE, AND REACTOR-RELATED GREATER THAN CLASS C WASTE Quality..., and applicant for a CoC shall establish measures to ensure that special processes, including welding...

  10. 10 CFR 72.158 - Control of special processes.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... NUCLEAR FUEL, HIGH-LEVEL RADIOACTIVE WASTE, AND REACTOR-RELATED GREATER THAN CLASS C WASTE Quality..., and applicant for a CoC shall establish measures to ensure that special processes, including welding...

  11. Machine Learning: A Crucial Tool for Sensor Design

    PubMed Central

    Zhao, Weixiang; Bhushan, Abhinav; Santamaria, Anthony D.; Simon, Melinda G.; Davis, Cristina E.

    2009-01-01

    Sensors have been widely used for disease diagnosis, environmental quality monitoring, food quality control, industrial process analysis and control, and other related fields. As a key tool for sensor data analysis, machine learning is becoming a core part of novel sensor design. Dividing a complete machine learning process into three steps: data pre-treatment, feature extraction and dimension reduction, and system modeling, this paper provides a review of the methods that are widely used for each step. For each method, the principles and the key issues that affect modeling results are discussed. After reviewing the potential problems in machine learning processes, this paper gives a summary of current algorithms in this field and provides some feasible directions for future studies. PMID:20191110

  12. Recent National Transonic Facility Test Process Improvements (Invited)

    NASA Technical Reports Server (NTRS)

    Kilgore, W. A.; Balakrishna, S.; Bobbitt, C. W., Jr.; Adcock, J. B.

    2001-01-01

    This paper describes the results of two recent process improvements; drag feed-forward Mach number control and simultaneous force/moment and pressure testing, at the National Transonic Facility. These improvements have reduced the duration and cost of testing. The drag feed-forward Mach number control reduces the Mach number settling time by using measured model drag in the Mach number control algorithm. Simultaneous force/moment and pressure testing allows simultaneous collection of force/moment and pressure data without sacrificing data quality thereby reducing the overall testing time. Both improvements can be implemented at any wind tunnel. Additionally the NTF is working to develop and implement continuous pitch as a testing option as an additional method to reduce costs and maintain data quality.

  13. Recent National Transonic Facility Test Process Improvements (Invited)

    NASA Technical Reports Server (NTRS)

    Kilgore, W. A.; Balakrishna, S.; Bobbitt, C. W., Jr.; Adcock, J. B.

    2001-01-01

    This paper describes the results of two recent process improvements; drag feed-forward Mach number control and simultaneous force/moment and pressure testing, at the National Transonic Facility. These improvements have reduced the duration and cost of testing. The drag feedforward Mach number control reduces the Mach number settling time by using measured model drag in the Mach number control algorithm. Simultaneous force/moment and pressure testing allows simultaneous collection of force/moment and pressure data without sacrificing data quality thereby reducing the overall testing time. Both improvements can be implemented at any wind tunnel. Additionally the NTF is working to develop and implement continuous pitch as a testing option as an additional method to reduce costs and maintain data quality.

  14. Case Studies in Modelling, Control in Food Processes.

    PubMed

    Glassey, J; Barone, A; Montague, G A; Sabou, V

    This chapter discusses the importance of modelling and control in increasing food process efficiency and ensuring product quality. Various approaches to both modelling and control in food processing are set in the context of the specific challenges in this industrial sector and latest developments in each area are discussed. Three industrial case studies are used to demonstrate the benefits of advanced measurement, modelling and control in food processes. The first case study illustrates the use of knowledge elicitation from expert operators in the process for the manufacture of potato chips (French fries) and the consequent improvements in process control to increase the consistency of the resulting product. The second case study highlights the economic benefits of tighter control of an important process parameter, moisture content, in potato crisp (chips) manufacture. The final case study describes the use of NIR spectroscopy in ensuring effective mixing of dry multicomponent mixtures and pastes. Practical implementation tips and infrastructure requirements are also discussed.

  15. Control Systems Engineering in Continuous Pharmaceutical Manufacturing May 20-21, 2014 Continuous Manufacturing Symposium.

    PubMed

    Myerson, Allan S; Krumme, Markus; Nasr, Moheb; Thomas, Hayden; Braatz, Richard D

    2015-03-01

    This white paper provides a perspective of the challenges, research needs, and future directions for control systems engineering in continuous pharmaceutical processing. The main motivation for writing this paper is to facilitate the development and deployment of control systems technologies so as to ensure quality of the drug product. Although the main focus is on small-molecule pharmaceutical products, most of the same statements apply to biological drug products. An introduction to continuous manufacturing and control systems is followed by a discussion of the current status and technical needs in process monitoring and control, systems integration, and risk analysis. Some key points are that: (1) the desired objective in continuous manufacturing should be the satisfaction of all critical quality attributes (CQAs), not for all variables to operate at steady-state values; (2) the design of start-up and shutdown procedures can significantly affect the economic operation of a continuous manufacturing process; (3) the traceability of material as it moves through the manufacturing facility is an important consideration that can at least in part be addressed using residence time distributions; and (4) the control systems technologies must assure quality in the presence of disturbances, dynamics, uncertainties, nonlinearities, and constraints. Direct measurement, first-principles and empirical model-based predictions, and design space approaches are described for ensuring that CQA specifications are met. Ways are discussed for universities, regulatory bodies, and industry to facilitate working around or through barriers to the development of control systems engineering technologies for continuous drug manufacturing. Industry and regulatory bodies should work with federal agencies to create federal funding mechanisms to attract faculty to this area. Universities should hire faculty interested in developing first-principles models and control systems technologies for drug manufacturing that are easily transportable to industry. Industry can facilitate the move to continuous manufacturing by working with universities on the conception of new continuous pharmaceutical manufacturing process unit operations that have the potential to make major improvements in product quality, controllability, or reduced capital and/or operating costs. Regulatory bodies should ensure that: (1) regulations and regulatory practices promote, and do not derail, the development and implementation of continuous manufacturing and control systems engineering approaches; (2) the individuals who approve specific regulatory filings are sufficiently trained to make good decisions regarding control systems approaches; (3) provide regulatory clarity and eliminate/reduce regulatory risks; (4) financially support the development of high-quality training materials for use of undergraduate students, graduate students, industrial employees, and regulatory staff; (5) enhance the training of their own technical staff by financially supporting joint research projects with universities in the development of continuous pharmaceutical manufacturing processes and the associated control systems engineering theory, numerical algorithms, and software; and (6) strongly encourage the federal agencies that support research to fund these research areas. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.

  16. Control systems engineering in continuous pharmaceutical manufacturing. May 20-21, 2014 Continuous Manufacturing Symposium.

    PubMed

    Myerson, Allan S; Krumme, Markus; Nasr, Moheb; Thomas, Hayden; Braatz, Richard D

    2015-03-01

    This white paper provides a perspective of the challenges, research needs, and future directions for control systems engineering in continuous pharmaceutical processing. The main motivation for writing this paper is to facilitate the development and deployment of control systems technologies so as to ensure quality of the drug product. Although the main focus is on small-molecule pharmaceutical products, most of the same statements apply to biological drug products. An introduction to continuous manufacturing and control systems is followed by a discussion of the current status and technical needs in process monitoring and control, systems integration, and risk analysis. Some key points are that: (1) the desired objective in continuous manufacturing should be the satisfaction of all critical quality attributes (CQAs), not for all variables to operate at steady-state values; (2) the design of start-up and shutdown procedures can significantly affect the economic operation of a continuous manufacturing process; (3) the traceability of material as it moves through the manufacturing facility is an important consideration that can at least in part be addressed using residence time distributions; and (4) the control systems technologies must assure quality in the presence of disturbances, dynamics, uncertainties, nonlinearities, and constraints. Direct measurement, first-principles and empirical model-based predictions, and design space approaches are described for ensuring that CQA specifications are met. Ways are discussed for universities, regulatory bodies, and industry to facilitate working around or through barriers to the development of control systems engineering technologies for continuous drug manufacturing. Industry and regulatory bodies should work with federal agencies to create federal funding mechanisms to attract faculty to this area. Universities should hire faculty interested in developing first-principles models and control systems technologies for drug manufacturing that are easily transportable to industry. Industry can facilitate the move to continuous manufacturing by working with universities on the conception of new continuous pharmaceutical manufacturing process unit operations that have the potential to make major improvements in product quality, controllability, or reduced capital and/or operating costs. Regulatory bodies should ensure that: (1) regulations and regulatory practices promote, and do not derail, the development and implementation of continuous manufacturing and control systems engineering approaches; (2) the individuals who approve specific regulatory filings are sufficiently trained to make good decisions regarding control systems approaches; (3) provide regulatory clarity and eliminate/reduce regulatory risks; (4) financially support the development of high-quality training materials for use of undergraduate students, graduate students, industrial employees, and regulatory staff; (5) enhance the training of their own technical staff by financially supporting joint research projects with universities in the development of continuous pharmaceutical manufacturing processes and the associated control systems engineering theory, numerical algorithms, and software; and (6) strongly encourage the federal agencies that support research to fund these research areas. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.

  17. MASQOT: a method for cDNA microarray spot quality control

    PubMed Central

    Bylesjö, Max; Eriksson, Daniel; Sjödin, Andreas; Sjöström, Michael; Jansson, Stefan; Antti, Henrik; Trygg, Johan

    2005-01-01

    Background cDNA microarray technology has emerged as a major player in the parallel detection of biomolecules, but still suffers from fundamental technical problems. Identifying and removing unreliable data is crucial to prevent the risk of receiving illusive analysis results. Visual assessment of spot quality is still a common procedure, despite the time-consuming work of manually inspecting spots in the range of hundreds of thousands or more. Results A novel methodology for cDNA microarray spot quality control is outlined. Multivariate discriminant analysis was used to assess spot quality based on existing and novel descriptors. The presented methodology displays high reproducibility and was found superior in identifying unreliable data compared to other evaluated methodologies. Conclusion The proposed methodology for cDNA microarray spot quality control generates non-discrete values of spot quality which can be utilized as weights in subsequent analysis procedures as well as to discard spots of undesired quality using the suggested threshold values. The MASQOT approach provides a consistent assessment of spot quality and can be considered an alternative to the labor-intensive manual quality assessment process. PMID:16223442

  18. Application of the quality by design approach to the drug substance manufacturing process of an Fc fusion protein: towards a global multi-step design space.

    PubMed

    Eon-duval, Alex; Valax, Pascal; Solacroup, Thomas; Broly, Hervé; Gleixner, Ralf; Strat, Claire L E; Sutter, James

    2012-10-01

    The article describes how Quality by Design principles can be applied to the drug substance manufacturing process of an Fc fusion protein. First, the quality attributes of the product were evaluated for their potential impact on safety and efficacy using risk management tools. Similarly, process parameters that have a potential impact on critical quality attributes (CQAs) were also identified through a risk assessment. Critical process parameters were then evaluated for their impact on CQAs, individually and in interaction with each other, using multivariate design of experiment techniques during the process characterisation phase. The global multi-step Design Space, defining operational limits for the entire drug substance manufacturing process so as to ensure that the drug substance quality targets are met, was devised using predictive statistical models developed during the characterisation study. The validity of the global multi-step Design Space was then confirmed by performing the entire process, from cell bank thawing to final drug substance, at its limits during the robustness study: the quality of the final drug substance produced under different conditions was verified against predefined targets. An adaptive strategy was devised whereby the Design Space can be adjusted to the quality of the input material to ensure reliable drug substance quality. Finally, all the data obtained during the process described above, together with data generated during additional validation studies as well as manufacturing data, were used to define the control strategy for the drug substance manufacturing process using a risk assessment methodology. Copyright © 2012 Wiley-Liss, Inc.

  19. What Does it Mean to Publish Data in Earth System Science Data Journal?

    NASA Astrophysics Data System (ADS)

    Carlson, D.; Pfeiffenberger, H.

    2015-12-01

    The availability of more than 120 data sets in ESSD represents an unprecedented effort by providers, data centers and ESSD. ESSD data sets and their accompanying data descriptions undergo rigorous review. The data sets reside at any of more than 20 cooperating data centers. The ESSD publication process depends on but challenges the concepts of digital object identification and exacerbates the varied interpretations of the phrase 'data publication'. ESSD adopts the digital object identifier (doi). Key questions apply to doi's and other identifiers. How will persistent identifiers point accurately to distributed or replicated data? How should data centers and data publishers use identifier technologies to ensure authenticity and integrity? Should metadata associated with identifiers distinguish among raw, quality controlled and derived data processing levels, or indicate license or copyright status?Data centers publish data sets according to internal metadata standards but without indicators of quality control. Publication in this sense indicates availability. National data portals compile, serve and publish data products as a service to national researchers and, often, to meet national requirements. Publication in this second case indicates availability in a national context; the data themselves may still reside at separate data centers. Data journals such as ESSD or Scientific Data publish peer-reviewed, quality controlled data sets. These data sets almost always reside at a separate data center - the journal and the center maintain explicit identifier linkages. Data journals add quality to the feature of availability. A single data set processed through these layers will generate three independent doi's but the doi's will provide little information about availability or quality. Could the data world learn from the URL world to consider additions? Suffixes? Could we use our experience with processing levels or data maturity to propose and agree such extensions?

  20. Did a quality improvement collaborative make stroke care better? A cluster randomized trial

    PubMed Central

    2014-01-01

    Background Stroke can result in death and long-term disability. Fast and high-quality care can reduce the impact of stroke, but UK national audit data has demonstrated variability in compliance with recommended processes of care. Though quality improvement collaboratives (QICs) are widely used, whether a QIC could improve reliability of stroke care was unknown. Methods Twenty-four NHS hospitals in the Northwest of England were randomly allocated to participate either in Stroke 90:10, a QIC based on the Breakthrough Series (BTS) model, or to a control group giving normal care. The QIC focused on nine processes of quality care for stroke already used in the national stroke audit. The nine processes were grouped into two distinct care bundles: one relating to early hours care and one relating to rehabilitation following stroke. Using an interrupted time series design and difference-in-difference analysis, we aimed to determine whether hospitals participating in the QIC improved more than the control group on bundle compliance. Results Data were available from nine interventions (3,533 patients) and nine control hospitals (3,059 patients). Hospitals in the QIC showed a modest improvement from baseline in the odds of average compliance equivalent to a relative improvement of 10.9% (95% CI 1.3%, 20.6%) in the Early Hours Bundle and 11.2% (95% CI 1.4%, 21.5%) in the Rehabilitation Bundle. Secondary analysis suggested that some specific processes were more sensitive to an intervention effect. Conclusions Some aspects of stroke care improved during the QIC, but the effects of the QIC were modest and further improvement is needed. The extent to which a BTS QIC can improve quality of stroke care remains uncertain. Some aspects of care may respond better to collaboratives than others. Trial registration ISRCTN13893902. PMID:24690267

Top