Sample records for quality control processes

  1. Process-based quality for thermal spray via feedback control

    NASA Astrophysics Data System (ADS)

    Dykhuizen, R. C.; Neiser, R. A.

    2006-09-01

    Quality control of a thermal spray system manufacturing process is difficult due to the many input variables that need to be controlled. Great care must be taken to ensure that the process remains constant to obtain a consistent quality of the parts. Control is greatly complicated by the fact that measurement of particle velocities and temperatures is a noisy stochastic process. This article illustrates the application of quality control concepts to a wire flame spray process. A central feature of the real-time control system is an automatic feedback control scheme that provides fine adjustments to ensure that uncontrolled variations are accommodated. It is shown how the control vectors can be constructed from simple process maps to independently control particle velocity and temperature. This control scheme is shown to perform well in a real production environment. We also demonstrate that slight variations in the feed wire curvature can greatly influence the process. Finally, the geometry of the spray system and sensor must remain constant for the best reproducibility.

  2. The process of managerial control in quality improvement initiatives.

    PubMed

    Slovensky, D J; Fottler, M D

    1994-11-01

    The fundamental intent of strategic management is to position an organization with in its market to exploit organizational competencies and strengths to gain competitive advantage. Competitive advantage may be achieved through such strategies as low cost, high quality, or unique services or products. For health care organizations accredited by the Joint Commission on Accreditation of Healthcare Organizations, continually improving both processes and outcomes of organizational performance--quality improvement--in all operational areas of the organization is a mandated strategy. Defining and measuring quality and controlling the quality improvement strategy remain problematic. The article discusses the nature and processes of managerial control, some potential measures of quality, and related information needs.

  3. Quality control process improvement of flexible printed circuit board by FMEA

    NASA Astrophysics Data System (ADS)

    Krasaephol, Siwaporn; Chutima, Parames

    2018-02-01

    This research focuses on the quality control process improvement of Flexible Printed Circuit Board (FPCB), centred around model 7-Flex, by using Failure Mode and Effect Analysis (FMEA) method to decrease proportion of defective finished goods that are found at the final inspection process. Due to a number of defective units that were found at the final inspection process, high scraps may be escaped to customers. The problem comes from poor quality control process which is not efficient enough to filter defective products from in-process because there is no In-Process Quality Control (IPQC) or sampling inspection in the process. Therefore, the quality control process has to be improved by setting inspection gates and IPCQs at critical processes in order to filter the defective products. The critical processes are analysed by the FMEA method. IPQC is used for detecting defective products and reducing chances of defective finished goods escaped to the customers. Reducing proportion of defective finished goods also decreases scrap cost because finished goods incur higher scrap cost than work in-process. Moreover, defective products that are found during process can reflect the abnormal processes; therefore, engineers and operators should timely solve the problems. Improved quality control was implemented for 7-Flex production lines from July 2017 to September 2017. The result shows decreasing of the average proportion of defective finished goods and the average of Customer Manufacturers Lot Reject Rate (%LRR of CMs) equal to 4.5% and 4.1% respectively. Furthermore, cost saving of this quality control process equals to 100K Baht.

  4. Pharmaceutical quality by design: product and process development, understanding, and control.

    PubMed

    Yu, Lawrence X

    2008-04-01

    The purpose of this paper is to discuss the pharmaceutical Quality by Design (QbD) and describe how it can be used to ensure pharmaceutical quality. The QbD was described and some of its elements identified. Process parameters and quality attributes were identified for each unit operation during manufacture of solid oral dosage forms. The use of QbD was contrasted with the evaluation of product quality by testing alone. The QbD is a systemic approach to pharmaceutical development. It means designing and developing formulations and manufacturing processes to ensure predefined product quality. Some of the QbD elements include: Defining target product quality profile; Designing product and manufacturing processes; Identifying critical quality attributes, process parameters, and sources of variability; Controlling manufacturing processes to produce consistent quality over time. Using QbD, pharmaceutical quality is assured by understanding and controlling formulation and manufacturing variables. Product testing confirms the product quality. Implementation of QbD will enable transformation of the chemistry, manufacturing, and controls (CMC) review of abbreviated new drug applications (ANDAs) into a science-based pharmaceutical quality assessment.

  5. A real time quality control application for animal production by image processing.

    PubMed

    Sungur, Cemil; Özkan, Halil

    2015-11-01

    Standards of hygiene and health are of major importance in food production, and quality control has become obligatory in this field. Thanks to rapidly developing technologies, it is now possible for automatic and safe quality control of food production. For this purpose, image-processing-based quality control systems used in industrial applications are being employed to analyze the quality of food products. In this study, quality control of chicken (Gallus domesticus) eggs was achieved using a real time image-processing technique. In order to execute the quality control processes, a conveying mechanism was used. Eggs passing on a conveyor belt were continuously photographed in real time by cameras located above the belt. The images obtained were processed by various methods and techniques. Using digital instrumentation, the volume of the eggs was measured, broken/cracked eggs were separated and dirty eggs were determined. In accordance with international standards for classifying the quality of eggs, the class of separated eggs was determined through a fuzzy implication model. According to tests carried out on thousands of eggs, a quality control process with an accuracy of 98% was possible. © 2014 Society of Chemical Industry.

  6. [Quality process control system of Chinese medicine preparation based on "holistic view"].

    PubMed

    Wang, Ya-Qi; Jiao, Jiao-Jiao; Wu, Zhen-Feng; Zheng, Qin; Yang, Ming

    2018-01-01

    "High quality, safety and effectiveness" are the primary principles for the pharmaceutical research and development process in China. The quality of products relies not only on the inspection method, but also on the design and development, process control and standardized management. The quality depends on the process control level. In this paper, the history and current development of quality control of traditional Chinese medicine (TCM) preparations are reviewed systematically. Based on the development model of international drug quality control and the misunderstanding of quality control of TCM preparations, the reasons for impacting the homogeneity of TCM preparations are analyzed and summarized. According to TCM characteristics, efforts were made to control the diversity of TCM, make "unstable" TCM into "stable" Chinese patent medicines, put forward the concepts of "holistic view" and "QbD (quality by design)", so as to create the "holistic, modular, data, standardized" model as the core of TCM preparation quality process control model. Scientific studies shall conform to the actual production of TCM preparations, and be conducive to supporting advanced equipment and technology upgrade, thoroughly applying the scientific research achievements in Chinese patent medicines, and promoting the cluster application and transformation application of TCM pharmaceutical technology, so as to improve the quality and effectiveness of the TCM industry and realize the green development. Copyright© by the Chinese Pharmaceutical Association.

  7. Statistical Process Control: Going to the Limit for Quality.

    ERIC Educational Resources Information Center

    Training, 1987

    1987-01-01

    Defines the concept of statistical process control, a quality control method used especially in manufacturing. Generally, concept users set specific standard levels that must be met. Makes the point that although employees work directly with the method, management is responsible for its success within the plant. (CH)

  8. [Development of whole process quality control and management system of traditional Chinese medicine decoction pieces based on traditional Chinese medicine quality tree].

    PubMed

    Yu, Wen-Kang; Dong, Ling; Pei, Wen-Xuan; Sun, Zhi-Rong; Dai, Jun-Dong; Wang, Yun

    2017-12-01

    The whole process quality control and management of traditional Chinese medicine (TCM) decoction pieces is a system engineering, involving the base environment, seeds and seedlings, harvesting, processing and other multiple steps, so the accurate identification of factors in TCM production process that may induce the quality risk, as well as reasonable quality control measures are very important. At present, the concept of quality risk is mainly concentrated in the aspects of management and regulations, etc. There is no comprehensive analysis on possible risks in the quality control process of TCM decoction pieces, or analysis summary of effective quality control schemes. A whole process quality control and management system for TCM decoction pieces based on TCM quality tree was proposed in this study. This system effectively combined the process analysis method of TCM quality tree with the quality risk management, and can help managers to make real-time decisions while realizing the whole process quality control of TCM. By providing personalized web interface, this system can realize user-oriented information feedback, and was convenient for users to predict, evaluate and control the quality of TCM. In the application process, the whole process quality control and management system of the TCM decoction pieces can identify the related quality factors such as base environment, cultivation and pieces processing, extend and modify the existing scientific workflow according to their own production conditions, and provide different enterprises with their own quality systems, to achieve the personalized service. As a new quality management model, this paper can provide reference for improving the quality of Chinese medicine production and quality standardization. Copyright© by the Chinese Pharmaceutical Association.

  9. Modelling and control for laser based welding processes: modern methods of process control to improve quality of laser-based joining methods

    NASA Astrophysics Data System (ADS)

    Zäh, Ralf-Kilian; Mosbach, Benedikt; Hollwich, Jan; Faupel, Benedikt

    2017-02-01

    To ensure the competitiveness of manufacturing companies it is indispensable to optimize their manufacturing processes. Slight variations of process parameters and machine settings have only marginally effects on the product quality. Therefore, the largest possible editing window is required. Such parameters are, for example, the movement of the laser beam across the component for the laser keyhole welding. That`s why it is necessary to keep the formation of welding seams within specified limits. Therefore, the quality of laser welding processes is ensured, by using post-process methods, like ultrasonic inspection, or special in-process methods. These in-process systems only achieve a simple evaluation which shows whether the weld seam is acceptable or not. Furthermore, in-process systems use no feedback for changing the control variables such as speed of the laser or adjustment of laser power. In this paper the research group presents current results of the research field of Online Monitoring, Online Controlling and Model predictive controlling in laser welding processes to increase the product quality. To record the characteristics of the welding process, tested online methods are used during the process. Based on the measurement data, a state space model is ascertained, which includes all the control variables of the system. Depending on simulation tools the model predictive controller (MPC) is designed for the model and integrated into an NI-Real-Time-System.

  10. Interrupted Time Series Versus Statistical Process Control in Quality Improvement Projects.

    PubMed

    Andersson Hagiwara, Magnus; Andersson Gäre, Boel; Elg, Mattias

    2016-01-01

    To measure the effect of quality improvement interventions, it is appropriate to use analysis methods that measure data over time. Examples of such methods include statistical process control analysis and interrupted time series with segmented regression analysis. This article compares the use of statistical process control analysis and interrupted time series with segmented regression analysis for evaluating the longitudinal effects of quality improvement interventions, using an example study on an evaluation of a computerized decision support system.

  11. [A strategy of constructing the technological system for quality control of Chinese medicine based on process control and management].

    PubMed

    Cheng, Yi-Yu; Qian, Zhong-Zhi; Zhang, Bo-Li

    2017-01-01

    The current situation, bottleneck problems and severe challenges in quality control technology of Chinese Medicine (CM) are briefly described. It is presented to change the phenomenon related to the post-test as the main means and contempt for process control in drug regulation, reverse the situation of neglecting the development of process control and management technology for pharmaceutical manufacture and reconstruct the technological system for quality control of CM products. The regulation and technology system based on process control and management for controlling CM quality should be established to solve weighty realistic problems of CM industry from the root causes, including backwardness of quality control technology, weakness of quality risk control measures, poor reputation of product quality and so on. By this way, the obstacles from poor controllability of CM product quality could be broken. Concentrating on those difficult problems and weak links in the technical field of CM quality control, it is proposed to build CMC (Chemistry, Manufacturing and Controls) regulation for CM products with Chinese characteristics and promote the regulation international recognition as soon as possible. The CMC technical framework, which is clinical efficacy-oriented, manufacturing manner-centered and process control-focused, was designed. To address the clinical characteristics of traditional Chinese medicine (TCM) and the production feature of CM manufacture, it is suggested to establish quality control engineering for CM manufacturing by integrating pharmaceutical analysis, TCM chemistry, TCM pharmacology, pharmaceutical engineering, control engineering, management engineering and other disciplines. Further, a theoretical model of quality control engineering for CM manufacturing and the methodology of digital pharmaceutical engineering are proposed. A technology pathway for promoting CM standard and realizing the strategic goal of CM internationalization is

  12. The utilization of six sigma and statistical process control techniques in surgical quality improvement.

    PubMed

    Sedlack, Jeffrey D

    2010-01-01

    Surgeons have been slow to incorporate industrial reliability techniques. Process control methods were applied to surgeon waiting time between cases, and to length of stay (LOS) after colon surgery. Waiting times between surgeries were evaluated by auditing the operating room records of a single hospital over a 1-month period. The medical records of 628 patients undergoing colon surgery over a 5-year period were reviewed. The average surgeon wait time between cases was 53 min, and the busiest surgeon spent 291/2 hr in 1 month waiting between surgeries. Process control charting demonstrated poor overall control of the room turnover process. Average LOS after colon resection also demonstrated very poor control. Mean LOS was 10 days. Weibull's conditional analysis revealed a conditional LOS of 9.83 days. Serious process management problems were identified in both analyses. These process issues are both expensive and adversely affect the quality of service offered by the institution. Process control mechanisms were suggested or implemented to improve these surgical processes. Industrial reliability and quality management tools can easily and effectively identify process control problems that occur on surgical services. © 2010 National Association for Healthcare Quality.

  13. A Controlled Agitation Process for Improving Quality of Canned Green Beans during Agitation Thermal Processing.

    PubMed

    Singh, Anika; Pratap Singh, Anubhav; Ramaswamy, Hosahalli S

    2016-06-01

    This work introduces the concept of a controlled agitation thermal process to reduce quality damage in liquid-particulate products during agitation thermal processing. Reciprocating agitation thermal processing (RA-TP) was used as the agitation thermal process. In order to reduce the impact of agitation, a new concept of "stopping agitations after sufficient development of cold-spot temperature" was proposed. Green beans were processed in No. 2 (307×409) cans filled with liquids of various consistency (0% to 2% CMC) at various frequencies (1 to 3 Hz) of RA-TP using a full-factorial design and heat penetration results were collected. Corresponding operator's process time to impart a 10-min process lethality (Fo ) and agitation time (AT) were calculated using heat penetration results. Accordingly, products were processed again by stopping agitations as per 3 agitation regimes, namely; full time agitation, equilibration time agitation, and partial time agitation. Processed products were photographed and tested for visual quality, color, texture, breakage of green beans, turbidity, and percentage of insoluble solids in can liquid. Results showed that stopping agitations after sufficient development of cold-spot temperatures is an effective way of reducing product damages caused by agitation (for example, breakage of beans and its leaching into liquid). Agitations till one-log temperature difference gave best color, texture and visual product quality for low-viscosity liquid-particulate mixture and extended agitations till equilibration time was best for high-viscosity products. Thus, it was shown that a controlled agitation thermal process is more effective in obtaining high product quality as compared to a regular agitation thermal process. © 2016 Institute of Food Technologists®

  14. [Investigation on production process quality control of traditional Chinese medicine--Banlangen granule as an example].

    PubMed

    Tan, Manrong; Yan, Dan; Qiu, Lingling; Chen, Longhu; Yan, Yan; Jin, Cheng; Li, Hanbing; Xiao, Xiaohe

    2012-04-01

    For the quality management system of herbal medicines, intermediate and finished products it exists the " short board" effect of methodologies. Based on the concept of process control, new strategies and new methods of the production process quality control had been established with the consideration of the actual production of traditional Chinese medicine an the characteristics of Chinese medicine. Taking Banlangen granule as a practice example, which was effective and widespread application, character identification, determination of index components, chemical fingerprint and biometrics technology were sequentially used respectively to assess the quality of Banlangen herbal medicines, intermediate (water extraction and alcohol precipitation) and finished product. With the transfer rate of chemical information and biological potency as indicators, the effectiveness and transmission of the above different assessments and control methods had been researched. And ultimately, the process quality control methods of Banlangen granule, which were based on chemical composition analysis-biometric analysis, had been set up. It can not only validly solute the current status that there were many manufacturers varying quality of Banlangen granule, but also ensure and enhance its clinical efficacy. Furthermore it provided a foundation for the construction of the quality control of traditional Chinese medicine production process.

  15. [Feedforward control strategy and its application in quality improvement of ethanol precipitation process of danhong injection].

    PubMed

    Yan, Bin-Jun; Guo, Zheng-Tai; Qu, Hai-Bin; Zhao, Bu-Chang; Zhao, Tao

    2013-06-01

    In this work, a feedforward control strategy basing on the concept of quality by design was established for the manufacturing process of traditional Chinese medicine to reduce the impact of the quality variation of raw materials on drug. In the research, the ethanol precipitation process of Danhong injection was taken as an application case of the method established. Box-Behnken design of experiments was conducted. Mathematical models relating the attributes of the concentrate, the process parameters and the quality of the supernatants produced were established. Then an optimization model for calculating the best process parameters basing on the attributes of the concentrate was built. The quality of the supernatants produced by ethanol precipitation with optimized and non-optimized process parameters were compared. The results showed that using the feedforward control strategy for process parameters optimization can control the quality of the supernatants effectively. The feedforward control strategy proposed can enhance the batch-to-batch consistency of the supernatants produced by ethanol precipitation.

  16. Image-guided radiotherapy quality control: Statistical process control using image similarity metrics.

    PubMed

    Shiraishi, Satomi; Grams, Michael P; Fong de Los Santos, Luis E

    2018-05-01

    The purpose of this study was to demonstrate an objective quality control framework for the image review process. A total of 927 cone-beam computed tomography (CBCT) registrations were retrospectively analyzed for 33 bilateral head and neck cancer patients who received definitive radiotherapy. Two registration tracking volumes (RTVs) - cervical spine (C-spine) and mandible - were defined, within which a similarity metric was calculated and used as a registration quality tracking metric over the course of treatment. First, sensitivity to large misregistrations was analyzed for normalized cross-correlation (NCC) and mutual information (MI) in the context of statistical analysis. The distribution of metrics was obtained for displacements that varied according to a normal distribution with standard deviation of σ = 2 mm, and the detectability of displacements greater than 5 mm was investigated. Then, similarity metric control charts were created using a statistical process control (SPC) framework to objectively monitor the image registration and review process. Patient-specific control charts were created using NCC values from the first five fractions to set a patient-specific process capability limit. Population control charts were created using the average of the first five NCC values for all patients in the study. For each patient, the similarity metrics were calculated as a function of unidirectional translation, referred to as the effective displacement. Patient-specific action limits corresponding to 5 mm effective displacements were defined. Furthermore, effective displacements of the ten registrations with the lowest similarity metrics were compared with a three dimensional (3DoF) couch displacement required to align the anatomical landmarks. Normalized cross-correlation identified suboptimal registrations more effectively than MI within the framework of SPC. Deviations greater than 5 mm were detected at 2.8σ and 2.1σ from the mean for NCC and MI

  17. 42 CFR 431.830 - Basic elements of the Medicaid quality control (MQC) claims processing assessment system.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Basic elements of the Medicaid quality control (MQC... & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL ASSISTANCE PROGRAMS STATE ORGANIZATION AND GENERAL ADMINISTRATION Quality Control Medicaid Quality Control (mqc) Claims Processing...

  18. Quality Control through Design and Process: Gambrel Roof Truss Challenge

    ERIC Educational Resources Information Center

    Ward, Dell; Jones, James

    2011-01-01

    Customers determine whether a product fulfills their needs or satisfies them. "Quality control", then, is the process of finding out what the customer wants, along with designing, producing, delivering, and servicing the product--and ultimately satisfying the customer's expectations. For many years, people considered a product to be of good…

  19. [Discussion on research thinking of traditional Chinese medicine standardization system based on whole process quality control].

    PubMed

    Dong, Ling; Sun, Yu; Pei, Wen-Xuan; Dai, Jun-Dong; Wang, Zi-Yu; Pan, Meng; Chen, Jiang-Peng; Wang, Yun

    2017-12-01

    The concept of "Quality by design" indicates that good design for the whole life cycle of pharmaceutical production enables the drug to meet the expected quality requirements. Aiming at the existing problems of the traditional Chinese medicine (TCM) industry, the TCM standardization system was put forward in this paper from the national strategic level, under the guidance by the idea of quality control in international manufacturing industry and with considerations of TCM industry's own characteristics and development status. The connotation of this strategy was to establish five interrelated systems: multi-indicators system based on tri-indicators system, quality standard and specification system of TCM herbal materials and decoction pieces, quality traceability system, data monitoring system based on whole-process quality control, and whole-process quality management system of TCM, and achieve the whole process systematic and scientific study in TCM industry through "top-level design-implement in steps-system integration" workflow. This article analyzed the correlation between the quality standards of all links, established standard operating procedures of each link and whole process, and constructed a high standard overall quality management system for TCM industry chains, in order to provide a demonstration for the establishment of TCM whole-process quality control system and provide systematic reference and basis for standardization strategy in TCM industry. Copyright© by the Chinese Pharmaceutical Association.

  20. Assessing the influence of component processing and donor characteristics on quality of red cell concentrates using quality control data.

    PubMed

    Jordan, A; Chen, D; Yi, Q-L; Kanias, T; Gladwin, M T; Acker, J P

    2016-07-01

    Quality control (QC) data collected by blood services are used to monitor production and to ensure compliance with regulatory standards. We demonstrate how analysis of quality control data can be used to highlight the sources of variability within red cell concentrates (RCCs). We merged Canadian Blood Services QC data with manufacturing and donor records for 28 227 RCC between June 2011 and October 2014. Units were categorized based on processing method, bag manufacturer, donor age and donor sex, then assessed based on product characteristics: haemolysis and haemoglobin levels, unit volume, leucocyte count and haematocrit. Buffy-coat method (top/bottom)-processed units exhibited lower haemolysis than units processed using the whole-blood filtration method (top/top). Units from female donors exhibited lower haemolysis than male donations. Processing method influenced unit volume and the ratio of additive solution to residual plasma. Stored red blood cell characteristics are influenced by prestorage processing and donor factors. Understanding the relationship between processing, donors and RCC quality will help blood services to ensure the safety of transfused products. © 2016 International Society of Blood Transfusion.

  1. Statistical Methods for Quality Control of Steel Coils Manufacturing Process using Generalized Linear Models

    NASA Astrophysics Data System (ADS)

    García-Díaz, J. Carlos

    2009-11-01

    Fault detection and diagnosis is an important problem in process engineering. Process equipments are subject to malfunctions during operation. Galvanized steel is a value added product, furnishing effective performance by combining the corrosion resistance of zinc with the strength and formability of steel. Fault detection and diagnosis is an important problem in continuous hot dip galvanizing and the increasingly stringent quality requirements in automotive industry has also demanded ongoing efforts in process control to make the process more robust. When faults occur, they change the relationship among these observed variables. This work compares different statistical regression models proposed in the literature for estimating the quality of galvanized steel coils on the basis of short time histories. Data for 26 batches were available. Five variables were selected for monitoring the process: the steel strip velocity, four bath temperatures and bath level. The entire data consisting of 48 galvanized steel coils was divided into sets. The first training data set was 25 conforming coils and the second data set was 23 nonconforming coils. Logistic regression is a modeling tool in which the dependent variable is categorical. In most applications, the dependent variable is binary. The results show that the logistic generalized linear models do provide good estimates of quality coils and can be useful for quality control in manufacturing process.

  2. Automating PACS quality control with the Vanderbilt image processing enterprise resource

    NASA Astrophysics Data System (ADS)

    Esparza, Michael L.; Welch, E. Brian; Landman, Bennett A.

    2012-02-01

    Precise image acquisition is an integral part of modern patient care and medical imaging research. Periodic quality control using standardized protocols and phantoms ensures that scanners are operating according to specifications, yet such procedures do not ensure that individual datasets are free from corruption; for example due to patient motion, transient interference, or physiological variability. If unacceptable artifacts are noticed during scanning, a technologist can repeat a procedure. Yet, substantial delays may be incurred if a problematic scan is not noticed until a radiologist reads the scans or an automated algorithm fails. Given scores of slices in typical three-dimensional scans and widevariety of potential use cases, a technologist cannot practically be expected inspect all images. In large-scale research, automated pipeline systems have had great success in achieving high throughput. However, clinical and institutional workflows are largely based on DICOM and PACS technologies; these systems are not readily compatible with research systems due to security and privacy restrictions. Hence, quantitative quality control has been relegated to individual investigators and too often neglected. Herein, we propose a scalable system, the Vanderbilt Image Processing Enterprise Resource (VIPER) to integrate modular quality control and image analysis routines with a standard PACS configuration. This server unifies image processing routines across an institutional level and provides a simple interface so that investigators can collaborate to deploy new analysis technologies. VIPER integrates with high performance computing environments has successfully analyzed all standard scans from our institutional research center over the course of the last 18 months.

  3. Measuring and improving the quality of postoperative epidural analgesia for major abdominal surgery using statistical process control charts.

    PubMed

    Duncan, Fiona; Haigh, Carol

    2013-10-01

    To explore and improve the quality of continuous epidural analgesia for pain relief using Statistical Process Control tools. Measuring the quality of pain management interventions is complex. Intermittent audits do not accurately capture the results of quality improvement initiatives. The failure rate for one intervention, epidural analgesia, is approximately 30% in everyday practice, so it is an important area for improvement. Continuous measurement and analysis are required to understand the multiple factors involved in providing effective pain relief. Process control and quality improvement Routine prospectively acquired data collection started in 2006. Patients were asked about their pain and side effects of treatment. Statistical Process Control methods were applied for continuous data analysis. A multidisciplinary group worked together to identify reasons for variation in the data and instigated ideas for improvement. The key measure for improvement was a reduction in the percentage of patients with an epidural in severe pain. The baseline control charts illustrated the recorded variation in the rate of several processes and outcomes for 293 surgical patients. The mean visual analogue pain score (VNRS) was four. There was no special cause variation when data were stratified by surgeons, clinical area or patients who had experienced pain before surgery. Fifty-seven per cent of patients were hypotensive on the first day after surgery. We were able to demonstrate a significant improvement in the failure rate of epidurals as the project continued with quality improvement interventions. Statistical Process Control is a useful tool for measuring and improving the quality of pain management. The applications of Statistical Process Control methods offer the potential to learn more about the process of change and outcomes in an Acute Pain Service both locally and nationally. We have been able to develop measures for improvement and benchmarking in routine care that

  4. Statistical Process Control: A Quality Tool for a Venous Thromboembolic Disease Registry.

    PubMed

    Posadas-Martinez, Maria Lourdes; Rojas, Liliana Paloma; Vazquez, Fernando Javier; De Quiros, Fernan Bernaldo; Waisman, Gabriel Dario; Giunta, Diego Hernan

    2016-01-01

    We aim to describe Statistical Control Process as a quality tool for the Institutional Registry of Venous Thromboembolic Disease (IRTD), a registry developed in a community-care tertiary hospital in Buenos Aires, Argentina. The IRTD is a prospective cohort. The process of data acquisition began with the creation of a computerized alert generated whenever physicians requested imaging or laboratory study to diagnose venous thromboembolism, which defined eligible patients. The process then followed a structured methodology for patient's inclusion, evaluation, and posterior data entry. To control this process, process performance indicators were designed to be measured monthly. These included the number of eligible patients, the number of included patients, median time to patient's evaluation, and percentage of patients lost to evaluation. Control charts were graphed for each indicator. The registry was evaluated in 93 months, where 25,757 patients were reported and 6,798 patients met inclusion criteria. The median time to evaluation was 20 hours (SD, 12) and 7.7% of the total was lost to evaluation. Each indicator presented trends over time, caused by structural changes and improvement cycles, and therefore the central limit suffered inflexions. Statistical process control through process performance indicators allowed us to control the performance of the registry over time to detect systematic problems. We postulate that this approach could be reproduced for other clinical registries.

  5. New approach for quality control in manufacturing process

    NASA Astrophysics Data System (ADS)

    Hanzah, Muhammad Radhi bin; Rahim, Wan Mohd Faizal Wan Abd; Khor, C. Y.; Ishak, Muhammad Ikman; Rosli, M. U.; Jamalludin, Mohd Riduan; Zakaria, M. S.; Nawi, M. A. M.

    2017-09-01

    This study was regulated exclusively in view of theoretical aspect and further research should be done to demonstrate it in the genuine circumstance. The structure of this investigation including two industrial visits, i.e. interviews and meeting with approved staff from each organization agents. This review is cut up into two sections. Aside from the perception, little gatherings with the staffs from both organizations are held. With the subtle elements, the review is begun up. The aim of the study is characterized as to enhance the assessment strategy at quality control station to minimize defect outflow to the following client. This is to investigate the underlying factor in ebb and flow quality control framework so that another strategy to enhance quality control framework can implemented. Quality is a basic characteristic to be instated so that the end goal to fulfill clients' need is achieved. After a long hypothetical review is made, the best answer for be actualized at QG station to beat defects outflow to the following customer is by photoelectric sensor. It is reasonable, simple to keep up and has a high affectability to identify the defective items at the QG station.

  6. [Study on "multi-dimensional structure and process dynamics quality control system" of Danshen infusion solution based on component structure theory].

    PubMed

    Feng, Liang; Zhang, Ming-Hua; Gu, Jun-Fei; Wang, Gui-You; Zhao, Zi-Yu; Jia, Xiao-Bin

    2013-11-01

    As traditional Chinese medicine (TCM) preparation products feature complex compounds and multiple preparation processes, the implementation of quality control in line with the characteristics of TCM preparation products provides a firm guarantee for the clinical efficacy and safety of TCM preparation products. Danshen infusion solution is a preparation commonly used in clinic, but its quality control is restricted to indexes of finished products, which can not guarantee its inherent quality. Our study group has proposed "multi-dimensional structure and process dynamics quality control system" on the basis of "component structure theory", for the purpose of controlling the quality of Danshen infusion solution at multiple levels and in multiple links from the efficacy-related material basis, the safety-related material basis, the characteristics of dosage form to the preparation process. This article, we bring forth new ideas and models to the quality control of TCM preparation products.

  7. Quality Control of High-Dose-Rate Brachytherapy: Treatment Delivery Analysis Using Statistical Process Control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Able, Charles M., E-mail: cable@wfubmc.edu; Bright, Megan; Frizzell, Bart

    Purpose: Statistical process control (SPC) is a quality control method used to ensure that a process is well controlled and operates with little variation. This study determined whether SPC was a viable technique for evaluating the proper operation of a high-dose-rate (HDR) brachytherapy treatment delivery system. Methods and Materials: A surrogate prostate patient was developed using Vyse ordnance gelatin. A total of 10 metal oxide semiconductor field-effect transistors (MOSFETs) were placed from prostate base to apex. Computed tomography guidance was used to accurately position the first detector in each train at the base. The plan consisted of 12 needles withmore » 129 dwell positions delivering a prescribed peripheral dose of 200 cGy. Sixteen accurate treatment trials were delivered as planned. Subsequently, a number of treatments were delivered with errors introduced, including wrong patient, wrong source calibration, wrong connection sequence, single needle displaced inferiorly 5 mm, and entire implant displaced 2 mm and 4 mm inferiorly. Two process behavior charts (PBC), an individual and a moving range chart, were developed for each dosimeter location. Results: There were 4 false positives resulting from 160 measurements from 16 accurately delivered treatments. For the inaccurately delivered treatments, the PBC indicated that measurements made at the periphery and apex (regions of high-dose gradient) were much more sensitive to treatment delivery errors. All errors introduced were correctly identified by either the individual or the moving range PBC in the apex region. Measurements at the urethra and base were less sensitive to errors. Conclusions: SPC is a viable method for assessing the quality of HDR treatment delivery. Further development is necessary to determine the most effective dose sampling, to ensure reproducible evaluation of treatment delivery accuracy.« less

  8. Quality Leadership and Quality Control

    PubMed Central

    Badrick, Tony

    2003-01-01

    Different quality control rules detect different analytical errors with varying levels of efficiency depending on the type of error present, its prevalence and the number of observations. The efficiency of a rule can be gauged by inspection of a power function graph. Control rules are only part of a process and not an end in itself; just as important are the trouble-shooting systems employed when a failure occurs. 'Average of patient normals' may develop as a usual adjunct to conventional quality control serum based programmes. Acceptable error can be based on various criteria; biological variation is probably the most sensible. Once determined, acceptable error can be used as limits in quality control rule systems. A key aspect of an organisation is leadership, which links the various components of the quality system. Leadership is difficult to characterise but its key aspects include trust, setting an example, developing staff and critically setting the vision for the organisation. Organisations also have internal characteristics such as the degree of formalisation, centralisation, and complexity. Medical organisations can have internal tensions because of the dichotomy between the bureaucratic and the shadow medical structures. PMID:18568046

  9. Biowaste home composting: experimental process monitoring and quality control.

    PubMed

    Tatàno, Fabio; Pagliaro, Giacomo; Di Giovanni, Paolo; Floriani, Enrico; Mangani, Filippo

    2015-04-01

    Because home composting is a prevention option in managing biowaste at local levels, the objective of the present study was to contribute to the knowledge of the process evolution and compost quality that can be expected and obtained, respectively, in this decentralized option. In this study, organized as the research portion of a provincial project on home composting in the territory of Pesaro-Urbino (Central Italy), four experimental composters were first initiated and temporally monitored. Second, two small sub-sets of selected provincial composters (directly operated by households involved in the project) underwent quality control on their compost products at two different temporal steps. The monitored experimental composters showed overall decreasing profiles versus composting time for moisture, organic carbon, and C/N, as well as overall increasing profiles for electrical conductivity and total nitrogen, which represented qualitative indications of progress in the process. Comparative evaluations of the monitored experimental composters also suggested some interactions in home composting, i.e., high C/N ratios limiting organic matter decomposition rates and final humification levels; high moisture contents restricting the internal temperature regime; nearly horizontal phosphorus and potassium evolutions contributing to limit the rates of increase in electrical conductivity; and prolonged biowaste additions contributing to limit the rate of decrease in moisture. The measures of parametric data variability in the two sub-sets of controlled provincial composters showed decreased variability in moisture, organic carbon, and C/N from the seventh to fifteenth month of home composting, as well as increased variability in electrical conductivity, total nitrogen, and humification rate, which could be considered compatible with the respective nature of decreasing and increasing parameters during composting. The modeled parametric kinetics in the monitored experimental

  10. [Quality control in anesthesiology].

    PubMed

    Muñoz-Ramón, J M

    1995-03-01

    The process of quality control and auditing of anesthesiology allows us to evaluate care given by a service and solve problems that are detected. Quality control is a basic element of care giving and is only secondarily an area of academic research; it is therefore a meaningless effort if the information does not serve to improve departmental procedures. Quality assurance procedures assume certain infrastructural requirements and an initial period of implementation and adjustment. The main objectives of quality control are the reduction of morbidity and mortality due to anesthesia, assurance of the availability and proper management of resources and, finally, the well-being and safety of the patient.

  11. [Quality control in herbal supplements].

    PubMed

    Oelker, Luisa

    2005-01-01

    Quality and safety of food and herbal supplements are the result of a whole of different elements as good manufacturing practice and process control. The process control must be active and able to individuate and correct all possible hazards. The main and most utilized instrument is the hazard analysis critical control point (HACCP) system the correct application of which can guarantee the safety of the product. Herbal supplements need, in addition to standard quality control, a set of checks to assure the harmlessness and safety of the plants used.

  12. [Application of quality by design in granulation process for Ginkgo leaf tablet (Ⅲ): process control strategy based on design space].

    PubMed

    Cui, Xiang-Long; Xu, Bing; Sun, Fei; Dai, Sheng-Yun; Shi, Xin-Yuan; Qiao, Yan-Jiang

    2017-03-01

    In this paper, under the guidance of quality by design (QbD) concept, the control strategy of the high shear wet granulation process of the ginkgo leaf tablet based on the design space was established to improve the process controllability and product quality consistency. The median granule size (D50) and bulk density (Da) of granules were identified as critical quality attributes (CQAs) and potential critical process parameters (pCPPs) were determined by the failure modes and effect analysis (FMEA). The Plackeet-Burmann experimental design was used to screen pCPPs and the results demonstrated that the binder amount, the wet massing time and the wet mixing impeller speed were critical process parameters (CPPs). The design space of the high shear wet granulation process was developed within pCPPs range based on the Box-Behnken design and quadratic polynomial regression models. ANOVA analysis showed that the P-values of model were less than 0.05 and the values of lack of fit test were more than 0.1, indicating that the relationship between CQAs and CPPs could be well described by the mathematical models. D₅₀ could be controlled within 170 to 500 μm, and the bulk density could be controlled within 0.30 to 0.44 g•cm⁻³ by using any CPPs combination within the scope of design space. Besides, granules produced by process parameters within the design space region could also meet the requirement of tensile strength of the ginkgo leaf tablet.. Copyright© by the Chinese Pharmaceutical Association.

  13. Quality by control: Towards model predictive control of mammalian cell culture bioprocesses.

    PubMed

    Sommeregger, Wolfgang; Sissolak, Bernhard; Kandra, Kulwant; von Stosch, Moritz; Mayer, Martin; Striedner, Gerald

    2017-07-01

    The industrial production of complex biopharmaceuticals using recombinant mammalian cell lines is still mainly built on a quality by testing approach, which is represented by fixed process conditions and extensive testing of the end-product. In 2004 the FDA launched the process analytical technology initiative, aiming to guide the industry towards advanced process monitoring and better understanding of how critical process parameters affect the critical quality attributes. Implementation of process analytical technology into the bio-production process enables moving from the quality by testing to a more flexible quality by design approach. The application of advanced sensor systems in combination with mathematical modelling techniques offers enhanced process understanding, allows on-line prediction of critical quality attributes and subsequently real-time product quality control. In this review opportunities and unsolved issues on the road to a successful quality by design and dynamic control implementation are discussed. A major focus is directed on the preconditions for the application of model predictive control for mammalian cell culture bioprocesses. Design of experiments providing information about the process dynamics upon parameter change, dynamic process models, on-line process state predictions and powerful software environments seem to be a prerequisite for quality by control realization. © 2017 The Authors. Biotechnology Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Network-based production quality control

    NASA Astrophysics Data System (ADS)

    Kwon, Yongjin; Tseng, Bill; Chiou, Richard

    2007-09-01

    This study investigates the feasibility of remote quality control using a host of advanced automation equipment with Internet accessibility. Recent emphasis on product quality and reduction of waste stems from the dynamic, globalized and customer-driven market, which brings opportunities and threats to companies, depending on the response speed and production strategies. The current trends in industry also include a wide spread of distributed manufacturing systems, where design, production, and management facilities are geographically dispersed. This situation mandates not only the accessibility to remotely located production equipment for monitoring and control, but efficient means of responding to changing environment to counter process variations and diverse customer demands. To compete under such an environment, companies are striving to achieve 100%, sensor-based, automated inspection for zero-defect manufacturing. In this study, the Internet-based quality control scheme is referred to as "E-Quality for Manufacturing" or "EQM" for short. By its definition, EQM refers to a holistic approach to design and to embed efficient quality control functions in the context of network integrated manufacturing systems. Such system let designers located far away from the production facility to monitor, control and adjust the quality inspection processes as production design evolves.

  15. 33 CFR 385.21 - Quality control.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 33 Navigation and Navigable Waters 3 2011-07-01 2011-07-01 false Quality control. 385.21 Section... Processes § 385.21 Quality control. (a) The Corps of Engineers and the non-Federal sponsor shall prepare a quality control plan, in accordance with applicable Corps of Engineers regulations, for each product that...

  16. 33 CFR 385.21 - Quality control.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 33 Navigation and Navigable Waters 3 2013-07-01 2013-07-01 false Quality control. 385.21 Section... Processes § 385.21 Quality control. (a) The Corps of Engineers and the non-Federal sponsor shall prepare a quality control plan, in accordance with applicable Corps of Engineers regulations, for each product that...

  17. 33 CFR 385.21 - Quality control.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 3 2010-07-01 2010-07-01 false Quality control. 385.21 Section... Processes § 385.21 Quality control. (a) The Corps of Engineers and the non-Federal sponsor shall prepare a quality control plan, in accordance with applicable Corps of Engineers regulations, for each product that...

  18. Instrument Quality Control.

    PubMed

    Jayakody, Chatura; Hull-Ryde, Emily A

    2016-01-01

    Well-defined quality control (QC) processes are used to determine whether a certain procedure or action conforms to a widely accepted standard and/or set of guidelines, and are important components of any laboratory quality assurance program (Popa-Burke et al., J Biomol Screen 14: 1017-1030, 2009). In this chapter, we describe QC procedures useful for monitoring the accuracy and precision of laboratory instrumentation, most notably automated liquid dispensers. Two techniques, gravimetric QC and photometric QC, are highlighted in this chapter. When used together, these simple techniques provide a robust process for evaluating liquid handler accuracy and precision, and critically underpin high-quality research programs.

  19. 7 CFR 58.928 - Quality control tests.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Quality control tests. 58.928 Section 58.928... Procedures § 58.928 Quality control tests. All dairy products and other ingredients shall be subject to inspection for quality and condition throughout each processing operation. Quality control tests shall be...

  20. Scater: pre-processing, quality control, normalization and visualization of single-cell RNA-seq data in R.

    PubMed

    McCarthy, Davis J; Campbell, Kieran R; Lun, Aaron T L; Wills, Quin F

    2017-04-15

    Single-cell RNA sequencing (scRNA-seq) is increasingly used to study gene expression at the level of individual cells. However, preparing raw sequence data for further analysis is not a straightforward process. Biases, artifacts and other sources of unwanted variation are present in the data, requiring substantial time and effort to be spent on pre-processing, quality control (QC) and normalization. We have developed the R/Bioconductor package scater to facilitate rigorous pre-processing, quality control, normalization and visualization of scRNA-seq data. The package provides a convenient, flexible workflow to process raw sequencing reads into a high-quality expression dataset ready for downstream analysis. scater provides a rich suite of plotting tools for single-cell data and a flexible data structure that is compatible with existing tools and can be used as infrastructure for future software development. The open-source code, along with installation instructions, vignettes and case studies, is available through Bioconductor at http://bioconductor.org/packages/scater . davis@ebi.ac.uk. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press.

  1. Control by quality: proposition of a typology.

    PubMed

    Pujo, P; Pillet, M

    The application of Quality tools and methods in industrial management has always had a fundamental impact on the control of production. It influences the behavior of the actors concerned, while introducing the necessary notions and formalizations, especially for production systems with little or no automation, which constitute a large part of the industrial activity. Several quality approaches are applied in the workshop and are implemented at the level of the control. In this paper, the authors present a typology of the various approaches that have successively influenced control, such as statistical process control, quality assurance, and continuous improvement. First the authors present a parallel between production control and quality organizational structure. They note the duality between control, which is aimed at increasing productivity, and quality, which aims to satisfy the needs of the customer. They also note the hierarchical organizational structure of these two systems of management with, at each level, the notion of a feedback loop. This notion is fundamental to any kind of decision making. The paper is organized around the operational, tactical, and strategic levels, by describing for each level the main methods and tools for control by quality. The overview of these tools and methods starts at the operational level, with the Statistical Process Control, the Taguchi technique, and the "six sigma" approach. On the tactical level, we find a quality system approach, with a documented description of the procedures introduced in the firm. The management system can refer here to Quality Assurance, Total Productive Maintenance, or Management by Total Quality. The formalization through procedures of the rules of decision governing the process control enhances the validity of these rules. This leads to the enhancement of their reliability and to their consolidation. All this counterbalances the human, intrinsically fluctuating, behavior of the control

  2. Quality control education in the community college

    NASA Technical Reports Server (NTRS)

    Greene, J. Griffen; Wilson, Steve

    1966-01-01

    This paper describes the Quality Control Program at Daytona Beach Junior College, including course descriptions. The program in quality control required communication between the college and the American Society for Quality Control (ASQC). The college has machinery established for certification of the learning process, and the society has the source of teachers who are competent in the technical field and who are the employers of the educational products. The associate degree for quality control does not have a fixed program, which can serve all needs, any more than all engineering degrees have identical programs. The main ideas which would be common to all quality control programs are the concept of economic control of a repetitive process and the concept of developing individual potentialities into individuals who are needed and productive.

  3. 7 CFR 58.335 - Quality control tests.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Quality control tests. 58.335 Section 58.335... Procedures § 58.335 Quality control tests. All milk, cream and related products are subject to inspection for quality and condition throughout each processing operation. Quality control tests shall be made on flow...

  4. Severe postpartum haemorrhage after vaginal delivery: a statistical process control chart to report seven years of continuous quality improvement.

    PubMed

    Dupont, Corinne; Occelli, Pauline; Deneux-Tharaux, Catherine; Touzet, Sandrine; Duclos, Antoine; Bouvier-Colle, Marie-Hélène; Rudigoz, René-Charles; Huissoud, Cyril

    2014-07-01

    Severe postpartum haemorrhage after vaginal delivery: a statistical process control chart to report seven years of continuous quality improvement To use statistical process control charts to describe trends in the prevalence of severe postpartum haemorrhage after vaginal delivery. This assessment was performed 7 years after we initiated a continuous quality improvement programme that began with regular criteria-based audits Observational descriptive study, in a French maternity unit in the Rhône-Alpes region. Quarterly clinical audit meetings to analyse all cases of severe postpartum haemorrhage after vaginal delivery and provide feedback on quality of care with statistical process control tools. The primary outcomes were the prevalence of severe PPH after vaginal delivery and its quarterly monitoring with a control chart. The secondary outcomes included the global quality of care for women with severe postpartum haemorrhage, including the performance rate of each recommended procedure. Differences in these variables between 2005 and 2012 were tested. From 2005 to 2012, the prevalence of severe postpartum haemorrhage declined significantly, from 1.2% to 0.6% of vaginal deliveries (p<0.001). Since 2010, the quarterly rate of severe PPH has not exceeded the upper control limits, that is, been out of statistical control. The proportion of cases that were managed consistently with the guidelines increased for all of their main components. Implementation of continuous quality improvement efforts began seven years ago and used, among other tools, statistical process control charts. During this period, the prevalence of severe postpartum haemorrhage after vaginal delivery has been reduced by 50%. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  5. [Establishment and application of "multi-dimensional structure and process dynamic quality control technology system" in preparation products of traditional Chinese medicine (I)].

    PubMed

    Gu, Jun-Fei; Feng, Liang; Zhang, Ming-Hua; Wu, Chan; Jia, Xiao-Bin

    2013-11-01

    Safety is an important component of the quality control of traditional Chinese medicine (TCM) preparation products, as well as an important guarantee for clinical application. Currently, the quality control of TCMs in Chinese Pharmacopoeia mostly focuses on indicative compounds for TCM efficacy. TCM preparations are associated with multiple links, from raw materials to products, and each procedure may have impacts on the safety of preparation. We make a summary and analysis on the factors impacting safety during the preparation of TCM products, and then expound the important role of the "multi-dimensional structure and process dynamic quality control technology system" in the quality safety of TCM preparations. Because the product quality of TCM preparation is closely related to the safety, the control over safety-related material basis is an important component of the product quality control of TCM preparations. The implementation of the quality control over the dynamic process of TCM preparations from raw materials to products, and the improvement of the TCM quality safety control at the microcosmic level help lay a firm foundation for the development of the modernization process of TCM preparations.

  6. A case study: application of statistical process control tool for determining process capability and sigma level.

    PubMed

    Chopra, Vikram; Bairagi, Mukesh; Trivedi, P; Nagar, Mona

    2012-01-01

    Statistical process control is the application of statistical methods to the measurement and analysis of variation process. Various regulatory authorities such as Validation Guidance for Industry (2011), International Conference on Harmonisation ICH Q10 (2009), the Health Canada guidelines (2009), Health Science Authority, Singapore: Guidance for Product Quality Review (2008), and International Organization for Standardization ISO-9000:2005 provide regulatory support for the application of statistical process control for better process control and understanding. In this study risk assessments, normal probability distributions, control charts, and capability charts are employed for selection of critical quality attributes, determination of normal probability distribution, statistical stability, and capability of production processes, respectively. The objective of this study is to determine tablet production process quality in the form of sigma process capability. By interpreting data and graph trends, forecasting of critical quality attributes, sigma process capability, and stability of process were studied. The overall study contributes to an assessment of process at the sigma level with respect to out-of-specification attributes produced. Finally, the study will point to an area where the application of quality improvement and quality risk assessment principles for achievement of six sigma-capable processes is possible. Statistical process control is the most advantageous tool for determination of the quality of any production process. This tool is new for the pharmaceutical tablet production process. In the case of pharmaceutical tablet production processes, the quality control parameters act as quality assessment parameters. Application of risk assessment provides selection of critical quality attributes among quality control parameters. Sequential application of normality distributions, control charts, and capability analyses provides a valid statistical

  7. Application of the suggestion system in the improvement of the production process and product quality control

    NASA Astrophysics Data System (ADS)

    Gołaś, H.; Mazur, A.; Gruszka, J.; Szafer, P.

    2016-08-01

    The elaboration is a case study and the research was carried out in the company Alco-Mot Ltd., which employs 120 people. The company specializes in the production of lead poles for industrial and traction batteries using gravity casting. The elements embedded in the cast are manufactured on a machining centre, which provides the stability of the process and of the dimensions of the product as well as a very short production time. As a result of observation and analysis the authors have developed a concept for the implementation of a dynamic suggestion system in ALCO-MOT, including, among others, a standard for actions in the implementation of the suggestion system, as well as clear guidelines for the processing and presentation of the activities undertaken in the time between the establishment of the concept (suggestions) and the benefits analysis after the proposed solutions have been implemented. The authors also present how suggestions proposed by ALCO-MOT staff contributed to the improvement of the processes of production and quality control. Employees offered more than 30 suggestions, of which more than a half are being implemented now and further actions are being prepared for implementation. The authors will present the results of improvements in, for example, tool replacement time, scrap reduction. The authors will present how kaizen can improve the production and quality control processes. They will present how the production and quality control processes looked before and after the implementation of employee suggestions.

  8. Monitoring Processes in Visual Search Enhanced by Professional Experience: The Case of Orange Quality-Control Workers

    PubMed Central

    Visalli, Antonino; Vallesi, Antonino

    2018-01-01

    Visual search tasks have often been used to investigate how cognitive processes change with expertise. Several studies have shown visual experts' advantages in detecting objects related to their expertise. Here, we tried to extend these findings by investigating whether professional search experience could boost top-down monitoring processes involved in visual search, independently of advantages specific to objects of expertise. To this aim, we recruited a group of quality-control workers employed in citrus farms. Given the specific features of this type of job, we expected that the extensive employment of monitoring mechanisms during orange selection could enhance these mechanisms even in search situations in which orange-related expertise is not suitable. To test this hypothesis, we compared performance of our experimental group and of a well-matched control group on a computerized visual search task. In one block the target was an orange (expertise target) while in the other block the target was a Smurfette doll (neutral target). The a priori hypothesis was to find an advantage for quality-controllers in those situations in which monitoring was especially involved, that is, when deciding the presence/absence of the target required a more extensive inspection of the search array. Results were consistent with our hypothesis. Quality-controllers were faster in those conditions that extensively required monitoring processes, specifically, the Smurfette-present and both target-absent conditions. No differences emerged in the orange-present condition, which resulted to mainly rely on bottom-up processes. These results suggest that top-down processes in visual search can be enhanced through immersive real-life experience beyond visual expertise advantages. PMID:29497392

  9. A system identification approach for developing model predictive controllers of antibody quality attributes in cell culture processes

    PubMed Central

    Schmitt, John; Beller, Justin; Russell, Brian; Quach, Anthony; Hermann, Elizabeth; Lyon, David; Breit, Jeffrey

    2017-01-01

    As the biopharmaceutical industry evolves to include more diverse protein formats and processes, more robust control of Critical Quality Attributes (CQAs) is needed to maintain processing flexibility without compromising quality. Active control of CQAs has been demonstrated using model predictive control techniques, which allow development of processes which are robust against disturbances associated with raw material variability and other potentially flexible operating conditions. Wide adoption of model predictive control in biopharmaceutical cell culture processes has been hampered, however, in part due to the large amount of data and expertise required to make a predictive model of controlled CQAs, a requirement for model predictive control. Here we developed a highly automated, perfusion apparatus to systematically and efficiently generate predictive models using application of system identification approaches. We successfully created a predictive model of %galactosylation using data obtained by manipulating galactose concentration in the perfusion apparatus in serialized step change experiments. We then demonstrated the use of the model in a model predictive controller in a simulated control scenario to successfully achieve a %galactosylation set point in a simulated fed‐batch culture. The automated model identification approach demonstrated here can potentially be generalized to many CQAs, and could be a more efficient, faster, and highly automated alternative to batch experiments for developing predictive models in cell culture processes, and allow the wider adoption of model predictive control in biopharmaceutical processes. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers Biotechnol. Prog., 33:1647–1661, 2017 PMID:28786215

  10. Feasibility study of using statistical process control to customized quality assurance in proton therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rah, Jeong-Eun; Oh, Do Hoon; Shin, Dongho

    Purpose: To evaluate and improve the reliability of proton quality assurance (QA) processes and, to provide an optimal customized tolerance level using the statistical process control (SPC) methodology. Methods: The authors investigated the consistency check of dose per monitor unit (D/MU) and range in proton beams to see whether it was within the tolerance level of the daily QA process. This study analyzed the difference between the measured and calculated ranges along the central axis to improve the patient-specific QA process in proton beams by using process capability indices. Results: The authors established a customized tolerance level of ±2% formore » D/MU and ±0.5 mm for beam range in the daily proton QA process. In the authors’ analysis of the process capability indices, the patient-specific range measurements were capable of a specification limit of ±2% in clinical plans. Conclusions: SPC methodology is a useful tool for customizing the optimal QA tolerance levels and improving the quality of proton machine maintenance, treatment delivery, and ultimately patient safety.« less

  11. Feasibility study of using statistical process control to customized quality assurance in proton therapy.

    PubMed

    Rah, Jeong-Eun; Shin, Dongho; Oh, Do Hoon; Kim, Tae Hyun; Kim, Gwe-Ya

    2014-09-01

    To evaluate and improve the reliability of proton quality assurance (QA) processes and, to provide an optimal customized tolerance level using the statistical process control (SPC) methodology. The authors investigated the consistency check of dose per monitor unit (D/MU) and range in proton beams to see whether it was within the tolerance level of the daily QA process. This study analyzed the difference between the measured and calculated ranges along the central axis to improve the patient-specific QA process in proton beams by using process capability indices. The authors established a customized tolerance level of ±2% for D/MU and ±0.5 mm for beam range in the daily proton QA process. In the authors' analysis of the process capability indices, the patient-specific range measurements were capable of a specification limit of ±2% in clinical plans. SPC methodology is a useful tool for customizing the optimal QA tolerance levels and improving the quality of proton machine maintenance, treatment delivery, and ultimately patient safety.

  12. A quality improvement project using statistical process control methods for type 2 diabetes control in a resource-limited setting.

    PubMed

    Flood, David; Douglas, Kate; Goldberg, Vera; Martinez, Boris; Garcia, Pablo; Arbour, MaryCatherine; Rohloff, Peter

    2017-08-01

    Quality improvement (QI) is a key strategy for improving diabetes care in low- and middle-income countries (LMICs). This study reports on a diabetes QI project in rural Guatemala whose primary aim was to improve glycemic control of a panel of adult diabetes patients. Formative research suggested multiple areas for programmatic improvement in ambulatory diabetes care. This project utilized the Model for Improvement and Agile Global Health, our organization's complementary healthcare implementation framework. A bundle of improvement activities were implemented at the home, clinic and institutional level. Control charts of mean hemoglobin A1C (HbA1C) and proportion of patients meeting target HbA1C showed improvement as special cause variation was identified 3 months after the intervention began. Control charts for secondary process measures offered insights into the value of different components of the intervention. Intensity of home-based diabetes education emerged as an important driver of panel glycemic control. Diabetes QI work is feasible in resource-limited settings in LMICs and can improve glycemic control. Statistical process control charts are a promising methodology for use with panels or registries of diabetes patients. © The Author 2017. Published by Oxford University Press in association with the International Society for Quality in Health Care. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  13. TQM (Total Quality Management) SPARC (Special Process Action Review Committees) Handbook

    DTIC Science & Technology

    1989-08-01

    This document describes the techniques used to support and guide the Special Process Action Review Committees for accomplishing their goals for Total Quality Management (TQM). It includes concepts and definitions, checklists, sample formats, and assessment criteria. Keywords: Continuous process improvement; Logistics information; Process analysis; Quality control; Quality assurance; Total Quality Management ; Statistical processes; Management Planning and control; Management training; Management information systems.

  14. A system identification approach for developing model predictive controllers of antibody quality attributes in cell culture processes.

    PubMed

    Downey, Brandon; Schmitt, John; Beller, Justin; Russell, Brian; Quach, Anthony; Hermann, Elizabeth; Lyon, David; Breit, Jeffrey

    2017-11-01

    As the biopharmaceutical industry evolves to include more diverse protein formats and processes, more robust control of Critical Quality Attributes (CQAs) is needed to maintain processing flexibility without compromising quality. Active control of CQAs has been demonstrated using model predictive control techniques, which allow development of processes which are robust against disturbances associated with raw material variability and other potentially flexible operating conditions. Wide adoption of model predictive control in biopharmaceutical cell culture processes has been hampered, however, in part due to the large amount of data and expertise required to make a predictive model of controlled CQAs, a requirement for model predictive control. Here we developed a highly automated, perfusion apparatus to systematically and efficiently generate predictive models using application of system identification approaches. We successfully created a predictive model of %galactosylation using data obtained by manipulating galactose concentration in the perfusion apparatus in serialized step change experiments. We then demonstrated the use of the model in a model predictive controller in a simulated control scenario to successfully achieve a %galactosylation set point in a simulated fed-batch culture. The automated model identification approach demonstrated here can potentially be generalized to many CQAs, and could be a more efficient, faster, and highly automated alternative to batch experiments for developing predictive models in cell culture processes, and allow the wider adoption of model predictive control in biopharmaceutical processes. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers Biotechnol. Prog., 33:1647-1661, 2017. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers.

  15. Image processing system performance prediction and product quality evaluation

    NASA Technical Reports Server (NTRS)

    Stein, E. K.; Hammill, H. B. (Principal Investigator)

    1976-01-01

    The author has identified the following significant results. A new technique for image processing system performance prediction and product quality evaluation was developed. It was entirely objective, quantitative, and general, and should prove useful in system design and quality control. The technique and its application to determination of quality control procedures for the Earth Resources Technology Satellite NASA Data Processing Facility are described.

  16. Developments in Signature Process Control

    NASA Astrophysics Data System (ADS)

    Keller, L. B.; Dominski, Marty

    1993-01-01

    Developments in the adaptive process control technique known as Signature Process Control for Advanced Composites (SPCC) are described. This computer control method for autoclave processing of composites was used to develop an optimum cure cycle for AFR 700B polyamide and for an experimental poly-isoimide. An improved process cycle was developed for Avimid N polyamide. The potential for extending the SPCC technique to pre-preg quality control, press modeling, pultrusion and RTM is briefly discussed.

  17. A method for evaluating treatment quality using in vivo EPID dosimetry and statistical process control in radiation therapy.

    PubMed

    Fuangrod, Todsaporn; Greer, Peter B; Simpson, John; Zwan, Benjamin J; Middleton, Richard H

    2017-03-13

    Purpose Due to increasing complexity, modern radiotherapy techniques require comprehensive quality assurance (QA) programmes, that to date generally focus on the pre-treatment stage. The purpose of this paper is to provide a method for an individual patient treatment QA evaluation and identification of a "quality gap" for continuous quality improvement. Design/methodology/approach A statistical process control (SPC) was applied to evaluate treatment delivery using in vivo electronic portal imaging device (EPID) dosimetry. A moving range control chart was constructed to monitor the individual patient treatment performance based on a control limit generated from initial data of 90 intensity-modulated radiotherapy (IMRT) and ten volumetric-modulated arc therapy (VMAT) patient deliveries. A process capability index was used to evaluate the continuing treatment quality based on three quality classes: treatment type-specific, treatment linac-specific, and body site-specific. Findings The determined control limits were 62.5 and 70.0 per cent of the χ pass-rate for IMRT and VMAT deliveries, respectively. In total, 14 patients were selected for a pilot study the results of which showed that about 1 per cent of all treatments contained errors relating to unexpected anatomical changes between treatment fractions. Both rectum and pelvis cancer treatments demonstrated process capability indices were less than 1, indicating the potential for quality improvement and hence may benefit from further assessment. Research limitations/implications The study relied on the application of in vivo EPID dosimetry for patients treated at the specific centre. Sampling patients for generating the control limits were limited to 100 patients. Whilst the quantitative results are specific to the clinical techniques and equipment used, the described method is generally applicable to IMRT and VMAT treatment QA. Whilst more work is required to determine the level of clinical significance, the

  18. [Application progress on near infrared spectroscopy in quality control and process monitoring of traditional Chinese medicine].

    PubMed

    Li, Wenlong; Qu, Haibin

    2017-01-25

    The industry of traditional Chinese medicine (TCM) encounters problems like quality fluctuation of raw materials and unstandardized production process. Near infrared (NIR) spectroscopy technology is widely used in quality control of TCM because of its abundant information, fast and nondestructive characters. The main applications include quantitative analysis of Chinese medicinal materials, intermediates and Chinese patent medicines; the authenticity of TCM, species, origins and manufacturers; monitoring and control of the extraction, alcohol precipitation, column chromatography and blending process. This article reviews the progress on the application of NIR spectroscopy technology in TCM field. In view of the problems existing in the application, the article proposes that the standardization of NIR analysis method should be developed according to specific characteristics of TCM, which will promote the application of NIR technology in the TCM industry.

  19. Medical Image Processing Server applied to Quality Control of Nuclear Medicine.

    NASA Astrophysics Data System (ADS)

    Vergara, C.; Graffigna, J. P.; Marino, E.; Omati, S.; Holleywell, P.

    2016-04-01

    This paper is framed within the area of medical image processing and aims to present the process of installation, configuration and implementation of a processing server of medical images (MIPS) in the Fundación Escuela de Medicina Nuclear located in Mendoza, Argentina (FUESMEN). It has been developed in the Gabinete de Tecnologia Médica (GA.TE.ME), Facultad de Ingeniería-Universidad Nacional de San Juan. MIPS is a software that using the DICOM standard, can receive medical imaging studies of different modalities or viewing stations, then it executes algorithms and finally returns the results to other devices. To achieve the objectives previously mentioned, preliminary tests were conducted in the laboratory. More over, tools were remotely installed in clinical enviroment. The appropiate protocols for setting up and using them in different services were established once defined those suitable algorithms. Finally, it’s important to focus on the implementation and training that is provided in FUESMEN, using nuclear medicine quality control processes. Results on implementation are exposed in this work.

  20. A multiple objective optimization approach to quality control

    NASA Technical Reports Server (NTRS)

    Seaman, Christopher Michael

    1991-01-01

    The use of product quality as the performance criteria for manufacturing system control is explored. The goal in manufacturing, for economic reasons, is to optimize product quality. The problem is that since quality is a rather nebulous product characteristic, there is seldom an analytic function that can be used as a measure. Therefore standard control approaches, such as optimal control, cannot readily be applied. A second problem with optimizing product quality is that it is typically measured along many dimensions: there are many apsects of quality which must be optimized simultaneously. Very often these different aspects are incommensurate and competing. The concept of optimality must now include accepting tradeoffs among the different quality characteristics. These problems are addressed using multiple objective optimization. It is shown that the quality control problem can be defined as a multiple objective optimization problem. A controller structure is defined using this as the basis. Then, an algorithm is presented which can be used by an operator to interactively find the best operating point. Essentially, the algorithm uses process data to provide the operator with two pieces of information: (1) if it is possible to simultaneously improve all quality criteria, then determine what changes to the process input or controller parameters should be made to do this; and (2) if it is not possible to improve all criteria, and the current operating point is not a desirable one, select a criteria in which a tradeoff should be made, and make input changes to improve all other criteria. The process is not operating at an optimal point in any sense if no tradeoff has to be made to move to a new operating point. This algorithm ensures that operating points are optimal in some sense and provides the operator with information about tradeoffs when seeking the best operating point. The multiobjective algorithm was implemented in two different injection molding scenarios

  1. RNA-SeQC: RNA-seq metrics for quality control and process optimization.

    PubMed

    DeLuca, David S; Levin, Joshua Z; Sivachenko, Andrey; Fennell, Timothy; Nazaire, Marc-Danie; Williams, Chris; Reich, Michael; Winckler, Wendy; Getz, Gad

    2012-06-01

    RNA-seq, the application of next-generation sequencing to RNA, provides transcriptome-wide characterization of cellular activity. Assessment of sequencing performance and library quality is critical to the interpretation of RNA-seq data, yet few tools exist to address this issue. We introduce RNA-SeQC, a program which provides key measures of data quality. These metrics include yield, alignment and duplication rates; GC bias, rRNA content, regions of alignment (exon, intron and intragenic), continuity of coverage, 3'/5' bias and count of detectable transcripts, among others. The software provides multi-sample evaluation of library construction protocols, input materials and other experimental parameters. The modularity of the software enables pipeline integration and the routine monitoring of key measures of data quality such as the number of alignable reads, duplication rates and rRNA contamination. RNA-SeQC allows investigators to make informed decisions about sample inclusion in downstream analysis. In summary, RNA-SeQC provides quality control measures critical to experiment design, process optimization and downstream computational analysis. See www.genepattern.org to run online, or www.broadinstitute.org/rna-seqc/ for a command line tool.

  2. Statistical process management: An essential element of quality improvement

    NASA Astrophysics Data System (ADS)

    Buckner, M. R.

    Successful quality improvement requires a balanced program involving the three elements that control quality: organization, people and technology. The focus of the SPC/SPM User's Group is to advance the technology component of Total Quality by networking within the Group and by providing an outreach within Westinghouse to foster the appropriate use of statistic techniques to achieve Total Quality. SPM encompasses the disciplines by which a process is measured against its intrinsic design capability, in the face of measurement noise and other obscuring variability. SPM tools facilitate decisions about the process that generated the data. SPM deals typically with manufacturing processes, but with some flexibility of definition and technique it accommodates many administrative processes as well. The techniques of SPM are those of Statistical Process Control, Statistical Quality Control, Measurement Control, and Experimental Design. In addition, techniques such as job and task analysis, and concurrent engineering are important elements of systematic planning and analysis that are needed early in the design process to ensure success. The SPC/SPM User's Group is endeavoring to achieve its objectives by sharing successes that have occurred within the member's own Westinghouse department as well as within other US and foreign industry. In addition, failures are reviewed to establish lessons learned in order to improve future applications. In broader terms, the Group is interested in making SPM the accepted way of doing business within Westinghouse.

  3. Using a statistical process control chart during the quality assessment of cancer registry data.

    PubMed

    Myles, Zachary M; German, Robert R; Wilson, Reda J; Wu, Manxia

    2011-01-01

    Statistical process control (SPC) charts may be used to detect acute variations in the data while simultaneously evaluating unforeseen aberrations that may warrant further investigation by the data user. Using cancer stage data captured by the Summary Stage 2000 (SS2000) variable, we sought to present a brief report highlighting the utility of the SPC chart during the quality assessment of cancer registry data. Using a county-level caseload for the diagnosis period of 2001-2004 (n=25,648), we found the overall variation of the SS2000 variable to be in control during diagnosis years of 2001 and 2002, exceeded the lower control limit (LCL) in 2003, and exceeded the upper control limit (UCL) in 2004; in situ/localized stages were in control throughout the diagnosis period, regional stage exceeded UCL in 2004, and distant stage exceeded the LCL in 2001 and the UCL in 2004. Our application of the SPC chart with cancer registry data illustrates that the SPC chart may serve as a readily available and timely tool for identifying areas of concern during the data collection and quality assessment of central cancer registry data.

  4. Statistical quality control through overall vibration analysis

    NASA Astrophysics Data System (ADS)

    Carnero, M. a. Carmen; González-Palma, Rafael; Almorza, David; Mayorga, Pedro; López-Escobar, Carlos

    2010-05-01

    The present study introduces the concept of statistical quality control in automotive wheel bearings manufacturing processes. Defects on products under analysis can have a direct influence on passengers' safety and comfort. At present, the use of vibration analysis on machine tools for quality control purposes is not very extensive in manufacturing facilities. Noise and vibration are common quality problems in bearings. These failure modes likely occur under certain operating conditions and do not require high vibration amplitudes but relate to certain vibration frequencies. The vibration frequencies are affected by the type of surface problems (chattering) of ball races that are generated through grinding processes. The purpose of this paper is to identify grinding process variables that affect the quality of bearings by using statistical principles in the field of machine tools. In addition, an evaluation of the quality results of the finished parts under different combinations of process variables is assessed. This paper intends to establish the foundations to predict the quality of the products through the analysis of self-induced vibrations during the contact between the grinding wheel and the parts. To achieve this goal, the overall self-induced vibration readings under different combinations of process variables are analysed using statistical tools. The analysis of data and design of experiments follows a classical approach, considering all potential interactions between variables. The analysis of data is conducted through analysis of variance (ANOVA) for data sets that meet normality and homoscedasticity criteria. This paper utilizes different statistical tools to support the conclusions such as chi squared, Shapiro-Wilks, symmetry, Kurtosis, Cochran, Hartlett, and Hartley and Krushal-Wallis. The analysis presented is the starting point to extend the use of predictive techniques (vibration analysis) for quality control. This paper demonstrates the existence

  5. 21 CFR 111.105 - What must quality control personnel do?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false What must quality control personnel do? 111.105..., LABELING, OR HOLDING OPERATIONS FOR DIETARY SUPPLEMENTS Production and Process Control System: Requirements for Quality Control § 111.105 What must quality control personnel do? Quality control personnel must...

  6. A laboratory evaluation of four quality control devices for radiographic processing.

    PubMed

    Rushton, V E; Horner, K

    1994-08-01

    Quality assurance programmes for radiographic processing traditionally employ expensive sensitometric and densitometric techniques. However cheap and simple devices for monitoring radiographic processing are available. The aim of this study was to make a comparison of four such devices in terms of their ability to detect variations in radiographic density of clinical significance. Three of the devices are commercially available while the fourth is easily manufactured from waste materials. Ideal bitewing exposure times were selected for four different kilovoltage/film speed combinations. Phantom bitewing radiographs, exposed using these exposure times, were processed using a variety of times and developer temperatures to simulate variations in radiographic quality due to inadequate processing conditions. Test films, produced using the four monitoring devices, were exposed and processed under identical conditions. The phantom bitewings were judged to have 'acceptable' quality when the optical density of that part of the film not showing calcified structures was within +/- 0.5 of that of the film processed under optimal conditions. The efficacy of the monitoring devices in indicating the adequacy of processing was assessed by a comparison of their readings with those made from the phantom bitewings. None of the monitoring devices was ideal for all the kilovoltage/film speed combinations tested, but the homemade device proved to be the most generally effective. We conclude that guidelines to dentists on radiographic quality assurance should include reference to and details of this simple device.

  7. GCE Data Toolbox for MATLAB - a software framework for automating environmental data processing, quality control and documentation

    NASA Astrophysics Data System (ADS)

    Sheldon, W.; Chamblee, J.; Cary, R. H.

    2013-12-01

    Environmental scientists are under increasing pressure from funding agencies and journal publishers to release quality-controlled data in a timely manner, as well as to produce comprehensive metadata for submitting data to long-term archives (e.g. DataONE, Dryad and BCO-DMO). At the same time, the volume of digital data that researchers collect and manage is increasing rapidly due to advances in high frequency electronic data collection from flux towers, instrumented moorings and sensor networks. However, few pre-built software tools are available to meet these data management needs, and those tools that do exist typically focus on part of the data management lifecycle or one class of data. The GCE Data Toolbox has proven to be both a generalized and effective software solution for environmental data management in the Long Term Ecological Research Network (LTER). This open source MATLAB software library, developed by the Georgia Coastal Ecosystems LTER program, integrates metadata capture, creation and management with data processing, quality control and analysis to support the entire data lifecycle. Raw data can be imported directly from common data logger formats (e.g. SeaBird, Campbell Scientific, YSI, Hobo), as well as delimited text files, MATLAB files and relational database queries. Basic metadata are derived from the data source itself (e.g. parsed from file headers) and by value inspection, and then augmented using editable metadata templates containing boilerplate documentation, attribute descriptors, code definitions and quality control rules. Data and metadata content, quality control rules and qualifier flags are then managed together in a robust data structure that supports database functionality and ensures data validity throughout processing. A growing suite of metadata-aware editing, quality control, analysis and synthesis tools are provided with the software to support managing data using graphical forms and command-line functions, as well as

  8. Optimising UAV topographic surveys processed with structure-from-motion: Ground control quality, quantity and bundle adjustment

    NASA Astrophysics Data System (ADS)

    James, M. R.; Robson, S.; d'Oleire-Oltmanns, S.; Niethammer, U.

    2017-03-01

    Structure-from-motion (SfM) algorithms greatly facilitate the production of detailed topographic models from photographs collected using unmanned aerial vehicles (UAVs). However, the survey quality achieved in published geomorphological studies is highly variable, and sufficient processing details are never provided to understand fully the causes of variability. To address this, we show how survey quality and consistency can be improved through a deeper consideration of the underlying photogrammetric methods. We demonstrate the sensitivity of digital elevation models (DEMs) to processing settings that have not been discussed in the geomorphological literature, yet are a critical part of survey georeferencing, and are responsible for balancing the contributions of tie and control points. We provide a Monte Carlo approach to enable geomorphologists to (1) carefully consider sources of survey error and hence increase the accuracy of SfM-based DEMs and (2) minimise the associated field effort by robust determination of suitable lower-density deployments of ground control. By identifying appropriate processing settings and highlighting photogrammetric issues such as over-parameterisation during camera self-calibration, processing artefacts are reduced and the spatial variability of error minimised. We demonstrate such DEM improvements with a commonly-used SfM-based software (PhotoScan), which we augment with semi-automated and automated identification of ground control points (GCPs) in images, and apply to two contrasting case studies - an erosion gully survey (Taroudant, Morocco) and an active landslide survey (Super-Sauze, France). In the gully survey, refined processing settings eliminated step-like artefacts of up to 50 mm in amplitude, and overall DEM variability with GCP selection improved from 37 to 16 mm. In the much more challenging landslide case study, our processing halved planimetric error to 0.1 m, effectively doubling the frequency at which changes in

  9. 21 CFR 111.117 - What quality control operations are required for equipment, instruments, and controls?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false What quality control operations are required for equipment, instruments, and controls? 111.117 Section 111.117 Food and Drugs FOOD AND DRUG ADMINISTRATION... and Process Control System: Requirements for Quality Control § 111.117 What quality control operations...

  10. [Quality control of laser imagers].

    PubMed

    Winkelbauer, F; Ammann, M; Gerstner, N; Imhof, H

    1992-11-01

    Multiformat imagers based on laser systems are used for documentation in an increasing number of investigations. The specific problems of quality control are explained and the persistence of film processing in these imager systems of different configuration with (Machine 1: 3M-Laser-Imager-Plus M952 with connected 3M Film-Processor, 3M-Film IRB, X-Rax Chemical Mixer 3M-XPM, 3M-Developer and Fixer) or without (Machine 2: 3M-Laser-Imager-Plus M952 with separate DuPont-Cronex Film-processor, Kodak IR-Film, Kodak Automixer, Kodak-Developer and Fixer) connected film processing unit are investigated. In our checking based on DIN 6868 and ONORM S 5240 we found persistence of film processing in the equipment with directly adapted film processing unit according to DIN and ONORM. The checking of film persistence as demanded by DIN 6868 in these equipment could therefore be performed in longer periods. Systems with conventional darkroom processing comparatively show plain increased fluctuation, and hence the demanded daily control is essential to guarantee appropriate reaction and constant quality of documentation.

  11. [Analysis and countermeasure for quality risk in process of traditional Chinese medicine preparations].

    PubMed

    Yang, Ming; Yang, Yuan-Zhen; Wang, Ya-Qi; Wu, Zhen-Feng; Wang, Xue-Cheng; Luo, Jing

    2017-03-01

    Product quality relies on not only testing methods,but also the design and development, production control and product manufacturing all aspects of logistics management. Quality comes from the process control level.Therefore, it is very important to accurately identify the factors that may induce quality risk in the production process and quality control measures correspondingly.This article systematically analyzes the source of the quality risk of all aspects of the production process in traditional Chinese medicine preparation. Discussing ways and methods of quality risk identification of traditional Chinese medicine preparation and providing references for perfecting the whole process quality management of traditional Chinese medicine preparation. Copyright© by the Chinese Pharmaceutical Association.

  12. Emerging structural insights into glycoprotein quality control coupled with N-glycan processing in the endoplasmic reticulum.

    PubMed

    Satoh, Tadashi; Yamaguchi, Takumi; Kato, Koichi

    2015-01-30

    In the endoplasmic reticulum (ER), the sugar chain is initially introduced onto newly synthesized proteins as a triantennary tetradecasaccharide (Glc3Man9GlcNAc2). The attached oligosaccharide chain is subjected to stepwise trimming by the actions of specific glucosidases and mannosidases. In these processes, the transiently expressed N-glycans, as processing intermediates, function as signals for the determination of glycoprotein fates, i.e., folding, transport, or degradation through interactions of a series of intracellular lectins. The monoglucosylated glycoforms are hallmarks of incompletely folded states of glycoproteins in this system, whereas the outer mannose trimming leads to ER-associated glycoprotein degradation. This review outlines the recently emerging evidence regarding the molecular and structural basis of this glycoprotein quality control system, which is regulated through dynamic interplay among intracellular lectins, glycosidases, and glycosyltransferase. Structural snapshots of carbohydrate-lectin interactions have been provided at the atomic level using X-ray crystallographic analyses. Conformational ensembles of uncomplexed triantennary high-mannose-type oligosaccharides have been characterized in a quantitative manner using molecular dynamics simulation in conjunction with nuclear magnetic resonance spectroscopy. These complementary views provide new insights into glycoprotein recognition in quality control coupled with N-glycan processing.

  13. Artificial Intelligence Approach to Support Statistical Quality Control Teaching

    ERIC Educational Resources Information Center

    Reis, Marcelo Menezes; Paladini, Edson Pacheco; Khator, Suresh; Sommer, Willy Arno

    2006-01-01

    Statistical quality control--SQC (consisting of Statistical Process Control, Process Capability Studies, Acceptance Sampling and Design of Experiments) is a very important tool to obtain, maintain and improve the Quality level of goods and services produced by an organization. Despite its importance, and the fact that it is taught in technical and…

  14. Nuclear Technology Series. Course 14: Introduction to Quality Assurance/Quality Control.

    ERIC Educational Resources Information Center

    Technical Education Research Center, Waco, TX.

    This technical specialty course is one of thirty-five courses designed for use by two-year postsecondary institutions in five nuclear technician curriculum areas: (1) radiation protection technician, (2) nuclear instrumentation and control technician, (3) nuclear materials processing technician, (4) nuclear quality-assurance/quality-control…

  15. Advances in Process Control.

    ERIC Educational Resources Information Center

    Morrison, David L.; And Others

    1982-01-01

    Advances in electronics and computer science have enabled industries (pulp/paper, iron/steel, petroleum/chemical) to attain better control of their processes with resulting increases in quality, productivity, profitability, and compliance with government regulations. (JN)

  16. A whole process quality control system for energy measuring instruments inspection based on IOT technology

    NASA Astrophysics Data System (ADS)

    Yin, Bo; Liu, Li; Wang, Jiahan; Li, Xiran; Liu, Zhenbo; Li, Dewei; Wang, Jun; Liu, Lu; Wu, Jun; Xu, Tingting; Cui, He

    2017-10-01

    Electric energy measurement as a basic work, an accurate measurements play a vital role for the economic interests of both parties of power supply, the standardized management of the measurement laboratory at all levels is a direct factor that directly affects the fairness of measurement. Currently, the management of metering laboratories generally uses one-dimensional bar code as the recognition object, advances the testing process by manual management, most of the test data requires human input to generate reports. There are many problems and potential risks in this process: Data cannot be saved completely, cannot trace the status of inspection, the inspection process isn't completely controllable and so on. For the provincial metrology center's actual requirements of the whole process management for the performance test of the power measuring appliances, using of large-capacity RF tags as a process management information media, we developed a set of general measurement experiment management system, formulated a standardized full performance test process, improved the raw data recording mode of experimental process, developed a storehouse automatic inventory device, established a strict test sample transfer and storage system, ensured that all the raw data of the inspection can be traced back, achieved full life-cycle control of the sample, significantly improved the quality control level and the effectiveness of inspection work.

  17. Quality status display for a vibration welding process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spicer, John Patrick; Abell, Jeffrey A.; Wincek, Michael Anthony

    A method includes receiving, during a vibration welding process, a set of sensory signals from a collection of sensors positioned with respect to a work piece during formation of a weld on or within the work piece. The method also includes receiving control signals from a welding controller during the process, with the control signals causing the welding horn to vibrate at a calibrated frequency, and processing the received sensory and control signals using a host machine. Additionally, the method includes displaying a predicted weld quality status on a surface of the work piece using a status projector. The methodmore » may include identifying and display a quality status of a suspect weld. The laser projector may project a laser beam directly onto or immediately adjacent to the suspect welds, e.g., as a red, green, blue laser or a gas laser having a switched color filter.« less

  18. Direct infusion mass spectrometry metabolomics dataset: a benchmark for data processing and quality control

    PubMed Central

    Kirwan, Jennifer A; Weber, Ralf J M; Broadhurst, David I; Viant, Mark R

    2014-01-01

    Direct-infusion mass spectrometry (DIMS) metabolomics is an important approach for characterising molecular responses of organisms to disease, drugs and the environment. Increasingly large-scale metabolomics studies are being conducted, necessitating improvements in both bioanalytical and computational workflows to maintain data quality. This dataset represents a systematic evaluation of the reproducibility of a multi-batch DIMS metabolomics study of cardiac tissue extracts. It comprises of twenty biological samples (cow vs. sheep) that were analysed repeatedly, in 8 batches across 7 days, together with a concurrent set of quality control (QC) samples. Data are presented from each step of the workflow and are available in MetaboLights. The strength of the dataset is that intra- and inter-batch variation can be corrected using QC spectra and the quality of this correction assessed independently using the repeatedly-measured biological samples. Originally designed to test the efficacy of a batch-correction algorithm, it will enable others to evaluate novel data processing algorithms. Furthermore, this dataset serves as a benchmark for DIMS metabolomics, derived using best-practice workflows and rigorous quality assessment. PMID:25977770

  19. Defining the best quality-control systems by design and inspection.

    PubMed

    Hinckley, C M

    1997-05-01

    Not all of the many approaches to quality control are equally effective. Nonconformities in laboratory testing are caused basically by excessive process variation and mistakes. Statistical quality control can effectively control process variation, but it cannot detect or prevent most mistakes. Because mistakes or blunders are frequently the dominant source of nonconformities, we conclude that statistical quality control by itself is not effective. I explore the 100% inspection methods essential for controlling mistakes. Unlike the inspection techniques that Deming described as ineffective, the new "source" inspection methods can detect mistakes and enable corrections before nonconformities are generated, achieving the highest degree of quality at a fraction of the cost of traditional methods. Key relationships between task complexity and nonconformity rates are also described, along with cultural changes that are essential for implementing the best quality-control practices.

  20. Quality Risk Management: Putting GMP Controls First.

    PubMed

    O'Donnell, Kevin; Greene, Anne; Zwitkovits, Michael; Calnan, Nuala

    2012-01-01

    This paper presents a practical way in which current approaches to quality risk management (QRM) may be improved, such that they better support qualification, validation programs, and change control proposals at manufacturing sites. The paper is focused on the treatment of good manufacturing practice (GMP) controls during QRM exercises. It specifically addresses why it is important to evaluate and classify such controls in terms of how they affect the severity, probability of occurrence, and detection ratings that may be assigned to potential failure modes or negative events. It also presents a QRM process that is designed to directly link the outputs of risk assessments and risk control activities with qualification and validation protocols in the GMP environment. This paper concerns the need for improvement in the use of risk-based principles and tools when working to ensure that the manufacturing processes used to produce medicines, and their related equipment, are appropriate. Manufacturing processes need to be validated (or proven) to demonstrate that they can produce a medicine of the required quality. The items of equipment used in such processes need to be qualified, in order to prove that they are fit for their intended use. Quality risk management (QRM) tools can be used to support such qualification and validation activities, but their use should be science-based and subject to as little subjectivity and uncertainty as possible. When changes are proposed to manufacturing processes, equipment, or related activities, they also need careful evaluation to ensure that any risks present are managed effectively. This paper presents a practical approach to how QRM may be improved so that it better supports qualification, validation programs, and change control proposals in a more scientific way. This improved approach is based on the treatment of what are called good manufacturing process (GMP) controls during those QRM exercises. A GMP control can be considered

  1. An introduction to statistical process control in research proteomics.

    PubMed

    Bramwell, David

    2013-12-16

    Statistical process control is a well-established and respected method which provides a general purpose, and consistent framework for monitoring and improving the quality of a process. It is routinely used in many industries where the quality of final products is critical and is often required in clinical diagnostic laboratories [1,2]. To date, the methodology has been little utilised in research proteomics. It has been shown to be capable of delivering quantitative QC procedures for qualitative clinical assays [3] making it an ideal methodology to apply to this area of biological research. To introduce statistical process control as an objective strategy for quality control and show how it could be used to benefit proteomics researchers and enhance the quality of the results they generate. We demonstrate that rules which provide basic quality control are easy to derive and implement and could have a major impact on data quality for many studies. Statistical process control is a powerful tool for investigating and improving proteomics research work-flows. The process of characterising measurement systems and defining control rules forces the exploration of key questions that can lead to significant improvements in performance. This work asserts that QC is essential to proteomics discovery experiments. Every experimenter must know the current capabilities of their measurement system and have an objective means for tracking and ensuring that performance. Proteomic analysis work-flows are complicated and multi-variate. QC is critical for clinical chemistry measurements and huge strides have been made in ensuring the quality and validity of results in clinical biochemistry labs. This work introduces some of these QC concepts and works to bridge their use from single analyte QC to applications in multi-analyte systems. This article is part of a Special Issue entitled: Standardization and Quality Control in Proteomics. Copyright © 2013 The Author. Published by Elsevier

  2. Quality management of manufacturing process based on manufacturing execution system

    NASA Astrophysics Data System (ADS)

    Zhang, Jian; Jiang, Yang; Jiang, Weizhuo

    2017-04-01

    Quality control elements in manufacturing process are elaborated. And the approach of quality management of manufacturing process based on manufacturing execution system (MES) is discussed. The functions of MES for a microcircuit production line are introduced conclusively.

  3. CRN5EXP: Expert system for statistical quality control

    NASA Technical Reports Server (NTRS)

    Hentea, Mariana

    1991-01-01

    The purpose of the Expert System CRN5EXP is to assist in checking the quality of the coils at two very important mills: Hot Rolling and Cold Rolling in a steel plant. The system interprets the statistical quality control charts, diagnoses and predicts the quality of the steel. Measurements of process control variables are recorded in a database and sample statistics such as the mean and the range are computed and plotted on a control chart. The chart is analyzed through patterns using the C Language Integrated Production System (CLIPS) and a forward chaining technique to reach a conclusion about the causes of defects and to take management measures for the improvement of the quality control techniques. The Expert System combines the certainty factors associated with the process control variables to predict the quality of the steel. The paper presents the approach to extract data from the database, the reason to combine certainty factors, the architecture and the use of the Expert System. However, the interpretation of control charts patterns requires the human expert's knowledge and lends to Expert Systems rules.

  4. Assessing Subjectivity in Sensor Data Post Processing via a Controlled Experiment

    NASA Astrophysics Data System (ADS)

    Jones, A. S.; Horsburgh, J. S.; Eiriksson, D.

    2017-12-01

    Environmental data collected by in situ sensors must be reviewed to verify validity, and conducting quality control often requires making edits in post processing to generate approved datasets. This process involves decisions by technicians, data managers, or data users on how to handle problematic data. Options include: removing data from a series, retaining data with annotations, and altering data based on algorithms related to adjacent data points or the patterns of data at other locations or of other variables. Ideally, given the same dataset and the same quality control guidelines, multiple data quality control technicians would make the same decisions in data post processing. However, despite the development and implementation of guidelines aimed to ensure consistent quality control procedures, we have faced ambiguity when performing post processing, and we have noticed inconsistencies in the practices of individuals performing quality control post processing. Technicians with the same level of training and using the same input datasets may produce different results, affecting the overall quality and comparability of finished data products. Different results may also be produced by technicians that do not have the same level of training. In order to assess the effect of subjective decision making by the individual technician on the end data product, we designed an experiment where multiple users performed quality control post processing on the same datasets using a consistent set of guidelines, field notes, and tools. We also assessed the effect of technician experience and training by conducting the same procedures with a group of novices unfamiliar with the data and the quality control process and compared their results to those generated by a group of more experienced technicians. In this presentation, we report our observations of the degree of subjectivity in sensor data post processing, assessing and quantifying the impacts of individual technician as

  5. [Statistical process control applied to intensity modulated radiotherapy pretreatment controls with portal dosimetry].

    PubMed

    Villani, N; Gérard, K; Marchesi, V; Huger, S; François, P; Noël, A

    2010-06-01

    The first purpose of this study was to illustrate the contribution of statistical process control for a better security in intensity modulated radiotherapy (IMRT) treatments. This improvement is possible by controlling the dose delivery process, characterized by pretreatment quality control results. So, it is necessary to put under control portal dosimetry measurements (currently, the ionisation chamber measurements were already monitored by statistical process control thanks to statistical process control tools). The second objective was to state whether it is possible to substitute ionisation chamber with portal dosimetry in order to optimize time devoted to pretreatment quality control. At Alexis-Vautrin center, pretreatment quality controls in IMRT for prostate and head and neck treatments were performed for each beam of each patient. These controls were made with an ionisation chamber, which is the reference detector for the absolute dose measurement, and with portal dosimetry for the verification of dose distribution. Statistical process control is a statistical analysis method, coming from industry, used to control and improve the studied process quality. It uses graphic tools as control maps to follow-up process, warning the operator in case of failure, and quantitative tools to evaluate the process toward its ability to respect guidelines: this is the capability study. The study was performed on 450 head and neck beams and on 100 prostate beams. Control charts, showing drifts, both slow and weak, and also both strong and fast, of mean and standard deviation have been established and have shown special cause introduced (manual shift of the leaf gap of the multileaf collimator). Correlation between dose measured at one point, given with the EPID and the ionisation chamber has been evaluated at more than 97% and disagreement cases between the two measurements were identified. The study allowed to demonstrate the feasibility to reduce the time devoted to

  6. Multileaf collimator performance monitoring and improvement using semiautomated quality control testing and statistical process control.

    PubMed

    Létourneau, Daniel; Wang, An; Amin, Md Nurul; Pearce, Jim; McNiven, Andrea; Keller, Harald; Norrlinger, Bernhard; Jaffray, David A

    2014-12-01

    High-quality radiation therapy using highly conformal dose distributions and image-guided techniques requires optimum machine delivery performance. In this work, a monitoring system for multileaf collimator (MLC) performance, integrating semiautomated MLC quality control (QC) tests and statistical process control tools, was developed. The MLC performance monitoring system was used for almost a year on two commercially available MLC models. Control charts were used to establish MLC performance and assess test frequency required to achieve a given level of performance. MLC-related interlocks and servicing events were recorded during the monitoring period and were investigated as indicators of MLC performance variations. The QC test developed as part of the MLC performance monitoring system uses 2D megavoltage images (acquired using an electronic portal imaging device) of 23 fields to determine the location of the leaves with respect to the radiation isocenter. The precision of the MLC performance monitoring QC test and the MLC itself was assessed by detecting the MLC leaf positions on 127 megavoltage images of a static field. After initial calibration, the MLC performance monitoring QC test was performed 3-4 times/week over a period of 10-11 months to monitor positional accuracy of individual leaves for two different MLC models. Analysis of test results was performed using individuals control charts per leaf with control limits computed based on the measurements as well as two sets of specifications of ± 0.5 and ± 1 mm. Out-of-specification and out-of-control leaves were automatically flagged by the monitoring system and reviewed monthly by physicists. MLC-related interlocks reported by the linear accelerator and servicing events were recorded to help identify potential causes of nonrandom MLC leaf positioning variations. The precision of the MLC performance monitoring QC test and the MLC itself was within ± 0.22 mm for most MLC leaves and the majority of the

  7. Multileaf collimator performance monitoring and improvement using semiautomated quality control testing and statistical process control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Létourneau, Daniel, E-mail: daniel.letourneau@rmp.uh.on.ca; McNiven, Andrea; Keller, Harald

    2014-12-15

    Purpose: High-quality radiation therapy using highly conformal dose distributions and image-guided techniques requires optimum machine delivery performance. In this work, a monitoring system for multileaf collimator (MLC) performance, integrating semiautomated MLC quality control (QC) tests and statistical process control tools, was developed. The MLC performance monitoring system was used for almost a year on two commercially available MLC models. Control charts were used to establish MLC performance and assess test frequency required to achieve a given level of performance. MLC-related interlocks and servicing events were recorded during the monitoring period and were investigated as indicators of MLC performance variations. Methods:more » The QC test developed as part of the MLC performance monitoring system uses 2D megavoltage images (acquired using an electronic portal imaging device) of 23 fields to determine the location of the leaves with respect to the radiation isocenter. The precision of the MLC performance monitoring QC test and the MLC itself was assessed by detecting the MLC leaf positions on 127 megavoltage images of a static field. After initial calibration, the MLC performance monitoring QC test was performed 3–4 times/week over a period of 10–11 months to monitor positional accuracy of individual leaves for two different MLC models. Analysis of test results was performed using individuals control charts per leaf with control limits computed based on the measurements as well as two sets of specifications of ±0.5 and ±1 mm. Out-of-specification and out-of-control leaves were automatically flagged by the monitoring system and reviewed monthly by physicists. MLC-related interlocks reported by the linear accelerator and servicing events were recorded to help identify potential causes of nonrandom MLC leaf positioning variations. Results: The precision of the MLC performance monitoring QC test and the MLC itself was within ±0.22 mm for most MLC

  8. Off-Line Quality Control In Integrated Circuit Fabrication Using Experimental Design

    NASA Astrophysics Data System (ADS)

    Phadke, M. S.; Kackar, R. N.; Speeney, D. V.; Grieco, M. J.

    1987-04-01

    Off-line quality control is a systematic method of optimizing production processes and product designs. It is widely used in Japan to produce high quality products at low cost. The method was introduced to us by Professor Genichi Taguchi who is a Deming-award winner and a former Director of the Japanese Academy of Quality. In this paper we will i) describe the off-line quality control method, and ii) document our efforts to optimize the process for forming contact windows in 3.5 Aim CMOS circuits fabricated in the Murray Hill Integrated Circuit Design Capability Laboratory. In the fabrication of integrated circuits it is critically important to produce contact windows of size very near the target dimension. Windows which are too small or too large lead to loss of yield. The off-line quality control method has improved both the process quality and productivity. The variance of the window size has been reduced by a factor of four. Also, processing time for window photolithography has been substantially reduced. The key steps of off-line quality control are: i) Identify important manipulatable process factors and their potential working levels. ii) Perform fractional factorial experiments on the process using orthogonal array designs. iii) Analyze the resulting data to determine the optimum operating levels of the factors. Both the process mean and the process variance are considered in this analysis. iv) Conduct an additional experiment to verify that the new factor levels indeed give an improvement.

  9. A Quadrupole Dalton-based multi-attribute method for product characterization, process development, and quality control of therapeutic proteins.

    PubMed

    Xu, Weichen; Jimenez, Rod Brian; Mowery, Rachel; Luo, Haibin; Cao, Mingyan; Agarwal, Nitin; Ramos, Irina; Wang, Xiangyang; Wang, Jihong

    2017-10-01

    During manufacturing and storage process, therapeutic proteins are subject to various post-translational modifications (PTMs), such as isomerization, deamidation, oxidation, disulfide bond modifications and glycosylation. Certain PTMs may affect bioactivity, stability or pharmacokinetics and pharmacodynamics profile and are therefore classified as potential critical quality attributes (pCQAs). Identifying, monitoring and controlling these PTMs are usually key elements of the Quality by Design (QbD) approach. Traditionally, multiple analytical methods are utilized for these purposes, which is time consuming and costly. In recent years, multi-attribute monitoring methods have been developed in the biopharmaceutical industry. However, these methods combine high-end mass spectrometry with complicated data analysis software, which could pose difficulty when implementing in a quality control (QC) environment. Here we report a multi-attribute method (MAM) using a Quadrupole Dalton (QDa) mass detector to selectively monitor and quantitate PTMs in a therapeutic monoclonal antibody. The result output from the QDa-based MAM is straightforward and automatic. Evaluation results indicate this method provides comparable results to the traditional assays. To ensure future application in the QC environment, this method was qualified according to the International Conference on Harmonization (ICH) guideline and applied in the characterization of drug substance and stability samples. The QDa-based MAM is shown to be an extremely useful tool for product and process characterization studies that facilitates facile understanding of process impact on multiple quality attributes, while being QC friendly and cost-effective.

  10. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC).

    PubMed

    Gérard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent; Kafrouni, Hanna; Husson, François; Aletti, Pierre

    2009-04-01

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short-term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center (+/- 4% of deviation between the calculated and measured doses) by calculating a control process capability (C(pc)) index. The C(pc) index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short-term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should improve the

  11. User-friendly solutions for microarray quality control and pre-processing on ArrayAnalysis.org

    PubMed Central

    Eijssen, Lars M. T.; Jaillard, Magali; Adriaens, Michiel E.; Gaj, Stan; de Groot, Philip J.; Müller, Michael; Evelo, Chris T.

    2013-01-01

    Quality control (QC) is crucial for any scientific method producing data. Applying adequate QC introduces new challenges in the genomics field where large amounts of data are produced with complex technologies. For DNA microarrays, specific algorithms for QC and pre-processing including normalization have been developed by the scientific community, especially for expression chips of the Affymetrix platform. Many of these have been implemented in the statistical scripting language R and are available from the Bioconductor repository. However, application is hampered by lack of integrative tools that can be used by users of any experience level. To fill this gap, we developed a freely available tool for QC and pre-processing of Affymetrix gene expression results, extending, integrating and harmonizing functionality of Bioconductor packages. The tool can be easily accessed through a wizard-like web portal at http://www.arrayanalysis.org or downloaded for local use in R. The portal provides extensive documentation, including user guides, interpretation help with real output illustrations and detailed technical documentation. It assists newcomers to the field in performing state-of-the-art QC and pre-processing while offering data analysts an integral open-source package. Providing the scientific community with this easily accessible tool will allow improving data quality and reuse and adoption of standards. PMID:23620278

  12. Software Quality Assurance and Controls Standard

    DTIC Science & Technology

    2010-04-27

    Software Quality Assurance d C t l St d dan on ro s an ar Sue Carroll Principal Software Quality Analyst, SAS John Wal z VP Technology and...for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington VA 22202-4302. Respondents should be aware that...Cycle (SLC) process? • What is in a SQA Process? • Where are SQA Controls? • What is the SQA standards history? Wh t i h i i SQA?• a s c ang ng n

  13. Applying Statistical Process Quality Control Methodology to Educational Settings.

    ERIC Educational Resources Information Center

    Blumberg, Carol Joyce

    A subset of Statistical Process Control (SPC) methodology known as Control Charting is introduced. SPC methodology is a collection of graphical and inferential statistics techniques used to study the progress of phenomena over time. The types of control charts covered are the null X (mean), R (Range), X (individual observations), MR (moving…

  14. PACS quality control and automatic problem notifier

    NASA Astrophysics Data System (ADS)

    Honeyman-Buck, Janice C.; Jones, Douglas; Frost, Meryll M.; Staab, Edward V.

    1997-05-01

    One side effect of installing a clinical PACS Is that users become dependent upon the technology and in some cases it can be very difficult to revert back to a film based system if components fail. The nature of system failures range from slow deterioration of function as seen in the loss of monitor luminance through sudden catastrophic loss of the entire PACS networks. This paper describes the quality control procedures in place at the University of Florida and the automatic notification system that alerts PACS personnel when a failure has happened or is anticipated. The goal is to recover from a failure with a minimum of downtime and no data loss. Routine quality control is practiced on all aspects of PACS, from acquisition, through network routing, through display, and including archiving. Whenever possible, the system components perform self and between platform checks for active processes, file system status, errors in log files, and system uptime. When an error is detected or a exception occurs, an automatic page is sent to a pager with a diagnostic code. Documentation on each code, trouble shooting procedures, and repairs are kept on an intranet server accessible only to people involved in maintaining the PACS. In addition to the automatic paging system for error conditions, acquisition is assured by an automatic fax report sent on a daily basis to all technologists acquiring PACS images to be used as a cross check that all studies are archived prior to being removed from the acquisition systems. Daily quality control is preformed to assure that studies can be moved from each acquisition and contrast adjustment. The results of selected quality control reports will be presented. The intranet documentation server will be described with the automatic pager system. Monitor quality control reports will be described and the cost of quality control will be quantified. As PACS is accepted as a clinical tool, the same standards of quality control must be established

  15. Importance of implementing an analytical quality control system in a core laboratory.

    PubMed

    Marques-Garcia, F; Garcia-Codesal, M F; Caro-Narros, M R; Contreras-SanFeliciano, T

    2015-01-01

    The aim of the clinical laboratory is to provide useful information for screening, diagnosis and monitoring of disease. The laboratory should ensure the quality of extra-analytical and analytical process, based on set criteria. To do this, it develops and implements a system of internal quality control, designed to detect errors, and compare its data with other laboratories, through external quality control. In this way it has a tool to detect the fulfillment of the objectives set, and in case of errors, allowing corrective actions to be made, and ensure the reliability of the results. This article sets out to describe the design and implementation of an internal quality control protocol, as well as its periodical assessment intervals (6 months) to determine compliance with pre-determined specifications (Stockholm Consensus(1)). A total of 40 biochemical and 15 immunochemical methods were evaluated using three different control materials. Next, a standard operation procedure was planned to develop a system of internal quality control that included calculating the error of the analytical process, setting quality specifications, and verifying compliance. The quality control data were then statistically depicted as means, standard deviations, and coefficients of variation, as well as systematic, random, and total errors. The quality specifications were then fixed and the operational rules to apply in the analytical process were calculated. Finally, our data were compared with those of other laboratories through an external quality assurance program. The development of an analytical quality control system is a highly structured process. This should be designed to detect errors that compromise the stability of the analytical process. The laboratory should review its quality indicators, systematic, random and total error at regular intervals, in order to ensure that they are meeting pre-determined specifications, and if not, apply the appropriate corrective actions

  16. Proteomics Quality Control: Quality Control Software for MaxQuant Results.

    PubMed

    Bielow, Chris; Mastrobuoni, Guido; Kempa, Stefan

    2016-03-04

    Mass spectrometry-based proteomics coupled to liquid chromatography has matured into an automatized, high-throughput technology, producing data on the scale of multiple gigabytes per instrument per day. Consequently, an automated quality control (QC) and quality analysis (QA) capable of detecting measurement bias, verifying consistency, and avoiding propagation of error is paramount for instrument operators and scientists in charge of downstream analysis. We have developed an R-based QC pipeline called Proteomics Quality Control (PTXQC) for bottom-up LC-MS data generated by the MaxQuant software pipeline. PTXQC creates a QC report containing a comprehensive and powerful set of QC metrics, augmented with automated scoring functions. The automated scores are collated to create an overview heatmap at the beginning of the report, giving valuable guidance also to nonspecialists. Our software supports a wide range of experimental designs, including stable isotope labeling by amino acids in cell culture (SILAC), tandem mass tags (TMT), and label-free data. Furthermore, we introduce new metrics to score MaxQuant's Match-between-runs (MBR) functionality by which peptide identifications can be transferred across Raw files based on accurate retention time and m/z. Last but not least, PTXQC is easy to install and use and represents the first QC software capable of processing MaxQuant result tables. PTXQC is freely available at https://github.com/cbielow/PTXQC .

  17. Measuring health care process quality with software quality measures.

    PubMed

    Yildiz, Ozkan; Demirörs, Onur

    2012-01-01

    Existing quality models focus on some specific diseases, clinics or clinical areas. Although they contain structure, process, or output type measures, there is no model which measures quality of health care processes comprehensively. In addition, due to the not measured overall process quality, hospitals cannot compare quality of processes internally and externally. To bring a solution to above problems, a new model is developed from software quality measures. We have adopted the ISO/IEC 9126 software quality standard for health care processes. Then, JCIAS (Joint Commission International Accreditation Standards for Hospitals) measurable elements were added to model scope for unifying functional requirements. Assessment (diagnosing) process measurement results are provided in this paper. After the application, it was concluded that the model determines weak and strong aspects of the processes, gives a more detailed picture for the process quality, and provides quantifiable information to hospitals to compare their processes with multiple organizations.

  18. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center ({+-}4% of deviation between the calculated and measured doses) by calculatingmore » a control process capability (C{sub pc}) index. The C{sub pc} index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they

  19. Multivariate statistical process control in product quality review assessment - A case study.

    PubMed

    Kharbach, M; Cherrah, Y; Vander Heyden, Y; Bouklouze, A

    2017-11-01

    According to the Food and Drug Administration and the European Good Manufacturing Practices (GMP) guidelines, Annual Product Review (APR) is a mandatory requirement in GMP. It consists of evaluating a large collection of qualitative or quantitative data in order to verify the consistency of an existing process. According to the Code of Federal Regulation Part 11 (21 CFR 211.180), all finished products should be reviewed annually for the quality standards to determine the need of any change in specification or manufacturing of drug products. Conventional Statistical Process Control (SPC) evaluates the pharmaceutical production process by examining only the effect of a single factor at the time using a Shewhart's chart. It neglects to take into account the interaction between the variables. In order to overcome this issue, Multivariate Statistical Process Control (MSPC) can be used. Our case study concerns an APR assessment, where 164 historical batches containing six active ingredients, manufactured in Morocco, were collected during one year. Each batch has been checked by assaying the six active ingredients by High Performance Liquid Chromatography according to European Pharmacopoeia monographs. The data matrix was evaluated both by SPC and MSPC. The SPC indicated that all batches are under control, while the MSPC, based on Principal Component Analysis (PCA), for the data being either autoscaled or robust scaled, showed four and seven batches, respectively, out of the Hotelling T 2 95% ellipse. Also, an improvement of the capability of the process is observed without the most extreme batches. The MSPC can be used for monitoring subtle changes in the manufacturing process during an APR assessment. Copyright © 2017 Académie Nationale de Pharmacie. Published by Elsevier Masson SAS. All rights reserved.

  20. Quality data collection and management technology of aerospace complex product assembly process

    NASA Astrophysics Data System (ADS)

    Weng, Gang; Liu, Jianhua; He, Yongxi; Zhuang, Cunbo

    2017-04-01

    Aiming at solving problems of difficult management and poor traceability for discrete assembly process quality data, a data collection and management method is proposed which take the assembly process and BOM as the core. Data collection method base on workflow technology, data model base on BOM and quality traceability of assembly process is included in the method. Finally, assembly process quality data management system is developed and effective control and management of quality information for complex product assembly process is realized.

  1. Statistical process control analysis for patient quality assurance of intensity modulated radiation therapy

    NASA Astrophysics Data System (ADS)

    Lee, Rena; Kim, Kyubo; Cho, Samju; Lim, Sangwook; Lee, Suk; Shim, Jang Bo; Huh, Hyun Do; Lee, Sang Hoon; Ahn, Sohyun

    2017-11-01

    This study applied statistical process control to set and verify the quality assurances (QA) tolerance standard for our hospital's characteristics with the criteria standards that are applied to all the treatment sites with this analysis. Gamma test factor of delivery quality assurances (DQA) was based on 3%/3 mm. Head and neck, breast, prostate cases of intensity modulated radiation therapy (IMRT) or volumetric arc radiation therapy (VMAT) were selected for the analysis of the QA treatment sites. The numbers of data used in the analysis were 73 and 68 for head and neck patients. Prostate and breast were 49 and 152 by MapCHECK and ArcCHECK respectively. C p value of head and neck and prostate QA were above 1.0, C pml is 1.53 and 1.71 respectively, which is close to the target value of 100%. C pml value of breast (IMRT) was 1.67, data values are close to the target value of 95%. But value of was 0.90, which means that the data values are widely distributed. C p and C pml of breast VMAT QA were respectively 1.07 and 2.10. This suggests that the VMAT QA has better process capability than the IMRT QA. Consequently, we should pay more attention to planning and QA before treatment for breast Radiotherapy.

  2. Quality Control in Clinical Laboratory Samples

    DTIC Science & Technology

    2015-01-01

    is able to find and correct flaws in the analytical processes of a lab before potentially incorrect patient resu lts are released. According to...verifi es that the results produced are accurate and precise . Clinical labs use management of documentation as well as inco rporation of a continuous...improvement process to streamline the overall quality control process . QC samples are expected to be identical and tested identically to patient

  3. 21 CFR 111.65 - What are the requirements for quality control operations?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false What are the requirements for quality control... Process Control System § 111.65 What are the requirements for quality control operations? You must implement quality control operations in your manufacturing, packaging, labeling, and holding operations for...

  4. A system framework of inter-enterprise machining quality control based on fractal theory

    NASA Astrophysics Data System (ADS)

    Zhao, Liping; Qin, Yongtao; Yao, Yiyong; Yan, Peng

    2014-03-01

    In order to meet the quality control requirement of dynamic and complicated product machining processes among enterprises, a system framework of inter-enterprise machining quality control based on fractal was proposed. In this system framework, the fractal-specific characteristic of inter-enterprise machining quality control function was analysed, and the model of inter-enterprise machining quality control was constructed by the nature of fractal structures. Furthermore, the goal-driven strategy of inter-enterprise quality control and the dynamic organisation strategy of inter-enterprise quality improvement were constructed by the characteristic analysis on this model. In addition, the architecture of inter-enterprise machining quality control based on fractal was established by means of Web service. Finally, a case study for application was presented. The result showed that the proposed method was available, and could provide guidance for quality control and support for product reliability in inter-enterprise machining processes.

  5. Economic-Oriented Stochastic Optimization in Advanced Process Control of Chemical Processes

    PubMed Central

    Dobos, László; Király, András; Abonyi, János

    2012-01-01

    Finding the optimal operating region of chemical processes is an inevitable step toward improving economic performance. Usually the optimal operating region is situated close to process constraints related to product quality or process safety requirements. Higher profit can be realized only by assuring a relatively low frequency of violation of these constraints. A multilevel stochastic optimization framework is proposed to determine the optimal setpoint values of control loops with respect to predetermined risk levels, uncertainties, and costs of violation of process constraints. The proposed framework is realized as direct search-type optimization of Monte-Carlo simulation of the controlled process. The concept is illustrated throughout by a well-known benchmark problem related to the control of a linear dynamical system and the model predictive control of a more complex nonlinear polymerization process. PMID:23213298

  6. Food safety and nutritional quality for the prevention of non communicable diseases: the Nutrient, hazard Analysis and Critical Control Point process (NACCP).

    PubMed

    Di Renzo, Laura; Colica, Carmen; Carraro, Alberto; Cenci Goga, Beniamino; Marsella, Luigi Tonino; Botta, Roberto; Colombo, Maria Laura; Gratteri, Santo; Chang, Ting Fa Margherita; Droli, Maurizio; Sarlo, Francesca; De Lorenzo, Antonino

    2015-04-23

    The important role of food and nutrition in public health is being increasingly recognized as crucial for its potential impact on health-related quality of life and the economy, both at the societal and individual levels. The prevalence of non-communicable diseases calls for a reformulation of our view of food. The Hazard Analysis and Critical Control Point (HACCP) system, first implemented in the EU with the Directive 43/93/CEE, later replaced by Regulation CE 178/2002 and Regulation CE 852/2004, is the internationally agreed approach for food safety control. Our aim is to develop a new procedure for the assessment of the Nutrient, hazard Analysis and Critical Control Point (NACCP) process, for total quality management (TMQ), and optimize nutritional levels. NACCP was based on four general principles: i) guarantee of health maintenance; ii) evaluate and assure the nutritional quality of food and TMQ; iii) give correct information to the consumers; iv) ensure an ethical profit. There are three stages for the application of the NACCP process: 1) application of NACCP for quality principles; 2) application of NACCP for health principals; 3) implementation of the NACCP process. The actions are: 1) identification of nutritional markers, which must remain intact throughout the food supply chain; 2) identification of critical control points which must monitored in order to minimize the likelihood of a reduction in quality; 3) establishment of critical limits to maintain adequate levels of nutrient; 4) establishment, and implementation of effective monitoring procedures of critical control points; 5) establishment of corrective actions; 6) identification of metabolic biomarkers; 7) evaluation of the effects of food intake, through the application of specific clinical trials; 8) establishment of procedures for consumer information; 9) implementation of the Health claim Regulation EU 1924/2006; 10) starting a training program. We calculate the risk assessment as follows

  7. Statistical process control methods allow the analysis and improvement of anesthesia care.

    PubMed

    Fasting, Sigurd; Gisvold, Sven E

    2003-10-01

    Quality aspects of the anesthetic process are reflected in the rate of intraoperative adverse events. The purpose of this report is to illustrate how the quality of the anesthesia process can be analyzed using statistical process control methods, and exemplify how this analysis can be used for quality improvement. We prospectively recorded anesthesia-related data from all anesthetics for five years. The data included intraoperative adverse events, which were graded into four levels, according to severity. We selected four adverse events, representing important quality and safety aspects, for statistical process control analysis. These were: inadequate regional anesthesia, difficult emergence from general anesthesia, intubation difficulties and drug errors. We analyzed the underlying process using 'p-charts' for statistical process control. In 65,170 anesthetics we recorded adverse events in 18.3%; mostly of lesser severity. Control charts were used to define statistically the predictable normal variation in problem rate, and then used as a basis for analysis of the selected problems with the following results: Inadequate plexus anesthesia: stable process, but unacceptably high failure rate; Difficult emergence: unstable process, because of quality improvement efforts; Intubation difficulties: stable process, rate acceptable; Medication errors: methodology not suited because of low rate of errors. By applying statistical process control methods to the analysis of adverse events, we have exemplified how this allows us to determine if a process is stable, whether an intervention is required, and if quality improvement efforts have the desired effect.

  8. Principles and Practices for Quality Assurance and Quality Control

    USGS Publications Warehouse

    Jones, Berwyn E.

    1999-01-01

    Quality assurance and quality control are vital parts of highway runoff water-quality monitoring projects. To be effective, project quality assurance must address all aspects of the project, including project management responsibilities and resources, data quality objectives, sampling and analysis plans, data-collection protocols, data quality-control plans, data-assessment procedures and requirements, and project outputs. Quality control ensures that the data quality objectives are achieved as planned. The historical development and current state of the art of quality assurance and quality control concepts described in this report can be applied to evaluation of data from prior projects.

  9. Optimising UAV topographic surveys processed with structure-from-motion: Ground control quality, quantity and bundle adjustment

    NASA Astrophysics Data System (ADS)

    James, Mike R.; Robson, Stuart; d'Oleire-Oltmanns, Sebastian; Niethammer, Uwe

    2016-04-01

    Structure-from-motion (SfM) algorithms are greatly facilitating the production of detailed topographic models based on images collected by unmanned aerial vehicles (UAVs). However, SfM-based software does not generally provide the rigorous photogrammetric analysis required to fully understand survey quality. Consequently, error related to problems in control point data or the distribution of control points can remain undiscovered. Even if these errors are not large in magnitude, they can be systematic, and thus have strong implications for the use of products such as digital elevation models (DEMs) and orthophotos. Here, we develop a Monte Carlo approach to (1) improve the accuracy of products when SfM-based processing is used and (2) reduce the associated field effort by identifying suitable lower density deployments of ground control points. The method highlights over-parameterisation during camera self-calibration and provides enhanced insight into control point performance when rigorous error metrics are not available. Processing was implemented using commonly-used SfM-based software (Agisoft PhotoScan), which we augment with semi-automated and automated GCPs image measurement. We apply the Monte Carlo method to two contrasting case studies - an erosion gully survey (Taurodont, Morocco) carried out with an fixed-wing UAV, and an active landslide survey (Super-Sauze, France), acquired using a manually controlled quadcopter. The results highlight the differences in the control requirements for the two sites, and we explore the implications for future surveys. We illustrate DEM sensitivity to critical processing parameters and show how the use of appropriate parameter values increases DEM repeatability and reduces the spatial variability of error due to processing artefacts.

  10. Root Cause Analysis of Quality Defects Using HPLC-MS Fingerprint Knowledgebase for Batch-to-batch Quality Control of Herbal Drugs.

    PubMed

    Yan, Binjun; Fang, Zhonghua; Shen, Lijuan; Qu, Haibin

    2015-01-01

    The batch-to-batch quality consistency of herbal drugs has always been an important issue. To propose a methodology for batch-to-batch quality control based on HPLC-MS fingerprints and process knowledgebase. The extraction process of Compound E-jiao Oral Liquid was taken as a case study. After establishing the HPLC-MS fingerprint analysis method, the fingerprints of the extract solutions produced under normal and abnormal operation conditions were obtained. Multivariate statistical models were built for fault detection and a discriminant analysis model was built using the probabilistic discriminant partial-least-squares method for fault diagnosis. Based on multivariate statistical analysis, process knowledge was acquired and the cause-effect relationship between process deviations and quality defects was revealed. The quality defects were detected successfully by multivariate statistical control charts and the type of process deviations were diagnosed correctly by discriminant analysis. This work has demonstrated the benefits of combining HPLC-MS fingerprints, process knowledge and multivariate analysis for the quality control of herbal drugs. Copyright © 2015 John Wiley & Sons, Ltd.

  11. Advanced process control framework initiative

    NASA Astrophysics Data System (ADS)

    Hill, Tom; Nettles, Steve

    1997-01-01

    The semiconductor industry, one the world's most fiercely competitive industries, is driven by increasingly complex process technologies and global competition to improve cycle time, quality, and process flexibility. Due to the complexity of these problems, current process control techniques are generally nonautomated, time-consuming, reactive, nonadaptive, and focused on individual fabrication tools and processes. As the semiconductor industry moves into higher density processes, radical new approaches are required. To address the need for advanced factory-level process control in this environment, Honeywell, Advanced Micro Devices (AMD), and SEMATECH formed the Advanced Process Control Framework Initiative (APCFI) joint research project. The project defines and demonstrates an Advanced Process Control (APC) approach based on SEMATECH's Computer Integrated Manufacturing (CIM) Framework. Its scope includes the coordination of Manufacturing Execution Systems, process control tools, and wafer fabrication equipment to provide necessary process control capabilities. Moreover, it takes advantage of the CIM Framework to integrate and coordinate applications from other suppliers that provide services necessary for the overall system to function. This presentation discusses the key concept of model-based process control that differentiates the APC Framework. This major improvement over current methods enables new systematic process control by linking the knowledge of key process settings to desired product characteristics that reside in models created with commercial model development tools The unique framework-based approach facilitates integration of commercial tools and reuse of their data by tying them together in an object-based structure. The presentation also explores the perspective of each organization's involvement in the APCFI project. Each has complementary goals and expertise to contribute; Honeywell represents the supplier viewpoint, AMD represents the user

  12. Using Statistical Process Control to Enhance Student Progression

    ERIC Educational Resources Information Center

    Hanna, Mark D.; Raichura, Nilesh; Bernardes, Ednilson

    2012-01-01

    Public interest in educational outcomes has markedly increased in the most recent decade; however, quality management and statistical process control have not deeply penetrated the management of academic institutions. This paper presents results of an attempt to use Statistical Process Control (SPC) to identify a key impediment to continuous…

  13. Quality Control Barriers in Adapting "Metro-Centric" Education to Regional Needs

    ERIC Educational Resources Information Center

    Nagy, Judy; Robinson, Susan R.

    2013-01-01

    The massification and globalization of higher education, combined with the widespread adoption of processes underpinning accreditation and quality control of university programs, have tended to result in learning contexts that are increasingly narrowly conceived and tightly controlled. Underlying many quality control measures is a "one size…

  14. 21 CFR 111.135 - What quality control operations are required for product complaints?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false What quality control operations are required for... MANUFACTURING, PACKAGING, LABELING, OR HOLDING OPERATIONS FOR DIETARY SUPPLEMENTS Production and Process Control System: Requirements for Quality Control § 111.135 What quality control operations are required for...

  15. Quality Control Study of the GSL Reinsurance System. Final Report.

    ERIC Educational Resources Information Center

    Advanced Technology, Inc., Reston, VA.

    A quality control plan for the U.S. Department of Education's Guaranteed Student Loan (GSL) reinsurance process was developed. To identify existing errors, systems documentation and past analyses of the reinsurance system were analyzed, and interviews were conducted. Corrective actions were proposed, and a quality control checklist was developed…

  16. [Post-marketing reevaluation for potential quality risk and quality control in clinical application of traditional Chinese medicines].

    PubMed

    Li, Hong-jiao; He, Li-yun; Liu, Bao-yan

    2015-06-01

    The effective quality control in clinical practices is an effective guarantee for the authenticity and scientificity of the findings. The post-marketing reevaluation for traditional Chinese medicines (TCM) focuses on the efficacy, adverse reaction, combined medication and effective dose of drugs in the market by expanded clinical trials, and requires a larger sample size and a wider range of patients. Therefore, this increases the difficulty of quality control in clinical practices. With the experience in quality control in clinical practices for the post-marketing reevaluation for Kangbingdu oral for cold, researchers in this study reviewed the study purpose, project, scheme design and clinical practice process from an overall point of view, analyzed the study characteristics of the post-marketing reevaluation for TCMs and the quality control risks, designed the quality control contents with quality impacting factors, defined key review contents and summarized the precautions in clinical practices, with the aim to improve the efficiency of quality control of clinical practices. This study can provide reference to clinical units and quality control-related personnel in the post-marketing reevaluation for TCMs.

  17. 21 CFR 111.110 - What quality control operations are required for laboratory operations associated with the...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... laboratory operations associated with the production and process control system? 111.110 Section 111.110 Food... OPERATIONS FOR DIETARY SUPPLEMENTS Production and Process Control System: Requirements for Quality Control... production and process control system? Quality control operations for laboratory operations associated with...

  18. Manufacturing Squares: An Integrative Statistical Process Control Exercise

    ERIC Educational Resources Information Center

    Coy, Steven P.

    2016-01-01

    In the exercise, students in a junior-level operations management class are asked to manufacture a simple product. Given product specifications, they must design a production process, create roles and design jobs for each team member, and develop a statistical process control plan that efficiently and effectively controls quality during…

  19. Initiating statistical process control to improve quality outcomes in colorectal surgery.

    PubMed

    Keller, Deborah S; Stulberg, Jonah J; Lawrence, Justin K; Samia, Hoda; Delaney, Conor P

    2015-12-01

    Unexpected variations in postoperative length of stay (LOS) negatively impact resources and patient outcomes. Statistical process control (SPC) measures performance, evaluates productivity, and modifies processes for optimal performance. The goal of this study was to initiate SPC to identify LOS outliers and evaluate its feasibility to improve outcomes in colorectal surgery. Review of a prospective database identified colorectal procedures performed by a single surgeon. Patients were grouped into elective and emergent categories and then stratified by laparoscopic and open approaches. All followed a standardized enhanced recovery protocol. SPC was applied to identify outliers and evaluate causes within each group. A total of 1294 cases were analyzed--83% elective (n = 1074) and 17% emergent (n = 220). Emergent cases were 70.5% open and 29.5% laparoscopic; elective cases were 36.8% open and 63.2% laparoscopic. All groups had a wide range in LOS. LOS outliers ranged from 8.6% (elective laparoscopic) to 10.8% (emergent laparoscopic). Evaluation of outliers demonstrated patient characteristics of higher ASA scores, longer operating times, ICU requirement, and temporary nursing at discharge. Outliers had higher postoperative complication rates in elective open (57.1 vs. 20.0%) and elective lap groups (77.6 vs. 26.1%). Outliers also had higher readmission rates for emergent open (11.4 vs. 5.4%), emergent lap (14.3 vs. 9.2%), and elective lap (32.8 vs. 6.9%). Elective open outliers did not follow trends of longer LOS or higher reoperation rates. SPC is feasible and promising for improving colorectal surgery outcomes. SPC identified patient and process characteristics associated with increased LOS. SPC may allow real-time outlier identification, during quality improvement efforts, and reevaluation of outcomes after introducing process change. SPC has clinical implications for improving patient outcomes and resource utilization.

  20. QUALITY CONTROLS FOR PCR

    EPA Science Inventory

    The purpose of this presentation is to present an overview of the quality control (QC) sections of a draft EPA document entitled, "Quality Assurance/Quality Control Guidance for Laboratories Performing PCR Analyses on Environmental Samples." This document has been prepared by th...

  1. [Study thought of pharmaceutical preparations quality standards by dynamic quality control technology].

    PubMed

    Yu, Dan-Hong; Mao, Chen-Mei; Lv, Cheng-Zhe; Jin, Hui-Zhen; Yao, Xin; Jia, Xiao-Bin

    2014-07-01

    Pharmaceutical preparations, particularly as a "secret recipe" of traditional Chinese medicine in medical institutions, are the product of China's medical and health industry, and they are also an important means of competing of different medical institutions. Although pharmaceutical preparations have advantages and characteristics than institutes for drug and pharmaceutical companies, the quality standards of pharmaceutical preparations in medical institutions has not reached the desired level over the years. As we all know, the quality of pharmaceutical preparations is important to ensure the efficacy, especially under the environment of people pay more sttention on drug safety and effectiveness and contry increase emphasis on the stste of pharmaceutical preparations. In view of this, we will improve the grade, stability, and clinical efficacy of pharmaceutical preparations by the advanced equipment, testing instruments and the process dynamic quality control technology. Finally, we hope we can provide new ideas for the quality control of pharmaceutical preparations.

  2. Internal quality control: planning and implementation strategies.

    PubMed

    Westgard, James O

    2003-11-01

    The first essential in setting up internal quality control (IQC) of a test procedure in the clinical laboratory is to select the proper IQC procedure to implement, i.e. choosing the statistical criteria or control rules, and the number of control measurements, according to the quality required for the test and the observed performance of the method. Then the right IQC procedure must be properly implemented. This review focuses on strategies for planning and implementing IQC procedures in order to improve the quality of the IQC. A quantitative planning process is described that can be implemented with graphical tools such as power function or critical-error graphs and charts of operating specifications. Finally, a total QC strategy is formulated to minimize cost and maximize quality. A general strategy for IQC implementation is recommended that employs a three-stage design in which the first stage provides high error detection, the second stage low false rejection and the third stage prescribes the length of the analytical run, making use of an algorithm involving the average of normal patients' data.

  3. Acoustic Sample Deposition MALDI-MS (ASD-MALDI-MS): A Novel Process Flow for Quality Control Screening of Compound Libraries.

    PubMed

    Chin, Jefferson; Wood, Elizabeth; Peters, Grace S; Drexler, Dieter M

    2016-02-01

    In the early stages of drug discovery, high-throughput screening (HTS) of compound libraries against pharmaceutical targets is a common method to identify potential lead molecules. For these HTS campaigns to be efficient and successful, continuous quality control of the compound collection is necessary and crucial. However, the large number of compound samples and the limited sample amount pose unique challenges. Presented here is a proof-of-concept study for a novel process flow for the quality control screening of small-molecule compound libraries that consumes only minimal amounts of samples and affords compound-specific molecular data. This process employs an acoustic sample deposition (ASD) technique for the offline sample preparation by depositing nanoliter volumes in an array format onto microscope glass slides followed by matrix-assisted laser desorption/ionization mass spectrometric (MALDI-MS) analysis. An initial study of a 384-compound array employing the ASD-MALDI-MS workflow resulted in a 75% first-pass positive identification rate with an analysis time of <1 s per sample. © 2015 Society for Laboratory Automation and Screening.

  4. Manufacture and quality control of interconnecting wire hardnesses, Volume 1

    NASA Technical Reports Server (NTRS)

    1972-01-01

    A standard is presented for manufacture, installation, and quality control of eight types of interconnecting wire harnesses. The processes, process controls, and inspection and test requirements reflected are based on acknowledgment of harness design requirements, acknowledgment of harness installation requirements, identification of the various parts, materials, etc., utilized in harness manufacture, and formulation of a typical manufacturing flow diagram for identification of each manufacturing and quality control process, operation, inspection, and test. The document covers interconnecting wire harnesses defined in the design standard, including type 1, enclosed in fluorocarbon elastomer convolute, tubing; type 2, enclosed in TFE convolute tubing lines with fiberglass braid; type 3, enclosed in TFE convolute tubing; and type 5, combination of types 3 and 4. Knowledge gained through experience on the Saturn 5 program coupled with recent advances in techniques, materials, and processes was incorporated.

  5. [On-site quality control of acupuncture randomized controlled trial: design of content and checklist of quality control based on PICOST].

    PubMed

    Li, Hong-Jiao; He, Li-Yun; Liu, Zhi-Shun; Sun, Ya-Nan; Yan, Shi-Yan; Liu, Jia; Zhao, Ye; Liu, Bao-Yan

    2014-02-01

    To effectively guarantee quality of randomized controlld trial (RCT) of acupuncture and develop reasonable content and checklist of on-site quality control, influencing factors on quality of acupuncture RCT are analyzed and scientificity of quality control content and feasibility of on-site manipulation are put into overall consideration. Based on content and checklist of on-site quality control in National 11th Five-Year Plan Project Optimization of Comprehensive Treatment Plan for TCM in Prevention and Treatment of Serious Disease and Clinical Assessment on Generic Technology and Quality Control Research, it is proposed that on-site quality control of acupuncture RCT should be conducted with PICOST (patient, intervention, comparison, out come, site and time) as core, especially on quality control of interveners' skills and outcome assessment of blinding, and checklist of on-site quality control is developed to provide references for undertaking groups of the project.

  6. Multifractal Properties of Process Control Variables

    NASA Astrophysics Data System (ADS)

    Domański, Paweł D.

    2017-06-01

    Control system is an inevitable element of any industrial installation. Its quality affects overall process performance significantly. The assessment, whether control system needs any improvement or not, requires relevant and constructive measures. There are various methods, like time domain based, Minimum Variance, Gaussian and non-Gaussian statistical factors, fractal and entropy indexes. Majority of approaches use time series of control variables. They are able to cover many phenomena. But process complexities and human interventions cause effects that are hardly visible for standard measures. It is shown that the signals originating from industrial installations have multifractal properties and such an analysis may extend standard approach to further observations. The work is based on industrial and simulation data. The analysis delivers additional insight into the properties of control system and the process. It helps to discover internal dependencies and human factors, which are hardly detectable.

  7. The Status of the Quality Control in Acupuncture-Neuroimaging Studies

    PubMed Central

    Qiu, Ke; Jing, Miaomiao; Liu, Xiaoyan; Gao, Feifei; Liang, Fanrong; Zeng, Fang

    2016-01-01

    Using neuroimaging techniques to explore the central mechanism of acupuncture gains increasing attention, but the quality control of acupuncture-neuroimaging study remains to be improved. We searched the PubMed Database during 1995 to 2014. The original English articles with neuroimaging scan performed on human beings were included. The data involved quality control including the author, sample size, characteristics of the participant, neuroimaging technology, and acupuncture intervention were extracted and analyzed. The rigorous inclusion and exclusion criteria are important guaranty for the participants' homogeneity. The standard operation process of acupuncture and the stricter requirement for acupuncturist play significant role in quality control. More attention should be paid to the quality control in future studies to improve the reproducibility and reliability of the acupuncture-neuroimaging studies. PMID:27242911

  8. Data quality control in eco-environmental monitoring

    NASA Astrophysics Data System (ADS)

    Lu, Chunyan; Wang, Jing

    2007-11-01

    With the development of science and technology, a number of environmental issues, such as sustainable development, climate change, environmental pollution, and land degradation become serious. Greater attention has been attached to environmental protection. The government gradually launched some eco--environmental construction projects. In 1999, China begin to carry out the project of Grain-for-Green in the west, to improve the eco-environment, and it make some good effect, but there are some questions that still can not be answered. How about the new grass or forest? Where are they? How can we do in the future? To answer these questions, the government began to monitor the eco-environment, based on remote sensing technology. Geography information can be attained timely, but the issue of uncertainty has become increasingly recognized, and this uncertainty affects the reliability of applications using the data. This article analyzed the process of eco-environment monitoring, the uncertainty of geography information, and discussed the methods of data quality control. The Spot5 span data and multi-spectral data in 2003(2002) were used, combined with land use survey data at the scale of 1:10,000, topography data at the scale of 1:10,000, and the local Grain-for-Green project map. Also the social and economic data were collected. Eco-environmental monitoring is a process which consists of several steps, such as image geometric correction, image matching, information extraction, and so on. Based on visual and automated method, land information turned to grass and forest from cultivated land was obtained by comparing the information form remote sensing data with the land survey data, and local Grain-for-Green project data, combined with field survey. According to the process, the uncertainty in the process was analyzed. Positional uncertainty, attribute uncertainty, and thematic uncertainty was obvious. Positional uncertainty mainly derived from image geometric correction

  9. Six Sigma Quality Management System and Design of Risk-based Statistical Quality Control.

    PubMed

    Westgard, James O; Westgard, Sten A

    2017-03-01

    Six sigma concepts provide a quality management system (QMS) with many useful tools for managing quality in medical laboratories. This Six Sigma QMS is driven by the quality required for the intended use of a test. The most useful form for this quality requirement is the allowable total error. Calculation of a sigma-metric provides the best predictor of risk for an analytical examination process, as well as a design parameter for selecting the statistical quality control (SQC) procedure necessary to detect medically important errors. Simple point estimates of sigma at medical decision concentrations are sufficient for laboratory applications. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Double shell tanks (DST) chemistry control data quality objectives

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    BANNING, D.L.

    2001-10-09

    One of the main functions of the River Protection Project is to store the Hanford Site tank waste until the Waste Treatment Plant (WTP) is ready to receive and process the waste. Waste from the older single-shell tanks is being transferred to the newer double-shell tanks (DSTs). Therefore, the integrity of the DSTs must be maintained until the waste from all tanks has been retrieved and transferred to the WTP. To help maintain the integrity of the DSTs over the life of the project, specific chemistry limits have been established to control corrosion of the DSTs. These waste chemistry limitsmore » are presented in the Technical Safety Requirements (TSR) document HNF-SD-WM-TSR-006, Sec. 5 . IS, Rev 2B (CHG 200 I). In order to control the chemistry in the DSTs, the Chemistry Control Program will require analyses of the tank waste. This document describes the Data Quality Objective (DUO) process undertaken to ensure appropriate data will be collected to control the waste chemistry in the DSTs. The DQO process was implemented in accordance with Data Quality Objectives for Sampling and Analyses, HNF-IP-0842, Rev. Ib, Vol. IV, Section 4.16, (Banning 2001) and the U.S. Environmental Protection Agency EPA QA/G4, Guidance for the Data Quality Objectives Process (EPA 1994), with some modifications to accommodate project or tank specific requirements and constraints.« less

  11. Statistical process control: separating signal from noise in emergency department operations.

    PubMed

    Pimentel, Laura; Barrueto, Fermin

    2015-05-01

    Statistical process control (SPC) is a visually appealing and statistically rigorous methodology very suitable to the analysis of emergency department (ED) operations. We demonstrate that the control chart is the primary tool of SPC; it is constructed by plotting data measuring the key quality indicators of operational processes in rationally ordered subgroups such as units of time. Control limits are calculated using formulas reflecting the variation in the data points from one another and from the mean. SPC allows managers to determine whether operational processes are controlled and predictable. We review why the moving range chart is most appropriate for use in the complex ED milieu, how to apply SPC to ED operations, and how to determine when performance improvement is needed. SPC is an excellent tool for operational analysis and quality improvement for these reasons: 1) control charts make large data sets intuitively coherent by integrating statistical and visual descriptions; 2) SPC provides analysis of process stability and capability rather than simple comparison with a benchmark; 3) SPC allows distinction between special cause variation (signal), indicating an unstable process requiring action, and common cause variation (noise), reflecting a stable process; and 4) SPC keeps the focus of quality improvement on process rather than individual performance. Because data have no meaning apart from their context, and every process generates information that can be used to improve it, we contend that SPC should be seriously considered for driving quality improvement in emergency medicine. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. Relational-database model for improving quality assurance and process control in a composite manufacturing environment

    NASA Astrophysics Data System (ADS)

    Gentry, Jeffery D.

    2000-05-01

    A relational database is a powerful tool for collecting and analyzing the vast amounts of inner-related data associated with the manufacture of composite materials. A relational database contains many individual database tables that store data that are related in some fashion. Manufacturing process variables as well as quality assurance measurements can be collected and stored in database tables indexed according to lot numbers, part type or individual serial numbers. Relationships between manufacturing process and product quality can then be correlated over a wide range of product types and process variations. This paper presents details on how relational databases are used to collect, store, and analyze process variables and quality assurance data associated with the manufacture of advanced composite materials. Important considerations are covered including how the various types of data are organized and how relationships between the data are defined. Employing relational database techniques to establish correlative relationships between process variables and quality assurance measurements is then explored. Finally, the benefits of database techniques such as data warehousing, data mining and web based client/server architectures are discussed in the context of composite material manufacturing.

  13. Mindcontrol: A web application for brain segmentation quality control.

    PubMed

    Keshavan, Anisha; Datta, Esha; M McDonough, Ian; Madan, Christopher R; Jordan, Kesshi; Henry, Roland G

    2018-04-15

    Tissue classification plays a crucial role in the investigation of normal neural development, brain-behavior relationships, and the disease mechanisms of many psychiatric and neurological illnesses. Ensuring the accuracy of tissue classification is important for quality research and, in particular, the translation of imaging biomarkers to clinical practice. Assessment with the human eye is vital to correct various errors inherent to all currently available segmentation algorithms. Manual quality assurance becomes methodologically difficult at a large scale - a problem of increasing importance as the number of data sets is on the rise. To make this process more efficient, we have developed Mindcontrol, an open-source web application for the collaborative quality control of neuroimaging processing outputs. The Mindcontrol platform consists of a dashboard to organize data, descriptive visualizations to explore the data, an imaging viewer, and an in-browser annotation and editing toolbox for data curation and quality control. Mindcontrol is flexible and can be configured for the outputs of any software package in any data organization structure. Example configurations for three large, open-source datasets are presented: the 1000 Functional Connectomes Project (FCP), the Consortium for Reliability and Reproducibility (CoRR), and the Autism Brain Imaging Data Exchange (ABIDE) Collection. These demo applications link descriptive quality control metrics, regional brain volumes, and thickness scalars to a 3D imaging viewer and editing module, resulting in an easy-to-implement quality control protocol that can be scaled for any size and complexity of study. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  14. MapReduce Based Parallel Bayesian Network for Manufacturing Quality Control

    NASA Astrophysics Data System (ADS)

    Zheng, Mao-Kuan; Ming, Xin-Guo; Zhang, Xian-Yu; Li, Guo-Ming

    2017-09-01

    Increasing complexity of industrial products and manufacturing processes have challenged conventional statistics based quality management approaches in the circumstances of dynamic production. A Bayesian network and big data analytics integrated approach for manufacturing process quality analysis and control is proposed. Based on Hadoop distributed architecture and MapReduce parallel computing model, big volume and variety quality related data generated during the manufacturing process could be dealt with. Artificial intelligent algorithms, including Bayesian network learning, classification and reasoning, are embedded into the Reduce process. Relying on the ability of the Bayesian network in dealing with dynamic and uncertain problem and the parallel computing power of MapReduce, Bayesian network of impact factors on quality are built based on prior probability distribution and modified with posterior probability distribution. A case study on hull segment manufacturing precision management for ship and offshore platform building shows that computing speed accelerates almost directly proportionally to the increase of computing nodes. It is also proved that the proposed model is feasible for locating and reasoning of root causes, forecasting of manufacturing outcome, and intelligent decision for precision problem solving. The integration of bigdata analytics and BN method offers a whole new perspective in manufacturing quality control.

  15. Implementation of quality by design toward processing of food products.

    PubMed

    Rathore, Anurag S; Kapoor, Gautam

    2017-05-28

    Quality by design (QbD) is a systematic approach that begins with predefined objectives and emphasizes product and process understanding and process control. It is an approach based on principles of sound science and quality risk management. As the food processing industry continues to embrace the idea of in-line, online, and/or at-line sensors and real-time characterization for process monitoring and control, the existing gaps with regard to our ability to monitor multiple parameters/variables associated with the manufacturing process will be alleviated over time. Investments made for development of tools and approaches that facilitate high-throughput analytical and process development, process analytical technology, design of experiments, risk analysis, knowledge management, and enhancement of process/product understanding would pave way for operational and economic benefits later in the commercialization process and across other product pipelines. This article aims to achieve two major objectives. First, to review the progress that has been made in the recent years on the topic of QbD implementation in processing of food products and second, present a case study that illustrates benefits of such QbD implementation.

  16. Quality control troubleshooting tools for the mill floor

    Treesearch

    John Dramm

    2000-01-01

    Statistical Process Control (SPC) provides effective tools for improving process quality in the forest products industry resulting in reduced costs and improved productivity. Implementing SPC helps identify and locate problems that occur in wood products manufacturing. SPC tools achieve their real value when applied on the mill floor for monitoring and troubleshooting...

  17. Can a combination of average of normals and "real time" External Quality Assurance replace Internal Quality Control?

    PubMed

    Badrick, Tony; Graham, Peter

    2018-03-28

    Internal Quality Control and External Quality Assurance are separate but related processes that have developed independently in laboratory medicine over many years. They have different sample frequencies, statistical interpretations and immediacy. Both processes have evolved absorbing new understandings of the concept of laboratory error, sample material matrix and assay capability. However, we do not believe at the coalface that either process has led to much improvement in patient outcomes recently. It is the increasing reliability and automation of analytical platforms along with improved stability of reagents that has reduced systematic and random error, which in turn has minimised the risk of running less frequent IQC. We suggest that it is time to rethink the role of both these processes and unite them into a single approach using an Average of Normals model supported by more frequent External Quality Assurance samples. This new paradigm may lead to less confusion for laboratory staff and quicker responses to and identification of out of control situations.

  18. Sampling the food processing environment: taking up the cudgel for preventive quality management in food processing environments.

    PubMed

    Wagner, Martin; Stessl, Beatrix

    2014-01-01

    The Listeria monitoring program for Austrian cheese factories was established in 1988. The basic idea is to control the introduction of L. monocytogenes into the food processing environment, preventing the pathogen from contaminating the food under processing. The Austrian Listeria monitoring program comprises four levels of investigation, dealing with routine monitoring of samples and consequences of finding a positive sample. Preventive quality control concepts attempt to detect a foodborne hazard along the food processing chain, prior to food delivery, retailing, and consumption. The implementation of a preventive food safety concept provokes a deepened insight by the manufacturers into problems concerning food safety. The development of preventive quality assurance strategies contributes to the national food safety status and protects public health.

  19. Commercial jet fuel quality control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strauss, K.H.

    1995-05-01

    The paper discusses the purpose of jet fuel quality control between the refinery and the aircraft. It describes fixed equipment, including various types of filters, and the usefulness and limitations of this equipment. Test equipment is reviewed as are various surveillance procedures. These include the Air Transport Association specification ATA 103, the FAA Advisory Circular 150/5230-4, the International Air Transport Association Guidance Material for Fuel Quality Control and Fuelling Service and the Guidelines for Quality Control at Jointly Operated Fuel Systems. Some past and current quality control problems are briefly mentioned.

  20. Standard Reference Specimens in Quality Control of Engineering Surfaces

    PubMed Central

    Song, J. F.; Vorburger, T. V.

    1991-01-01

    In the quality control of engineering surfaces, we aim to understand and maintain a good relationship between the manufacturing process and surface function. This is achieved by controlling the surface texture. The control process involves: 1) learning the functional parameters and their control values through controlled experiments or through a long history of production and use; 2) maintaining high accuracy and reproducibility with measurements not only of roughness calibration specimens but also of real engineering parts. In this paper, the characteristics, utilizations, and limitations of different classes of precision roughness calibration specimens are described. A measuring procedure of engineering surfaces, based on the calibration procedure of roughness specimens at NIST, is proposed. This procedure involves utilization of check specimens with waveform, wavelength, and other roughness parameters similar to functioning engineering surfaces. These check specimens would be certified under standardized reference measuring conditions, or by a reference instrument, and could be used for overall checking of the measuring procedure and for maintaining accuracy and agreement in engineering surface measurement. The concept of “surface texture design” is also suggested, which involves designing the engineering surface texture, the manufacturing process, and the quality control procedure to meet the optimal functional needs. PMID:28184115

  1. Factors controlling groundwater quality in the Yeonjegu District of Busan City, Korea, using the hydrogeochemical processes and fuzzy GIS.

    PubMed

    Venkatramanan, Senapathi; Chung, Sang Yong; Selvam, Sekar; Lee, Seung Yeop; Elzain, Hussam Eldin

    2017-10-01

    The hydrogeochemical processes and fuzzy GIS techniques were used to evaluate the groundwater quality in the Yeonjegu district of Busan Metropolitan City, Korea. The highest concentrations of major ions were mainly related to the local geology. The seawater intrusion into the river water and municipal contaminants were secondary contamination sources of groundwater in the study area. Factor analysis represented the contamination sources of the mineral dissolution of the host rocks and domestic influences. The Gibbs plot exhibited that the major ions were derived from the rock weathering condition. Piper's trilinear diagram showed that the groundwater quality was classified into five types of CaHCO 3 , NaHCO 3 , NaCl, CaCl 2 , and CaSO 4 types in that order. The ionic relationship and the saturation mineral index of the ions indicated that the evaporation, dissolution, and precipitation processes controlled the groundwater chemistry. The fuzzy GIS map showed that highly contaminated groundwater occurred in the northeastern and the central parts and that the groundwater of medium quality appeared in most parts of the study area. It suggested that the groundwater quality of the study area was influenced by local geology, seawater intrusion, and municipal contaminants. This research clearly demonstrated that the geochemical analyses and fuzzy GIS method were very useful to identify the contaminant sources and the location of good groundwater quality.

  2. Process control systems: integrated for future process technologies

    NASA Astrophysics Data System (ADS)

    Botros, Youssry; Hajj, Hazem M.

    2003-06-01

    Process Control Systems (PCS) are becoming more crucial to the success of Integrated Circuit makers due to their direct impact on product quality, cost, and Fab output. The primary objective of PCS is to minimize variability by detecting and correcting non optimal performance. Current PCS implementations are considered disparate, where each PCS application is designed, deployed and supported separately. Each implementation targets a specific area of control such as equipment performance, wafer manufacturing, and process health monitoring. With Intel entering the nanometer technology era, tighter process specifications are required for higher yields and lower cost. This requires areas of control to be tightly coupled and integrated to achieve the optimal performance. This requirement can be achieved via consistent design and deployment of the integrated PCS. PCS integration will result in several benefits such as leveraging commonalities, avoiding redundancy, and facilitating sharing between implementations. This paper will address PCS implementations and focus on benefits and requirements of the integrated PCS. Intel integrated PCS Architecture will be then presented and its components will be briefly discussed. Finally, industry direction and efforts to standardize PCS interfaces that enable PCS integration will be presented.

  3. 10 CFR 71.119 - Control of special processes.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 2 2012-01-01 2012-01-01 false Control of special processes. 71.119 Section 71.119 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) PACKAGING AND TRANSPORTATION OF RADIOACTIVE MATERIAL Quality... shall establish measures to assure that special processes, including welding, heat treating, and...

  4. 10 CFR 71.119 - Control of special processes.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 2 2013-01-01 2013-01-01 false Control of special processes. 71.119 Section 71.119 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) PACKAGING AND TRANSPORTATION OF RADIOACTIVE MATERIAL Quality... shall establish measures to assure that special processes, including welding, heat treating, and...

  5. 10 CFR 71.119 - Control of special processes.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 2 2014-01-01 2014-01-01 false Control of special processes. 71.119 Section 71.119 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) PACKAGING AND TRANSPORTATION OF RADIOACTIVE MATERIAL Quality... shall establish measures to assure that special processes, including welding, heat treating, and...

  6. 10 CFR 71.119 - Control of special processes.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 2 2011-01-01 2011-01-01 false Control of special processes. 71.119 Section 71.119 Energy NUCLEAR REGULATORY COMMISSION (CONTINUED) PACKAGING AND TRANSPORTATION OF RADIOACTIVE MATERIAL Quality... shall establish measures to assure that special processes, including welding, heat treating, and...

  7. Quality Control of the Print with the Application of Statistical Methods

    NASA Astrophysics Data System (ADS)

    Simonenko, K. V.; Bulatova, G. S.; Antropova, L. B.; Varepo, L. G.

    2018-04-01

    The basis for standardizing the process of offset printing is the control of print quality indicators. The solution of this problem has various approaches, among which the most important are statistical methods. Practical implementation of them for managing the quality of the printing process is very relevant and is reflected in this paper. The possibility of using the method of constructing a Control Card to identify the reasons for the deviation of the optical density for a triad of inks in offset printing is shown.

  8. Process performance and product quality in an integrated continuous antibody production process.

    PubMed

    Karst, Daniel J; Steinebach, Fabian; Soos, Miroslav; Morbidelli, Massimo

    2017-02-01

    Continuous manufacturing is currently being seriously considered in the biopharmaceutical industry as the possible new paradigm for producing therapeutic proteins, due to production cost and product quality related benefits. In this study, a monoclonal antibody producing CHO cell line was cultured in perfusion mode and connected to a continuous affinity capture step. The reliable and stable integration of the two systems was enabled by suitable control loops, regulating the continuous volumetric flow and adapting the operating conditions of the capture process. For the latter, an at-line HPLC measurement of the harvest concentration subsequent to the bioreactor was combined with a mechanistic model of the capture chromatographic unit. Thereby, optimal buffer consumption and productivity throughout the process was realized while always maintaining a yield above the target value of 99%. Stable operation was achieved at three consecutive viable cell density set points (20, 60, and 40 × 10 6 cells/mL), together with consistent product quality in terms of aggregates, fragments, charge isoforms, and N-linked glycosylation. In addition, different values for these product quality attributes such as N-linked glycosylation, charge variants, and aggregate content were measured at the different steady states. As expected, the amount of released DNA and HCP was significantly reduced by the capture step for all considered upstream operating conditions. This study is exemplary for the potential of enhancing product quality control and modulation by integrated continuous manufacturing. Biotechnol. Bioeng. 2017;114: 298-307. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  9. Material quality development during the automated tow placement process

    NASA Astrophysics Data System (ADS)

    Tierney, John Joseph

    Automated tow placement (ATP) of thermoplastic composites builds on the existing industrial base for equipment, robotics and kinematic placement of material with the aim of further cost reduction by eliminating the autoclave entirely. During ATP processing, thermoplastic composite tows are deposited on a preconsolidated substrate at rates ranging from 10--100mm/s and consolidated using the localized application of heat and pressure by a tow placement head mounted on a robot. The process is highly non-isothermal subjecting the material to multiple heating and cooling rates approaching 1000°C/sec. The requirement for the ATP process is to achieve the same quality in seconds (low void content, full translation of mechanical properties and degree of bonding and minimal warpage) as the autoclave process achieves in hours. The scientific challenge was to first understand and then model the relationships between processing, material response, microstructure and quality. The important phenomena affecting quality investigated in this study include a steady state heat transfer simulation, consolidation and deconsolidation (void dynamics), intimate contact and polymer interdiffusion (degree of bonding/mechanical properties) and residual stress and warpage (crystallization and viscoelastic response). A fundamental understanding of the role of materials related to these mechanisms and their relationship to final quality is developed and applied towards a method of process control and optimization.

  10. Image processing and Quality Control for the first 10,000 brain imaging datasets from UK Biobank.

    PubMed

    Alfaro-Almagro, Fidel; Jenkinson, Mark; Bangerter, Neal K; Andersson, Jesper L R; Griffanti, Ludovica; Douaud, Gwenaëlle; Sotiropoulos, Stamatios N; Jbabdi, Saad; Hernandez-Fernandez, Moises; Vallee, Emmanuel; Vidaurre, Diego; Webster, Matthew; McCarthy, Paul; Rorden, Christopher; Daducci, Alessandro; Alexander, Daniel C; Zhang, Hui; Dragonu, Iulius; Matthews, Paul M; Miller, Karla L; Smith, Stephen M

    2018-02-01

    UK Biobank is a large-scale prospective epidemiological study with all data accessible to researchers worldwide. It is currently in the process of bringing back 100,000 of the original participants for brain, heart and body MRI, carotid ultrasound and low-dose bone/fat x-ray. The brain imaging component covers 6 modalities (T1, T2 FLAIR, susceptibility weighted MRI, Resting fMRI, Task fMRI and Diffusion MRI). Raw and processed data from the first 10,000 imaged subjects has recently been released for general research access. To help convert this data into useful summary information we have developed an automated processing and QC (Quality Control) pipeline that is available for use by other researchers. In this paper we describe the pipeline in detail, following a brief overview of UK Biobank brain imaging and the acquisition protocol. We also describe several quantitative investigations carried out as part of the development of both the imaging protocol and the processing pipeline. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  11. Plan delivery quality assurance for CyberKnife: Statistical process control analysis of 350 film-based patient-specific QAs.

    PubMed

    Bellec, J; Delaby, N; Jouyaux, F; Perdrieux, M; Bouvier, J; Sorel, S; Henry, O; Lafond, C

    2017-07-01

    Robotic radiosurgery requires plan delivery quality assurance (DQA) but there has never been a published comprehensive analysis of a patient-specific DQA process in a clinic. We proposed to evaluate 350 consecutive film-based patient-specific DQAs using statistical process control. We evaluated the performance of the process to propose achievable tolerance criteria for DQA validation and we sought to identify suboptimal DQA using control charts. DQAs were performed on a CyberKnife-M6 using Gafchromic-EBT3 films. The signal-to-dose conversion was performed using a multichannel-correction and a scanning protocol that combined measurement and calibration in a single scan. The DQA analysis comprised a gamma-index analysis at 3%/1.5mm and a separate evaluation of spatial and dosimetric accuracy of the plan delivery. Each parameter was plotted on a control chart and control limits were calculated. A capability index (Cpm) was calculated to evaluate the ability of the process to produce results within specifications. The analysis of capability showed that a gamma pass rate of 85% at 3%/1.5mm was highly achievable as acceptance criteria for DQA validation using a film-based protocol (Cpm>1.33). 3.4% of DQA were outside a control limit of 88% for gamma pass-rate. The analysis of the out-of-control DQA helped identify a dosimetric error in our institute for a specific treatment type. We have defined initial tolerance criteria for DQA validations. We have shown that the implementation of a film-based patient-specific DQA protocol with the use of control charts is an effective method to improve patient treatment safety on CyberKnife. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  12. Printing quality control automation

    NASA Astrophysics Data System (ADS)

    Trapeznikova, O. V.

    2018-04-01

    One of the most important problems in the concept of standardizing the process of offset printing is the control the quality rating of printing and its automation. To solve the problem, a software has been developed taking into account the specifics of printing system components and the behavior in printing process. In order to characterize the distribution of ink layer on the printed substrate the so-called deviation of the ink layer thickness on the sheet from nominal surface is suggested. The geometric data construction the surface projections of the color gamut bodies allows to visualize the color reproduction gamut of printing systems in brightness ranges and specific color sectors, that provides a qualitative comparison of the system by the reproduction of individual colors in a varying ranges of brightness.

  13. Predictive displays for a process-control schematic interface.

    PubMed

    Yin, Shanqing; Wickens, Christopher D; Helander, Martin; Laberge, Jason C

    2015-02-01

    Our objective was to examine the extent to which increasing precision of predictive (rate of change) information in process control will improve performance on a simulated process-control task. Predictive displays have been found to be useful in process control (as well as aviation and maritime industries). However, authors of prior research have not examined the extent to which predictive value is increased by increasing predictor resolution, nor has such research tied potential improvements to changes in process control strategy. Fifty nonprofessional participants each controlled a simulated chemical mixture process (honey mixer simulation) that simulated the operations found in process control. Participants in each of five groups controlled with either no predictor or a predictor ranging in the resolution of prediction of the process. Increasing detail resolution generally increased the benefit of prediction over the control condition although not monotonically so. The best overall performance, combining quality and predictive ability, was obtained by the display of intermediate resolution. The two displays with the lowest resolution were clearly inferior. Predictors with higher resolution are of value but may trade off enhanced sensitivity to variable change (lower-resolution discrete state predictor) with smoother control action (higher-resolution continuous predictors). The research provides guidelines to the process-control industry regarding displays that can most improve operator performance.

  14. Information integration and diagnosis analysis of equipment status and production quality for machining process

    NASA Astrophysics Data System (ADS)

    Zan, Tao; Wang, Min; Hu, Jianzhong

    2010-12-01

    Machining status monitoring technique by multi-sensors can acquire and analyze the machining process information to implement abnormity diagnosis and fault warning. Statistical quality control technique is normally used to distinguish abnormal fluctuations from normal fluctuations through statistical method. In this paper by comparing the advantages and disadvantages of the two methods, the necessity and feasibility of integration and fusion is introduced. Then an approach that integrates multi-sensors status monitoring and statistical process control based on artificial intelligent technique, internet technique and database technique is brought forward. Based on virtual instrument technique the author developed the machining quality assurance system - MoniSysOnline, which has been used to monitoring the grinding machining process. By analyzing the quality data and AE signal information of wheel dressing process the reason of machining quality fluctuation has been obtained. The experiment result indicates that the approach is suitable for the status monitoring and analyzing of machining process.

  15. Case Studies in Modelling, Control in Food Processes.

    PubMed

    Glassey, J; Barone, A; Montague, G A; Sabou, V

    This chapter discusses the importance of modelling and control in increasing food process efficiency and ensuring product quality. Various approaches to both modelling and control in food processing are set in the context of the specific challenges in this industrial sector and latest developments in each area are discussed. Three industrial case studies are used to demonstrate the benefits of advanced measurement, modelling and control in food processes. The first case study illustrates the use of knowledge elicitation from expert operators in the process for the manufacture of potato chips (French fries) and the consequent improvements in process control to increase the consistency of the resulting product. The second case study highlights the economic benefits of tighter control of an important process parameter, moisture content, in potato crisp (chips) manufacture. The final case study describes the use of NIR spectroscopy in ensuring effective mixing of dry multicomponent mixtures and pastes. Practical implementation tips and infrastructure requirements are also discussed.

  16. Statistical process control for electron beam monitoring.

    PubMed

    López-Tarjuelo, Juan; Luquero-Llopis, Naika; García-Mollá, Rafael; Quirós-Higueras, Juan David; Bouché-Babiloni, Ana; Juan-Senabre, Xavier Jordi; de Marco-Blancas, Noelia; Ferrer-Albiach, Carlos; Santos-Serra, Agustín

    2015-07-01

    To assess the electron beam monitoring statistical process control (SPC) in linear accelerator (linac) daily quality control. We present a long-term record of our measurements and evaluate which SPC-led conditions are feasible for maintaining control. We retrieved our linac beam calibration, symmetry, and flatness daily records for all electron beam energies from January 2008 to December 2013, and retrospectively studied how SPC could have been applied and which of its features could be used in the future. A set of adjustment interventions designed to maintain these parameters under control was also simulated. All phase I data was under control. The dose plots were characterized by rising trends followed by steep drops caused by our attempts to re-center the linac beam calibration. Where flatness and symmetry trends were detected they were less-well defined. The process capability ratios ranged from 1.6 to 9.3 at a 2% specification level. Simulated interventions ranged from 2% to 34% of the total number of measurement sessions. We also noted that if prospective SPC had been applied it would have met quality control specifications. SPC can be used to assess the inherent variability of our electron beam monitoring system. It can also indicate whether a process is capable of maintaining electron parameters under control with respect to established specifications by using a daily checking device, but this is not practical unless a method to establish direct feedback from the device to the linac can be devised. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  17. Development of Process Control Methodology for Tracking the Quality and Safety of Pain, Agitation, and Sedation Management in Critical Care Units.

    PubMed

    Walsh, Timothy S; Kydonaki, Kalliopi; Lee, Robert J; Everingham, Kirsty; Antonelli, Jean; Harkness, Ronald T; Cole, Stephen; Quasim, Tara; Ruddy, James; McDougall, Marcia; Davidson, Alan; Rutherford, John; Richards, Jonathan; Weir, Christopher J

    2016-03-01

    To develop sedation, pain, and agitation quality measures using process control methodology and evaluate their properties in clinical practice. A Sedation Quality Assessment Tool was developed and validated to capture data for 12-hour periods of nursing care. Domains included pain/discomfort and sedation-agitation behaviors; sedative, analgesic, and neuromuscular blocking drug administration; ventilation status; and conditions potentially justifying deep sedation. Predefined sedation-related adverse events were recorded daily. Using an iterative process, algorithms were developed to describe the proportion of care periods with poor limb relaxation, poor ventilator synchronization, unnecessary deep sedation, agitation, and an overall optimum sedation metric. Proportion charts described processes over time (2 monthly intervals) for each ICU. The numbers of patients treated between sedation-related adverse events were described with G charts. Automated algorithms generated charts for 12 months of sequential data. Mean values for each process were calculated, and variation within and between ICUs explored qualitatively. Eight Scottish ICUs over a 12-month period. Mechanically ventilated patients. None. The Sedation Quality Assessment Tool agitation-sedation domains correlated with the Richmond Sedation Agitation Scale score (Spearman ρ = 0.75) and were reliable in clinician-clinician (weighted kappa; κ = 0.66) and clinician-researcher (κ = 0.82) comparisons. The limb movement domain had fair correlation with Behavioral Pain Scale (ρ = 0.24) and was reliable in clinician-clinician (κ = 0.58) and clinician-researcher (κ = 0.45) comparisons. Ventilator synchronization correlated with Behavioral Pain Scale (ρ = 0.54), and reliability in clinician-clinician (κ = 0.29) and clinician-researcher (κ = 0.42) comparisons was fair-moderate. Eight hundred twenty-five patients were enrolled (range, 59-235 across ICUs), providing 12,385 care periods for evaluation (range 655

  18. Tacit Quality Leadership: Operationalized Quality Perceptions as a Source of Influence in the American Higher Education Accreditation Process

    ERIC Educational Resources Information Center

    Saurbier, Ann L.

    2013-01-01

    American post-secondary education faces unprecedented challenges in the dynamic 21st century environment. An appreciation of the higher education accreditation process, as a quality control mechanism, therefore may be seen as a significant priority. When American higher education is viewed systemically, the perceptions of quality held and…

  19. 40 CFR 75.21 - Quality assurance and quality control requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false Quality assurance and quality control... PROGRAMS (CONTINUED) CONTINUOUS EMISSION MONITORING Operation and Maintenance Requirements § 75.21 Quality assurance and quality control requirements. (a) Continuous emission monitoring systems. The owner or...

  20. Quality Control by Artificial Vision

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lam, Edmond Y.; Gleason, Shaun Scott; Niel, Kurt S.

    2010-01-01

    Computational technology has fundamentally changed many aspects of our lives. One clear evidence is the development of artificial-vision systems, which have effectively automated many manual tasks ranging from quality inspection to quantitative assessment. In many cases, these machine-vision systems are even preferred over manual ones due to their repeatability and high precision. Such advantages come from significant research efforts in advancing sensor technology, illumination, computational hardware, and image-processing algorithms. Similar to the Special Section on Quality Control by Artificial Vision published two years ago in Volume 17, Issue 3 of the Journal of Electronic Imaging, the present one invited papersmore » relevant to fundamental technology improvements to foster quality control by artificial vision, and fine-tuned the technology for specific applications. We aim to balance both theoretical and applied work pertinent to this special section theme. Consequently, we have seven high-quality papers resulting from the stringent peer-reviewing process in place at the Journal of Electronic Imaging. Some of the papers contain extended treatment of the authors work presented at the SPIE Image Processing: Machine Vision Applications conference and the International Conference on Quality Control by Artificial Vision. On the broad application side, Liu et al. propose an unsupervised texture image segmentation scheme. Using a multilayer data condensation spectral clustering algorithm together with wavelet transform, they demonstrate the effectiveness of their approach on both texture and synthetic aperture radar images. A problem related to image segmentation is image extraction. For this, O'Leary et al. investigate the theory of polynomial moments and show how these moments can be compared to classical filters. They also show how to use the discrete polynomial-basis functions for the extraction of 3-D embossed digits, demonstrating superiority over Fourier

  1. Process and control systems for composites manufacturing

    NASA Technical Reports Server (NTRS)

    Tsiang, T. H.; Wanamaker, John L.

    1992-01-01

    A precise control of composite material processing would not only improve part quality, but it would also directly reduce the overall manufacturing cost. The development and incorporation of sensors will help to generate real-time information for material processing relationships and equipment characteristics. In the present work, the thermocouple, pressure transducer, and dielectrometer technologies were investigated. The monitoring sensors were integrated with the computerized control system in three non-autoclave fabrication techniques: hot-press, self contained tool (self heating and pressurizing), and pressure vessel). The sensors were implemented in the parts and tools.

  2. 40 CFR 51.359 - Quality control.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 2 2013-07-01 2013-07-01 false Quality control. 51.359 Section 51.359....359 Quality control. Quality control measures shall insure that emission testing equipment is calibrated and maintained properly, and that inspection, calibration records, and control charts are...

  3. 40 CFR 51.359 - Quality control.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 2 2010-07-01 2010-07-01 false Quality control. 51.359 Section 51.359....359 Quality control. Quality control measures shall insure that emission testing equipment is calibrated and maintained properly, and that inspection, calibration records, and control charts are...

  4. 40 CFR 51.359 - Quality control.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 2 2011-07-01 2011-07-01 false Quality control. 51.359 Section 51.359....359 Quality control. Quality control measures shall insure that emission testing equipment is calibrated and maintained properly, and that inspection, calibration records, and control charts are...

  5. 40 CFR 51.359 - Quality control.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 2 2012-07-01 2012-07-01 false Quality control. 51.359 Section 51.359....359 Quality control. Quality control measures shall insure that emission testing equipment is calibrated and maintained properly, and that inspection, calibration records, and control charts are...

  6. Quality control in gastrointestinal surgery.

    PubMed

    Ramírez-Barba, Ector Jaime; Arenas-Moya, Diego; Vázquez-Guerrero, Arturo

    2011-01-01

    We analyzed the Mexican legal framework, identifying the vectors that characterize quality and control in gastrointestinal surgery. Quality is contemplated in the health protection rights determined according to the Mexican Constitution, established in the general health law and included as a specific goal in the actual National Development Plan and Health Sector Plan. Quality control implies planning, verification and application of corrective measures. Mexico has implemented several quality strategies such as certification of hospitals and regulatory agreements by the General Salubrity Council, creation of the National Health Quality Committee, generation of Clinical Practice Guidelines and the Certification of Medical Specialties, among others. Quality control in gastrointestinal surgery must begin at the time of medical education and continue during professional activities of surgeons, encouraging multidisciplinary teamwork, knowledge, abilities, attitudes, values and skills that promote homogeneous, safe and quality health services for the Mexican population.

  7. The ESA FRM4DOAS project: Towards a quality-controlled MAXDOAS Centralized Processing System

    NASA Astrophysics Data System (ADS)

    Hendrick, Francois; Fayt, Caroline; Friess, Udo; Kreher, Karin; Piters, Ankie; Richter, Andreas; Wagner, Thomas; Cede, Alexander; Spinei, Elena; von Bismarck, Jonas; Fehr, Thorsten; Van Roozendael, Michel

    2017-04-01

    The Fiducial Reference Measurements for Ground-Based DOAS Air-Quality Observations (FRM4DOAS) is a two-year project funded by the European Space Agency (ESA). Started in July 2016, FRM4DOAS aims at further harmonizing MAXDOAS measurements and data sets, through (1) the specification of best practices for instrument operation, (2) the selection of state-of-the art retrieval algorithms, procedures, and settings, (3) the demonstration of a centralised rapid-delivery (6-24h latency) processing system for MAXDOAS instruments to be operated within the international Network for the Detection of Atmospheric Composition Change (NDACC). The project also links with the Pandonia initiative. In a first phase, the system concentrates on the development of 3 key products: NO2 vertical profiles, total O3 and tropospheric HCHO profiles, which will be retrieved at 11 MAXDOAS pilot stations. The system will also be tested and validated on data from the CINDI-2 campaign, and designed to allow further extension after commissioning. These activities will help and guarantee that homogenous, fully traceable, and quality-controlled datasets are generated from reference ground-based UV-vis instruments, which will play a crucial role in the validation of future ESA/Copernicus Sentinel satellite missions S-5P, S-4, and S-5.

  8. Protein Quality Control and the Amyotrophic Lateral Sclerosis/Frontotemporal Dementia Continuum

    PubMed Central

    Shahheydari, Hamideh; Ragagnin, Audrey; Walker, Adam K.; Toth, Reka P.; Vidal, Marta; Jagaraj, Cyril J.; Perri, Emma R.; Konopka, Anna; Sultana, Jessica M.; Atkin, Julie D.

    2017-01-01

    Protein homeostasis, or proteostasis, has an important regulatory role in cellular function. Protein quality control mechanisms, including protein folding and protein degradation processes, have a crucial function in post-mitotic neurons. Cellular protein quality control relies on multiple strategies, including molecular chaperones, autophagy, the ubiquitin proteasome system, endoplasmic reticulum (ER)-associated degradation (ERAD) and the formation of stress granules (SGs), to regulate proteostasis. Neurodegenerative diseases are characterized by the presence of misfolded protein aggregates, implying that protein quality control mechanisms are dysfunctional in these conditions. Amyotrophic lateral sclerosis (ALS) and frontotemporal dementia (FTD) are neurodegenerative diseases that are now recognized to overlap clinically and pathologically, forming a continuous disease spectrum. In this review article, we detail the evidence for dysregulation of protein quality control mechanisms across the whole ALS-FTD continuum, by discussing the major proteins implicated in ALS and/or FTD. We also discuss possible ways in which protein quality mechanisms could be targeted therapeutically in these disorders and highlight promising protein quality control-based therapeutics for clinical trials. PMID:28539871

  9. Simulation of textile manufacturing processes for planning, scheduling, and quality control purposes

    NASA Astrophysics Data System (ADS)

    Cropper, A. E.; Wang, Z.

    1995-08-01

    Simulation, as a management information tool, has been applied to engineering manufacture and assembly operations. The application of the principles to textile manufacturing (fiber to fabric) is discussed. The particular problems and solutions in applying the simulation software package to the yarn production processes are discussed with an indication of how the software achieves the production schedule. The system appears to have application in planning, scheduling, and quality assurance. The latter being a result of the traceability possibilities through a process involving mixing and splitting of material.

  10. 78 FR 69103 - 30-Day Notice of Proposed Information Collection: Quality Control for Rental Assistance Subsidy...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-18

    ... Information Collection: Quality Control for Rental Assistance Subsidy Determinations AGENCY: Office of the... Collection Title of Information Collection: Quality Control for Rental Assistance Subsidy Determinations. OMB... Quality Control process involves selecting a nationally representative sample of assisted households to...

  11. Interim results of quality-control sampling of surface water for the Upper Colorado River National Water-Quality Assessment Study Unit, water years 1995-96

    USGS Publications Warehouse

    Spahr, N.E.; Boulger, R.W.

    1997-01-01

    Quality-control samples provide part of the information needed to estimate the bias and variability that result from sample collection, processing, and analysis. Quality-control samples of surface water collected for the Upper Colorado River National Water-Quality Assessment study unit for water years 1995?96 are presented and analyzed in this report. The types of quality-control samples collected include pre-processing split replicates, concurrent replicates, sequential replicates, post-processing split replicates, and field blanks. Analysis of the pre-processing split replicates, concurrent replicates, sequential replicates, and post-processing split replicates is based on differences between analytical results of the environmental samples and analytical results of the quality-control samples. Results of these comparisons indicate that variability introduced by sample collection, processing, and handling is low and will not affect interpretation of the environmental data. The differences for most water-quality constituents is on the order of plus or minus 1 or 2 lowest rounding units. A lowest rounding unit is equivalent to the magnitude of the least significant figure reported for analytical results. The use of lowest rounding units avoids some of the difficulty in comparing differences between pairs of samples when concentrations span orders of magnitude and provides a measure of the practical significance of the effect of variability. Analysis of field-blank quality-control samples indicates that with the exception of chloride and silica, no systematic contamination of samples is apparent. Chloride contamination probably was the result of incomplete rinsing of the dilute cleaning solution from the outlet ports of the decaport sample splitter. Silica contamination seems to have been introduced by the blank water. Sampling and processing procedures for water year 1997 have been modified as a result of these analyses.

  12. 40 CFR 85.2233 - Steady state test equipment calibrations, adjustments, and quality control-EPA 91.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... compensated for automatically and statistical process control demonstrates equal or better quality control... calibrations, adjustments, and quality control-EPA 91. 85.2233 Section 85.2233 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) CONTROL OF AIR POLLUTION FROM MOBILE...

  13. Process air quality data

    NASA Technical Reports Server (NTRS)

    Butler, C. M.; Hogge, J. E.

    1978-01-01

    Air quality sampling was conducted. Data for air quality parameters, recorded on written forms, punched cards or magnetic tape, are available for 1972 through 1975. Computer software was developed to (1) calculate several daily statistical measures of location, (2) plot time histories of data or the calculated daily statistics, (3) calculate simple correlation coefficients, and (4) plot scatter diagrams. Computer software was developed for processing air quality data to include time series analysis and goodness of fit tests. Computer software was developed to (1) calculate a larger number of daily statistical measures of location, and a number of daily monthly and yearly measures of location, dispersion, skewness and kurtosis, (2) decompose the extended time series model and (3) perform some goodness of fit tests. The computer program is described, documented and illustrated by examples. Recommendations are made for continuation of the development of research on processing air quality data.

  14. General aviation fuel quality control

    NASA Technical Reports Server (NTRS)

    Poitz, H.

    1983-01-01

    Quality control measures for aviation gasoline, and some of the differences between quality control on avgas and mogas are discussed. One thing to keep in mind is that with motor gasoline you can always pull off to the side of the road. It's not so easy to do in an airplane. Consequently, there are reasons for having the tight specifications and the tight quality control measures on avgas as compared to motor gasoline.

  15. Real-time control of combined surface water quantity and quality: polder flushing.

    PubMed

    Xu, M; van Overloop, P J; van de Giesen, N C; Stelling, G S

    2010-01-01

    In open water systems, keeping both water depths and water quality at specified values is critical for maintaining a 'healthy' water system. Many systems still require manual operation, at least for water quality management. When applying real-time control, both quantity and quality standards need to be met. In this paper, an artificial polder flushing case is studied. Model Predictive Control (MPC) is developed to control the system. In addition to MPC, a 'forward estimation' procedure is used to acquire water quality predictions for the simplified model used in MPC optimization. In order to illustrate the advantages of MPC, classical control [Proportional-Integral control (PI)] has been developed for comparison in the test case. The results show that both algorithms are able to control the polder flushing process, but MPC is more efficient in functionality and control flexibility.

  16. 40 CFR 51.359 - Quality control.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 2 2014-07-01 2014-07-01 false Quality control. 51.359 Section 51.359 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS REQUIREMENTS FOR... to assure test accuracy. Computer control of quality assurance checks and quality control charts...

  17. Quality risk management of top spray fluidized bed process for antihypertensive drug formulation with control strategy engendered by Box-behnken experimental design space.

    PubMed

    Mukharya, Amit; Patel, Paresh U; Shenoy, Dinesh; Chaudhary, Shivang

    2013-01-01

    Lacidipine (LCDP) is a very low soluble and highly biovariable calcium channel blocker used in the treatment of hypertension. To increase its apparent solubility and to reduce its biovariability, solid dispersion fluid bed processing technology was explored, as it produces highly dispersible granules with a characteristic porous structure that enhances dispersibility, wettability, blend uniformity (by dissolving and spraying a solution of actives), flow ability and compressibility of granules for tableting and reducing variability by uniform drug-binder solution distribution on carrier molecules. Main object of this quality risk management (QRM) study is to provide a sophisticated "robust and rugged" Fluidized Bed Process (FBP) for the preparation of LCDP tablets with desired quality (stability) and performance (dissolution) by quality by design (QbD) concept. THIS STUDY IS PRINCIPALLY FOCUSING ON THOROUGH MECHANISTIC UNDERSTANDING OF THE FBP BY WHICH IT IS DEVELOPED AND SCALED UP WITH A KNOWLEDGE OF THE CRITICAL RISKS INVOLVED IN MANUFACTURING PROCESS ANALYZED BY RISK ASSESSMENT TOOLS LIKE: Qualitative Initial Risk-based Matrix Analysis (IRMA) and Quantitative Failure Mode Effective Analysis (FMEA) to identify and rank parameters with potential to have an impact on In Process/Finished Product Critical Quality Attributes (IP/FP CQAs). These Critical Process Parameters (CPPs) were further refined by DoE and MVDA to develop design space with Real Time Release Testing (RTRT) that leads to implementation of a control strategy to achieve consistent finished product quality at lab scale itself to prevent possible product failure at larger manufacturing scale.

  18. Process safety improvement--quality and target zero.

    PubMed

    Van Scyoc, Karl

    2008-11-15

    Process safety practitioners have adopted quality management principles in design of process safety management systems with positive effect, yet achieving safety objectives sometimes remain a distant target. Companies regularly apply tools and methods which have roots in quality and productivity improvement. The "plan, do, check, act" improvement loop, statistical analysis of incidents (non-conformities), and performance trending popularized by Dr. Deming are now commonly used in the context of process safety. Significant advancements in HSE performance are reported after applying methods viewed as fundamental for quality management. In pursuit of continual process safety improvement, the paper examines various quality improvement methods, and explores how methods intended for product quality can be additionally applied to continual improvement of process safety. Methods such as Kaizen, Poke yoke, and TRIZ, while long established for quality improvement, are quite unfamiliar in the process safety arena. These methods are discussed for application in improving both process safety leadership and field work team performance. Practical ways to advance process safety, based on the methods, are given.

  19. 7 CFR 981.42 - Quality control.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 8 2012-01-01 2012-01-01 false Quality control. 981.42 Section 981.42 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Marketing Agreements... Regulating Handling Quality Control § 981.42 Quality control. (a) Incoming. Except as provided in this...

  20. 7 CFR 981.42 - Quality control.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 8 2013-01-01 2013-01-01 false Quality control. 981.42 Section 981.42 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (MARKETING AGREEMENTS... Regulating Handling Quality Control § 981.42 Quality control. (a) Incoming. Except as provided in this...

  1. 7 CFR 981.42 - Quality control.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 8 2011-01-01 2011-01-01 false Quality control. 981.42 Section 981.42 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Marketing Agreements... Regulating Handling Quality Control § 981.42 Quality control. (a) Incoming. Except as provided in this...

  2. 7 CFR 981.42 - Quality control.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 8 2010-01-01 2010-01-01 false Quality control. 981.42 Section 981.42 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Marketing Agreements... Regulating Handling Quality Control § 981.42 Quality control. (a) Incoming. Except as provided in this...

  3. Production system with process quality control: modelling and application

    NASA Astrophysics Data System (ADS)

    Tsou, Jia-Chi

    2010-07-01

    Over the past decade, there has been a great deal of research dedicated to the study of quality and the economics of production. In this article, we develop a dynamic model which is based on the hypothesis of a traditional economic production quantity model. Taguchi's cost of poor quality is used to evaluate the cost of poor quality in the dynamic production system. A practical case from the automotive industry, which uses the Six-sigma DMAIC methodology, is discussed to verify the proposed model. This study shows that there is an optimal value of quality investment to make the production system reach a reasonable quality level and minimise the production cost. Based on our model, the management can adjust its investment in quality improvement to generate considerable financial return.

  4. 7 CFR 930.44 - Quality control.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 8 2011-01-01 2011-01-01 false Quality control. 930.44 Section 930.44 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Marketing Agreements... Control § 930.44 Quality control. (a) Quality standards. The Board may establish, with the approval of the...

  5. 7 CFR 930.44 - Quality control.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 8 2010-01-01 2010-01-01 false Quality control. 930.44 Section 930.44 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Marketing Agreements... Control § 930.44 Quality control. (a) Quality standards. The Board may establish, with the approval of the...

  6. 7 CFR 930.44 - Quality control.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 8 2012-01-01 2012-01-01 false Quality control. 930.44 Section 930.44 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Marketing Agreements... Control § 930.44 Quality control. (a) Quality standards. The Board may establish, with the approval of the...

  7. 7 CFR 930.44 - Quality control.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 8 2013-01-01 2013-01-01 false Quality control. 930.44 Section 930.44 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (MARKETING AGREEMENTS... Control § 930.44 Quality control. (a) Quality standards. The Board may establish, with the approval of the...

  8. Project T.E.A.M. (Technical Education Advancement Modules). Advanced Statistical Process Control.

    ERIC Educational Resources Information Center

    Dunlap, Dale

    This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 20-hour advanced statistical process control (SPC) and quality improvement course designed to develop the following competencies: (1) understanding quality systems; (2) knowing the process; (3) solving quality problems; and (4)…

  9. Process Analytical Technology (PAT): batch-to-batch reproducibility of fermentation processes by robust process operational design and control.

    PubMed

    Gnoth, S; Jenzsch, M; Simutis, R; Lübbert, A

    2007-10-31

    The Process Analytical Technology (PAT) initiative of the FDA is a reaction on the increasing discrepancy between current possibilities in process supervision and control of pharmaceutical production processes and its current application in industrial manufacturing processes. With rigid approval practices based on standard operational procedures, adaptations of production reactors towards the state of the art were more or less inhibited for long years. Now PAT paves the way for continuous process and product improvements through improved process supervision based on knowledge-based data analysis, "Quality-by-Design"-concepts, and, finally, through feedback control. Examples of up-to-date implementations of this concept are presented. They are taken from one key group of processes in recombinant pharmaceutical protein manufacturing, the cultivations of genetically modified Escherichia coli bacteria.

  10. Guidelines for the processing and quality assurance of benthic invertebrate samples collected as part of the National Water-Quality Assessment Program

    USGS Publications Warehouse

    Cuffney, T.F.; Gurtz, M.E.; Meador, M.R.

    1993-01-01

    Benthic invertebrate samples are collected as part of the U.S. Geological Survey's National Water-Quality Assessment Program. This is a perennial, multidisciplinary program that integrates biological, physical, and chemical indicators of water quality to evaluate status and trends and to develop an understanding of the factors controlling observed water quality. The Program examines water quality in 60 study units (coupled ground- and surface-water systems) that encompass most of the conterminous United States and parts of Alaska and Hawaii. Study-unit teams collect and process qualitative and semi-quantitative invertebrate samples according to standardized procedures. These samples are processed (elutriated and subsampled) in the field to produce as many as four sample components: large-rare, main-body, elutriate, and split. Each sample component is preserved in 10-percent formalin, and two components, large-rare and main-body, are sent to contract laboratories for further processing. The large-rare component is composed of large invertebrates that are removed from the sample matrix during field processing and placed in one or more containers. The main-body sample component consists of the remaining sample materials (sediment, detritus, and invertebrates) and is subsampled in the field to achieve a volume of 750 milliliters or less. The remaining two sample components, elutriate and split, are used for quality-assurance and quality-control purposes. Contract laboratories are used to identify and quantify invertebrates from the large-rare and main-body sample components according to the procedures and guidelines specified within this document. These guidelines allow the use of subsampling techniques to reduce the volume of sample material processed and to facilitate identifications. These processing procedures and techniques may be modified if the modifications provide equal or greater levels of accuracy and precision. The intent of sample processing is to

  11. Harmonisation Initiatives of Copernicus Data Quality Control

    NASA Astrophysics Data System (ADS)

    Vescovi, F. D.; Lankester, T.; Coleman, E.; Ottavianelli, G.

    2015-04-01

    The Copernicus Space Component Data Access system (CSCDA) incorporates data contributions from a wide range of satellite missions. Through EO data handling and distribution, CSCDA serves a set of Copernicus Services related to Land, Marine and Atmosphere Monitoring, Emergency Management and Security and Climate Change. The quality of the delivered EO products is the responsibility of each contributing mission, and the Copernicus data Quality Control (CQC) service supports and complements such data quality control activities. The mission of the CQC is to provide a service of quality assessment on the provided imagery, to support the investigation related to product quality anomalies, and to guarantee harmonisation and traceability of the quality information. In terms of product quality control, the CQC carries out analysis of representative sample products for each contributing mission as well as coordinating data quality investigation related to issues found or raised by Copernicus users. Results from the product analysis are systematically collected and the derived quality reports stored in a searchable database. The CQC service can be seen as a privileged focal point with unique comparison capacities over the data providers. The comparison among products from different missions suggests the need for a strong, common effort of harmonisation. Technical terms, definitions, metadata, file formats, processing levels, algorithms, cal/val procedures etc. are far from being homogeneous, and this may generate inconsistencies and confusion among users of EO data. The CSCDA CQC team plays a significant role in promoting harmonisation initiatives across the numerous contributing missions, so that a common effort can achieve optimal complementarity and compatibility among the EO data from multiple data providers. This effort is done in coordination with important initiatives already working towards these goals (e.g. INSPIRE directive, CEOS initiatives, OGC standards, QA4EO

  12. 14 CFR 21.139 - Quality control.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Quality control. 21.139 Section 21.139... PROCEDURES FOR PRODUCTS AND PARTS Production Certificates § 21.139 Quality control. The applicant must show that he has established and can maintain a quality control system for any product, for which he...

  13. 14 CFR 21.139 - Quality control.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 1 2011-01-01 2011-01-01 false Quality control. 21.139 Section 21.139... PROCEDURES FOR PRODUCTS AND PARTS Production Certificates § 21.139 Quality control. The applicant must show that he has established and can maintain a quality control system for any product, for which he...

  14. 40 CFR 136.7 - Quality assurance and quality control.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... quality control elements, where applicable, into the laboratory's documented standard operating procedure... quality control elements must be clearly documented in the written standard operating procedure for each... Methods contains QA/QC procedures in the Part 1000 section of the Standard Methods Compendium. The...

  15. 40 CFR 136.7 - Quality assurance and quality control.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... quality control elements, where applicable, into the laboratory's documented standard operating procedure... quality control elements must be clearly documented in the written standard operating procedure for each... Methods contains QA/QC procedures in the Part 1000 section of the Standard Methods Compendium. The...

  16. 40 CFR 136.7 - Quality assurance and quality control.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... quality control elements, where applicable, into the laboratory's documented standard operating procedure... quality control elements must be clearly documented in the written standard operating procedure for each... Methods contains QA/QC procedures in the Part 1000 section of the Standard Methods Compendium. The...

  17. Method for enhanced control of welding processes

    DOEpatents

    Sheaffer, Donald A.; Renzi, Ronald F.; Tung, David M.; Schroder, Kevin

    2000-01-01

    Method and system for producing high quality welds in welding processes, in general, and gas tungsten arc (GTA) welding, in particular by controlling weld penetration. Light emitted from a weld pool is collected from the backside of a workpiece by optical means during welding and transmitted to a digital video camera for further processing, after the emitted light is first passed through a short wavelength pass filter to remove infrared radiation. By filtering out the infrared component of the light emitted from the backside weld pool image, the present invention provides for the accurate determination of the weld pool boundary. Data from the digital camera is fed to an imaging board which focuses on a 100.times.100 pixel portion of the image. The board performs a thresholding operation and provides this information to a digital signal processor to compute the backside weld pool dimensions and area. This information is used by a control system, in a dynamic feedback mode, to automatically adjust appropriate parameters of a welding system, such as the welding current, to control weld penetration and thus, create a uniform weld bead and high quality weld.

  18. A quality-refinement process for medical imaging applications.

    PubMed

    Neuhaus, J; Maleike, D; Nolden, M; Kenngott, H-G; Meinzer, H-P; Wolf, I

    2009-01-01

    To introduce and evaluate a process for refinement of software quality that is suitable to research groups. In order to avoid constraining researchers too much, the quality improvement process has to be designed carefully. The scope of this paper is to present and evaluate a process to advance quality aspects of existing research prototypes in order to make them ready for initial clinical studies. The proposed process is tailored for research environments and therefore more lightweight than traditional quality management processes. Focus on quality criteria that are important at the given stage of the software life cycle. Usage of tools that automate aspects of the process is emphasized. To evaluate the additional effort that comes along with the process, it was exemplarily applied for eight prototypical software modules for medical image processing. The introduced process has been applied to improve the quality of all prototypes so that they could be successfully used in clinical studies. The quality refinement yielded an average of 13 person days of additional effort per project. Overall, 107 bugs were found and resolved by applying the process. Careful selection of quality criteria and the usage of automated process tools lead to a lightweight quality refinement process suitable for scientific research groups that can be applied to ensure a successful transfer of technical software prototypes into clinical research workflows.

  19. Rapid evaluation and quality control of next generation sequencing data with FaQCs.

    PubMed

    Lo, Chien-Chi; Chain, Patrick S G

    2014-11-19

    Next generation sequencing (NGS) technologies that parallelize the sequencing process and produce thousands to millions, or even hundreds of millions of sequences in a single sequencing run, have revolutionized genomic and genetic research. Because of the vagaries of any platform's sequencing chemistry, the experimental processing, machine failure, and so on, the quality of sequencing reads is never perfect, and often declines as the read is extended. These errors invariably affect downstream analysis/application and should therefore be identified early on to mitigate any unforeseen effects. Here we present a novel FastQ Quality Control Software (FaQCs) that can rapidly process large volumes of data, and which improves upon previous solutions to monitor the quality and remove poor quality data from sequencing runs. Both the speed of processing and the memory footprint of storing all required information have been optimized via algorithmic and parallel processing solutions. The trimmed output compared side-by-side with the original data is part of the automated PDF output. We show how this tool can help data analysis by providing a few examples, including an increased percentage of reads recruited to references, improved single nucleotide polymorphism identification as well as de novo sequence assembly metrics. FaQCs combines several features of currently available applications into a single, user-friendly process, and includes additional unique capabilities such as filtering the PhiX control sequences, conversion of FASTQ formats, and multi-threading. The original data and trimmed summaries are reported within a variety of graphics and reports, providing a simple way to do data quality control and assurance.

  20. Study on Quality Standard of Processed Curcuma Longa Radix

    PubMed Central

    Zhao, Yongfeng; Quan, Liang; Zhou, Haiting; Cao, Dong; Li, Wenbing; Yang, Zhuo

    2017-01-01

    To control the quality of Curcuma Longa Radix by establishing quality standards, this paper increased the contents of extract and volatile oil determination. Meanwhile, the curcumin was selected as the internal marker, and the relative correlation factors (RCFs) of demethoxycurcumin and bisdemethoxycurcumin were established by high performance liquid chromatography (HPLC). The contents of multicomponents were calculated based on their RCFs. The rationality and feasibility of the methods were evaluated by comparison of the quantitative results between external standard method (ESM) and quantitative analysis of multicomponents by single-marker (QAMS). Ethanol extracts ranged from 9.749 to 15.644% and the mean value was 13.473%. The volatile oil ranged from 0.45 to 0.90 mL/100 g and the mean value was 0.66 mL/100 g. This method was accurate and feasible and could provide a reference for further comprehensive and effective control of the quality standard of Curcuma Longa Radix and its processed products. PMID:29375640

  1. Quality risk management of top spray fluidized bed process for antihypertensive drug formulation with control strategy engendered by Box-behnken experimental design space

    PubMed Central

    Mukharya, Amit; Patel, Paresh U; Shenoy, Dinesh; Chaudhary, Shivang

    2013-01-01

    Introduction: Lacidipine (LCDP) is a very low soluble and highly biovariable calcium channel blocker used in the treatment of hypertension. To increase its apparent solubility and to reduce its biovariability, solid dispersion fluid bed processing technology was explored, as it produces highly dispersible granules with a characteristic porous structure that enhances dispersibility, wettability, blend uniformity (by dissolving and spraying a solution of actives), flow ability and compressibility of granules for tableting and reducing variability by uniform drug-binder solution distribution on carrier molecules. Materials and Methods: Main object of this quality risk management (QRM) study is to provide a sophisticated “robust and rugged” Fluidized Bed Process (FBP) for the preparation of LCDP tablets with desired quality (stability) and performance (dissolution) by quality by design (QbD) concept. Results and Conclusion: This study is principally focusing on thorough mechanistic understanding of the FBP by which it is developed and scaled up with a knowledge of the critical risks involved in manufacturing process analyzed by risk assessment tools like: Qualitative Initial Risk-based Matrix Analysis (IRMA) and Quantitative Failure Mode Effective Analysis (FMEA) to identify and rank parameters with potential to have an impact on In Process/Finished Product Critical Quality Attributes (IP/FP CQAs). These Critical Process Parameters (CPPs) were further refined by DoE and MVDA to develop design space with Real Time Release Testing (RTRT) that leads to implementation of a control strategy to achieve consistent finished product quality at lab scale itself to prevent possible product failure at larger manufacturing scale. PMID:23799202

  2. The Use of Logistics n the Quality Parameters Control System of Material Flow

    ERIC Educational Resources Information Center

    Karpova, Natalia P.; Toymentseva, Irina A.; Shvetsova, Elena V.; Chichkina, Vera D.; Chubarkova, Elena V.

    2016-01-01

    The relevance of the research problem is conditioned on the need to justify the use of the logistics methodologies in the quality parameters control process of material flows. The goal of the article is to develop theoretical principles and practical recommendations for logistical system control in material flows quality parameters. A leading…

  3. [Application of quality by design in granulation process for ginkgo leaf tablet (Ⅱ): identification of critical quality attributes].

    PubMed

    Xu, Bing; Cui, Xiang-Long; Yang, Chan; Wang, Xin; Shi, Xin-Yuan; Qiao, Yan-Jiang

    2017-03-01

    Quality by design (QbD) highlights the concept of "begin with the end", which means to thoroughly understand the target product quality first, and then guide pharmaceutical process development and quality control throughout the whole manufacturing process. In this paper, the Ginkgo biloba granules intermediates were taken as the research object, and the requirements of the tensile strength of tablets were treated as the goals to establish the methods for identification of granules' critical quality attributes (CQAs) and establishment of CQAs' limits. Firstly, the orthogonal partial least square (OPLS) model was adopted to build the relationship between the micromeritic properties of 29 batches of granules and the tensile strength of ginkgo leaf tablets, and thereby the potential critical quality attributes (pCQAs) were screened by variable importance in the projection (VIP) indexes. Then, a series of OPLS models were rebuilt by reducing pCQAs variables one by one in view of the rule of VIP values from low to high in sequence. The model performance results demonstrated that calibration and predictive performance of the model had no decreasing trend after variables reduction. In consideration of the results from variables selection as well as the collinearity test and testability of the pCQAs, the median particle size (D₅₀) and the bulk density (Da) were identified as critical quality attributes (CQAs). The design space of CQAs was developed based on a multiple linear regression model established between the CQAs (D₅₀ and Da) and the tensile strength. The control constraints of the CQAs were determined as 170 μm< D₅₀<500 μm and 0.30 g•cm⁻³controlling and optimizing the wet granulation process of the ginkgo leaf tablet.. Copyright© by the Chinese Pharmaceutical Association.

  4. A new hyperspectral imaging based device for quality control in plastic recycling

    NASA Astrophysics Data System (ADS)

    Bonifazi, G.; D'Agostini, M.; Dall'Ava, A.; Serranti, S.; Turioni, F.

    2013-05-01

    The quality control of contamination level in the recycled plastics stream has been identified as an important key factor for increasing the value of the recycled material by both plastic recycling and compounder industries. Existing quality control methods for the detection of both plastics and non-plastics contaminants in the plastic waste streams at different stages of the industrial process (e.g. feed, intermediate and final products) are currently based on the manual collection from the stream of a sample and on the subsequent off-line laboratory analyses. The results of such analyses are usually available after some hours, or sometimes even some days, after the material has been processed. The laboratory analyses are time-consuming and expensive (both in terms of equipment cost and their maintenance and of labour cost).Therefore, a fast on-line assessment to monitor the plastic waste feed streams and to characterize the composition of the different plastic products, is fundamental to increase the value of secondary plastics. The paper is finalized to describe and evaluate the development of an HSI-based device and of the related software architectures and processing algorithms for quality assessment of plastics in recycling plants, with particular reference to polyolefins (PO). NIR-HSI sensing devices coupled with multivariate data analysis methods was demonstrated as an objective, rapid and non-destructive technique that can be used for on-line quality and process control in the recycling process of POs. In particular, the adoption of the previous mentioned HD&SW integrated architectures can provide a solution to one of the major problems of the recycling industry, which is the lack of an accurate quality certification of materials obtained by recycling processes. These results could therefore assist in developing strategies to certify the composition of recycled PO products.

  5. 75 FR 17942 - Notice of Proposed Information Collection for Public Comment on the Quality Control for Rental...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-08

    ... Information Collection for Public Comment on the Quality Control for Rental Assistance Subsidy Determinations... respondent burden (e.g., permitting electronic submission of responses). Title of Proposal: Quality Control... covered by the Public Housing and Section 8 housing subsidies. The Quality Control process involves...

  6. MASQOT: a method for cDNA microarray spot quality control

    PubMed Central

    Bylesjö, Max; Eriksson, Daniel; Sjödin, Andreas; Sjöström, Michael; Jansson, Stefan; Antti, Henrik; Trygg, Johan

    2005-01-01

    Background cDNA microarray technology has emerged as a major player in the parallel detection of biomolecules, but still suffers from fundamental technical problems. Identifying and removing unreliable data is crucial to prevent the risk of receiving illusive analysis results. Visual assessment of spot quality is still a common procedure, despite the time-consuming work of manually inspecting spots in the range of hundreds of thousands or more. Results A novel methodology for cDNA microarray spot quality control is outlined. Multivariate discriminant analysis was used to assess spot quality based on existing and novel descriptors. The presented methodology displays high reproducibility and was found superior in identifying unreliable data compared to other evaluated methodologies. Conclusion The proposed methodology for cDNA microarray spot quality control generates non-discrete values of spot quality which can be utilized as weights in subsequent analysis procedures as well as to discard spots of undesired quality using the suggested threshold values. The MASQOT approach provides a consistent assessment of spot quality and can be considered an alternative to the labor-intensive manual quality assessment process. PMID:16223442

  7. System for verifiable CT radiation dose optimization based on image quality. part II. process control system.

    PubMed

    Larson, David B; Malarik, Remo J; Hall, Seth M; Podberesky, Daniel J

    2013-10-01

    To evaluate the effect of an automated computed tomography (CT) radiation dose optimization and process control system on the consistency of estimated image noise and size-specific dose estimates (SSDEs) of radiation in CT examinations of the chest, abdomen, and pelvis. This quality improvement project was determined not to constitute human subject research. An automated system was developed to analyze each examination immediately after completion, and to report individual axial-image-level and study-level summary data for patient size, image noise, and SSDE. The system acquired data for 4 months beginning October 1, 2011. Protocol changes were made by using parameters recommended by the prediction application, and 3 months of additional data were acquired. Preimplementation and postimplementation mean image noise and SSDE were compared by using unpaired t tests and F tests. Common-cause variation was differentiated from special-cause variation by using a statistical process control individual chart. A total of 817 CT examinations, 490 acquired before and 327 acquired after the initial protocol changes, were included in the study. Mean patient age and water-equivalent diameter were 12.0 years and 23.0 cm, respectively. The difference between actual and target noise increased from -1.4 to 0.3 HU (P < .01) and the standard deviation decreased from 3.9 to 1.6 HU (P < .01). Mean SSDE decreased from 11.9 to 7.5 mGy, a 37% reduction (P < .01). The process control chart identified several special causes of variation. Implementation of an automated CT radiation dose optimization system led to verifiable simultaneous decrease in image noise variation and SSDE. The automated nature of the system provides the opportunity for consistent CT radiation dose optimization on a broad scale. © RSNA, 2013.

  8. REDUCING WASTEWATER FROM CUCUMBER PICKLING PROCESS BY CONTROLLED CULTURE FERMENTATION

    EPA Science Inventory

    On a demonstration scale, the controlled culture fermentation process (CCF) developed by the U.S. Food Fermentation Laboratory was compared with the conventional natural fermentation process (NF) in regard to product quality and yield and volume and concentration of wastewaters. ...

  9. QUALITY CONTROL OF PHARMACEUTICALS.

    PubMed

    LEVI, L; WALKER, G C; PUGSLEY, L I

    1964-10-10

    Quality control is an essential operation of the pharmaceutical industry. Drugs must be marketed as safe and therapeutically active formulations whose performance is consistent and predictable. New and better medicinal agents are being produced at an accelerated rate. At the same time more exacting and sophisticated analytical methods are being developed for their evaluation. Requirements governing the quality control of pharmaceuticals in accordance with the Canadian Food and Drugs Act are cited and discussed.

  10. The design of control system of livestock feeding processing

    NASA Astrophysics Data System (ADS)

    Sihombing, Juna; Napitupulu, Humala L.; Hidayati, Juliza

    2018-03-01

    PT. XYZ is a company that produces animal feed. One type of animal feed produced is 105 ISA P. In carrying out its production process, PT. XYZ faces the problem of rejected feed amounts during 2014 to June 2015 due to the amount of animal feed that exceeds the standard feed quality of 13% of moisture content and 3% for ash content. Therefore, the researchers analyzed the relationship between factors affecting the quality and extent of damage by using regression and correlation and determine the optimum value of each processing process. Analysis results found that variables affecting product quality are mixing time, steam conditioning temperature and cooling time. The most dominant variable affecting the product moisture content is mixing time with the correlation coefficient of (0.7959) and the most dominant variable affecting the ash content of the product during the processing is mixing time with the correlation coefficient of (0.8541). The design of the proposed product processing control is to run the product processing process with mixing time 235 seconds, steam conditioning temperature 87 0C and cooling time 192 seconds. Product quality 105 ISA P obtained by using this design is with 12.16% moisture content and ash content of 2.59%.

  11. Expert database system for quality control

    NASA Astrophysics Data System (ADS)

    Wang, Anne J.; Li, Zhi-Cheng

    1993-09-01

    There are more competitors today. Markets are not homogeneous they are fragmented into increasingly focused niches requiring greater flexibility in the product mix shorter manufacturing production runs and above allhigher quality. In this paper the author identified a real-time expert system as a way to improve plantwide quality management. The quality control expert database system (QCEDS) by integrating knowledge of experts in operations quality management and computer systems use all information relevant to quality managementfacts as well as rulesto determine if a product meets quality standards. Keywords: expert system quality control data base

  12. QCloud: A cloud-based quality control system for mass spectrometry-based proteomics laboratories

    PubMed Central

    Chiva, Cristina; Olivella, Roger; Borràs, Eva; Espadas, Guadalupe; Pastor, Olga; Solé, Amanda

    2018-01-01

    The increasing number of biomedical and translational applications in mass spectrometry-based proteomics poses new analytical challenges and raises the need for automated quality control systems. Despite previous efforts to set standard file formats, data processing workflows and key evaluation parameters for quality control, automated quality control systems are not yet widespread among proteomics laboratories, which limits the acquisition of high-quality results, inter-laboratory comparisons and the assessment of variability of instrumental platforms. Here we present QCloud, a cloud-based system to support proteomics laboratories in daily quality assessment using a user-friendly interface, easy setup, automated data processing and archiving, and unbiased instrument evaluation. QCloud supports the most common targeted and untargeted proteomics workflows, it accepts data formats from different vendors and it enables the annotation of acquired data and reporting incidences. A complete version of the QCloud system has successfully been developed and it is now open to the proteomics community (http://qcloud.crg.eu). QCloud system is an open source project, publicly available under a Creative Commons License Attribution-ShareAlike 4.0. PMID:29324744

  13. Groundwater-quality and quality-control data for two monitoring wells near Pavillion, Wyoming, April and May 2012

    USGS Publications Warehouse

    Wright, Peter R.; McMahon, Peter B.; Mueller, David K.; Clark, Melanie L.

    2012-01-01

    In June 2010, the U.S. Environmental Protection Agency installed two deep monitoring wells (MW01 and MW02) near Pavillion, Wyoming, to study groundwater quality. During April and May 2012, the U.S Geological Survey, in cooperation with the Wyoming Department of Environmental Quality, collected groundwater-quality data and quality-control data from monitoring well MW01 and, following well redevelopment, quality-control data for monitoring well MW02. Two groundwater-quality samples were collected from well MW01—one sample was collected after purging about 1.5 borehole volumes, and a second sample was collected after purging 3 borehole volumes. Both samples were collected and processed using methods designed to minimize atmospheric contamination or changes to water chemistry. Groundwater-quality samples were analyzed for field water-quality properties (water temperature, pH, specific conductance, dissolved oxygen, oxidation potential); inorganic constituents including naturally occurring radioactive compounds (radon, radium-226 and radium-228); organic constituents; dissolved gasses; stable isotopes of methane, water, and dissolved inorganic carbon; and environmental tracers (carbon-14, chlorofluorocarbons, sulfur hexafluoride, tritium, helium, neon, argon, krypton, xenon, and the ratio of helium-3 to helium-4). Quality-control sample results associated with well MW01 were evaluated to determine the extent to which environmental sample analytical results were affected by bias and to evaluate the variability inherent to sample collection and laboratory analyses. Field documentation, environmental data, and quality-control data for activities that occurred at the two monitoring wells during April and May 2012 are presented.

  14. Quality Control of Pharmaceuticals

    PubMed Central

    Levi, Leo; Walker, George C.; Pugsley, L. I.

    1964-01-01

    Quality control is an essential operation of the pharmaceutical industry. Drugs must be marketed as safe and therapeutically active formulations whose performance is consistent and predictable. New and better medicinal agents are being produced at an accelerated rate. At the same time more exacting and sophisticated analytical methods are being developed for their evaluation. Requirements governing the quality control of pharmaceuticals in accordance with the Canadian Food and Drugs Act are cited and discussed. PMID:14199105

  15. Adopting Quality Assurance Technology in Customer-Vendor Relationships: A Case Study of How Interorganizational Relationships Influence the Process

    NASA Astrophysics Data System (ADS)

    Heeager, Lise Tordrup; Tjørnehøj, Gitte

    Quality assurance technology is a formal control mechanism aiming at increasing the quality of the product exchanged between vendors and customers. Studies of the adoption of this technology in the field of system development rarely focus on the role of the relationship between the customer and vendor in the process. We have studied how the process of adopting quality assurance technology by a small Danish IT vendor developing pharmacy software for a customer in the public sector was influenced by the relationship with the customer. The case study showed that the adoption process was shaped to a high degree by the relationship and vice versa. The prior high level of trust and mutual knowledge helped the parties negotiate mutually feasible solutions throughout the adoption process. We thus advise enhancing trust-building processes to strengthen the relationships and to balance formal control and social control to increase the likelihood of a successful outcome of the adoption of quality assurance technology in a customer-vendor relationship.

  16. Austrian Daily Climate Data Rescue and Quality Control

    NASA Astrophysics Data System (ADS)

    Jurkovic, A.; Lipa, W.; Adler, S.; Albenberger, J.; Lechner, W.; Swietli, R.; Vossberg, I.; Zehetner, S.

    2010-09-01

    Checked climate datasets are a "conditio sine qua non" for all projects that are relevant for environment and climate. In the framework of climate change studies and analysis it is essential to work with quality controlled and trustful data. Furthermore these datasets are used as input for various simulation models. In regard to investigations of extreme events, like strong precipitation periods, drought periods and similar ones we need climate data in high temporal resolution (at least in daily resolution). Because of the historical background - during Second World War the majority of our climate sheets were sent to Berlin, where the historical sheets were destroyed by a bomb attack and so important information got lost - only several climate sheets, mostly duplicates, before 1939 are available and stored in our climate data archive. In 1970 the Central Institute for Meteorology and Geodynamics in Vienna started a first attempt to digitize climate data by means of punch cards. With the introduction of a routinely climate data quality control in 1984 we can speak of high-class-checked daily data (finally checked data, quality flag 6). Our group is working on the processing of digitization and quality control of the historical data for the period 1872 to 1983 for 18 years. Since 2007 it was possible to intensify the work (processes) in the framework of an internal project, namely Austrian Climate Data Rescue and Quality Control. The aim of this initiative was - and still is - to supply daily data in an outstanding good and uniform quality. So this project is a kind of pre-project for all scientific projects which are working with daily data. In addition to routine quality checks (that are running since 1984) using the commercial Bull Software we are testing our data with additional open source software, namely ProClim.db. By the use of this spatial and statistical test procedure, the elements air temperature and precipitation - for several sites in Carinthia - could

  17. Process perspective on image quality evaluation

    NASA Astrophysics Data System (ADS)

    Leisti, Tuomas; Halonen, Raisa; Kokkonen, Anna; Weckman, Hanna; Mettänen, Marja; Lensu, Lasse; Ritala, Risto; Oittinen, Pirkko; Nyman, Göte

    2008-01-01

    The psychological complexity of multivariate image quality evaluation makes it difficult to develop general image quality metrics. Quality evaluation includes several mental processes and ignoring these processes and the use of a few test images can lead to biased results. By using a qualitative/quantitative (Interpretation Based Quality, IBQ) methodology, we examined the process of pair-wise comparison in a setting, where the quality of the images printed by laser printer on different paper grades was evaluated. Test image consisted of a picture of a table covered with several objects. Three other images were also used, photographs of a woman, cityscape and countryside. In addition to the pair-wise comparisons, observers (N=10) were interviewed about the subjective quality attributes they used in making their quality decisions. An examination of the individual pair-wise comparisons revealed serious inconsistencies in observers' evaluations on the test image content, but not on other contexts. The qualitative analysis showed that this inconsistency was due to the observers' focus of attention. The lack of easily recognizable context in the test image may have contributed to this inconsistency. To obtain reliable knowledge of the effect of image context or attention on subjective image quality, a qualitative methodology is needed.

  18. Using clinical indicators to facilitate quality improvement via the accreditation process: an adaptive study into the control relationship.

    PubMed

    Chuang, Sheuwen; Howley, Peter P; Hancock, Stephen

    2013-07-01

    The aim of the study was to determine accreditation surveyors' and hospitals' use and perceived usefulness of clinical indicator reports and the potential to establish the control relationship between the accreditation and reporting systems. The control relationship refers to instructional directives, arising from appropriately designed methods and efforts towards using clinical indicators, which provide a directed moderating, balancing and best outcome for the connected systems. Web-based questionnaire survey. Australian Council on Healthcare Standards' (ACHS) accreditation and clinical indicator programmes. Seventy-three of 306 surveyors responded. Half used the reports always/most of the time. Five key messages were revealed: (i) report use was related to availability before on-site investigation; (ii) report use was associated with the use of non-ACHS reports; (iii) a clinical indicator set's perceived usefulness was associated with its reporting volume across hospitals; (iv) simpler measures and visual summaries in reports were rated the most useful; (v) reports were deemed to be suitable for the quality and safety objectives of the key groups of interested parties (hospitals' senior executive and management officers, clinicians, quality managers and surveyors). Implementing the control relationship between the reporting and accreditation systems is a promising expectation. Redesigning processes to ensure reports are available in pre-survey packages and refined education of surveyors and hospitals on how to better utilize the reports will support the relationship. Additional studies on the systems' theory-based model of the accreditation and reporting system are warranted to establish the control relationship, building integrated system-wide relationships with sustainable and improved outcomes.

  19. Quality control of mRNP biogenesis: networking at the transcription site.

    PubMed

    Eberle, Andrea B; Visa, Neus

    2014-08-01

    Eukaryotic cells carry out quality control (QC) over the processes of RNA biogenesis to inactivate or eliminate defective transcripts, and to avoid their production. In the case of protein-coding transcripts, the quality controls can sense defects in the assembly of mRNA-protein complexes, in the processing of the precursor mRNAs, and in the sequence of open reading frames. Different types of defect are monitored by different specialized mechanisms. Some of them involve dedicated factors whose function is to identify faulty molecules and target them for degradation. Others are the result of a more subtle balance in the kinetics of opposing activities in the mRNA biogenesis pathway. One way or another, all such mechanisms hinder the expression of the defective mRNAs through processes as diverse as rapid degradation, nuclear retention and transcriptional silencing. Three major degradation systems are responsible for the destruction of the defective transcripts: the exosome, the 5'-3' exoribonucleases, and the nonsense-mediated mRNA decay (NMD) machinery. This review summarizes recent findings on the cotranscriptional quality control of mRNA biogenesis, and speculates that a protein-protein interaction network integrates multiple mRNA degradation systems with the transcription machinery. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Application of Statistical Quality Control Techniques to Detonator Fabrication: Feasibility Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, J. Frank

    1971-05-20

    A feasibility study was performed on the use of process control techniques which might reduce the need for a duplicate inspection by production inspection and quality control inspection. Two active detonator fabrication programs were selected for the study. Inspection areas accounting for the greatest percentage of total inspection costs were selected by applying "Pareto's Principle of Maldistribution." Data from these areas were then gathered and analyzed by a process capabiltiy study.

  1. 30 CFR 28.31 - Quality control plans; contents.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Quality control plans; contents. 28.31 Section... PROTECTION FOR TRAILING CABLES IN COAL MINES Quality Control § 28.31 Quality control plans; contents. (a) Each quality control plan shall contain provisions for the management of quality, including: (1...

  2. Towards an integrated quality control procedure for eddy-covariance data

    NASA Astrophysics Data System (ADS)

    Vitale, Domenico; Papale, Dario

    2017-04-01

    The eddy-covariance technique is nowadays the most reliable and direct way, allowing to calculate the main fluxes of Sensible and Latent Heat and of Net Ecosystem Exchange, this last being the result of the difference between the CO2 assimilated by photosynthetic activities and those released to the atmosphere through the ecosystem respiration processes. Despite the improvements in accuracy of measurement instruments and software development, the eddy-covariance technique is not suitable under non-ideal conditions respect to the instruments characteristics and the physical assumption behind the technique mainly related to the well-developed and stationary turbulence conditions. Under these conditions the calculated fluxes are not reliable and need to be flagged and discarded. In order to discover these unavoidable "bad" fluxes and build dataset with the highest quality, several tests applied both on high-frequency (10-20 Hz) raw data and on half-hourly times series have been developed in the past years. Nevertheless, there is an increasing need to develop a standardized quality control procedure suitable not only for the analysis of long-term data, but also for the near-real time data processing. In this paper, we review established quality assessment procedures and present an innovative quality control strategy with the purpose of integrating the existing consolidated procedures with robust and advanced statistical tests more suitable for the analysis of time series data. The performance of the proposed quality control strategy is evaluated both on simulated and EC data distributed by the ICOS research infrastructure. It is concluded that the proposed strategy is able to flag and exclude unrealistic fluxes while being reproducible and retaining the largest possible amount of high quality data.

  3. [Highly quality-controlled radiation therapy].

    PubMed

    Shirato, Hiroki

    2005-04-01

    Advanced radiation therapy for intracranial disease has focused on set-up accuracy for the past 15 years. However, quality control in the prescribed dose is actually as important as the tumor set-up in radiation therapy. Because of the complexity of the three-dimensional radiation treatment planning system in recent years, the highly quality-controlled prescription of the dose has now been reappraised as the mainstream to improve the treatment outcome of radiation therapy for intracranial disease. The Japanese Committee for Quality Control of Radiation Therapy has developed fundamental requirements such as a QC committee in each hospital, a medical physicist, dosimetrists (QC members), and an external audit.

  4. Air Quality Management Process Cycle

    EPA Pesticide Factsheets

    Air quality management are activities a regulatory authority undertakes to protect human health and the environment from the harmful effects of air pollution. The process of managing air quality can be illustrated as a cycle of inter-related elements.

  5. ECG compression using non-recursive wavelet transform with quality control

    NASA Astrophysics Data System (ADS)

    Liu, Je-Hung; Hung, King-Chu; Wu, Tsung-Ching

    2016-09-01

    While wavelet-based electrocardiogram (ECG) data compression using scalar quantisation (SQ) yields excellent compression performance, a wavelet's SQ scheme, however, must select a set of multilevel quantisers for each quantisation process. As a result of the properties of multiple-to-one mapping, however, this scheme is not conducive for reconstruction error control. In order to address this problem, this paper presents a single-variable control SQ scheme able to guarantee the reconstruction quality of wavelet-based ECG data compression. Based on the reversible round-off non-recursive discrete periodised wavelet transform (RRO-NRDPWT), the SQ scheme is derived with a three-stage design process that first uses genetic algorithm (GA) for high compression ratio (CR), followed by a quadratic curve fitting for linear distortion control, and the third uses a fuzzy decision-making for minimising data dependency effect and selecting the optimal SQ. The two databases, Physikalisch-Technische Bundesanstalt (PTB) and Massachusetts Institute of Technology (MIT) arrhythmia, are used to evaluate quality control performance. Experimental results show that the design method guarantees a high compression performance SQ scheme with statistically linear distortion. This property can be independent of training data and can facilitate rapid error control.

  6. Adopting software quality measures for healthcare processes.

    PubMed

    Yildiz, Ozkan; Demirörs, Onur

    2009-01-01

    In this study, we investigated the adoptability of software quality measures for healthcare process measurement. Quality measures of ISO/IEC 9126 are redefined from a process perspective to build a generic healthcare process quality measurement model. Case study research method is used, and the model is applied to a public hospital's Entry to Care process. After the application, weak and strong aspects of the process can be easily observed. Access audibility, fault removal, completeness of documentation, and machine utilization are weak aspects and these aspects are the candidates for process improvement. On the other hand, functional completeness, fault ratio, input validity checking, response time, and throughput time are the strong aspects of the process.

  7. 30 CFR 74.6 - Quality control.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Quality control. 74.6 Section 74.6 Mineral... of the CMDPSU will be maintained in production through adequate quality control procedures, MSHA and... DUST SAMPLING DEVICES Approval Requirements for Coal Mine Dust Personal Sampler Unit § 74.6 Quality...

  8. Internal quality control: best practice.

    PubMed

    Kinns, Helen; Pitkin, Sarah; Housley, David; Freedman, Danielle B

    2013-12-01

    There is a wide variation in laboratory practice with regard to implementation and review of internal quality control (IQC). A poor approach can lead to a spectrum of scenarios from validation of incorrect patient results to over investigation of falsely rejected analytical runs. This article will provide a practical approach for the routine clinical biochemistry laboratory to introduce an efficient quality control system that will optimise error detection and reduce the rate of false rejection. Each stage of the IQC system is considered, from selection of IQC material to selection of IQC rules, and finally the appropriate action to follow when a rejection signal has been obtained. The main objective of IQC is to ensure day-to-day consistency of an analytical process and thus help to determine whether patient results are reliable enough to be released. The required quality and assay performance varies between analytes as does the definition of a clinically significant error. Unfortunately many laboratories currently decide what is clinically significant at the troubleshooting stage. Assay-specific IQC systems will reduce the number of inappropriate sample-run rejections compared with the blanket use of one IQC rule. In practice, only three or four different IQC rules are required for the whole of the routine biochemistry repertoire as assays are assigned into groups based on performance. The tools to categorise performance and assign IQC rules based on that performance are presented. Although significant investment of time and education is required prior to implementation, laboratories have shown that such systems achieve considerable reductions in cost and labour.

  9. [Role of medical information processing for quality assurance in obstetrics].

    PubMed

    Selbmann, H K

    1983-06-01

    The paradigma of problem-orientated assuring of the professional quality of medical case is a kind of "control loop system" consisting of the following 5 steps: routine observation, identification of the problem, analysis of the problem, translation of problem solutions into daily practice and control as to whether the problem has been solved or eliminated. Medical data processing, which involves documentation, electronic data processing and statistics, can make substantial contributions especially to the steps of observation, identification of the problem, and follow-up control. Perinatal data collection, which has already been introduced in 6 Länder of the Federal Republic of Germany, has supplied ample proof of this. These operations were conducted under the heading "internal clinical assuring of quality with external aid". Those clinics who participated in this programme, were given the necessary aid in self-observation (questionnaires, clinical statistics), and they were also given comparative informative data to help them in identifying the problems (clinical profiles, etc.). It is entirely left to the responsibility of the clinics themselves--voluntary cooperation and guarantee of remaining anonymous being a matter of course -- to draw their own consequences from the collected data and to translate these into clinical everyday practice.

  10. Quality transitivity and traceability system of herbal medicine products based on quality markers.

    PubMed

    Liu, Changxiao; Guo, De-An; Liu, Liang

    2018-05-15

    Due to a variety of factors to affect the herb quality, the existing quality management model is unable to evaluate the process control. The development of the concept of "quality marker" (Q-marker) lays basis for establishing an independent process quality control system for herbal products. To ensure the highest degree of safety, effectiveness and quality process control of herbal products, it is aimed to establish a quality transitivity and traceability system of quality and process control from raw materials to finished herbal products. Based on the key issues and challenges of quality assessment, the current status of quality and process controls from raw materials to herbal medicinal products listed in Pharmacopoeia were analyzed and the research models including discovery and identification of Q-markers, analysis and quality management of risk evaluation were designed. Authors introduced a few new technologies and methodologies, such as DNA barcoding, chromatographic technologies, fingerprint analysis, chemical markers, bio-responses, risk management and solution for quality process control. The quality and process control models for herbal medicinal products were proposed and the transitivity and traceability system from raw materials to the finished products was constructed to improve the herbal quality from the entire supply and production chain. The transitivity and traceability system has been established based on quality markers, especially on how to control the production process under Good Engineering Practices, as well as to implement the risk management for quality and process control in herbal medicine production. Copyright © 2018 Elsevier GmbH. All rights reserved.

  11. Process control charts in infection prevention: Make it simple to make it happen.

    PubMed

    Wiemken, Timothy L; Furmanek, Stephen P; Carrico, Ruth M; Mattingly, William A; Persaud, Annuradha K; Guinn, Brian E; Kelley, Robert R; Ramirez, Julio A

    2017-03-01

    Quality improvement is central to Infection Prevention and Control (IPC) programs. Challenges may occur when applying quality improvement methodologies like process control charts, often due to the limited exposure of typical IPs. Because of this, our team created an open-source database with a process control chart generator for IPC programs. The objectives of this report are to outline the development of the application and demonstrate application using simulated data. We used Research Electronic Data Capture (REDCap Consortium, Vanderbilt University, Nashville, TN), R (R Foundation for Statistical Computing, Vienna, Austria), and R Studio Shiny (R Foundation for Statistical Computing) to create an open source data collection system with automated process control chart generation. We used simulated data to test and visualize both in-control and out-of-control processes for commonly used metrics in IPC programs. The R code for implementing the control charts and Shiny application can be found on our Web site (https://github.com/ul-research-support/spcapp). Screen captures of the workflow and simulated data indicating both common cause and special cause variation are provided. Process control charts can be easily developed based on individual facility needs using freely available software. Through providing our work free to all interested parties, we hope that others will be able to harness the power and ease of use of the application for improving the quality of care and patient safety in their facilities. Copyright © 2017 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  12. Distributed sensor architecture for intelligent control that supports quality of control and quality of service.

    PubMed

    Poza-Lujan, Jose-Luis; Posadas-Yagüe, Juan-Luis; Simó-Ten, José-Enrique; Simarro, Raúl; Benet, Ginés

    2015-02-25

    This paper is part of a study of intelligent architectures for distributed control and communications systems. The study focuses on optimizing control systems by evaluating the performance of middleware through quality of service (QoS) parameters and the optimization of control using Quality of Control (QoC) parameters. The main aim of this work is to study, design, develop, and evaluate a distributed control architecture based on the Data-Distribution Service for Real-Time Systems (DDS) communication standard as proposed by the Object Management Group (OMG). As a result of the study, an architecture called Frame-Sensor-Adapter to Control (FSACtrl) has been developed. FSACtrl provides a model to implement an intelligent distributed Event-Based Control (EBC) system with support to measure QoS and QoC parameters. The novelty consists of using, simultaneously, the measured QoS and QoC parameters to make decisions about the control action with a new method called Event Based Quality Integral Cycle. To validate the architecture, the first five Braitenberg vehicles have been implemented using the FSACtrl architecture. The experimental outcomes, demonstrate the convenience of using jointly QoS and QoC parameters in distributed control systems.

  13. Distributed Sensor Architecture for Intelligent Control that Supports Quality of Control and Quality of Service

    PubMed Central

    Poza-Lujan, Jose-Luis; Posadas-Yagüe, Juan-Luis; Simó-Ten, José-Enrique; Simarro, Raúl; Benet, Ginés

    2015-01-01

    This paper is part of a study of intelligent architectures for distributed control and communications systems. The study focuses on optimizing control systems by evaluating the performance of middleware through quality of service (QoS) parameters and the optimization of control using Quality of Control (QoC) parameters. The main aim of this work is to study, design, develop, and evaluate a distributed control architecture based on the Data-Distribution Service for Real-Time Systems (DDS) communication standard as proposed by the Object Management Group (OMG). As a result of the study, an architecture called Frame-Sensor-Adapter to Control (FSACtrl) has been developed. FSACtrl provides a model to implement an intelligent distributed Event-Based Control (EBC) system with support to measure QoS and QoC parameters. The novelty consists of using, simultaneously, the measured QoS and QoC parameters to make decisions about the control action with a new method called Event Based Quality Integral Cycle. To validate the architecture, the first five Braitenberg vehicles have been implemented using the FSACtrl architecture. The experimental outcomes, demonstrate the convenience of using jointly QoS and QoC parameters in distributed control systems. PMID:25723145

  14. Multicapillary SDS-gel electrophoresis for the analysis of fluorescently labeled mAb preparations: a high throughput quality control process for the production of QuantiPlasma and PlasmaScan mAb libraries.

    PubMed

    Székely, Andrea; Szekrényes, Akos; Kerékgyártó, Márta; Balogh, Attila; Kádas, János; Lázár, József; Guttman, András; Kurucz, István; Takács, László

    2014-08-01

    Molecular heterogeneity of mAb preparations is the result of various co- and post-translational modifications and to contaminants related to the production process. Changes in molecular composition results in alterations of functional performance, therefore quality control and validation of therapeutic or diagnostic protein products is essential. A special case is the consistent production of mAb libraries (QuantiPlasma™ and PlasmaScan™) for proteome profiling, quality control of which represents a challenge because of high number of mAbs (>1000). Here, we devise a generally applicable multicapillary SDS-gel electrophoresis process for the analysis of fluorescently labeled mAb preparations for the high throughput quality control of mAbs of the QuantiPlasma™ and PlasmaScan™ libraries. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Quality and Control of Water Vapor Winds

    NASA Technical Reports Server (NTRS)

    Jedlovec, Gary J.; Atkinson, Robert J.

    1996-01-01

    Water vapor imagery from the geostationary satellites such as GOES, Meteosat, and GMS provides synoptic views of dynamical events on a continual basis. Because the imagery represents a non-linear combination of mid- and upper-tropospheric thermodynamic parameters (three-dimensional variations in temperature and humidity), video loops of these image products provide enlightening views of regional flow fields, the movement of tropical and extratropical storm systems, the transfer of moisture between hemispheres and from the tropics to the mid- latitudes, and the dominance of high pressure systems over particular regions of the Earth. Despite the obvious larger scale features, the water vapor imagery contains significant image variability down to the single 8 km GOES pixel. These features can be quantitatively identified and tracked from one time to the next using various image processing techniques. Merrill et al. (1991), Hayden and Schmidt (1992), and Laurent (1993) have documented the operational procedures and capabilities of NOAA and ESOC to produce cloud and water vapor winds. These techniques employ standard correlation and template matching approaches to wind tracking and use qualitative and quantitative procedures to eliminate bad wind vectors from the wind data set. Techniques have also been developed to improve the quality of the operational winds though robust editing procedures (Hayden and Veldon 1991). These quality and control approaches have limitations, are often subjective, and constrain wind variability to be consistent with model derived wind fields. This paper describes research focused on the refinement of objective quality and control parameters for water vapor wind vector data sets. New quality and control measures are developed and employed to provide a more robust wind data set for climate analysis, data assimilation studies, as well as operational weather forecasting. The parameters are applicable to cloud-tracked winds as well with minor

  16. Automated quality control in a file-based broadcasting workflow

    NASA Astrophysics Data System (ADS)

    Zhang, Lina

    2014-04-01

    Benefit from the development of information and internet technologies, television broadcasting is transforming from inefficient tape-based production and distribution to integrated file-based workflows. However, no matter how many changes have took place, successful broadcasting still depends on the ability to deliver a consistent high quality signal to the audiences. After the transition from tape to file, traditional methods of manual quality control (QC) become inadequate, subjective, and inefficient. Based on China Central Television's full file-based workflow in the new site, this paper introduces an automated quality control test system for accurate detection of hidden troubles in media contents. It discusses the system framework and workflow control when the automated QC is added. It puts forward a QC criterion and brings forth a QC software followed this criterion. It also does some experiments on QC speed by adopting parallel processing and distributed computing. The performance of the test system shows that the adoption of automated QC can make the production effective and efficient, and help the station to achieve a competitive advantage in the media market.

  17. 23 CFR 1340.8 - Quality control.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... OBSERVATIONAL SURVEYS OF SEAT BELT USE Survey Design Requirements § 1340.8 Quality control. (a) Quality control... control monitors involved in seat belt use surveys shall have received training in data collection...) Statistical review. Survey results shall be reviewed and approved by a survey statistician, i.e., a person...

  18. 23 CFR 1340.8 - Quality control.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... OBSERVATIONAL SURVEYS OF SEAT BELT USE Survey Design Requirements § 1340.8 Quality control. (a) Quality control... control monitors involved in seat belt use surveys shall have received training in data collection...) Statistical review. Survey results shall be reviewed and approved by a survey statistician, i.e., a person...

  19. 23 CFR 1340.8 - Quality control.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... OBSERVATIONAL SURVEYS OF SEAT BELT USE Survey Design Requirements § 1340.8 Quality control. (a) Quality control... control monitors involved in seat belt use surveys shall have received training in data collection...) Statistical review. Survey results shall be reviewed and approved by a survey statistician, i.e., a person...

  20. [Application of traditional Chinese medicine reference standards in quality control of Chinese herbal pieces].

    PubMed

    Lu, Tu-Lin; Li, Jin-Ci; Yu, Jiang-Yong; Cai, Bao-Chang; Mao, Chun-Qin; Yin, Fang-Zhou

    2014-01-01

    Traditional Chinese medicine (TCM) reference standards plays an important role in the quality control of Chinese herbal pieces. This paper overviewed the development of TCM reference standards. By analyzing the 2010 edition of Chinese pharmacopoeia, the application of TCM reference standards in the quality control of Chinese herbal pieces was summarized, and the problems exiting in the system were put forward. In the process of improving the quality control level of Chinese herbal pieces, various kinds of advanced methods and technology should be used to research the characteristic reference standards of Chinese herbal pieces, more and more reasonable reference standards should be introduced in the quality control system of Chinese herbal pieces. This article discussed the solutions in the aspect of TCM reference standards, and future development of quality control on Chinese herbal pieces is prospected.

  1. Statistical quality control charts for liver transplant process indicators: evaluation of a single-center experience.

    PubMed

    Varona, M A; Soriano, A; Aguirre-Jaime, A; Barrera, M A; Medina, M L; Bañon, N; Mendez, S; Lopez, E; Portero, J; Dominguez, D; Gonzalez, A

    2012-01-01

    Liver transplantation, the best option for many end-stage liver diseases, is indicated in more candidates than the donor availability. In this situation, this demanding treatment must achieve excellence, accessibility and patient satisfaction to be ethical, scientific, and efficient. The current consensus of quality measurements promoted by the Sociedad Española de Trasplante Hepático (SETH) seeks to depict criteria, indicators, and standards for liver transplantation in Spain. According to this recommendation, the Canary Islands liver program has studied its experience. We separated the 411 cadaveric transplants performed in the last 15 years into 2 groups: The first 100 and the other 311. The 8 criteria of SETH 2010 were correctly fulfilled. In most indicators, the outcomes were favorable, with an actuarial survivals at 1, 3, 5, and 10 years of 84%, 79%, 76%, and 65%, respectively; excellent results in retransplant rates (early 0.56% and long-term 5.9%), primary nonfunction rate (0.43%), waiting list mortality (13.34%), and patient satisfaction (91.5%). On the other hand, some indicators of mortality were worse as perioperative, postoperative, and early mortality with normal graft function and reoperation rate. After the analyses of the series with statistical quality control charts, we observed an improvement in all indicators, even in the apparently worst, early mortality with normal graft functions in a stable program. Such results helped us to discover specific areas to improve the program. The application of the quality measurement, as SETH consensus recommends, has shown in our study that despite being a consuming time process, it is a useful tool. Copyright © 2012 Elsevier Inc. All rights reserved.

  2. 42 CFR 84.41 - Quality control plans; contents.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 1 2014-10-01 2014-10-01 false Quality control plans; contents. 84.41 Section 84... AND HEALTH RESEARCH AND RELATED ACTIVITIES APPROVAL OF RESPIRATORY PROTECTIVE DEVICES Quality Control § 84.41 Quality control plans; contents. (a) Each quality control plan shall contain provisions for the...

  3. 42 CFR 84.41 - Quality control plans; contents.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 1 2012-10-01 2012-10-01 false Quality control plans; contents. 84.41 Section 84... AND HEALTH RESEARCH AND RELATED ACTIVITIES APPROVAL OF RESPIRATORY PROTECTIVE DEVICES Quality Control § 84.41 Quality control plans; contents. (a) Each quality control plan shall contain provisions for the...

  4. 42 CFR 84.41 - Quality control plans; contents.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 1 2013-10-01 2013-10-01 false Quality control plans; contents. 84.41 Section 84... AND HEALTH RESEARCH AND RELATED ACTIVITIES APPROVAL OF RESPIRATORY PROTECTIVE DEVICES Quality Control § 84.41 Quality control plans; contents. (a) Each quality control plan shall contain provisions for the...

  5. 42 CFR 84.41 - Quality control plans; contents.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Quality control plans; contents. 84.41 Section 84... AND HEALTH RESEARCH AND RELATED ACTIVITIES APPROVAL OF RESPIRATORY PROTECTIVE DEVICES Quality Control § 84.41 Quality control plans; contents. (a) Each quality control plan shall contain provisions for the...

  6. 42 CFR 84.41 - Quality control plans; contents.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 1 2011-10-01 2011-10-01 false Quality control plans; contents. 84.41 Section 84... AND HEALTH RESEARCH AND RELATED ACTIVITIES APPROVAL OF RESPIRATORY PROTECTIVE DEVICES Quality Control § 84.41 Quality control plans; contents. (a) Each quality control plan shall contain provisions for the...

  7. Flight-Test Validation and Flying Qualities Evaluation of a Rotorcraft UAV Flight Control System

    NASA Technical Reports Server (NTRS)

    Mettler, Bernard; Tuschler, Mark B.; Kanade, Takeo

    2000-01-01

    This paper presents a process of design and flight-test validation and flying qualities evaluation of a flight control system for a rotorcraft-based unmanned aerial vehicle (RUAV). The keystone of this process is an accurate flight-dynamic model of the aircraft, derived by using system identification modeling. The model captures the most relevant dynamic features of our unmanned rotorcraft, and explicitly accounts for the presence of a stabilizer bar. Using the identified model we were able to determine the performance margins of our original control system and identify limiting factors. The performance limitations were addressed and the attitude control system was 0ptimize.d for different three performance levels: slow, medium, fast. The optimized control laws will be implemented in our RUAV. We will first determine the validity of our control design approach by flight test validating our optimized controllers. Subsequently, we will fly a series of maneuvers with the three optimized controllers to determine the level of flying qualities that can be attained. The outcome enable us to draw important conclusions on the flying qualities requirements for small-scale RUAVs.

  8. Rapid evaluation and quality control of next generation sequencing data with FaQCs

    DOE PAGES

    Lo, Chien -Chi; Chain, Patrick S. G.

    2014-12-01

    Background: Next generation sequencing (NGS) technologies that parallelize the sequencing process and produce thousands to millions, or even hundreds of millions of sequences in a single sequencing run, have revolutionized genomic and genetic research. Because of the vagaries of any platform's sequencing chemistry, the experimental processing, machine failure, and so on, the quality of sequencing reads is never perfect, and often declines as the read is extended. These errors invariably affect downstream analysis/application and should therefore be identified early on to mitigate any unforeseen effects. Results: Here we present a novel FastQ Quality Control Software (FaQCs) that can rapidly processmore » large volumes of data, and which improves upon previous solutions to monitor the quality and remove poor quality data from sequencing runs. Both the speed of processing and the memory footprint of storing all required information have been optimized via algorithmic and parallel processing solutions. The trimmed output compared side-by-side with the original data is part of the automated PDF output. We show how this tool can help data analysis by providing a few examples, including an increased percentage of reads recruited to references, improved single nucleotide polymorphism identification as well as de novo sequence assembly metrics. Conclusion: FaQCs combines several features of currently available applications into a single, user-friendly process, and includes additional unique capabilities such as filtering the PhiX control sequences, conversion of FASTQ formats, and multi-threading. The original data and trimmed summaries are reported within a variety of graphics and reports, providing a simple way to do data quality control and assurance.« less

  9. 14 CFR 145.211 - Quality control system.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 3 2011-01-01 2011-01-01 false Quality control system. 145.211 Section 145...) SCHOOLS AND OTHER CERTIFICATED AGENCIES REPAIR STATIONS Operating Rules § 145.211 Quality control system. (a) A certificated repair station must establish and maintain a quality control system acceptable to...

  10. 18 CFR 12.40 - Quality control programs.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Quality control... PROJECT WORKS Other Responsibilities of Applicant or Licensee § 12.40 Quality control programs. (a... meeting any requirements or standards set by the Regional Engineer. If a quality control program is...

  11. 18 CFR 12.40 - Quality control programs.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Quality control... PROJECT WORKS Other Responsibilities of Applicant or Licensee § 12.40 Quality control programs. (a... meeting any requirements or standards set by the Regional Engineer. If a quality control program is...

  12. 14 CFR 145.211 - Quality control system.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 3 2010-01-01 2010-01-01 false Quality control system. 145.211 Section 145...) SCHOOLS AND OTHER CERTIFICATED AGENCIES REPAIR STATIONS Operating Rules § 145.211 Quality control system. (a) A certificated repair station must establish and maintain a quality control system acceptable to...

  13. A Taguchi approach on optimal process control parameters for HDPE pipe extrusion process

    NASA Astrophysics Data System (ADS)

    Sharma, G. V. S. S.; Rao, R. Umamaheswara; Rao, P. Srinivasa

    2017-06-01

    High-density polyethylene (HDPE) pipes find versatile applicability for transportation of water, sewage and slurry from one place to another. Hence, these pipes undergo tremendous pressure by the fluid carried. The present work entails the optimization of the withstanding pressure of the HDPE pipes using Taguchi technique. The traditional heuristic methodology stresses on a trial and error approach and relies heavily upon the accumulated experience of the process engineers for determining the optimal process control parameters. This results in setting up of less-than-optimal values. Hence, there arouse a necessity to determine optimal process control parameters for the pipe extrusion process, which can ensure robust pipe quality and process reliability. In the proposed optimization strategy, the design of experiments (DoE) are conducted wherein different control parameter combinations are analyzed by considering multiple setting levels of each control parameter. The concept of signal-to-noise ratio ( S/ N ratio) is applied and ultimately optimum values of process control parameters are obtained as: pushing zone temperature of 166 °C, Dimmer speed at 08 rpm, and Die head temperature to be 192 °C. Confirmation experimental run is also conducted to verify the analysis and research result and values proved to be in synchronization with the main experimental findings and the withstanding pressure showed a significant improvement from 0.60 to 1.004 Mpa.

  14. Inhibitory Control Mediates the Association between Perceived Stress and Secure Relationship Quality.

    PubMed

    Herd, Toria; Li, Mengjiao; Maciejewski, Dominique; Lee, Jacob; Deater-Deckard, Kirby; King-Casas, Brooks; Kim-Spoon, Jungmeen

    2018-01-01

    Past research has demonstrated negative associations between exposure to stressors and quality of interpersonal relationships among children and adolescents. Nevertheless, underlying mechanisms of this association remain unclear. Chronic stress has been shown to disrupt prefrontal functioning in the brain, including inhibitory control abilities, and evidence is accumulating that inhibitory control may play an important role in secure interpersonal relationship quality, including peer problems and social competence. In this prospective longitudinal study, we examine whether changes in inhibitory control, measured at both behavioral and neural levels, mediate the association between stress and changes in secure relationship quality with parents and peers. The sample included 167 adolescents (53% males) who were first recruited at age 13 or 14 years and assessed annually three times. Adolescents' inhibitory control was measured by their behavioral performance and brain activities, and adolescents self-reported perceived stress levels and relationship quality with mothers, fathers, and peers. Results suggest that behavioral inhibitory control mediates the association between perceived stress and adolescent's secure relationship quality with their mothers and fathers, but not their peers. In contrast, given that stress was not significantly correlated with neural inhibitory control, we did not further test the mediation path. Our results highlight the role of inhibitory control as a process through which stressful life experiences are related to impaired secure relationship quality between adolescents and their mothers and fathers.

  15. Inhibitory Control Mediates the Association between Perceived Stress and Secure Relationship Quality

    PubMed Central

    Herd, Toria; Li, Mengjiao; Maciejewski, Dominique; Lee, Jacob; Deater-Deckard, Kirby; King-Casas, Brooks; Kim-Spoon, Jungmeen

    2018-01-01

    Past research has demonstrated negative associations between exposure to stressors and quality of interpersonal relationships among children and adolescents. Nevertheless, underlying mechanisms of this association remain unclear. Chronic stress has been shown to disrupt prefrontal functioning in the brain, including inhibitory control abilities, and evidence is accumulating that inhibitory control may play an important role in secure interpersonal relationship quality, including peer problems and social competence. In this prospective longitudinal study, we examine whether changes in inhibitory control, measured at both behavioral and neural levels, mediate the association between stress and changes in secure relationship quality with parents and peers. The sample included 167 adolescents (53% males) who were first recruited at age 13 or 14 years and assessed annually three times. Adolescents’ inhibitory control was measured by their behavioral performance and brain activities, and adolescents self-reported perceived stress levels and relationship quality with mothers, fathers, and peers. Results suggest that behavioral inhibitory control mediates the association between perceived stress and adolescent’s secure relationship quality with their mothers and fathers, but not their peers. In contrast, given that stress was not significantly correlated with neural inhibitory control, we did not further test the mediation path. Our results highlight the role of inhibitory control as a process through which stressful life experiences are related to impaired secure relationship quality between adolescents and their mothers and fathers. PMID:29535664

  16. Quality Management and Control of Low Pressure Cast Aluminum Alloy

    NASA Astrophysics Data System (ADS)

    Zhang, Dianxi; Zhang, Yanbo; Yang, Xiufan; Chen, Zhaosong; Jiang, Zelan

    2018-01-01

    This paper briefly reviews the history of low pressure casting and summarizes the major production processes of low pressure casting. It briefly introduces the quality management and control of low pressure cast aluminum alloy. The main processes include are: preparation of raw materials, Melting, refining, physical and chemical analysis, K-mode inspection, sand core, mold, heat treatment and so on.

  17. Quality Control and Peer Review of Data Sets: Mapping Data Archiving Processes to Data Publication Requirements

    NASA Astrophysics Data System (ADS)

    Mayernik, M. S.; Daniels, M.; Eaker, C.; Strand, G.; Williams, S. F.; Worley, S. J.

    2012-12-01

    Data sets exist within scientific research and knowledge networks as both technical and non-technical entities. Establishing the quality of data sets is a multi-faceted task that encompasses many automated and manual processes. Data sets have always been essential for science research, but now need to be more visible as first-class scholarly objects at national, international, and local levels. Many initiatives are establishing procedures to publish and curate data sets, as well as to promote professional rewards for researchers that collect, create, manage, and preserve data sets. Traditionally, research quality has been assessed by peer review of textual publications, e.g. journal articles, conference proceedings, and books. Citation indices then provide standard measures of productivity used to reward individuals for their peer-reviewed work. Whether a similar peer review process is appropriate for assessing and ensuring the quality of data sets remains as an open question. How does the traditional process of peer review apply to data sets? This presentation will describe current work being done at the National Center for Atmospheric Research (NCAR) in the context of the Peer REview for Publication & Accreditation of Research Data in the Earth sciences (PREPARDE) project. PREPARDE is assessing practices and processes for data peer review, with the goal of developing recommendations. NCAR data management teams perform various kinds of quality assessment and review of data sets prior to making them publicly available. The poster will investigate how notions of peer review relate to the types of data review already in place at NCAR. We highlight the data set characteristics and management/archiving processes that challenge the traditional peer review processes by using a number of questions as probes, including: Who is qualified to review data sets? What formal and informal documentation is necessary to allow someone outside of a research team to review a data set

  18. HDP for the Neutralized pH Value Control in the Clarifying Process of Sugar Cane Juice

    NASA Astrophysics Data System (ADS)

    Lin, Xiaofeng; Yang, Jiaran

    2009-05-01

    Neutralizing pH value of sugar cane juice is the important craft in the control process in the clarifying process of sugar cane juice, which is the important factor to influence output and the quality of white sugar. On the one hand, it is an important content to control the neutralized pH value within a required range, which has the vital significance for acquiring high quality purified juice, reducing energy consumption and raising sucrose recovery. On the other hand, it is a complicated physical-chemistry process, which has the characteristics of strong non-linearity, time-varying, large time-delay, and multi-input. Therefore, there has not been a very good solution to control the neutralized pH value. Firstly, in this chapter, a neural network model for the clarifying process of sugar juice is established based on gathering 1200 groups of real-time sample data in a sugar factory. Then, the HDP (Heuristic Dynamic Programming) method is used to optimize and control the neutralized pH value in the clarifying process of sugar juice. Simulation results indicate that this method has good control effect. This will build a good foundation for stabilizing the clarifying process and enhancing the quality of the purified juice and lastly enhancing the quality of white sugar.

  19. Assessing the structure of non-routine decision processes in Airline Operations Control.

    PubMed

    Richters, Floor; Schraagen, Jan Maarten; Heerkens, Hans

    2016-03-01

    Unfamiliar severe disruptions challenge Airline Operations Control professionals most, as their expertise is stretched to its limits. This study has elicited the structure of Airline Operations Control professionals' decision process during unfamiliar disruptions by mapping three macrocognitive activities on the decision ladder: sensemaking, option evaluation and action planning. The relationship between this structure and decision quality was measured. A simulated task was staged, based on which think-aloud protocols were obtained. Results show that the general decision process structure resembles the structure of experts working under routine conditions, in terms of the general structure of the macrocognitive activities, and the rule-based approach used to identify options and actions. Surprisingly, high quality of decision outcomes was found to relate to the use of rule-based strategies. This implies that successful professionals are capable of dealing with unfamiliar problems by reframing them into familiar ones, rather than to engage in knowledge-based processing. Practitioner Summary: We examined the macrocognitive structure of Airline Operations Control professionals' decision process during a simulated unfamiliar disruption in relation to decision quality. Results suggest that successful professionals are capable of dealing with unfamiliar problems by reframing them into familiar ones, rather than to engage in knowledge-based processing.

  20. Total Quality Management Implementation Strategy: Directorate of Quality Assurance

    DTIC Science & Technology

    1989-05-01

    Total Quality Control Harrington, H. James The Improvement Process Imai, Masaaki Kaizen Ishikawa , Kaoru What is Total Quality Control Ishikawa ... Kaoru Statistical Quality Control Juran, J. M. Managerial Breakthrough Juran, J. M. Quality Control Handbook Mizuno, Ed Managing for Quality Improvements

  1. Data Assimilation Experiments using Quality Controlled AIRS Version 5 Temperature Soundings

    NASA Technical Reports Server (NTRS)

    Susskind, Joel

    2008-01-01

    The AIRS Science Team Version 5 retrieval algorithm has been finalized and is now operational at the Goddard DAAC in the processing (and reprocessing) of all AlRS data. Version 5 contains accurate case-by-case error estimates for most derived products, which are also used for quality control. We have conducted forecast impact experiments assimilating AlRS quality controlled temperature profiles using the NASA GEOS-5 data assimilation system, consisting of the NCEP GSI analysis coupled with the NASA FVGCM. Assimilation of quality controlled temperature profiles resulted in significantly improved forecast skill in both the Northern Hemisphere and Southern Hemisphere Extra-Tropics, compared to that obtained from analyses obtained when all data used operationally by NCEP except for AlRS data is assimilated. Experiments using different Quality Control thresholds for assimilation of AlRS temperature retrievals showed that a medium quality control threshold performed better than a tighter threshold, which provided better overall sounding accuracy; or a looser threshold, which provided better spatial coverage of accepted soundings. We are conducting more experiments to further optimize this balance of spatial coverage and sounding accuracy from the data assimilation perspective. In all cases, temperature soundings were assimilated well below cloud level in partially cloudy cases. The positive impact of assimilating AlRS derived atmospheric temperatures all but vanished when only AIRS stratospheric temperatures were assimilated. Forecast skill resulting from assimilation of AlRS radiances uncontaminated by clouds, instead of AlRS temperature soundings, was only slightly better than that resulting from assimilation of only stratospheric AlRS temperatures. This reduction in forecast skill is most likely the result of significant loss of tropospheric information when only AIRS radiances unaffected by clouds are used in the data assimilation process.

  2. Quality Control in construction.

    DTIC Science & Technology

    1984-01-01

    behavioral scientists. In 1962, Dr. Kaoru Ishikawa gave shape to the form of training which featured intradepartmental groups of ten or so workers seated...and Japanese circles bears closer scrutiny. 4.3.1 Japanese Ingredients of Quality The founder of quality circles, Dr. Kaoru Ishikawa , gives six...around 51 a table; hence the name Quality Control Circle. 4 Dr. 0 Ishikawa was an engineering professor at Tokyo University, and the circles were

  3. Quality status display for a vibration welding process

    DOEpatents

    Spicer, John Patrick; Abell, Jeffrey A.; Wincek, Michael Anthony; Chakraborty, Debejyo; Bracey, Jennifer; Wang, Hui; Tavora, Peter W.; Davis, Jeffrey S.; Hutchinson, Daniel C.; Reardon, Ronald L.; Utz, Shawn

    2017-03-28

    A system includes a host machine and a status projector. The host machine is in electrical communication with a collection of sensors and with a welding controller that generates control signals for controlling the welding horn. The host machine is configured to execute a method to thereby process the sensory and control signals, as well as predict a quality status of a weld that is formed using the welding horn, including identifying any suspect welds. The host machine then activates the status projector to illuminate the suspect welds. This may occur directly on the welds using a laser projector, or on a surface of the work piece in proximity to the welds. The system and method may be used in the ultrasonic welding of battery tabs of a multi-cell battery pack in a particular embodiment. The welding horn and welding controller may also be part of the system.

  4. 30 CFR 74.6 - Quality control.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 30 Mineral Resources 1 2013-07-01 2013-07-01 false Quality control. 74.6 Section 74.6 Mineral... control. The applicant shall describe the way in which each lot of components will be sampled and tested... of the CMDPSU will be maintained in production through adequate quality control procedures, MSHA and...

  5. 30 CFR 74.6 - Quality control.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 30 Mineral Resources 1 2014-07-01 2014-07-01 false Quality control. 74.6 Section 74.6 Mineral... control. The applicant shall describe the way in which each lot of components will be sampled and tested... of the CMDPSU will be maintained in production through adequate quality control procedures, MSHA and...

  6. 30 CFR 74.6 - Quality control.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 30 Mineral Resources 1 2012-07-01 2012-07-01 false Quality control. 74.6 Section 74.6 Mineral... control. The applicant shall describe the way in which each lot of components will be sampled and tested... of the CMDPSU will be maintained in production through adequate quality control procedures, MSHA and...

  7. Level of structural quality and process quality in rural preschool classrooms

    PubMed Central

    Hartman, Suzanne C.; Warash, Barbara G.; Curtis, Reagan; Hirst, Jessica Day

    2017-01-01

    Preschool classrooms with varying levels of structural quality requirements across the state of West Virginia were investigated for differences in measured structural and process quality. Quality was measured using group size, child-to-teacher/staff ratio, teacher education, and the Early Childhood Environmental Rating Scale-Revised (ECERS-R; Harms, T., Clifford, R. M., & Cryer, D. (2005). The early childhood environment rating scale-revised. New York, NY: Teachers College Press). Thirty-six classrooms with less structural quality requirements and 136 with more structural quality requirements were measured. There were significant differences between classroom type, with classrooms with more structural quality requirements having significantly higher teacher education levels and higher environmental rating scores on the ECERS-R subscales of Space and Furnishings, Activities, and Program Structure. Results support previous research that stricter structural state regulations are correlated with higher measured structural and process quality in preschool classrooms. Implications for preschool state quality standards are discussed. PMID:29056814

  8. A Process Analytical Technology (PAT) approach to control a new API manufacturing process: development, validation and implementation.

    PubMed

    Schaefer, Cédric; Clicq, David; Lecomte, Clémence; Merschaert, Alain; Norrant, Edith; Fotiadu, Frédéric

    2014-03-01

    Pharmaceutical companies are progressively adopting and introducing Process Analytical Technology (PAT) and Quality-by-Design (QbD) concepts promoted by the regulatory agencies, aiming the building of the quality directly into the product by combining thorough scientific understanding and quality risk management. An analytical method based on near infrared (NIR) spectroscopy was developed as a PAT tool to control on-line an API (active pharmaceutical ingredient) manufacturing crystallization step during which the API and residual solvent contents need to be precisely determined to reach the predefined seeding point. An original methodology based on the QbD principles was designed to conduct the development and validation of the NIR method and to ensure that it is fitted for its intended use. On this basis, Partial least squares (PLS) models were developed and optimized using chemometrics methods. The method was fully validated according to the ICH Q2(R1) guideline and using the accuracy profile approach. The dosing ranges were evaluated to 9.0-12.0% w/w for the API and 0.18-1.50% w/w for the residual methanol. As by nature the variability of the sampling method and the reference method are included in the variability obtained for the NIR method during the validation phase, a real-time process monitoring exercise was performed to prove its fit for purpose. The implementation of this in-process control (IPC) method on the industrial plant from the launch of the new API synthesis process will enable automatic control of the final crystallization step in order to ensure a predefined quality level of the API. In addition, several valuable benefits are expected including reduction of the process time, suppression of a rather difficult sampling and tedious off-line analyses. © 2013 Published by Elsevier B.V.

  9. Quality Assurance and Quality Control, Part 2.

    PubMed

    Akers, Michael J

    2015-01-01

    The tragedy surrounding the New England Compounding Center and contaminated steroid syringe preparations clearly points out what can happen if quality-assurance and quality-control procedures are not strictly practiced in the compounding of sterile preparations. This article is part 2 of a two-part article on requirements to comply with United States Pharmacopeia general chapters <797> and <1163> with respect to quality assurance of compounded sterile preparations. Part 1 covered documentation requirements, inspection procedures, compounding accuracy checks, and part of a discussion on bacterial endotoxin testing. Part 2 covers sterility testing, the completion from part 1 on bacterial endotoxin testing, a brief dicussion of United States Pharmacopeia <1163>, and advances in pharmaceutical quality systems.

  10. Guidance for Efficient Small Animal Imaging Quality Control.

    PubMed

    Osborne, Dustin R; Kuntner, Claudia; Berr, Stuart; Stout, David

    2017-08-01

    Routine quality control is a critical aspect of properly maintaining high-performance small animal imaging instrumentation. A robust quality control program helps produce more reliable data both for academic purposes and as proof of system performance for contract imaging work. For preclinical imaging laboratories, the combination of costs and available resources often limits their ability to produce efficient and effective quality control programs. This work presents a series of simplified quality control procedures that are accessible to a wide range of preclinical imaging laboratories. Our intent is to provide minimum guidelines for routine quality control that can assist preclinical imaging specialists in setting up an appropriate quality control program for their facility.

  11. 42 CFR 84.256 - Quality control requirements.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 1 2013-10-01 2013-10-01 false Quality control requirements. 84.256 Section 84.256... § 84.256 Quality control requirements. (a) In addition to the construction and performance requirements specified in §§ 84.251, 84.252, 84.253, 84.254, and 84.255, the quality control requirements in paragraphs...

  12. 42 CFR 84.256 - Quality control requirements.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 1 2012-10-01 2012-10-01 false Quality control requirements. 84.256 Section 84.256... § 84.256 Quality control requirements. (a) In addition to the construction and performance requirements specified in §§ 84.251, 84.252, 84.253, 84.254, and 84.255, the quality control requirements in paragraphs...

  13. 42 CFR 84.256 - Quality control requirements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Quality control requirements. 84.256 Section 84.256... § 84.256 Quality control requirements. (a) In addition to the construction and performance requirements specified in §§ 84.251, 84.252, 84.253, 84.254, and 84.255, the quality control requirements in paragraphs...

  14. 42 CFR 84.256 - Quality control requirements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 1 2011-10-01 2011-10-01 false Quality control requirements. 84.256 Section 84.256... § 84.256 Quality control requirements. (a) In addition to the construction and performance requirements specified in §§ 84.251, 84.252, 84.253, 84.254, and 84.255, the quality control requirements in paragraphs...

  15. 42 CFR 84.256 - Quality control requirements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 1 2014-10-01 2014-10-01 false Quality control requirements. 84.256 Section 84.256... § 84.256 Quality control requirements. (a) In addition to the construction and performance requirements specified in §§ 84.251, 84.252, 84.253, 84.254, and 84.255, the quality control requirements in paragraphs...

  16. [Strategies and development of quality assurance and control in the ELSA-Brasil].

    PubMed

    Schmidt, Maria Inês; Griep, Rosane Härter; Passos, Valéria Maria; Luft, Vivian Cristine; Goulart, Alessandra Carvalho; Menezes, Greice Maria de Souza; Molina, Maria del Carmen Bisi; Vigo, Alvaro; Nunes, Maria Angélica

    2013-06-01

    The ELSA-Brasil (Estudo Longitudinal de Saúde do Adulto - Brazilian Longitudinal Study for Adult Health) is a cohort study composed of 15,105 adults followed up in order to assess the development of chronic diseases, especially diabetes and cardiovascular disease. Its size, multicenter nature and the diversity of measurements required effective and efficient mechanisms of quality assurance and control. The main quality assurance activities (those developed before data collection) were: careful selection of research instruments, centralized training and certification, pretesting and pilot studies, and preparation of operation manuals for the procedures. Quality control activities (developed during data collection and processing) were performed more intensively at the beginning, when routines had not been established yet. The main quality control activities were: periodic observation of technicians, test-retest studies, data monitoring, network of supervisors, and cross visits. Data that estimate the reliability of the obtained information attest that the quality goals have been achieved.

  17. Scheduling algorithms for automatic control systems for technological processes

    NASA Astrophysics Data System (ADS)

    Chernigovskiy, A. S.; Tsarev, R. Yu; Kapulin, D. V.

    2017-01-01

    Wide use of automatic process control systems and the usage of high-performance systems containing a number of computers (processors) give opportunities for creation of high-quality and fast production that increases competitiveness of an enterprise. Exact and fast calculations, control computation, and processing of the big data arrays - all of this requires the high level of productivity and, at the same time, minimum time of data handling and result receiving. In order to reach the best time, it is necessary not only to use computing resources optimally, but also to design and develop the software so that time gain will be maximal. For this purpose task (jobs or operations), scheduling techniques for the multi-machine/multiprocessor systems are applied. Some of basic task scheduling methods for the multi-machine process control systems are considered in this paper, their advantages and disadvantages come to light, and also some usage considerations, in case of the software for automatic process control systems developing, are made.

  18. Remote Sensing Image Quality Assessment Experiment with Post-Processing

    NASA Astrophysics Data System (ADS)

    Jiang, W.; Chen, S.; Wang, X.; Huang, Q.; Shi, H.; Man, Y.

    2018-04-01

    This paper briefly describes the post-processing influence assessment experiment, the experiment includes three steps: the physical simulation, image processing, and image quality assessment. The physical simulation models sampled imaging system in laboratory, the imaging system parameters are tested, the digital image serving as image processing input are produced by this imaging system with the same imaging system parameters. The gathered optical sampled images with the tested imaging parameters are processed by 3 digital image processes, including calibration pre-processing, lossy compression with different compression ratio and image post-processing with different core. Image quality assessment method used is just noticeable difference (JND) subject assessment based on ISO20462, through subject assessment of the gathered and processing images, the influence of different imaging parameters and post-processing to image quality can be found. The six JND subject assessment experimental data can be validated each other. Main conclusions include: image post-processing can improve image quality; image post-processing can improve image quality even with lossy compression, image quality with higher compression ratio improves less than lower ratio; with our image post-processing method, image quality is better, when camera MTF being within a small range.

  19. Validation of gamma irradiator controls for quality and regulatory compliance

    NASA Astrophysics Data System (ADS)

    Harding, Rorry B.; Pinteric, Francis J. A.

    1995-09-01

    Since 1978 the U.S. Food and Drug Administration (FDA) has had both the legal authority and the Current Good Manufacturing Practice (CGMP) regulations in place to require irradiator owners who process medical devices to produce evidence of Irradiation Process Validation. One of the key components of Irradiation Process Validation is the validation of the irradiator controls. However, it is only recently that FDA audits have focused on this component of the process validation. What is Irradiator Control System Validation? What constitutes evidence of control? How do owners obtain evidence? What is the irradiator supplier's role in validation? How does the ISO 9000 Quality Standard relate to the FDA's CGMP requirement for evidence of Control System Validation? This paper presents answers to these questions based on the recent experiences of Nordion's engineering and product management staff who have worked with several US-based irradiator owners. This topic — Validation of Irradiator Controls — is a significant regulatory compliance and operations issue within the irradiator suppliers' and users' community.

  20. ATAD control goals through the analysis of process variables and evaluation of quality, production and cost.

    PubMed

    Nájera, S; Gil-Martínez, M; Zambrano, J A

    2015-01-01

    The aim of this paper is to establish and quantify different operational goals and control strategies in autothermal thermophilic aerobic digestion (ATAD). This technology appears as an alternative to conventional sludge digestion systems. During the batch-mode reaction, high temperatures promote sludge stabilization and pasteurization. The digester temperature is usually the only online, robust, measurable variable. The average temperature can be regulated by manipulating both the air injection and the sludge retention time. An improved performance of diverse biochemical variables can be achieved through proper manipulation of these inputs. However, a better quality of treated sludge usually implies major operating costs or a lower production rate. Thus, quality, production and cost indices are defined to quantify the outcomes of the treatment. Based on these, tradeoff control strategies are proposed and illustrated through some examples. This paper's results are relevant to guide plant operators, to design automatic control systems and to compare or evaluate the control performance on ATAD systems.

  1. Quality by design for herbal drugs: a feedforward control strategy and an approach to define the acceptable ranges of critical quality attributes.

    PubMed

    Yan, Binjun; Li, Yao; Guo, Zhengtai; Qu, Haibin

    2014-01-01

    The concept of quality by design (QbD) has been widely accepted and applied in the pharmaceutical manufacturing industry. There are still two key issues to be addressed in the implementation of QbD for herbal drugs. The first issue is the quality variation of herbal raw materials and the second issue is the difficulty in defining the acceptable ranges of critical quality attributes (CQAs). To propose a feedforward control strategy and a method for defining the acceptable ranges of CQAs for the two issues. In the case study of the ethanol precipitation process of Danshen (Radix Salvia miltiorrhiza) injection, regression models linking input material attributes and process parameters to CQAs were built first and an optimisation model for calculating the best process parameters according to the input materials was established. Then, the feasible material space was defined and the acceptable ranges of CQAs for the previous process were determined. In the case study, satisfactory regression models were built with cross-validated regression coefficients (Q(2) ) all above 91 %. The feedforward control strategy was applied successfully to compensate the quality variation of the input materials, which was able to control the CQAs in the 90-110 % ranges of the desired values. In addition, the feasible material space for the ethanol precipitation process was built successfully, which showed the acceptable ranges of the CQAs for the concentration process. The proposed methodology can help to promote the implementation of QbD for herbal drugs. Copyright © 2013 John Wiley & Sons, Ltd.

  2. In-process and post-process measurements of drill wear for control of the drilling process

    NASA Astrophysics Data System (ADS)

    Liu, Tien-I.; Liu, George; Gao, Zhiyu

    2011-12-01

    Optical inspection was used in this research for the post-process measurements of drill wear. A precision toolmakers" microscope was used. Indirect index, cutting force, is used for in-process drill wear measurements. Using in-process measurements to estimate the drill wear for control purpose can decrease the operation cost and enhance the product quality and safety. The challenge is to correlate the in-process cutting force measurements with the post-process optical inspection of drill wear. To find the most important feature, the energy principle was used in this research. It is necessary to select only the cutting force feature which shows the highest sensitivity to drill wear. The best feature selected is the peak of torque in the drilling process. Neuro-fuzzy systems were used for correlation purposes. The Adaptive-Network-Based Fuzzy Inference System (ANFIS) can construct fuzzy rules with membership functions to generate an input-output pair. A 1x6 ANFIS architecture with product of sigmoid membership functions can in-process measure the drill wear with an error as low as 0.15%. This is extremely important for control of the drilling process. Furthermore, the measurement of drill wear was performed under different drilling conditions. This shows that ANFIS has the capability of generalization.

  3. Development of Process Analytical Technology (PAT) methods for controlled release pellet coating.

    PubMed

    Avalle, P; Pollitt, M J; Bradley, K; Cooper, B; Pearce, G; Djemai, A; Fitzpatrick, S

    2014-07-01

    This work focused on the control of the manufacturing process for a controlled release (CR) pellet product, within a Quality by Design (QbD) framework. The manufacturing process was Wurster coating: firstly layering active pharmaceutical ingredient (API) onto sugar pellet cores and secondly a controlled release (CR) coating. For each of these two steps, development of a Process Analytical Technology (PAT) method is discussed and also a novel application of automated microscopy as the reference method. Ultimately, PAT methods should link to product performance and the two key Critical Quality Attributes (CQAs) for this CR product are assay and release rate, linked to the API and CR coating steps respectively. In this work, the link between near infra-red (NIR) spectra and those attributes was explored by chemometrics over the course of the coating process in a pilot scale industrial environment. Correlations were built between the NIR spectra and coating weight (for API amount), CR coating thickness and dissolution performance. These correlations allow the coating process to be monitored at-line and so better control of the product performance in line with QbD requirements. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. [Practice report: the process-based indicator dashboard. Visualising quality assurance results in standardised processes].

    PubMed

    Petzold, Thomas; Hertzschuch, Diana; Elchlep, Frank; Eberlein-Gonska, Maria

    2014-01-01

    Process management (PM) is a valuable method for the systematic analysis and structural optimisation of the quality and safety of clinical treatment. PM requires a high motivation and willingness to implement changes of both employees and management. Definition of quality indicators is required to systematically measure the quality of the specified processes. One way to represent comparable quality results is the use of quality indicators of the external quality assurance in accordance with Sect. 137 SGB V—a method which the Federal Joint Committee (GBA) and the institutions commissioned by the GBA have employed and consistently enhanced for more than ten years. Information on the quality of inpatient treatment is available for 30 defined subjects throughout Germany. The combination of specified processes with quality indicators is beneficial for the information of employees. A process-based indicator dashboard provides essential information about the treatment process. These can be used for process analysis. In a continuous consideration of these indicator results values can be determined and errors will be remedied quickly. If due consideration is given to these indicators, they can be used for benchmarking to identify potential process improvements. Copyright © 2014. Published by Elsevier GmbH.

  5. The Individualized Quality Control Plan - Coming Soon to Clinical Microbiology Laboratories Everywhere!

    PubMed

    Anderson, Nancy

    2015-11-15

    As of January 1, 2016, microbiology laboratories can choose to adopt a new quality control option, the Individualized Quality Control Plan (IQCP), under the Clinical Laboratory Improvement Amendments of 1988 (CLIA). This voluntary approach increases flexibility for meeting regulatory requirements and provides laboratories the opportunity to customize QC for their testing in their unique environments and by their testing personnel. IQCP is an all-inclusive approach to quality based on risk management to address potential errors in the total testing process. It includes three main steps, (1) performing a risk assessment, (2) developing a QC plan, and (3) monitoring the plan through quality assessment. Resources are available from the Centers for Medicare & Medicaid Services, Centers for Disease Control and Prevention, American Society for Microbiology, Clinical and Laboratory Standards Institute, and accrediting organizations, such as the College of American Pathologists and Joint Commission, to assist microbiology laboratories implementing IQCP.

  6. 21 CFR 106.1 - Status and applicability of the quality control procedures regulation.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION INFANT FORMULA QUALITY CONTROL PROCEDURES... infant formula meets the safety, quality, and nutrient requirements of section 412 of the act and the..., processing, and packaging of an infant formula shall render such formula adulterated under section 412(a)(1...

  7. 21 CFR 106.1 - Status and applicability of the quality control procedures regulation.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION INFANT FORMULA QUALITY CONTROL PROCEDURES... infant formula meets the safety, quality, and nutrient requirements of section 412 of the act and the..., processing, and packaging of an infant formula shall render such formula adulterated under section 412(a)(1...

  8. Human Connectome Project Informatics: quality control, database services, and data visualization

    PubMed Central

    Marcus, Daniel S.; Harms, Michael P.; Snyder, Abraham Z.; Jenkinson, Mark; Wilson, J Anthony; Glasser, Matthew F.; Barch, Deanna M.; Archie, Kevin A.; Burgess, Gregory C.; Ramaratnam, Mohana; Hodge, Michael; Horton, William; Herrick, Rick; Olsen, Timothy; McKay, Michael; House, Matthew; Hileman, Michael; Reid, Erin; Harwell, John; Coalson, Timothy; Schindler, Jon; Elam, Jennifer S.; Curtiss, Sandra W.; Van Essen, David C.

    2013-01-01

    The Human Connectome Project (HCP) has developed protocols, standard operating and quality control procedures, and a suite of informatics tools to enable high throughput data collection, data sharing, automated data processing and analysis, and data mining and visualization. Quality control procedures include methods to maintain data collection consistency over time, to measure head motion, and to establish quantitative modality-specific overall quality assessments. Database services developed as customizations of the XNAT imaging informatics platform support both internal daily operations and open access data sharing. The Connectome Workbench visualization environment enables user interaction with HCP data and is increasingly integrated with the HCP's database services. Here we describe the current state of these procedures and tools and their application in the ongoing HCP study. PMID:23707591

  9. ITS data quality control and the calculation of mobility performance measures

    DOT National Transportation Integrated Search

    2000-09-01

    This report describes the results of research on the use of intelligent transportation system (ITS) data in calculating mobility performance measures for ITS operations. The report also describes a data quality control process developed for the Trans...

  10. 7 CFR 58.928 - Quality control tests.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... quality test shall be made to determine product stability. ... 7 Agriculture 3 2013-01-01 2013-01-01 false Quality control tests. 58.928 Section 58.928... Procedures § 58.928 Quality control tests. All dairy products and other ingredients shall be subject to...

  11. 7 CFR 58.928 - Quality control tests.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... quality test shall be made to determine product stability. ... 7 Agriculture 3 2012-01-01 2012-01-01 false Quality control tests. 58.928 Section 58.928... Procedures § 58.928 Quality control tests. All dairy products and other ingredients shall be subject to...

  12. Effects of processing conditions on mammographic image quality.

    PubMed

    Braeuning, M P; Cooper, H W; O'Brien, S; Burns, C B; Washburn, D B; Schell, M J; Pisano, E D

    1999-08-01

    Any given mammographic film will exhibit changes in sensitometric response and image resolution as processing variables are altered. Developer type, immersion time, and temperature have been shown to affect the contrast of the mammographic image and thus lesion visibility. The authors evaluated the effect of altering processing variables, including film type, developer type, and immersion time, on the visibility of masses, fibrils, and speaks in a standard mammographic phantom. Images of a phantom obtained with two screen types (Kodak Min-R and Fuji) and five film types (Kodak Min-R M, Min-R E, Min-R H; Fuji UM-MA HC, and DuPont Microvision-C) were processed with five different developer chemicals (Autex SE, DuPont HSD, Kodak RP, Picker 3-7-90, and White Mountain) at four different immersion times (24, 30, 36, and 46 seconds). Processor chemical activity was monitored with sensitometric strips, and developer temperatures were continuously measured. The film images were reviewed by two board-certified radiologists and two physicists with expertise in mammography quality control and were scored based on the visibility of calcifications, masses, and fibrils. Although the differences in the absolute scores were not large, the Kodak Min-R M and Fuji films exhibited the highest scores, and images developed in White Mountain and Autex chemicals exhibited the highest scores. For any film, several processing chemicals may be used to produce images of similar quality. Extended processing may no longer be necessary.

  13. Intelligent process control of fiber chemical vapor deposition

    NASA Astrophysics Data System (ADS)

    Jones, John Gregory

    Chemical Vapor Deposition (CVD) is a widely used process for the application of thin films. In this case, CVD is being used to apply a thin film interface coating to single crystal monofilament sapphire (Alsb2Osb3) fibers for use in Ceramic Matrix Composites (CMC's). The hot-wall reactor operates at near atmospheric pressure which is maintained using a venturi pump system. Inert gas seals obviate the need for a sealed system. A liquid precursor delivery system has been implemented to provide precise stoichiometry control. Neural networks have been implemented to create real-time process description models trained using data generated based on a Navier-Stokes finite difference model of the process. Automation of the process to include full computer control and data logging capability is also presented. In situ sensors including a quadrupole mass spectrometer, thermocouples, laser scanner, and Raman spectrometer have been implemented to determine the gas phase reactants and coating quality. A fuzzy logic controller has been developed to regulate either the gas phase or the in situ temperature of the reactor using oxygen flow rate as an actuator. Scanning electron microscope (SEM) images of various samples are shown. A hierarchical control structure upon which the control structure is based is also presented.

  14. Online sensing and control of oil in process wastewater

    NASA Astrophysics Data System (ADS)

    Khomchenko, Irina B.; Soukhomlinoff, Alexander D.; Mitchell, T. F.; Selenow, Alexander E.

    2002-02-01

    Industrial processes, which eliminate high concentration of oil in their waste stream, find it extremely difficult to measure and control the water purification process. Most oil separation processes involve chemical separation using highly corrosive caustics, acids, surfactants, and emulsifiers. Included in the output of this chemical treatment process are highly adhesive tar-like globules, emulsified and surface oils, and other emulsified chemicals, in addition to suspended solids. The level of oil/hydrocarbons concentration in the wastewater process may fluctuate from 1 ppm to 10,000 ppm, depending upon the specifications of the industry and level of water quality control. The authors have developed a sensing technology, which provides the accuracy of scatter/absorption sensing in a contactless environment by combining these methodologies with reflective measurement. The sensitivity of the sensor may be modified by changing the fluid level control in the flow cell, allowing for a broad range of accurate measurement from 1 ppm to 10,000 ppm. Because this sensing system has been designed to work in a highly invasive environment, it can be placed close to the process source to allow for accurate real time measurement and control.

  15. 7 CFR 981.442 - Quality control.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 8 2012-01-01 2012-01-01 false Quality control. 981.442 Section 981.442 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Marketing Agreements... Administrative Rules and Regulations § 981.442 Quality control. (a) Incoming. Pursuant to § 981.42(a), the...

  16. 7 CFR 981.442 - Quality control.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 8 2013-01-01 2013-01-01 false Quality control. 981.442 Section 981.442 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (MARKETING AGREEMENTS... Administrative Rules and Regulations § 981.442 Quality control. (a) Incoming. Pursuant to § 981.42(a), the...

  17. 7 CFR 981.442 - Quality control.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 8 2011-01-01 2011-01-01 false Quality control. 981.442 Section 981.442 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Marketing Agreements... Administrative Rules and Regulations § 981.442 Quality control. (a) Incoming. Pursuant to § 981.42(a), the...

  18. The Validity of Higher-Order Questions as a Process Indicator of Educational Quality

    ERIC Educational Resources Information Center

    Renaud, Robert D.; Murray, Harry G.

    2007-01-01

    One way to assess the quality of education in post-secondary institutions is through the use of performance indicators. Studies that have compared currently popular process indicators (e.g., library size, percentage of faculty with PhD) found that after controlling for incoming student ability, these process indicators tend to be weakly associated…

  19. Using Statistical Process Control to Drive Improvement in Neonatal Care: A Practical Introduction to Control Charts.

    PubMed

    Gupta, Munish; Kaplan, Heather C

    2017-09-01

    Quality improvement (QI) is based on measuring performance over time, and variation in data measured over time must be understood to guide change and make optimal improvements. Common cause variation is natural variation owing to factors inherent to any process; special cause variation is unnatural variation owing to external factors. Statistical process control methods, and particularly control charts, are robust tools for understanding data over time and identifying common and special cause variation. This review provides a practical introduction to the use of control charts in health care QI, with a focus on neonatology. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Flight Dynamics Mission Support and Quality Assurance Process

    NASA Technical Reports Server (NTRS)

    Oh, InHwan

    1996-01-01

    This paper summarizes the method of the Computer Sciences Corporation Flight Dynamics Operation (FDO) quality assurance approach to support the National Aeronautics and Space Administration Goddard Space Flight Center Flight Dynamics Support Branch. Historically, a strong need has existed for developing systematic quality assurance using methods that account for the unique nature and environment of satellite Flight Dynamics mission support. Over the past few years FDO has developed and implemented proactive quality assurance processes applied to each of the six phases of the Flight Dynamics mission support life cycle: systems and operations concept, system requirements and specifications, software development support, operations planing and training, launch support, and on-orbit mission operations. Rather than performing quality assurance as a final step after work is completed, quality assurance has been built in as work progresses in the form of process assurance. Process assurance activities occur throughout the Flight Dynamics mission support life cycle. The FDO Product Assurance Office developed process checklists for prephase process reviews, mission team orientations, in-progress reviews, and end-of-phase audits. This paper will outline the evolving history of FDO quality assurance approaches, discuss the tailoring of Computer Science Corporations's process assurance cycle procedures, describe some of the quality assurance approaches that have been or are being developed, and present some of the successful results.

  1. A conceptual study of automatic and semi-automatic quality assurance techniques for round image processing

    NASA Technical Reports Server (NTRS)

    1983-01-01

    This report summarizes the results of a study conducted by Engineering and Economics Research (EER), Inc. under NASA Contract Number NAS5-27513. The study involved the development of preliminary concepts for automatic and semiautomatic quality assurance (QA) techniques for ground image processing. A distinction is made between quality assessment and the more comprehensive quality assurance which includes decision making and system feedback control in response to quality assessment.

  2. Contamination control methods for gases used in the microlithography process

    NASA Astrophysics Data System (ADS)

    Rabellino, Larry; Applegarth, Chuck; Vergani, Giorgio

    2002-07-01

    Sensitivity to contamination continues to increase as the technology shrinks from 365 nm I-line lamp illumination to 13.4 nm Extreme Ultraviolet laser activated plasma. Gas borne impurities can be readily distributed within the system, remaining both suspended in the gas and attached to critical surfaces. Effects from a variety of contamination, some well characterized and others not, remain a continuing obstacle for stepper manufacturers and users. Impurities like oxygen, moisture and hydrocarbons in parts per billion levels can absorb light, reducing the light intensity and subsequently reducing the consistence of the process. Moisture, sulfur compounds, ammonia, acid compounds and organic compounds such as hydrocarbons can deposit on lens or mirror surfaces affecting image quality. Regular lens replacement or removal for cleaning is a costly option and in-situ cleaning processes must be carefully managed to avoid recontamination of the system. The contamination can come from outside the controlled environment (local gas supply, piping system, & leaks), or from the materials moving into the controlled environment; or contamination may be generated inside the controlled environment as a result of the process itself. The release of amines can occur as a result of the degassing of the photo-resists. For the manufacturer and user of stepper equipment, the challenge is not in predictable contamination, but the variable or unpredictable contamination in the process. One type of unpredictable contamination may be variation in the environmental conditions when producing the nitrogen gas and Clean Dry Air (CDA). Variation in the CDA, nitrogen and xenon may range from parts per billion to parts per million. The risk due to uncontrolled or unmonitored variation in gas quality can be directly related to product defects. Global location can significantly affect the gas quality, due to the ambient air quality (for nitrogen and CDA), production methods, gas handling equipment

  3. Contributions of CCLM to advances in quality control.

    PubMed

    Kazmierczak, Steven C

    2013-01-01

    Abstract The discipline of laboratory medicine is relatively young when considered in the context of the history of medicine itself. The history of quality control, within the context of laboratory medicine, also enjoys a relatively brief, but rich history. Laboratory quality control continues to evolve along with advances in automation, measurement techniques and information technology. Clinical Chemistry and Laboratory Medicine (CCLM) has played a key role in helping disseminate information about the proper use and utility of quality control. Publication of important advances in quality control techniques and dissemination of guidelines concerned with laboratory quality control has undoubtedly helped readers of this journal keep up to date on the most recent developments in this field.

  4. Integrating Microscopic Analysis into Existing Quality Assurance Processes

    NASA Astrophysics Data System (ADS)

    Frühberger, Peter; Stephan, Thomas; Beyerer, Jürgen

    When technical goods, like mainboards and other electronic components, are produced, quality assurance (QA) is very important. To achieve this goal, different optical microscopes can be used to analyze a variety of specimen to gain comprehensive information by combining the acquired sensor data. In many industrial processes, cameras are used to examine these technical goods. Those cameras can analyze complete boards at once and offer a high level of accuracy when used for completeness checks. When small defects, e.g. soldered points, need to be examined in detail, those wide area cameras are limited. Microscopes with large magnification need to be used to analyze those critical areas. But microscopes alone cannot fulfill this task within a limited time schedule, because microscopic analysis of complete motherboards of a certain size is time demanding. Microscopes are limited concerning their depth of field and depth of focus, which is why additional components like XY moving tables need to be used to examine the complete surface. Yet today's industrial production quality standards require a 100 % control of the soldered components within a given time schedule. This level of quality, while keeping inspection time low, can only be achieved when combining multiple inspection devices in an optimized manner. This paper presents results and methods of combining industrial cameras with microscopy instrumenting a classificatory based approach intending to keep already deployed QA processes in place but extending them with the purpose of increasing the quality level of the produced technical goods while maintaining high throughput.

  5. Design and application of process control charting methodologies to gamma irradiation practices

    NASA Astrophysics Data System (ADS)

    Saylor, M. C.; Connaghan, J. P.; Yeadon, S. C.; Herring, C. M.; Jordan, T. M.

    2002-12-01

    The relationship between the contract irradiation facility and the customer has historically been based upon a "PASS/FAIL" approach with little or no quality metrics used to gage the control of the irradiation process. Application of process control charts, designed in coordination with mathematical simulation of routine radiation processing, can provide a basis for understanding irradiation events. By using tools that simulate the physical rules associated with the irradiation process, end-users can explore process-related boundaries and the effects of process changes. Consequently, the relationship between contractor and customer can evolve based on the derived knowledge. The resulting level of mutual understanding of the irradiation process and its resultant control benefits both the customer and contract operation, and provides necessary assurances to regulators. In this article we examine the complementary nature of theoretical (point kernel) and experimental (dosimetric) process evaluation, and the resulting by-product of improved understanding, communication and control generated through the implementation of effective process control charting strategies.

  6. Statistical Process Control in the Practice of Program Evaluation.

    ERIC Educational Resources Information Center

    Posavac, Emil J.

    1995-01-01

    A technique developed to monitor the quality of manufactured products, statistical process control (SPC), incorporates several features that may prove attractive to evaluators. This paper reviews the history of SPC, suggests how the approach can enrich program evaluation, and illustrates its use in a hospital-based example. (SLD)

  7. Verification and quality control of routine hematology analyzers.

    PubMed

    Vis, J Y; Huisman, A

    2016-05-01

    Verification of hematology analyzers (automated blood cell counters) is mandatory before new hematology analyzers may be used in routine clinical care. The verification process consists of several items which comprise among others: precision, accuracy, comparability, carryover, background and linearity throughout the expected range of results. Yet, which standard should be met or which verification limit be used is at the discretion of the laboratory specialist. This paper offers practical guidance on verification and quality control of automated hematology analyzers and provides an expert opinion on the performance standard that should be met by the contemporary generation of hematology analyzers. Therefore (i) the state-of-the-art performance of hematology analyzers for complete blood count parameters is summarized, (ii) considerations, challenges, and pitfalls concerning the development of a verification plan are discussed, (iii) guidance is given regarding the establishment of reference intervals, and (iv) different methods on quality control of hematology analyzers are reviewed. © 2016 John Wiley & Sons Ltd.

  8. Statistical process control analysis for patient-specific IMRT and VMAT QA.

    PubMed

    Sanghangthum, Taweap; Suriyapee, Sivalee; Srisatit, Somyot; Pawlicki, Todd

    2013-05-01

    This work applied statistical process control to establish the control limits of the % gamma pass of patient-specific intensity modulated radiotherapy (IMRT) and volumetric modulated arc therapy (VMAT) quality assurance (QA), and to evaluate the efficiency of the QA process by using the process capability index (Cpml). A total of 278 IMRT QA plans in nasopharyngeal carcinoma were measured with MapCHECK, while 159 VMAT QA plans were undertaken with ArcCHECK. Six megavolts with nine fields were used for the IMRT plan and 2.5 arcs were used to generate the VMAT plans. The gamma (3%/3 mm) criteria were used to evaluate the QA plans. The % gamma passes were plotted on a control chart. The first 50 data points were employed to calculate the control limits. The Cpml was calculated to evaluate the capability of the IMRT/VMAT QA process. The results showed higher systematic errors in IMRT QA than VMAT QA due to the more complicated setup used in IMRT QA. The variation of random errors was also larger in IMRT QA than VMAT QA because the VMAT plan has more continuity of dose distribution. The average % gamma pass was 93.7% ± 3.7% for IMRT and 96.7% ± 2.2% for VMAT. The Cpml value of IMRT QA was 1.60 and VMAT QA was 1.99, which implied that the VMAT QA process was more accurate than the IMRT QA process. Our lower control limit for % gamma pass of IMRT is 85.0%, while the limit for VMAT is 90%. Both the IMRT and VMAT QA processes are good quality because Cpml values are higher than 1.0.

  9. 7 CFR 58.733 - Quality control tests.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Quality control tests. 58.733 Section 58.733... Procedures § 58.733 Quality control tests. (a) Chemical analyses. The following chemical analyses shall be... pasteurization by means of the phosphatase test, as well as any other tests necessary to assure good quality...

  10. 7 CFR 58.733 - Quality control tests.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Quality control tests. 58.733 Section 58.733... Procedures § 58.733 Quality control tests. (a) Chemical analyses. The following chemical analyses shall be... pasteurization by means of the phosphatase test, as well as any other tests necessary to assure good quality...

  11. Establishing a Quality Control System for Stem Cell-Based Medicinal Products in China

    PubMed Central

    2015-01-01

    Stem cell-based medicinal products (SCMPs) are emerging as novel therapeutic products. The success of its development depends on the existence of an effective quality control system, which is constituted by quality control technologies, standards, reference materials, guidelines, and the associated management system in accordance with regulatory requirements along product lifespan. However, a worldwide, effective quality control system specific for SCMPs is still far from established partially due to the limited understanding of stem cell sciences and lack of quality control technologies for accurately assessing the safety and biological effectiveness of SCMPs before clinical use. Even though, based on the existing regulations and current stem cell sciences and technologies, initial actions toward the goal of establishing such a system have been taken as exemplified by recent development of new “interim guidelines” for governing quality control along development of SCMPs and new development of the associated quality control technologies in China. In this review, we first briefly introduced the major institutions involved in the regulation of cell substrates and therapeutic cell products in China and the existing regulatory documents and technical guidelines used as critical references for developing the new interim guidelines. With focus only on nonhematopoietic stem cells, we then discussed the principal quality attributes of SCMPs as well as our thinking of proper testing approaches to be established with relevant evaluation technologies to ensure all quality requirements of SCMPs along different manufacturing processes and development stages. At the end, some regulatory and technical challenges were also discussed with the conclusion that combined efforts should be taken to promote stem cell regulatory sciences to establish the effective quality control system for SCMPs. PMID:25471126

  12. Quality-control design for surface-water sampling in the National Water-Quality Network

    USGS Publications Warehouse

    Riskin, Melissa L.; Reutter, David C.; Martin, Jeffrey D.; Mueller, David K.

    2018-04-10

    The data-quality objectives for samples collected at surface-water sites in the National Water-Quality Network include estimating the extent to which contamination, matrix effects, and measurement variability affect interpretation of environmental conditions. Quality-control samples provide insight into how well the samples collected at surface-water sites represent the true environmental conditions. Quality-control samples used in this program include field blanks, replicates, and field matrix spikes. This report describes the design for collection of these quality-control samples and the data management needed to properly identify these samples in the U.S. Geological Survey’s national database.

  13. Advanced strategies for quality control of Chinese medicines.

    PubMed

    Zhao, Jing; Ma, Shuang-Cheng; Li, Shao-Ping

    2018-01-05

    Quality control is always the critical issue for Chinese medicines (CMs) with their worldwide increasing use. Different from western medicine, CMs are usually considered that multiple constituents are responsible for the therapeutic effects. Therefore, quality control of CMs is a challenge. In 2011, the strategies for quantification, related to the markers, reference compounds and approaches, in quality control of CMs were reviewed (Li, et al., J. Pharm. Biomed. Anal., 2011, 55, 802-809). Since then, some new strategies have been proposed in these fields. Therefore, the review on the strategies for quality control of CMs should be updated to improve the safety and efficacy of CMs. Herein, novel strategies related to quality marker discovery, reference compound development and advanced approaches (focused on glyco-analysis) for quality control, during 2011-2016, were summarized and discussed. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. 30 CFR 28.30 - Quality control plans; filing requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Quality control plans; filing requirements. 28... PROTECTION FOR TRAILING CABLES IN COAL MINES Quality Control § 28.30 Quality control plans; filing... part, each applicant shall file with MSHA a proposed quality control plan which shall be designed to...

  15. 30 CFR 28.30 - Quality control plans; filing requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Quality control plans; filing requirements. 28... PROTECTION FOR TRAILING CABLES IN COAL MINES Quality Control § 28.30 Quality control plans; filing... part, each applicant shall file with MSHA a proposed quality control plan which shall be designed to...

  16. 30 CFR 28.30 - Quality control plans; filing requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... PROTECTION FOR TRAILING CABLES IN COAL MINES Quality Control § 28.30 Quality control plans; filing... 30 Mineral Resources 1 2013-07-01 2013-07-01 false Quality control plans; filing requirements. 28... part, each applicant shall file with MSHA a proposed quality control plan which shall be designed to...

  17. 30 CFR 28.30 - Quality control plans; filing requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... PROTECTION FOR TRAILING CABLES IN COAL MINES Quality Control § 28.30 Quality control plans; filing... 30 Mineral Resources 1 2012-07-01 2012-07-01 false Quality control plans; filing requirements. 28... part, each applicant shall file with MSHA a proposed quality control plan which shall be designed to...

  18. 30 CFR 28.30 - Quality control plans; filing requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... PROTECTION FOR TRAILING CABLES IN COAL MINES Quality Control § 28.30 Quality control plans; filing... 30 Mineral Resources 1 2014-07-01 2014-07-01 false Quality control plans; filing requirements. 28... part, each applicant shall file with MSHA a proposed quality control plan which shall be designed to...

  19. Statistical process control: a practical application for hospitals.

    PubMed

    VanderVeen, L M

    1992-01-01

    A six-step plan based on using statistics was designed to improve quality in the central processing and distribution department of a 223-bed hospital in Oakland, CA. This article describes how the plan was implemented sequentially, starting with the crucial first step of obtaining administrative support. The QI project succeeded in overcoming beginners' fear of statistics and in training both managers and staff to use inspection checklists, Pareto charts, cause-and-effect diagrams, and control charts. The best outcome of the program was the increased commitment to quality improvement by the members of the department.

  20. 21 CFR 864.8625 - Hematology quality control mixture.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Hematology quality control mixture. 864.8625 Section 864.8625 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES... quality control mixture. (a) Identification. A hematology quality control mixture is a device used to...

  1. Cellular Strategies of Protein Quality Control

    PubMed Central

    Chen, Bryan; Retzlaff, Marco; Roos, Thomas; Frydman, Judith

    2011-01-01

    Eukaryotic cells must contend with a continuous stream of misfolded proteins that compromise the cellular protein homeostasis balance and jeopardize cell viability. An elaborate network of molecular chaperones and protein degradation factors continually monitor and maintain the integrity of the proteome. Cellular protein quality control relies on three distinct yet interconnected strategies whereby misfolded proteins can either be refolded, degraded, or delivered to distinct quality control compartments that sequester potentially harmful misfolded species. Molecular chaperones play a critical role in determining the fate of misfolded proteins in the cell. Here, we discuss the spatial and temporal organization of cellular quality control strategies and their implications for human diseases linked to protein misfolding and aggregation. PMID:21746797

  2. 21 CFR 211.22 - Responsibilities of quality control unit.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 4 2011-04-01 2011-04-01 false Responsibilities of quality control unit. 211.22... Personnel § 211.22 Responsibilities of quality control unit. (a) There shall be a quality control unit that... have been fully investigated. The quality control unit shall be responsible for approving or rejecting...

  3. 21 CFR 211.22 - Responsibilities of quality control unit.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 4 2010-04-01 2010-04-01 false Responsibilities of quality control unit. 211.22... Personnel § 211.22 Responsibilities of quality control unit. (a) There shall be a quality control unit that... have been fully investigated. The quality control unit shall be responsible for approving or rejecting...

  4. 40 CFR 81.112 - Charleston Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Quality Control Regions § 81.112 Charleston Intrastate Air Quality Control Region. The Charleston Intrastate Air Quality Control Region (South Carolina) consists of the territorial area encompassed by the... Quality Control Region: Region 1. 81.107Greenwood Intrastate Air Quality Control Region: Region 2. 81...

  5. 40 CFR 81.112 - Charleston Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Quality Control Regions § 81.112 Charleston Intrastate Air Quality Control Region. The Charleston Intrastate Air Quality Control Region (South Carolina) consists of the territorial area encompassed by the... Quality Control Region: Region 1. 81.107Greenwood Intrastate Air Quality Control Region: Region 2. 81...

  6. 7 CFR 275.21 - Quality control review reports.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 4 2011-01-01 2011-01-01 false Quality control review reports. 275.21 Section 275.21... Reporting on Program Performance § 275.21 Quality control review reports. (a) General. Each State agency shall submit reports on the performance of quality control reviews in accordance with the requirements...

  7. 40 CFR 81.88 - Billings Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Quality Control Regions § 81.88 Billings Intrastate Air Quality Control Region. The Metropolitan Billings Intrastate Air Quality Control Region (Montana) has been renamed the Billings Intrastate Air Quality Control... to by Montana authorities as follows: Sec. 481.168Great Falls Intrastate Air Quality Control Region...

  8. 40 CFR 81.88 - Billings Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Quality Control Regions § 81.88 Billings Intrastate Air Quality Control Region. The Metropolitan Billings Intrastate Air Quality Control Region (Montana) has been renamed the Billings Intrastate Air Quality Control... to by Montana authorities as follows: Sec. 481.168Great Falls Intrastate Air Quality Control Region...

  9. Application of machine learning and expert systems to Statistical Process Control (SPC) chart interpretation

    NASA Technical Reports Server (NTRS)

    Shewhart, Mark

    1991-01-01

    Statistical Process Control (SPC) charts are one of several tools used in quality control. Other tools include flow charts, histograms, cause and effect diagrams, check sheets, Pareto diagrams, graphs, and scatter diagrams. A control chart is simply a graph which indicates process variation over time. The purpose of drawing a control chart is to detect any changes in the process signalled by abnormal points or patterns on the graph. The Artificial Intelligence Support Center (AISC) of the Acquisition Logistics Division has developed a hybrid machine learning expert system prototype which automates the process of constructing and interpreting control charts.

  10. Quality-Assurance/Quality-Control Manual for Collection and Analysis of Water-Quality Data in the Ohio District, US Geological Survey

    USGS Publications Warehouse

    Francy, D.S.; Jones, A.L.; Myers, Donna N.; Rowe, G.L.; Eberle, Michael; Sarver, K.M.

    1998-01-01

    The U.S. Geological Survey (USGS), Water Resources Division (WRD), requires that quality-assurance/quality-control (QA/QC) activities be included in any sampling and analysis program. Operational QA/QC procedures address local needs while incorporating national policies. Therefore, specific technical policies were established for all activities associated with water-quality project being done by the Ohio District. The policies described in this report provide Ohio District personnel, cooperating agencies, and others with a reference manual on QA/QC procedures that are followed in collecitng and analyzing water-quality samples and reporting water-quality information in the Ohio District. The project chief, project support staff, District Water-Quality Specialist, and District Laboratory Coordinator are all involved in planning and implementing QA/QC activities at the district level. The District Chief and other district-level managers provide oversight, and the Regional Water-Quality Specialist, Office of Water Quality (USGS headquarters), and the Branch of Quality Systems within the Office of Water Quality create national QA/QC polices and provide assistance to District personnel. In the literature, the quality of all measurement data is expressed in terms of precision, variability, bias, accuracy, completeness, representativeness, and comparability. In the Ohio District, bias and variability will be used to describe quality-control data generated from samples in the field and laboratory. Each project chief must plan for implementation and financing of QA/QC activities necessary to achieve data-quality objectives. At least 15 percent of the total project effort must be directed toward QA/QC activities. Of this total, 5-10 percent will be used for collection and analysis of quality-control samples. This is an absolute minimum, and more may be required based on project objectives. Proper techniques must be followed in the collection and processing of surface

  11. Synthesis of high-quality libraries of long (150mer) oligonucleotides by a novel depurination controlled process

    PubMed Central

    LeProust, Emily M.; Peck, Bill J.; Spirin, Konstantin; McCuen, Heather Brummel; Moore, Bridget; Namsaraev, Eugeni; Caruthers, Marvin H.

    2010-01-01

    We have achieved the ability to synthesize thousands of unique, long oligonucleotides (150mers) in fmol amounts using parallel synthesis of DNA on microarrays. The sequence accuracy of the oligonucleotides in such large-scale syntheses has been limited by the yields and side reactions of the DNA synthesis process used. While there has been significant demand for libraries of long oligos (150mer and more), the yields in conventional DNA synthesis and the associated side reactions have previously limited the availability of oligonucleotide pools to lengths <100 nt. Using novel array based depurination assays, we show that the depurination side reaction is the limiting factor for the synthesis of libraries of long oligonucleotides on Agilent Technologies’ SurePrint® DNA microarray platform. We also demonstrate how depurination can be controlled and reduced by a novel detritylation process to enable the synthesis of high quality, long (150mer) oligonucleotide libraries and we report the characterization of synthesis efficiency for such libraries. Oligonucleotide libraries prepared with this method have changed the economics and availability of several existing applications (e.g. targeted resequencing, preparation of shRNA libraries, site-directed mutagenesis), and have the potential to enable even more novel applications (e.g. high-complexity synthetic biology). PMID:20308161

  12. Quality evaluation and control of end cap welds in PHWR fuel elements by ultrasonic examination

    NASA Astrophysics Data System (ADS)

    Choi, M. S.; Yang, M. S.

    1991-02-01

    The current quality control procedure of nuclear fuel end cap weld is mainly dependent on the destructive metallographic examination. A nondestructive examination technique, i.e., ultrasonic examination, has been developed to identify and evaluate weld discontinuities. A few interesting results of the weld quality evaluation by applying the developed ultrasonic examination technique to PHWR fuel welds are presented. In addition, the feasibility of the weld quality control by the ultrasonic examination is discussed. This study shows that the ultrasonic examination is effective and reliable method for detecting abnormal weld contours and weld discontinuities such as micro-fissure, crack, upset split and expulsion, and can be used as a quality control tool for the end cap welding process.

  13. 7 CFR 58.141 - Alternate quality control program.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Alternate quality control program. 58.141 Section 58... Service 1 Quality Specifications for Raw Milk § 58.141 Alternate quality control program. When a plant has in operation an acceptable quality program, at the producer level, which is approved by the...

  14. 21 CFR 111.105 - What must quality control personnel do?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 2 2011-04-01 2011-04-01 false What must quality control personnel do? 111.105... for Quality Control § 111.105 What must quality control personnel do? Quality control personnel must... manufacturing record. To do so, quality control personnel must perform operations that include: (a) Approving or...

  15. Theoretical approach to society-wide environmental quality control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ayano, K.

    1982-01-01

    The study outlines the basis for a theory of societal control of environmental quality in the US based on the concepts and philosophy of company-wide quality control which has developed in Japan as a cross-disciplinary approach to problem-solving in the industrial realm. The basic concepts are: 1) every member of society, as a producer of environmental products and services for future generations, in principle has the responsibility to control the quality of his output; 2) environment quality is the quality of life, or the fitness of use of environment for humans; and 3) societal control is any activity necessary formore » quality production of environmental products and services continuously or in the long run. A motivator-hygiene theory of environmental quality is identified, and a proposal is made that the policy provision must be formulated differently between those aimed at hygiene factors of environmental quality and those aimed at motivators, the former in a collectivistic manner, the latter as an individual problem. The concept of societal cost of environmental quality is introduced. Based on the motivator-hygiene theory of environmental quality, the collectivistic and individual approaches are differentiated and discussed.« less

  16. Implementation of statistical process control for proteomic experiments via LC MS/MS.

    PubMed

    Bereman, Michael S; Johnson, Richard; Bollinger, James; Boss, Yuval; Shulman, Nick; MacLean, Brendan; Hoofnagle, Andrew N; MacCoss, Michael J

    2014-04-01

    Statistical process control (SPC) is a robust set of tools that aids in the visualization, detection, and identification of assignable causes of variation in any process that creates products, services, or information. A tool has been developed termed Statistical Process Control in Proteomics (SProCoP) which implements aspects of SPC (e.g., control charts and Pareto analysis) into the Skyline proteomics software. It monitors five quality control metrics in a shotgun or targeted proteomic workflow. None of these metrics require peptide identification. The source code, written in the R statistical language, runs directly from the Skyline interface, which supports the use of raw data files from several of the mass spectrometry vendors. It provides real time evaluation of the chromatographic performance (e.g., retention time reproducibility, peak asymmetry, and resolution), and mass spectrometric performance (targeted peptide ion intensity and mass measurement accuracy for high resolving power instruments) via control charts. Thresholds are experiment- and instrument-specific and are determined empirically from user-defined quality control standards that enable the separation of random noise and systematic error. Finally, Pareto analysis provides a summary of performance metrics and guides the user to metrics with high variance. The utility of these charts to evaluate proteomic experiments is illustrated in two case studies.

  17. Some Inspection Methods for Quality Control and In-service Inspection of GLARE

    NASA Astrophysics Data System (ADS)

    Sinke, J.

    2003-07-01

    Quality control of materials and structures is an important issue, also for GLARE. During the manufacturing stage the processes and materials should be monitored and checked frequently in order to obtain a qualified product. During the operation of the aircraft, frequent monitoring and inspections are performed to maintain the quality at a prescribed level. Therefore, in-service inspection methods are applied, and when necessary repair activities are conducted. For the quality control of the GLARE panels and components during manufacturing, the C-scan method proves to be an effective tool. For in-service inspection the Eddy Current Method is one of the suitable options. In this paper a brief overview is presented of both methods and their application on GLARE products.

  18. Quality changes of pomegranate arils throughout shelf life affected by deficit irrigation and pre-processing storage.

    PubMed

    Peña-Estévez, María E; Artés-Hernández, Francisco; Artés, Francisco; Aguayo, Encarna; Martínez-Hernández, Ginés Benito; Galindo, Alejandro; Gómez, Perla A

    2016-10-15

    This study investigated the influence of sustained deficit irrigation (SDI, 78% less water supply than the reference evapotranspiration, ET0) compared to a control (100% ET0) on the physicochemical and sensory qualities and health-promoting compounds of pomegranate arils stored for 14days at 5°C. Prior to processing, the fruits were stored for 0, 30, 60 or 90days at 5°C. The effect of the pre-processing storage duration was also examined. Physicochemical and sensory qualities were kept during the storage period. Arils from SDI fruit had lower punicalagin-α and ellagic acid losses than the control (13% vs 50%). However, the anthocyanin content decreased during the shelf-life (72%) regardless of the treatment. The ascorbic acid slight decreased. Arils from SDI experienced glucose/fructose ratio loss (19%) lower than that of the control (35%). In general, arils from SDI showed better quality and health attributes during the shelf-life than did the control samples. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Effects of wireless packet loss in industrial process control systems.

    PubMed

    Liu, Yongkang; Candell, Richard; Moayeri, Nader

    2017-05-01

    Timely and reliable sensing and actuation control are essential in networked control. This depends on not only the precision/quality of the sensors and actuators used but also on how well the communications links between the field instruments and the controller have been designed. Wireless networking offers simple deployment, reconfigurability, scalability, and reduced operational expenditure, and is easier to upgrade than wired solutions. However, the adoption of wireless networking has been slow in industrial process control due to the stochastic and less than 100% reliable nature of wireless communications and lack of a model to evaluate the effects of such communications imperfections on the overall control performance. In this paper, we study how control performance is affected by wireless link quality, which in turn is adversely affected by severe propagation loss in harsh industrial environments, co-channel interference, and unintended interference from other devices. We select the Tennessee Eastman Challenge Model (TE) for our study. A decentralized process control system, first proposed by N. Ricker, is adopted that employs 41 sensors and 12 actuators to manage the production process in the TE plant. We consider the scenario where wireless links are used to periodically transmit essential sensor measurement data, such as pressure, temperature and chemical composition to the controller as well as control commands to manipulate the actuators according to predetermined setpoints. We consider two models for packet loss in the wireless links, namely, an independent and identically distributed (IID) packet loss model and the two-state Gilbert-Elliot (GE) channel model. While the former is a random loss model, the latter can model bursty losses. With each channel model, the performance of the simulated decentralized controller using wireless links is compared with the one using wired links providing instant and 100% reliable communications. The sensitivity of the

  20. Effects of Wireless Packet Loss in Industrial Process Control Systems

    PubMed Central

    Liu, Yongkang; Candell, Richard; Moayeri, Nader

    2017-01-01

    Timely and reliable sensing and actuation control are essential in networked control. This depends on not only the precision/quality of the sensors and actuators used but also on how well the communications links between the field instruments and the controller have been designed. Wireless networking offers simple deployment, reconfigurability, scalability, and reduced operational expenditure, and is easier to upgrade than wired solutions. However, the adoption of wireless networking has been slow in industrial process control due to the stochastic and less than 100 % reliable nature of wireless communications and lack of a model to evaluate the effects of such communications imperfections on the overall control performance. In this paper, we study how control performance is affected by wireless link quality, which in turn is adversely affected by severe propagation loss in harsh industrial environments, co-channel interference, and unintended interference from other devices. We select the Tennessee Eastman Challenge Model (TE) for our study. A decentralized process control system, first proposed by N. Ricker, is adopted that employs 41 sensors and 12 actuators to manage the production process in the TE plant. We consider the scenario where wireless links are used to periodically transmit essential sensor measurement data, such as pressure, temperature and chemical composition to the controller as well as control commands to manipulate the actuators according to predetermined setpoints. We consider two models for packet loss in the wireless links, namely, an independent and identically distributed (IID) packet loss model and the two-state Gilbert-Elliot (GE) channel model. While the former is a random loss model, the latter can model bursty losses. With each channel model, the performance of the simulated decentralized controller using wireless links is compared with the one using wired links providing instant and 100 % reliable communications. The sensitivity of

  1. Real-time assessment of critical quality attributes of a continuous granulation process.

    PubMed

    Fonteyne, Margot; Vercruysse, Jurgen; Díaz, Damián Córdoba; Gildemyn, Delphine; Vervaet, Chris; Remon, Jean Paul; De Beer, Thomas

    2013-02-01

    There exists the intention to shift pharmaceutical manufacturing of solid dosage forms from traditional batch production towards continuous production. The currently applied conventional quality control systems, based on sampling and time-consuming off-line analyses in analytical laboratories, would annul the advantages of continuous processing. It is clear that real-time quality assessment and control is indispensable for continuous production. This manuscript evaluates strengths and weaknesses of several complementary Process Analytical Technology (PAT) tools implemented in a continuous wet granulation process, which is part of a fully continuous from powder-to-tablet production line. The use of Raman and NIR-spectroscopy and a particle size distribution analyzer is evaluated for the real-time monitoring of critical parameters during the continuous wet agglomeration of an anhydrous theophylline- lactose blend. The solid state characteristics and particle size of the granules were analyzed in real-time and the critical process parameters influencing these granule characteristics were identified. The temperature of the granulator barrel, the amount of granulation liquid added and, to a lesser extent, the powder feed rate were the parameters influencing the solid state of the active pharmaceutical ingredient (API). A higher barrel temperature and a higher powder feed rate, resulted in larger granules.

  2. Quality Control in Higher Education.

    ERIC Educational Resources Information Center

    Hogarth, Charles P.

    The status of quality control in U.S. higher education is discussed with an overview of the functions and structure of public and private colleges and universities. The book is divided into seven chapters: (1) outside controls (accrediting groups, governmental groups and other groups); (2) structure (board of control, president, organization); (3)…

  3. A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.

    PubMed

    Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco

    2018-01-01

    One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality

  4. Implementation of Quality Assurance and Quality Control Measures in the National Phenology Database

    NASA Astrophysics Data System (ADS)

    Gerst, K.; Rosemartin, A.; Denny, E. G.; Marsh, L.; Barnett, L.

    2015-12-01

    The USA National Phenology Network (USA-NPN; www.usanpn.org) serves science and society by promoting a broad understanding of plant and animal phenology and the relationships among phenological patterns and environmental change. The National Phenology Database has over 5.5 million observation records for plants and animals for the period 1954-2015. These data have been used in a number of science, conservation and resource management applications, including national assessments of historical and potential future trends in phenology, regional assessments of spatio-temporal variation in organismal activity, and local monitoring for invasive species detection. Customizable data downloads are freely available, and data are accompanied by FGDC-compliant metadata, data-use and data-attribution policies, and vetted documented methodologies and protocols. The USA-NPN has implemented a number of measures to ensure both quality assurance and quality control. Here we describe the resources that have been developed so that incoming data submitted by both citizen and professional scientists are reliable; these include training materials, such as a botanical primer and species profiles. We also describe a number of automated quality control processes applied to incoming data streams to optimize data output quality. Existing and planned quality control measures for output of raw and derived data include: (1) Validation of site locations, including latitude, longitude, and elevation; (2) Flagging of records that conflict for a given date for an individual plant; (3) Flagging where species occur outside known ranges; (4) Flagging of records when phenophases occur outside of the plausible order for a species; (5) Flagging of records when intensity measures do not follow a plausible progression for a phenophase; (6) Flagging of records when a phenophase occurs outside of the plausible season, and (7) Quantification of precision and uncertainty for estimation of phenological metrics

  5. Sensors for process control Focus Team report

    NASA Astrophysics Data System (ADS)

    At the Semiconductor Technology Workshop, held in November 1992, the Semiconductor Industry Association (SIA) convened 179 semiconductor technology experts to assess the 15-year outlook for the semiconductor manufacturing industry. The output of the Workshop, a document entitled 'Semiconductor Technology: Workshop Working Group Reports,' contained an overall roadmap for the technology characteristics envisioned in integrated circuits (IC's) for the period 1992-2007. In addition, the document contained individual roadmaps for numerous key areas in IC manufacturing, such as film deposition, thermal processing, manufacturing systems, exposure technology, etc. The SIA Report did not contain a separate roadmap for contamination free manufacturing (CFM). A key component of CFM for the next 15 years is the use of sensors for (1) defect reduction, (2) improved product quality, (3) improved yield, (4) improved tool utilization through contamination reduction, and (5) real time process control in semiconductor fabrication. The objective of this Focus Team is to generate a Sensors for Process Control Roadmap. Implicit in this objective is the identification of gaps in current sensor technology so that research and development activity in the sensor industry can be stimulated to develop sensor systems capable of meeting the projected roadmap needs. Sensor performance features of interest include detection limit, specificity, sensitivity, ease of installation and maintenance, range, response time, accuracy, precision, ease and frequency of calibration, degree of automation, and adaptability to in-line process control applications.

  6. Application of Advanced Process Control techniques to a pusher type reheating furnace

    NASA Astrophysics Data System (ADS)

    Zanoli, S. M.; Pepe, C.; Barboni, L.

    2015-11-01

    In this paper an Advanced Process Control system aimed at controlling and optimizing a pusher type reheating furnace located in an Italian steel plant is proposed. The designed controller replaced the previous control system, based on PID controllers manually conducted by process operators. A two-layer Model Predictive Control architecture has been adopted that, exploiting a chemical, physical and economic modelling of the process, overcomes the limitations of plant operators’ mental model and knowledge. In addition, an ad hoc decoupling strategy has been implemented, allowing the selection of the manipulated variables to be used for the control of each single process variable. Finally, in order to improve the system flexibility and resilience, the controller has been equipped with a supervision module. A profitable trade-off between conflicting specifications, e.g. safety, quality and production constraints, energy saving and pollution impact, has been guaranteed. Simulation tests and real plant results demonstrated the soundness and the reliability of the proposed system.

  7. Quality Control Technician.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. Center on Education and Training for Employment.

    This document contains 18 units to consider for use in a tech prep competency profile for the occupation of quality control technician. All the units listed will not necessarily apply to every situation or tech prep consortium, nor will all the competencies within each unit be appropriate. Several units appear within each specific occupation and…

  8. 40 CFR 81.77 - Puerto Rico Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 17 2011-07-01 2011-07-01 false Puerto Rico Air Quality Control Region... PROGRAMS (CONTINUED) DESIGNATION OF AREAS FOR AIR QUALITY PLANNING PURPOSES Designation of Air Quality Control Regions § 81.77 Puerto Rico Air Quality Control Region. The Puerto Rico Air Quality Control Region...

  9. 40 CFR 81.77 - Puerto Rico Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 17 2010-07-01 2010-07-01 false Puerto Rico Air Quality Control Region... PROGRAMS (CONTINUED) DESIGNATION OF AREAS FOR AIR QUALITY PLANNING PURPOSES Designation of Air Quality Control Regions § 81.77 Puerto Rico Air Quality Control Region. The Puerto Rico Air Quality Control Region...

  10. 40 CFR 81.77 - Puerto Rico Air Quality Control Region.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 18 2012-07-01 2012-07-01 false Puerto Rico Air Quality Control Region... PROGRAMS (CONTINUED) DESIGNATION OF AREAS FOR AIR QUALITY PLANNING PURPOSES Designation of Air Quality Control Regions § 81.77 Puerto Rico Air Quality Control Region. The Puerto Rico Air Quality Control Region...

  11. 40 CFR 81.77 - Puerto Rico Air Quality Control Region.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 18 2013-07-01 2013-07-01 false Puerto Rico Air Quality Control Region... PROGRAMS (CONTINUED) DESIGNATION OF AREAS FOR AIR QUALITY PLANNING PURPOSES Designation of Air Quality Control Regions § 81.77 Puerto Rico Air Quality Control Region. The Puerto Rico Air Quality Control Region...

  12. [Quality Management and Quality Specifications of Laboratory Tests in Clinical Studies--Challenges in Pre-Analytical Processes in Clinical Laboratories].

    PubMed

    Ishibashi, Midori

    2015-01-01

    The cost, speed, and quality are the three important factors recently indicated by the Ministry of Health, Labour and Welfare (MHLW) for the purpose of accelerating clinical studies. Based on this background, the importance of laboratory tests is increasing, especially in the evaluation of clinical study participants' entry and safety, and drug efficacy. To assure the quality of laboratory tests, providing high-quality laboratory tests is mandatory. For providing adequate quality assurance in laboratory tests, quality control in the three fields of pre-analytical, analytical, and post-analytical processes is extremely important. There are, however, no detailed written requirements concerning specimen collection, handling, preparation, storage, and shipping. Most laboratory tests for clinical studies are performed onsite in a local laboratory; however, a part of laboratory tests is done in offsite central laboratories after specimen shipping. As factors affecting laboratory tests, individual and inter-individual variations are well-known. Besides these factors, standardizing the factors of specimen collection, handling, preparation, storage, and shipping, may improve and maintain the high quality of clinical studies in general. Furthermore, the analytical method, units, and reference interval are also important factors. It is concluded that, to overcome the problems derived from pre-analytical processes, it is necessary to standardize specimen handling in a broad sense.

  13. Quality control for quantitative PCR based on amplification compatibility test.

    PubMed

    Tichopad, Ales; Bar, Tzachi; Pecen, Ladislav; Kitchen, Robert R; Kubista, Mikael; Pfaffl, Michael W

    2010-04-01

    Quantitative qPCR is a routinely used method for the accurate quantification of nucleic acids. Yet it may generate erroneous results if the amplification process is obscured by inhibition or generation of aberrant side-products such as primer dimers. Several methods have been established to control for pre-processing performance that rely on the introduction of a co-amplified reference sequence, however there is currently no method to allow for reliable control of the amplification process without directly modifying the sample mix. Herein we present a statistical approach based on multivariate analysis of the amplification response data generated in real-time. The amplification trajectory in its most resolved and dynamic phase is fitted with a suitable model. Two parameters of this model, related to amplification efficiency, are then used for calculation of the Z-score statistics. Each studied sample is compared to a predefined reference set of reactions, typically calibration reactions. A probabilistic decision for each individual Z-score is then used to identify the majority of inhibited reactions in our experiments. We compare this approach to univariate methods using only the sample specific amplification efficiency as reporter of the compatibility. We demonstrate improved identification performance using the multivariate approach compared to the univariate approach. Finally we stress that the performance of the amplification compatibility test as a quality control procedure depends on the quality of the reference set. Copyright 2010 Elsevier Inc. All rights reserved.

  14. Quality Control Pathways for Nucleus-Encoded Eukaryotic tRNA Biosynthesis and Subcellular Trafficking

    PubMed Central

    Huang, Hsiao-Yun

    2015-01-01

    tRNAs perform an essential role in translating the genetic code. They are long-lived RNAs that are generated via numerous posttranscriptional steps. Eukaryotic cells have evolved numerous layers of quality control mechanisms to ensure that the tRNAs are appropriately structured, processed, and modified. We describe the known tRNA quality control processes that check tRNAs and correct or destroy aberrant tRNAs. These mechanisms employ two types of exonucleases, CCA end addition, tRNA nuclear aminoacylation, and tRNA subcellular traffic. We arrange these processes in order of the steps that occur from generation of precursor tRNAs by RNA polymerase (Pol) III transcription to end maturation and modification in the nucleus to splicing and additional modifications in the cytoplasm. Finally, we discuss the tRNA retrograde pathway, which allows tRNA reimport into the nucleus for degradation or repair. PMID:25848089

  15. High perfomance liquid chromatography fingerprint analysis for quality control of brotowali (Tinospora crispa)

    NASA Astrophysics Data System (ADS)

    Syarifah, V. B.; Rafi, M.; Wahyuni, W. T.

    2017-05-01

    Brotowali (Tinospora crispa) is widely used in Indonesia as ingredient of herbal medicine formulation. To ensure the quality, safety, and efficacy of herbal medicine products, its chemical constituents should be continuously evaluated. High performance liquid chromatography (HPLC) fingerprint is one of powerful technique for this quality control process. In this study, HPLC fingerprint analysis method was developed for quality control of brotowali. HPLC analysis was performed in C18 column and detection was performed using photodiode array detector. The optimum mobile phase for brotowali fingerprint was acetonitrile (ACN) and 0.1% formic acid in gradient elution mode at a flow rate of 1 mL/min. The number of peaks detected in HPLC fingerprint of brotowali was 32 peaks and 23 peaks for stems and leaves, respectively. Berberine as marker compound was detected at retention time of 20.525 minutes. Evaluation of analytical performance including precision, reproducibility, and stability prove that this HPLC fingerprint analysis was reliable and could be applied for quality control of brotowali.

  16. Data Quality Control of the French Permanent Broadband Network in the RESIF Framework.

    NASA Astrophysics Data System (ADS)

    Grunberg, M.; Lambotte, S.; Engels, F.

    2014-12-01

    In the framework of the RESIF (Réseau Sismologique et géodésique Français) project, a new information system is setting up, allowing the improvement of the management and the distribution of high quality data from the different elements of RESIF. Within this information system, EOST (in Strasbourg) is in charge of collecting real-time permanent broadband seismic waveform, and performing Quality Control on these data. The real-time and validated data set are pushed to the French National Distribution Center (Isterre/Grenoble) to make them publicly available. Furthermore EOST hosts the BCSF-ReNaSS, in charge of the French metropolitan seismic bulletin. This allows to benefit from some high-end quality control based on the national and world-wide seismicity. Here we present the real-time seismic data flow from the stations of the French National Broad Band Network to EOST, and then, the data Quality Control procedures that were recently installed, including some new developments.The data Quality Control consists in applying a variety of processes to check the consistency of the whole system from the stations to the data center. This allows us to verify that instruments and data transmission are operating correctly. Moreover, time quality is critical for most of the scientific data applications. To face this challenge and check the consistency of polarities and amplitudes, we deployed several high-end processes including a noise correlation procedure to check for timing accuracy (intrumental time errors result in a time-shift of the whole cross-correlation, clearly distinct from those due to change in medium physical properties), and a systematic comparison of synthetic and real data for teleseismic earthquakes of magnitude larger than 6.5 to detect timing errors as well as polarity and amplitude problems.

  17. Signal quality and Bayesian signal processing in neurofeedback based on real-time fMRI.

    PubMed

    Koush, Yury; Zvyagintsev, Mikhail; Dyck, Miriam; Mathiak, Krystyna A; Mathiak, Klaus

    2012-01-02

    Real-time fMRI allows analysis and visualization of the brain activity online, i.e. within one repetition time. It can be used in neurofeedback applications where subjects attempt to control an activation level in a specified region of interest (ROI) of their brain. The signal derived from the ROI is contaminated with noise and artifacts, namely with physiological noise from breathing and heart beat, scanner drift, motion-related artifacts and measurement noise. We developed a Bayesian approach to reduce noise and to remove artifacts in real-time using a modified Kalman filter. The system performs several signal processing operations: subtraction of constant and low-frequency signal components, spike removal and signal smoothing. Quantitative feedback signal quality analysis was used to estimate the quality of the neurofeedback time series and performance of the applied signal processing on different ROIs. The signal-to-noise ratio (SNR) across the entire time series and the group event-related SNR (eSNR) were significantly higher for the processed time series in comparison to the raw data. Applied signal processing improved the t-statistic increasing the significance of blood oxygen level-dependent (BOLD) signal changes. Accordingly, the contrast-to-noise ratio (CNR) of the feedback time series was improved as well. In addition, the data revealed increase of localized self-control across feedback sessions. The new signal processing approach provided reliable neurofeedback, performed precise artifacts removal, reduced noise, and required minimal manual adjustments of parameters. Advanced and fast online signal processing algorithms considerably increased the quality as well as the information content of the control signal which in turn resulted in higher contingency in the neurofeedback loop. Copyright © 2011 Elsevier Inc. All rights reserved.

  18. A Total Quality-Control Plan with Right-Sized Statistical Quality-Control.

    PubMed

    Westgard, James O

    2017-03-01

    A new Clinical Laboratory Improvement Amendments option for risk-based quality-control (QC) plans became effective in January, 2016. Called an Individualized QC Plan, this option requires the laboratory to perform a risk assessment, develop a QC plan, and implement a QC program to monitor ongoing performance of the QC plan. Difficulties in performing a risk assessment may limit validity of an Individualized QC Plan. A better alternative is to develop a Total QC Plan including a right-sized statistical QC procedure to detect medically important errors. Westgard Sigma Rules provides a simple way to select the right control rules and the right number of control measurements. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Control and monitoring method and system for electromagnetic forming process

    DOEpatents

    Kunerth, Dennis C.; Lassahn, Gordon D.

    1990-01-01

    A process, system, and improvement for a process for electromagnetic forming of a workpiece in which characteristics of the workpiece such as its geometry, electrical conductivity, quality, and magnetic permeability can be determined by monitoring the current and voltage in the workcoil. In an electromagnet forming process in which a power supply provides current to a workcoil and the electromagnetic field produced by the workcoil acts to form the workpiece, the dynamic interaction of the electromagnetic fields produced by the workcoil with the geometry, electrical conductivity, and magnetic permeability of the workpiece, provides information pertinent to the physical condition of the workpiece that is available for determination of quality and process control. This information can be obtained by deriving in real time the first several time derivatives of the current and voltage in the workcoil. In addition, the process can be extended by injecting test signals into the workcoil during the electromagnetic forming and monitoring the response to the test signals in the workcoil.

  20. Math Problems for Water Quality Control Personnel, Student Workbook. Second Edition.

    ERIC Educational Resources Information Center

    Delvecchio, Fred; Brutsch, Gloria

    This document is the student workbook for a course in mathematics for water quality control personnel. This version contains complete problems, answers and references. Problems are arranged alphabetically by treatment process. Charts, graphs, and drawings represent data forms an operator might see in a plant containing information necessary for…

  1. Statistical process control based chart for information systems security

    NASA Astrophysics Data System (ADS)

    Khan, Mansoor S.; Cui, Lirong

    2015-07-01

    Intrusion detection systems have a highly significant role in securing computer networks and information systems. To assure the reliability and quality of computer networks and information systems, it is highly desirable to develop techniques that detect intrusions into information systems. We put forward the concept of statistical process control (SPC) in computer networks and information systems intrusions. In this article we propose exponentially weighted moving average (EWMA) type quality monitoring scheme. Our proposed scheme has only one parameter which differentiates it from the past versions. We construct the control limits for the proposed scheme and investigate their effectiveness. We provide an industrial example for the sake of clarity for practitioner. We give comparison of the proposed scheme with EWMA schemes and p chart; finally we provide some recommendations for the future work.

  2. Quality Control of Structural MRI Images Applied Using FreeSurfer—A Hands-On Workflow to Rate Motion Artifacts

    PubMed Central

    Backhausen, Lea L.; Herting, Megan M.; Buse, Judith; Roessner, Veit; Smolka, Michael N.; Vetter, Nora C.

    2016-01-01

    In structural magnetic resonance imaging motion artifacts are common, especially when not scanning healthy young adults. It has been shown that motion affects the analysis with automated image-processing techniques (e.g., FreeSurfer). This can bias results. Several developmental and adult studies have found reduced volume and thickness of gray matter due to motion artifacts. Thus, quality control is necessary in order to ensure an acceptable level of quality and to define exclusion criteria of images (i.e., determine participants with most severe artifacts). However, information about the quality control workflow and image exclusion procedure is largely lacking in the current literature and the existing rating systems differ. Here, we propose a stringent workflow of quality control steps during and after acquisition of T1-weighted images, which enables researchers dealing with populations that are typically affected by motion artifacts to enhance data quality and maximize sample sizes. As an underlying aim we established a thorough quality control rating system for T1-weighted images and applied it to the analysis of developmental clinical data using the automated processing pipeline FreeSurfer. This hands-on workflow and quality control rating system will aid researchers in minimizing motion artifacts in the final data set, and therefore enhance the quality of structural magnetic resonance imaging studies. PMID:27999528

  3. Controlled atmosphere stunning of broiler chickens. II. Effects on behaviour, physiology and meat quality in a commercial processing plant.

    PubMed

    McKeegan, D E F; Abeyesinghe, S M; McLeman, M A; Lowe, J C; Demmers, T G M; White, R P; Kranen, R W; van Bemmel, H; Lankhaar, J A C; Wathes, C M

    2007-08-01

    1. The effects of controlled atmosphere stunning on behavioural and physiological responses, and carcase and meat quality of broiler chickens were studied experimentally in a full scale processing plant. 2. The gas mixtures tested were a single phase hypercapnic anoxic mixture of 60% Ar and 30% CO(2) in air with <2% O(2), and a biphasic hypercapnic hyperoxygenation mixture, comprising an anaesthetic phase, 40% CO(2), 30% O(2), 30% N(2), followed by an euthanasia phase, 80% CO(2), 5% O(2), 15% N(2). 3. Birds stunned with Ar + CO(2) were more often observed to flap their wings earlier, jump, paddle their legs, twitch and lie dorsally (rather than ventrally) than those stunned with CO(2) + O(2). These behaviours indicate a more agitated response with more severe convulsions during hypercapnic anoxia, thereby introducing greater potential for injury. 4. Heart rate during the first 100 s of gas stunning was similar for both gases, after which it remained constant at approximately 230 beats/min for CO(2) + O(2) birds whereas it declined gently for Ar + CO(2) birds. 5. In terms of carcase and meat quality, there appeared to be clear advantages to the processor in using CO(2) + O(2) rather than Ar + CO(2) to stun broiler chickens, for example, a much smaller number of fractured wings (1.6 vs. 6.8%) with fewer haemorrhages of the fillet. 6. This study supports the conclusions of both laboratory and pilot scale experiments that controlled atmosphere stunning of broiler chickens based upon a biphasic hypercapnic hyperoxygenation approach has advantages, in terms of welfare and carcase and meat quality, over a single phase hypercapnic anoxic approach employing 60% Ar and 30% CO(2) in air with <2% O(2).

  4. The loss of SMG1 causes defects in quality control pathways in Physcomitrella patens

    PubMed Central

    Lang, Daniel; Zimmer, Andreas D; Causier, Barry

    2018-01-01

    Abstract Nonsense-mediated mRNA decay (NMD) is important for RNA quality control and gene regulation in eukaryotes. NMD targets aberrant transcripts for decay and also directly influences the abundance of non-aberrant transcripts. In animals, the SMG1 kinase plays an essential role in NMD by phosphorylating the core NMD factor UPF1. Despite SMG1 being ubiquitous throughout the plant kingdom, little is known about its function, probably because SMG1 is atypically absent from the genome of the model plant, Arabidopsis thaliana. By combining our previously established SMG1 knockout in moss with transcriptome-wide analysis, we reveal the range of processes involving SMG1 in plants. Machine learning assisted analysis suggests that 32% of multi-isoform genes produce NMD-targeted transcripts and that splice junctions downstream of a stop codon act as the major determinant of NMD targeting. Furthermore, we suggest that SMG1 is involved in other quality control pathways, affecting DNA repair and the unfolded protein response, in addition to its role in mRNA quality control. Consistent with this, smg1 plants have increased susceptibility to DNA damage, but increased tolerance to unfolded protein inducing agents. The potential involvement of SMG1 in RNA, DNA and protein quality control has major implications for the study of these processes in plants. PMID:29596649

  5. 18 CFR 12.40 - Quality control programs.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... PROJECT WORKS Other Responsibilities of Applicant or Licensee § 12.40 Quality control programs. (a) General rule. During any construction, repair, or modification of project works, including any corrective... 18 Conservation of Power and Water Resources 1 2014-04-01 2014-04-01 false Quality control...

  6. 18 CFR 12.40 - Quality control programs.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... PROJECT WORKS Other Responsibilities of Applicant or Licensee § 12.40 Quality control programs. (a) General rule. During any construction, repair, or modification of project works, including any corrective... 18 Conservation of Power and Water Resources 1 2013-04-01 2013-04-01 false Quality control...

  7. 18 CFR 12.40 - Quality control programs.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... PROJECT WORKS Other Responsibilities of Applicant or Licensee § 12.40 Quality control programs. (a) General rule. During any construction, repair, or modification of project works, including any corrective... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Quality control...

  8. Ultrasonic Real-Time Quality Monitoring Of Aluminum Spot Weld Process

    NASA Astrophysics Data System (ADS)

    Perez Regalado, Waldo Josue

    The real-time ultrasonic spot weld monitoring system, introduced by our research group, has been designed for the unsupervised quality characterization of the spot welding process. It comprises the ultrasonic transducer (probe) built into one of the welding electrodes and an electronics hardware unit which gathers information from the transducer, performs real-time weld quality characterization and communicates with the robot programmable logic controller (PLC). The system has been fully developed for the inspection of spot welds manufactured in steel alloys, and has been mainly applied in the automotive industry. In recent years, a variety of materials have been introduced to the automotive industry. These include high strength steels, magnesium alloys, and aluminum alloys. Aluminum alloys have been of particular interest due to their high strength-to-weight ratio. Resistance spot welding requirements for aluminum vary greatly from those of steel. Additionally, the oxide film formed on the aluminum surface increases the heat generation between the copper electrodes and the aluminum plates leading to accelerated electrode deterioration. Preliminary studies showed that the real-time quality inspection system was not able to monitor spot welds manufactured with aluminum. The extensive experimental research, finite element modelling of the aluminum welding process and finite difference modeling of the acoustic wave propagation through the aluminum spot welds presented in this dissertation, revealed that the thermodynamics and hence the acoustic wave propagation through an aluminum and a steel spot weld differ significantly. For this reason, the hardware requirements and the algorithms developed to determine the welds quality from the ultrasonic data used on steel, no longer apply on aluminum spot welds. After updating the system and designing the required algorithms, parameters such as liquid nugget penetration and nugget diameter were available in the ultrasonic data

  9. 46 CFR 164.019-13 - Production quality control requirements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 6 2011-10-01 2011-10-01 false Production quality control requirements. 164.019-13....019-13 Production quality control requirements. (a) General. Each component manufacturer shall establish procedures for maintaining quality control of the materials used in production, manufacturing...

  10. 46 CFR 164.019-13 - Production quality control requirements.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 6 2012-10-01 2012-10-01 false Production quality control requirements. 164.019-13....019-13 Production quality control requirements. (a) General. Each component manufacturer shall establish procedures for maintaining quality control of the materials used in production, manufacturing...

  11. 46 CFR 164.019-13 - Production quality control requirements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 6 2010-10-01 2010-10-01 false Production quality control requirements. 164.019-13....019-13 Production quality control requirements. (a) General. Each component manufacturer shall establish procedures for maintaining quality control of the materials used in production, manufacturing...

  12. 46 CFR 164.019-13 - Production quality control requirements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 6 2014-10-01 2014-10-01 false Production quality control requirements. 164.019-13....019-13 Production quality control requirements. (a) General. Each component manufacturer shall establish procedures for maintaining quality control of the materials used in production, manufacturing...

  13. 46 CFR 164.019-13 - Production quality control requirements.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 6 2013-10-01 2013-10-01 false Production quality control requirements. 164.019-13....019-13 Production quality control requirements. (a) General. Each component manufacturer shall establish procedures for maintaining quality control of the materials used in production, manufacturing...

  14. A quality control circle process to improve implementation effect of prevention measures for high-risk patients.

    PubMed

    Feng, Haixia; Li, Guohong; Xu, Cuirong; Ju, Changping; Suo, Peiheng

    2017-12-01

    The aim of the study was to analyse the influence of prevention measures on pressure injuries for high-risk patients and to establish the most appropriate methods of implementation. Nurses assessed patients using a checklist and factors influencing the prevention of a pressure injury determined by brain storming. A specific series of measures was drawn up and an estimate of risk of pressure injury determined using the Braden Scale, analysis of nursing documents, implementation of prevention measures for pressure sores and awareness of the system both before and after carrying out a quality control circle (QCC) process. The overall scores of implementation of prevention measures ranged from 74.86 ± 14.24 to 87.06 ± 17.04, a result that was statistically significant (P < 0.0025). The Braden Scale scores ranged from 8.53 ± 3.21 to 13.48 ± 3.57. The nursing document scores ranged from 7.67 ± 3.98 to 10.12 ± 1.63; prevention measure scores ranged from 11.48 ± 4.18 to 13.96 ± 3.92. Differences in all of the above results are statistically significant (P < 0.05). Implementation of a QCC can standardise and improve the prevention measures for patients who are vulnerable to pressure sores and is of practical importance to their prevention and control. © 2017 Medicalhelplines.com Inc and John Wiley & Sons Ltd.

  15. Quality control and quality assurance plan for bridge channel-stability assessments in Massachusetts

    USGS Publications Warehouse

    Parker, Gene W.; Pinson, Harlow

    1993-01-01

    A quality control and quality assurance plan has been implemented as part of the Massachusetts bridge scour and channel-stability assessment program. This program is being conducted by the U.S. Geological Survey, Massachusetts-Rhode Island District, in cooperation with the Massachusetts Highway Department. Project personnel training, data-integrity verification, and new data-management technologies are being utilized in the channel-stability assessment process to improve current data-collection and management techniques. An automated data-collection procedure has been implemented to standardize channel-stability assessments on a regular basis within the State. An object-oriented data structure and new image management tools are used to produce a data base enabling management of multiple data object classes. Data will be reviewed by assessors and data base managers before being merged into a master bridge-scour data base, which includes automated data-verification routines.

  16. Analytical approaches to quality assurance and quality control in rangeland monitoring data

    USDA-ARS?s Scientific Manuscript database

    Producing quality data to support land management decisions is the goal of every rangeland monitoring program. However, the results of quality assurance (QA) and quality control (QC) efforts to improve data quality are rarely reported. The purpose of QA and QC is to prevent and describe non-sampling...

  17. Employee empowerment through team building and use of process control methods.

    PubMed

    Willems, S

    1998-02-01

    The article examines the use of statistical process control and performance improvement techniques in employee empowerment. The focus is how these techniques provide employees with information to improve their productivity and become involved in the decision-making process. Findings suggest that at one Mississippi hospital employee improvement has had a positive effect on employee productivity, morale, and quality of work.

  18. Professional Development for Water Quality Control Personnel.

    ERIC Educational Resources Information Center

    Shepard, Clinton Lewis

    This study investigated the availability of professional development opportunities for water quality control personnel in the midwest. The major objective of the study was to establish a listing of educational opportunities for the professional development of water quality control personnel and to compare these with the opportunities technicians…

  19. [Monitoring method of extraction process for Schisandrae Chinensis Fructus based on near infrared spectroscopy and multivariate statistical process control].

    PubMed

    Xu, Min; Zhang, Lei; Yue, Hong-Shui; Pang, Hong-Wei; Ye, Zheng-Liang; Ding, Li

    2017-10-01

    To establish an on-line monitoring method for extraction process of Schisandrae Chinensis Fructus, the formula medicinal material of Yiqi Fumai lyophilized injection by combining near infrared spectroscopy with multi-variable data analysis technology. The multivariate statistical process control (MSPC) model was established based on 5 normal batches in production and 2 test batches were monitored by PC scores, DModX and Hotelling T2 control charts. The results showed that MSPC model had a good monitoring ability for the extraction process. The application of the MSPC model to actual production process could effectively achieve on-line monitoring for extraction process of Schisandrae Chinensis Fructus, and can reflect the change of material properties in the production process in real time. This established process monitoring method could provide reference for the application of process analysis technology in the process quality control of traditional Chinese medicine injections. Copyright© by the Chinese Pharmaceutical Association.

  20. Application of QC_DR software for acceptance testing and routine quality control of direct digital radiography systems: initial experiences using the Italian Association of Physicist in Medicine quality control protocol.

    PubMed

    Nitrosi, Andrea; Bertolini, Marco; Borasi, Giovanni; Botti, Andrea; Barani, Adriana; Rivetti, Stefano; Pierotti, Luisa

    2009-12-01

    Ideally, medical x-ray imaging systems should be designed to deliver maximum image quality at an acceptable radiation risk to the patient. Quality assurance procedures are employed to ensure that these standards are maintained. A quality control protocol for direct digital radiography (DDR) systems is described and discussed. Software to automatically process and analyze the required images was developed. In this paper, the initial results obtained on equipment of different DDR manufacturers were reported. The protocol was developed to highlight even small discrepancies in standard operating performance.

  1. 75 FR 60013 - Approval and Promulgation of Air Quality Implementation Plans; Maryland; Control of Volatile...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-29

    ... Promulgation of Air Quality Implementation Plans; Maryland; Control of Volatile Organic Compounds Emissions... Maryland's Volatile Organic Compounds from Specific Processes Regulation. Maryland has adopted standards... (RACT) requirements for sources of volatile organic compounds (VOCs) covered by control techniques...

  2. Fuel quality processing study, volume 1

    NASA Astrophysics Data System (ADS)

    Ohara, J. B.; Bela, A.; Jentz, N. E.; Syverson, H. T.; Klumpe, H. W.; Kessler, R. E.; Kotzot, H. T.; Loran, B. L.

    1981-04-01

    A fuel quality processing study to provide a data base for an intelligent tradeoff between advanced turbine technology and liquid fuel quality, and also, to guide the development of specifications of future synthetic fuels anticipated for use in the time period 1985 to 2000 is given. Four technical performance tests are discussed: on-site pretreating, existing refineries to upgrade fuels, new refineries to upgrade fuels, and data evaluation. The base case refinery is a modern Midwest refinery processing 200,000 BPD of a 60/40 domestic/import petroleum crude mix. The synthetic crudes used for upgrading to marketable products and turbine fuel are shale oil and coal liquids. Of these syncrudes, 50,000 BPD are processed in the existing petroleum refinery, requiring additional process units and reducing petroleum feed, and in a new refinery designed for processing each syncrude to produce gasoline, distillate fuels, resid fuels, and turbine fuel, JPGs and coke. An extensive collection of synfuel properties and upgrading data was prepared for the application of a linear program model to investigate the most economical production slate meeting petroleum product specifications and turbine fuels of various quality grades. Technical and economic projections were developed for 36 scenarios, based on 4 different crude feeds to either modified existing or new refineries operated in 2 different modes to produce 7 differing grades of turbine fuels. A required product selling price of turbine fuel for each processing route was calculated. Procedures and projected economics were developed for on-site treatment of turbine fuel to meet limitations of impurities and emission of pollutants.

  3. Fuel quality processing study, volume 1

    NASA Technical Reports Server (NTRS)

    Ohara, J. B.; Bela, A.; Jentz, N. E.; Syverson, H. T.; Klumpe, H. W.; Kessler, R. E.; Kotzot, H. T.; Loran, B. L.

    1981-01-01

    A fuel quality processing study to provide a data base for an intelligent tradeoff between advanced turbine technology and liquid fuel quality, and also, to guide the development of specifications of future synthetic fuels anticipated for use in the time period 1985 to 2000 is given. Four technical performance tests are discussed: on-site pretreating, existing refineries to upgrade fuels, new refineries to upgrade fuels, and data evaluation. The base case refinery is a modern Midwest refinery processing 200,000 BPD of a 60/40 domestic/import petroleum crude mix. The synthetic crudes used for upgrading to marketable products and turbine fuel are shale oil and coal liquids. Of these syncrudes, 50,000 BPD are processed in the existing petroleum refinery, requiring additional process units and reducing petroleum feed, and in a new refinery designed for processing each syncrude to produce gasoline, distillate fuels, resid fuels, and turbine fuel, JPGs and coke. An extensive collection of synfuel properties and upgrading data was prepared for the application of a linear program model to investigate the most economical production slate meeting petroleum product specifications and turbine fuels of various quality grades. Technical and economic projections were developed for 36 scenarios, based on 4 different crude feeds to either modified existing or new refineries operated in 2 different modes to produce 7 differing grades of turbine fuels. A required product selling price of turbine fuel for each processing route was calculated. Procedures and projected economics were developed for on-site treatment of turbine fuel to meet limitations of impurities and emission of pollutants.

  4. Ride quality sensitivity to SAS control law and to handling quality variations

    NASA Technical Reports Server (NTRS)

    Roberts, P. A.; Schmidt, D. K.; Swaim, R. L.

    1976-01-01

    The RQ trends which large flexible aircraft exhibit under various parameterizations of control laws and handling qualities are discussed. A summary of the assumptions and solution technique, a control law parameterization review, a discussion of ride sensitivity to handling qualities, and the RQ effects generated by implementing relaxed static stability configurations are included.

  5. 14 CFR 21.147 - Changes in quality control system.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 1 2011-01-01 2011-01-01 false Changes in quality control system. 21.147 Section 21.147 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION... quality control system. After the issue of a production certificate, each change to the quality control...

  6. 14 CFR 21.147 - Changes in quality control system.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Changes in quality control system. 21.147 Section 21.147 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION... quality control system. After the issue of a production certificate, each change to the quality control...

  7. Systems and processes that ensure high quality care.

    PubMed

    Bassett, Sally; Westmore, Kathryn

    2012-10-01

    This is the second in a series of articles examining the components of good corporate governance. It considers how the structures and processes for quality governance can affect an organisation's ability to be assured about the quality of care. Complex information systems and procedures can lead to poor quality care, but sound structures and processes alone are insufficient to ensure good governance, and behavioural factors play a significant part in making sure that staff are enabled to provide good quality care. The next article in this series looks at how the information reporting of an organisation can affect its governance.

  8. A Quality Process Approach to Electronic System Reliability: Supplier Quality Assessment Procedure. Volume 2

    DTIC Science & Technology

    1993-11-01

    AND SUPPORT SERVICE QUALITY ....... 16 3.5.7 SUPPLIER QUALITY ............................................................................. 16 3.6...RESULTS ......................................................................................................... 16 3.6.1 PRODUCT AND SERVICE QUALITY RESULTS...10 5.6 Business Process and Support Service Quality 20 5.7 Supplier Quality 20 6.0 Results 180 6.1 Product and Service Quality Results 90 6.2 Business

  9. Statistical process control and verifying positional accuracy of a cobra motion couch using step-wedge quality assurance tool.

    PubMed

    Binny, Diana; Lancaster, Craig M; Trapp, Jamie V; Crowe, Scott B

    2017-09-01

    This study utilizes process control techniques to identify action limits for TomoTherapy couch positioning quality assurance tests. A test was introduced to monitor accuracy of the applied couch offset detection in the TomoTherapy Hi-Art treatment system using the TQA "Step-Wedge Helical" module and MVCT detector. Individual X-charts, process capability (cp), probability (P), and acceptability (cpk) indices were used to monitor a 4-year couch IEC offset data to detect systematic and random errors in the couch positional accuracy for different action levels. Process capability tests were also performed on the retrospective data to define tolerances based on user-specified levels. A second study was carried out whereby physical couch offsets were applied using the TQA module and the MVCT detector was used to detect the observed variations. Random and systematic variations were observed for the SPC-based upper and lower control limits, and investigations were carried out to maintain the ongoing stability of the process for a 4-year and a three-monthly period. Local trend analysis showed mean variations up to ±0.5 mm in the three-monthly analysis period for all IEC offset measurements. Variations were also observed in the detected versus applied offsets using the MVCT detector in the second study largely in the vertical direction, and actions were taken to remediate this error. Based on the results, it was recommended that imaging shifts in each coordinate direction be only applied after assessing the machine for applied versus detected test results using the step helical module. User-specified tolerance levels of at least ±2 mm were recommended for a test frequency of once every 3 months to improve couch positional accuracy. SPC enables detection of systematic variations prior to reaching machine tolerance levels. Couch encoding system recalibrations reduced variations to user-specified levels and a monitoring period of 3 months using SPC facilitated in detecting

  10. Improving NGDC Track-line Data Quality Control

    NASA Astrophysics Data System (ADS)

    Chandler, M. T.; Wessel, P.

    2004-12-01

    Ship-board gravity, magnetic and bathymetry data archived at the National Geophysical Data Center (NGDC) represent decades of seagoing research, containing over 4,500 cruises. Cruise data remain relevent despite the prominence of satellite altimetry-derived global grids because many geologic processes remain resolvable by oceanographic research alone. Due to the tremendous investment put forth by scientists and taxpayers to compile this vast archive and the significant errors found within it, additional quality assessment and corrections are warranted. These can best be accomplished by adding to existing quality control measures at NGDC. We are currently developing open source software to provide additional quality control. Along with NGDC's current sanity checking, new data at NGDC will also be subjected to an along-track ``sniffer'' which will detect and flag suspicious data for later graphical inspection using a visual editor. If new data pass these tests, they will undergo further scrutinization using a crossover error (COE) calculator which will compare new data values to existing values at points of intersection within the archive. Data passing these tests will be deemed ``quality data`` and suitable for permanent addition to the archive, while data that fail will be returned to the source institution for correction. Crossover errors will be stored and an online COE database will be available. The COE database will allow users to apply corrections to the NGDC track-line database to produce corrected data files. At no time will the archived data itself be modified. An attempt will also be made to reduce navigational errors for pre-GPS navigated cruises. Upon completion these programs will be used to explore and model systematic errors within the archive, generate correction tables for all cruises, and to quantify the error budget in marine geophysical observations. Software will be released and these procedures will be implemented in cooperation with NGDC

  11. Colorado Air Quality Control Regulations and Ambient Air Quality Standards.

    ERIC Educational Resources Information Center

    Colorado State Dept. of Health, Denver. Div. of Air Pollution Control.

    Regulations and standards relative to air quality control in Colorado are defined in this publication. Presented first are definitions of terms, a statement of intent, and general provisions applicable to all emission control regulations adopted by the Colorado Air Pollution Control Commission. Following this, three regulations are enumerated: (1)…

  12. 46 CFR 164.120-11 - Production quality control requirements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 6 2014-10-01 2014-10-01 false Production quality control requirements. 164.120-11... Rescue Boats § 164.120-11 Production quality control requirements. The resin manufacturer must institute a quality control procedure to ensure that all Coast Guard-accepted resin is produced to the same...

  13. 46 CFR 164.120-11 - Production quality control requirements.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 6 2012-10-01 2012-10-01 false Production quality control requirements. 164.120-11... Rescue Boats § 164.120-11 Production quality control requirements. The resin manufacturer must institute a quality control procedure to ensure that all Coast Guard-accepted resin is produced to the same...

  14. 46 CFR 164.120-11 - Production quality control requirements.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 6 2013-10-01 2013-10-01 false Production quality control requirements. 164.120-11... Rescue Boats § 164.120-11 Production quality control requirements. The resin manufacturer must institute a quality control procedure to ensure that all Coast Guard-accepted resin is produced to the same...

  15. Math Problems for Water Quality Control Personnel, Instructor's Manual. Second Edition.

    ERIC Educational Resources Information Center

    Delvecchio, Fred; Brutsch, Gloria

    This document is the instructor's manual for a course in mathematics for water quality control personnel. It is designed so a program may be designed for a specific facility. The problem structures are arranged alphabetically by treatment process. Charts, graphs and/or drawings representing familiar data forms contain the necessary information to…

  16. System-wide hybrid MPC-PID control of a continuous pharmaceutical tablet manufacturing process via direct compaction.

    PubMed

    Singh, Ravendra; Ierapetritou, Marianthi; Ramachandran, Rohit

    2013-11-01

    The next generation of QbD based pharmaceutical products will be manufactured through continuous processing. This will allow the integration of online/inline monitoring tools, coupled with an efficient advanced model-based feedback control systems, to achieve precise control of process variables, so that the predefined product quality can be achieved consistently. The direct compaction process considered in this study is highly interactive and involves time delays for a number of process variables due to sensor placements, process equipment dimensions, and the flow characteristics of the solid material. A simple feedback regulatory control system (e.g., PI(D)) by itself may not be sufficient to achieve the tight process control that is mandated by regulatory authorities. The process presented herein comprises of coupled dynamics involving slow and fast responses, indicating the requirement of a hybrid control scheme such as a combined MPC-PID control scheme. In this manuscript, an efficient system-wide hybrid control strategy for an integrated continuous pharmaceutical tablet manufacturing process via direct compaction has been designed. The designed control system is a hybrid scheme of MPC-PID control. An effective controller parameter tuning strategy involving an ITAE method coupled with an optimization strategy has been used for tuning of both MPC and PID parameters. The designed hybrid control system has been implemented in a first-principles model-based flowsheet that was simulated in gPROMS (Process System Enterprise). Results demonstrate enhanced performance of critical quality attributes (CQAs) under the hybrid control scheme compared to only PID or MPC control schemes, illustrating the potential of a hybrid control scheme in improving pharmaceutical manufacturing operations. Copyright © 2013 Elsevier B.V. All rights reserved.

  17. Fermentanomics: Relating quality attributes of a monoclonal antibody to cell culture process variables and raw materials using multivariate data analysis.

    PubMed

    Rathore, Anurag S; Kumar Singh, Sumit; Pathak, Mili; Read, Erik K; Brorson, Kurt A; Agarabi, Cyrus D; Khan, Mansoor

    2015-01-01

    Fermentanomics is an emerging field of research and involves understanding the underlying controlled process variables and their effect on process yield and product quality. Although major advancements have occurred in process analytics over the past two decades, accurate real-time measurement of significant quality attributes for a biotech product during production culture is still not feasible. Researchers have used an amalgam of process models and analytical measurements for monitoring and process control during production. This article focuses on using multivariate data analysis as a tool for monitoring the internal bioreactor dynamics, the metabolic state of the cell, and interactions among them during culture. Quality attributes of the monoclonal antibody product that were monitored include glycosylation profile of the final product along with process attributes, such as viable cell density and level of antibody expression. These were related to process variables, raw materials components of the chemically defined hybridoma media, concentration of metabolites formed during the course of the culture, aeration-related parameters, and supplemented raw materials such as glucose, methionine, threonine, tryptophan, and tyrosine. This article demonstrates the utility of multivariate data analysis for correlating the product quality attributes (especially glycosylation) to process variables and raw materials (especially amino acid supplements in cell culture media). The proposed approach can be applied for process optimization to increase product expression, improve consistency of product quality, and target the desired quality attribute profile. © 2015 American Institute of Chemical Engineers.

  18. Study on the Quality Management of Building Electricity Engineering Construction in the Whole Process

    NASA Astrophysics Data System (ADS)

    Qin, Minwu

    2018-05-01

    With the progress of science and technology, people use more and more types of electrical equipment and the functions are more and more complicated, which put forward higher requirements on the construction quality of electrical construction. If you ignore some of the necessary quality requirements and violate the specification of operation in the process of building electrical construction, that will bring great security risks and resulting in huge economic losses, even endanger personal safety. Manage and control construction quality of building electrical construction must be carried out throughout the whole process of construction. According to the construction characteristics of building electrical construction, this article analyze the construction details that are easy to be ignored but very important in the construction, based on management theory and put forward the methods of quality management in the whole process of building electrical construction. This template explains and demonstrates how to prepare your camera-ready paper for Trans Tech Publications. The best is to read these instructions and follow the outline of this text.

  19. Drop-on-Demand System for Manufacturing of Melt-based Solid Oral Dosage: Effect of Critical Process Parameters on Product Quality.

    PubMed

    Içten, Elçin; Giridhar, Arun; Nagy, Zoltan K; Reklaitis, Gintaras V

    2016-04-01

    The features of a drop-on-demand-based system developed for the manufacture of melt-based pharmaceuticals have been previously reported. In this paper, a supervisory control system, which is designed to ensure reproducible production of high quality of melt-based solid oral dosages, is presented. This control system enables the production of individual dosage forms with the desired critical quality attributes: amount of active ingredient and drug morphology by monitoring and controlling critical process parameters, such as drop size and product and process temperatures. The effects of these process parameters on the final product quality are investigated, and the properties of the produced dosage forms characterized using various techniques, such as Raman spectroscopy, optical microscopy, and dissolution testing. A crystallization temperature control strategy, including controlled temperature cycles, is presented to tailor the crystallization behavior of drug deposits and to achieve consistent drug morphology. This control strategy can be used to achieve the desired bioavailability of the drug by mitigating variations in the dissolution profiles. The supervisor control strategy enables the application of the drop-on-demand system to the production of individualized dosage required for personalized drug regimens.

  20. Quality Control Technician Curriculum. An Elusive Butterfly.

    ERIC Educational Resources Information Center

    Holler, Michael

    Defining and developing a quality control technician curriculum for an associate degree program is a difficult and puzzling job. There are as many definitions of quality control and curriculum ideas as there are educators asked. However, one could start by dividing the field into its major areas--heavy manufacturing, maintenance, research, and…

  1. Interpreting the handling qualities of aircraft with stability and control augmentation

    NASA Technical Reports Server (NTRS)

    Hodgkinson, J.; Potsdam, E. H.; Smith, R. E.

    1990-01-01

    The general process of designing an aircraft for good flying qualities is first discussed. Lessons learned are pointed out, with piloted evaluation emerging as a crucial element. Two sources of rating variability in performing these evaluations are then discussed. First, the finite endpoints of the Cooper-Harper scale do not bias parametric statistical analyses unduly. Second, the wording of the scale does introduce some scatter. Phase lags generated by augmentation systems, as represented by equivalent time delays, often cause poor flying qualities. An analysis is introduced which allows a designer to relate any level of time delay to a probability of loss of aircraft control. This view of time delays should, it is hoped, allow better visibility of the time delays in the design process.

  2. Quality Assessment of College Admissions Processes.

    ERIC Educational Resources Information Center

    Fisher, Caroline; Weymann, Elizabeth; Todd, Amy

    2000-01-01

    This study evaluated the admissions process for a Master's in Business Administration Program using such quality improvement techniques as customer surveys, benchmarking, and gap analysis. Analysis revealed that student dissatisfaction with the admissions process may be a factor influencing declining enrollment. Cycle time and number of student…

  3. Statistical Process Control Charts for Measuring and Monitoring Temporal Consistency of Ratings

    ERIC Educational Resources Information Center

    Omar, M. Hafidz

    2010-01-01

    Methods of statistical process control were briefly investigated in the field of educational measurement as early as 1999. However, only the use of a cumulative sum chart was explored. In this article other methods of statistical quality control are introduced and explored. In particular, methods in the form of Shewhart mean and standard deviation…

  4. Quality by Design approach for studying the impact of formulation and process variables on product quality of oral disintegrating films.

    PubMed

    Mazumder, Sonal; Pavurala, Naresh; Manda, Prashanth; Xu, Xiaoming; Cruz, Celia N; Krishnaiah, Yellela S R

    2017-07-15

    The present investigation was carried out to understand the impact of formulation and process variables on the quality of oral disintegrating films (ODF) using Quality by Design (QbD) approach. Lamotrigine (LMT) was used as a model drug. Formulation variable was plasticizer to film former ratio and process variables were drying temperature, air flow rate in the drying chamber, drying time and wet coat thickness of the film. A Definitive Screening Design of Experiments (DoE) was used to identify and classify the critical formulation and process variables impacting critical quality attributes (CQA). A total of 14 laboratory-scale DoE formulations were prepared and evaluated for mechanical properties (%elongation at break, yield stress, Young's modulus, folding endurance) and other CQA (dry thickness, disintegration time, dissolution rate, moisture content, moisture uptake, drug assay and drug content uniformity). The main factors affecting mechanical properties were plasticizer to film former ratio and drying temperature. Dissolution rate was found to be sensitive to air flow rate during drying and plasticizer to film former ratio. Data were analyzed for elucidating interactions between different variables, rank ordering the critical materials attributes (CMA) and critical process parameters (CPP), and for providing a predictive model for the process. Results suggested that plasticizer to film former ratio and process controls on drying are critical to manufacture LMT ODF with the desired CQA. Published by Elsevier B.V.

  5. SIMBAD quality-control

    NASA Technical Reports Server (NTRS)

    Lesteven, Soizick

    1992-01-01

    bibliographic information in the SIMBAD quality control process. Furthermore, it is possible to connect the physical measurements of stars from SIMBAD to literature concerning these stars from the NASA-STI abstracts. The physical properties of stars (e.g. UBV colors) are not randomly distributed. Stars are distributed among different clusters in a physical parameter space. The authors will show that there are some relations between this classification and the literature concerning these objects clusters in a factor space. They will investigate the nature of the relationship between the SIMBAD measurements and the bibliography. These would be new relationships that are not pre-established by an astronomer. In addition, the bibliography could be neutral information that can be used in combination with the measured parameters.

  6. Chip Design Process Optimization Based on Design Quality Assessment

    NASA Astrophysics Data System (ADS)

    Häusler, Stefan; Blaschke, Jana; Sebeke, Christian; Rosenstiel, Wolfgang; Hahn, Axel

    2010-06-01

    Nowadays, the managing of product development projects is increasingly challenging. Especially the IC design of ASICs with both analog and digital components (mixed-signal design) is becoming more and more complex, while the time-to-market window narrows at the same time. Still, high quality standards must be fulfilled. Projects and their status are becoming less transparent due to this complexity. This makes the planning and execution of projects rather difficult. Therefore, there is a need for efficient project control. A main challenge is the objective evaluation of the current development status. Are all requirements successfully verified? Are all intermediate goals achieved? Companies often develop special solutions that are not reusable in other projects. This makes the quality measurement process itself less efficient and produces too much overhead. The method proposed in this paper is a contribution to solve these issues. It is applied at a German design house for analog mixed-signal IC design. This paper presents the results of a case study and introduces an optimized project scheduling on the basis of quality assessment results.

  7. 7 CFR 58.523 - Laboratory and quality control tests.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 3 2014-01-01 2014-01-01 false Laboratory and quality control tests. 58.523 Section... Service 1 Operations and Operating Procedures § 58.523 Laboratory and quality control tests. (a) Quality control tests shall be made on samples as often as necessary to determine the shelf-life and stability of...

  8. 7 CFR 58.523 - Laboratory and quality control tests.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Laboratory and quality control tests. 58.523 Section... Service 1 Operations and Operating Procedures § 58.523 Laboratory and quality control tests. (a) Quality control tests shall be made on samples as often as necessary to determine the shelf-life and stability of...

  9. 7 CFR 58.523 - Laboratory and quality control tests.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Laboratory and quality control tests. 58.523 Section... Service 1 Operations and Operating Procedures § 58.523 Laboratory and quality control tests. (a) Quality control tests shall be made on samples as often as necessary to determine the shelf-life and stability of...

  10. 7 CFR 58.523 - Laboratory and quality control tests.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Laboratory and quality control tests. 58.523 Section... Service 1 Operations and Operating Procedures § 58.523 Laboratory and quality control tests. (a) Quality control tests shall be made on samples as often as necessary to determine the shelf-life and stability of...

  11. 7 CFR 58.523 - Laboratory and quality control tests.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 3 2012-01-01 2012-01-01 false Laboratory and quality control tests. 58.523 Section... Service 1 Operations and Operating Procedures § 58.523 Laboratory and quality control tests. (a) Quality control tests shall be made on samples as often as necessary to determine the shelf-life and stability of...

  12. 21 CFR 640.56 - Quality control test for potency.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... quality control test for potency may be performed by a clinical laboratory which meets the standards of... 21 Food and Drugs 7 2010-04-01 2010-04-01 false Quality control test for potency. 640.56 Section...) BIOLOGICS ADDITIONAL STANDARDS FOR HUMAN BLOOD AND BLOOD PRODUCTS Cryoprecipitate § 640.56 Quality control...

  13. Web quality control for lectures: Supercourse and Amazon.com.

    PubMed

    Linkov, Faina; LaPorte, Ronald; Lovalekar, Mita; Dodani, Sunita

    2005-12-01

    Peer review has been at the corner stone of quality control of the biomedical journals in the past 300 years. With the emergency of the Internet, new models of quality control and peer review are emerging. However, such models are poorly investigated. We would argue that the popular system of quality control used in Amazon.com offers a way to ensure continuous quality improvement in the area of research communications on the Internet. Such system is providing an interesting alternative to the traditional peer review approaches used in the biomedical journals and challenges the traditional paradigms of scientific publishing. This idea is being explored in the context of Supercourse, a library of 2,350 prevention lectures, shared for free by faculty members from over 150 countries. Supercourse is successfully utilizing quality control approaches that are similar to Amazon.com model. Clearly, the existing approaches and emerging alternatives for quality control in scientific communications needs to be assessed scientifically. Rapid explosion of internet technologies could be leveraged to produce better, more cost effective systems for quality control in the biomedical publications and across all sciences.

  14. 40 CFR 81.107 - Greenwood Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Quality Control Regions § 81.107 Greenwood Intrastate Air Quality Control Region. The Greenwood Intrastate Air Quality Control Region (South Carolina) consists of the territorial area encompassed by the... 40 Protection of Environment 17 2010-07-01 2010-07-01 false Greenwood Intrastate Air Quality...

  15. 40 CFR 81.108 - Columbia Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Quality Control Regions § 81.108 Columbia Intrastate Air Quality Control Region. The Columbia Intrastate Air Quality Control Region (South Carolina) consists of the territorial area encompassed by the... 40 Protection of Environment 17 2011-07-01 2011-07-01 false Columbia Intrastate Air Quality...

  16. 40 CFR 81.108 - Columbia Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Quality Control Regions § 81.108 Columbia Intrastate Air Quality Control Region. The Columbia Intrastate Air Quality Control Region (South Carolina) consists of the territorial area encompassed by the... 40 Protection of Environment 17 2010-07-01 2010-07-01 false Columbia Intrastate Air Quality...

  17. 40 CFR 81.111 - Georgetown Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Quality Control Regions § 81.111 Georgetown Intrastate Air Quality Control Region. The Georgetown Intrastate Air Quality Control Region (South Carolina) consists of the territorial area encompassed by the... 40 Protection of Environment 17 2011-07-01 2011-07-01 false Georgetown Intrastate Air Quality...

  18. 40 CFR 81.109 - Florence Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Quality Control Regions § 81.109 Florence Intrastate Air Quality Control Region. The Florence Intrastate Air Quality Control Region (South Carolina) consists of the territorial area encompassed by the... 40 Protection of Environment 17 2010-07-01 2010-07-01 false Florence Intrastate Air Quality...

  19. 40 CFR 81.111 - Georgetown Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Quality Control Regions § 81.111 Georgetown Intrastate Air Quality Control Region. The Georgetown Intrastate Air Quality Control Region (South Carolina) consists of the territorial area encompassed by the... 40 Protection of Environment 17 2010-07-01 2010-07-01 false Georgetown Intrastate Air Quality...

  20. 40 CFR 81.109 - Florence Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Quality Control Regions § 81.109 Florence Intrastate Air Quality Control Region. The Florence Intrastate Air Quality Control Region (South Carolina) consists of the territorial area encompassed by the... 40 Protection of Environment 17 2011-07-01 2011-07-01 false Florence Intrastate Air Quality...

  1. 40 CFR 81.107 - Greenwood Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Quality Control Regions § 81.107 Greenwood Intrastate Air Quality Control Region. The Greenwood Intrastate Air Quality Control Region (South Carolina) consists of the territorial area encompassed by the... 40 Protection of Environment 17 2011-07-01 2011-07-01 false Greenwood Intrastate Air Quality...

  2. Controlling the digital transfer process

    NASA Astrophysics Data System (ADS)

    Brunner, Felix

    1997-02-01

    The accuracy of today's color management systems fails to satisfy the requirements of the graphic arts market. A first explanation for this is that color calibration charts on which these systems rely, because of print technical reasons, are subject to color deviations and inconsistencies. A second reason is that colorimetry describes the human visual perception of color differences and has no direct relation to the rendering technology itself of a proofing or printing device. The author explains that only firm process control of the many parameters in offset printing by means of a system as for example EUROSTANDARD System Brunner, can lead to accurate and consistent calibration of scanner, display, proof and print. The same principles hold for the quality management of digital presses.

  3. E-Learning Quality Assurance: A Process-Oriented Lifecycle Model

    ERIC Educational Resources Information Center

    Abdous, M'hammed

    2009-01-01

    Purpose: The purpose of this paper is to propose a process-oriented lifecycle model for ensuring quality in e-learning development and delivery. As a dynamic and iterative process, quality assurance (QA) is intertwined with the e-learning development process. Design/methodology/approach: After reviewing the existing literature, particularly…

  4. Contractor Performed Quality Control on KyTC Projects.

    DOT National Transportation Integrated Search

    2002-08-01

    This report addresses issues related to transferring the responsibility for quality control from the Kentucky Transportation Cabinet (KyTC) to construction contractors. : Several key topics related to Contractor Performed Quality Control (CPQC) are p...

  5. Software quality: Process or people

    NASA Technical Reports Server (NTRS)

    Palmer, Regina; Labaugh, Modenna

    1993-01-01

    This paper will present data related to software development processes and personnel involvement from the perspective of software quality assurance. We examine eight years of data collected from six projects. Data collected varied by project but usually included defect and fault density with limited use of code metrics, schedule adherence, and budget growth information. The data are a blend of AFSCP 800-14 and suggested productivity measures in Software Metrics: A Practioner's Guide to Improved Product Development. A software quality assurance database tool, SQUID, was used to store and tabulate the data.

  6. Preliminary Retrospective Analysis of Daily Tomotherapy Output Constancy Checks Using Statistical Process Control.

    PubMed

    Mezzenga, Emilio; D'Errico, Vincenzo; Sarnelli, Anna; Strigari, Lidia; Menghi, Enrico; Marcocci, Francesco; Bianchini, David; Benassi, Marcello

    2016-01-01

    The purpose of this study was to retrospectively evaluate the results from a Helical TomoTherapy Hi-Art treatment system relating to quality controls based on daily static and dynamic output checks using statistical process control methods. Individual value X-charts, exponentially weighted moving average charts, and process capability and acceptability indices were used to monitor the treatment system performance. Daily output values measured from January 2014 to January 2015 were considered. The results obtained showed that, although the process was in control, there was an out-of-control situation in the principal maintenance intervention for the treatment system. In particular, process capability indices showed a decreasing percentage of points in control which was, however, acceptable according to AAPM TG148 guidelines. Our findings underline the importance of restricting the acceptable range of daily output checks and suggest a future line of investigation for a detailed process control of daily output checks for the Helical TomoTherapy Hi-Art treatment system.

  7. [Pharmaceutical product quality control and good manufacturing practices].

    PubMed

    Hiyama, Yukio

    2010-01-01

    This report describes the roles of Good Manufacturing Practices (GMP) in pharmaceutical product quality control. There are three keys to pharmaceutical product quality control. They are specifications, thorough product characterization during development, and adherence to GMP as the ICH Q6A guideline on specifications provides the most important principles in its background section. Impacts of the revised Pharmaceutical Affairs Law (rPAL) which became effective in 2005 on product quality control are discussed. Progress of ICH discussion for Pharmaceutical Development (Q8), Quality Risk Management (Q9) and Pharmaceutical Quality System (Q10) are reviewed. In order to reconstruct GMP guidelines and GMP inspection system in the regulatory agencies under the new paradigm by rPAL and the ICH, a series of Health Science studies were conducted. For GMP guidelines, product GMP guideline, technology transfer guideline, laboratory control guideline and change control system guideline were written. For the GMP inspection system, inspection check list, inspection memo and inspection scenario were proposed also by the Health Science study groups. Because pharmaceutical products and their raw materials are manufactured and distributed internationally, collaborations with other national authorities are highly desired. In order to enhance the international collaborations, consistent establishment of GMP inspection quality system throughout Japan will be essential.

  8. Providing leadership to a decentralized total quality process.

    PubMed

    Diederich, J J; Eisenberg, M

    1993-01-01

    Integrating total quality management into the culture of an organization and the daily work of employees requires a decentralized leadership structure that encourages all employees to become involved. This article, based upon the experience of the University of Michigan Hospitals Professional Services Divisional Lead Team, outlines a process for decentralizing the total quality management process.

  9. Adverse effect versus quality control of the Fuenzalida-Palacios antirabies vaccine.

    PubMed

    Nogueira, Y L

    1998-01-01

    We evaluated the components of the Fuenzalida-Palacios antirabies vaccine, which is till used in most developing countries in human immunization for treatment and prophylaxis. This vaccine is prepared from newborn mouse brains at 1% concentration. Even though the vaccine is considered to have a low myelin content, it is not fully free of myelin or of other undesirable components that might trigger adverse effects after vaccination. The most severe effect is a post-vaccination neuroparalytic accident associated with Guillain-Barré syndrome. In the present study we demonstrate how the vaccines produced and distributed by different laboratories show different component patterns with different degrees of impurity and with varying protein concentrations, indicating that production processes can vary from one laboratory to another. These differences, which could be resolved using a better quality control process, may affect and impair immunization, with consequent risks and adverse effects after vaccination. We used crossed immunoelectrophoresis to evaluate and demonstrate the possibility of quality control in vaccine production, reducing the risk factors possibly involved in these immunizing products.

  10. ISO 9002 as Literacy Practice: Coping with Quality-Control Documents in a High-Tech Company

    ERIC Educational Resources Information Center

    Kleifgen, Jo Anne

    2005-01-01

    This study describes the process by which a circuit board manufacturing company became certified in an international quality control program known as ISO 9002. Particular attention is paid to how quality documents were made and used in actual practice and to the relationship between these standardized procedures (official literacies) and…

  11. QCScreen: a software tool for data quality control in LC-HRMS based metabolomics.

    PubMed

    Simader, Alexandra Maria; Kluger, Bernhard; Neumann, Nora Katharina Nicole; Bueschl, Christoph; Lemmens, Marc; Lirk, Gerald; Krska, Rudolf; Schuhmacher, Rainer

    2015-10-24

    Metabolomics experiments often comprise large numbers of biological samples resulting in huge amounts of data. This data needs to be inspected for plausibility before data evaluation to detect putative sources of error e.g. retention time or mass accuracy shifts. Especially in liquid chromatography-high resolution mass spectrometry (LC-HRMS) based metabolomics research, proper quality control checks (e.g. for precision, signal drifts or offsets) are crucial prerequisites to achieve reliable and comparable results within and across experimental measurement sequences. Software tools can support this process. The software tool QCScreen was developed to offer a quick and easy data quality check of LC-HRMS derived data. It allows a flexible investigation and comparison of basic quality-related parameters within user-defined target features and the possibility to automatically evaluate multiple sample types within or across different measurement sequences in a short time. It offers a user-friendly interface that allows an easy selection of processing steps and parameter settings. The generated results include a coloured overview plot of data quality across all analysed samples and targets and, in addition, detailed illustrations of the stability and precision of the chromatographic separation, the mass accuracy and the detector sensitivity. The use of QCScreen is demonstrated with experimental data from metabolomics experiments using selected standard compounds in pure solvent. The application of the software identified problematic features, samples and analytical parameters and suggested which data files or compounds required closer manual inspection. QCScreen is an open source software tool which provides a useful basis for assessing the suitability of LC-HRMS data prior to time consuming, detailed data processing and subsequent statistical analysis. It accepts the generic mzXML format and thus can be used with many different LC-HRMS platforms to process both multiple

  12. Quality control in the development of coagulation factor concentrates.

    PubMed

    Snape, T J

    1987-01-01

    Limitation of process change is a major factor contributing to assurance of quality in pharmaceutical manufacturing. This is particularly true in the manufacture of coagulation factor concentrates, for which presumptive testing for poorly defined product characteristics is an integral feature of finished product quality control. The development of new or modified preparations requires that this comfortable position be abandoned, and that the effect on finished product characteristics of changes to individual process steps (and components) be assessed. The degree of confidence in the safety and efficacy of the new product will be determined by, amongst other things, the complexity of the process alteration and the extent to which the results of finished product tests can be considered predictive. The introduction of a heat-treatment step for inactivation of potential viral contaminants in coagulation factor concentrates presents a significant challenge in both respects, quite independent of any consideration of assessment of the effectiveness of the viral inactivation step. These interactions are illustrated by some of the problems encountered with terminal dry heat-treatment (72 h. at 80 degrees C) of factor VIII and prothrombin complex concentrates manufactured by the Blood Products Laboratory.

  13. Executing Quality: A Grounded Theory of Child Care Quality Improvement Engagement Process in Pennsylvania

    ERIC Educational Resources Information Center

    Critchosin, Heather

    2014-01-01

    Executing Quality describes the perceived process experienced by participants while engaging in Keystone Standards, Training, Assistance, Resources, and Support (Keystone STARS) quality rating improvement system (QRIS). The purpose of this qualitative inquiry was to understand the process of Keystone STARS engagement in order to generate a…

  14. Web-based X-ray quality control documentation.

    PubMed

    David, George; Burnett, Lou Ann; Schenkel, Robert

    2003-01-01

    The department of radiology at the Medical College of Georgia Hospital and Clinics has developed an equipment quality control web site. Our goal is to provide immediate access to virtually all medical physics survey data. The web site is designed to assist equipment engineers, department management and technologists. By improving communications and access to equipment documentation, we believe productivity is enhanced. The creation of the quality control web site was accomplished in three distinct steps. First, survey data had to be placed in a computer format. The second step was to convert these various computer files to a format supported by commercial web browsers. Third, a comprehensive home page had to be designed to provide convenient access to the multitude of surveys done in the various x-ray rooms. Because we had spent years previously fine-tuning the computerization of the medical physics quality control program, most survey documentation was already in spreadsheet or database format. A major technical decision was the method of conversion of survey spreadsheet and database files into documentation appropriate for the web. After an unsatisfactory experience with a HyperText Markup Language (HTML) converter (packaged with spreadsheet and database software), we tried creating Portable Document Format (PDF) files using Adobe Acrobat software. This process preserves the original formatting of the document and takes no longer than conventional printing; therefore, it has been very successful. Although the PDF file generated by Adobe Acrobat is a proprietary format, it can be displayed through a conventional web browser using the freely distributed Adobe Acrobat Reader program that is available for virtually all platforms. Once a user installs the software, it is automatically invoked by the web browser whenever the user follows a link to a file with a PDF extension. Although no confidential patient information is available on the web site, our legal

  15. Standardization and quality control parameters for Muktā Bhasma (calcined pearl)

    PubMed Central

    Joshi, Namrata; Sharma, Khemchand; Peter, Hema; Dash, Manoj Kumar

    2015-01-01

    Background: Muktā Bhasma (MB) is a traditional Ayurvedic preparation for cough, breathlessness, and eye disorders and is a powerful cardiac tonic, mood elevator, and known to promote strength, intellect, and semen production. Objectives: The present research work was conducted to generate fingerprint for raw and processed MB for quality assessment and standardization using classical and other techniques. Setting and Design: Three samples of MB were prepared by purification (śodhana) of Muktā (pearl) followed by repeated calcinations (Māraṇa). Resultant product was subjected to organoleptic tests and Ayurvedic tests for quality control such as rekhāpūrṇatā, vāritaratva, and nirdhūmatva. Materials and Methods: For quality control, physicochemical parameters such as loss on drying, total ash value, acid insoluble ash, specific gravity, pH value, and other tests using techniques such as elemental analysis with energy dispersive X-ray analysis (EDAX), Structural study with powder X-ray diffraction, particle size with scanning electron microscopy (SEM) were carried out on raw Muktā, Śodhita Muktā, and triplicate batches of MB. Results: The study showed that the raw material Muktā was calcium carbonate in aragonite form, which on repeated calcinations was converted into a more stable calcite form. SEM studies revealed that in raw and purified materials the particles were found scattered and unevenly arranged in the range of 718.7–214.7 nm while in final product, uniformly arranged, stable, rod-shaped, and rounded particles with more agglomerates were observed in the range of 279.2–79.93 nm. EDAX analysis revealed calcium as a major ingredient in MB (average 46.32%) which increased gradually in the stages of processing (raw 34.11%, Śodhita 37.5%). Conclusion: Quality control parameters have been quantified for fingerprinting of MB prepared using a particular method. PMID:26600667

  16. 21 CFR 862.1660 - Quality control material (assayed and unassayed).

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Quality control material (assayed and unassayed... Test Systems § 862.1660 Quality control material (assayed and unassayed). (a) Identification. A quality... that may arise from reagent or analytical instrument variation. A quality control material (assayed and...

  17. QUALITY ASSURANCE AND QUALITY CONTROL FOR WASTE CONTAINMENT FACILITIES. Project Summary

    EPA Science Inventory

    It is generally agreed that both quality assurance (QA) and quality control (QC) are essential to the proper installation and eventual performance of environmentally safe and secure waste containment systems. Even further, there are both manufacturing and construction aspects to...

  18. [Quality control an assessment system. Its location within a program for food, nutrition and metabolic intervention].

    PubMed

    Santana Porbén, S

    2012-01-01

    A design proposal for a HQCAS Hospital Quality Control and Assessment System of the nutritional and feeding care processes conducted in a hospital environment is presented in this article. The design proposal is accompanied of the results of inspections conducted by the hospital NST Nutritional Support Group between 2005-2010. The system design includes quality policies that should rule the useful and safe conduction of such processes, the recording and documentary foundations of the System, and the quality control and assessment exercises for the continuous verification of such established policies. The current state of the conduction of these processes was documented from secondary records opened by the NST after satisfying consultation requests from the medical care teams of the institution. Inspections conducted by the NST revealed that less than half of clinical charts contained information minimally enough for elaborating nutritional judgments, almost one-fifth of the assisted patients were on Nils Per Oris, for whom no nutritional support schemes were prescribed, and a low prescription and usage of artificial nutrition schemes. Corrective measures adopted by the NST served to significantly increase the rates of successful completion of inspected processes. Quality assurance of feeding and nutritional care processes is a practical as well as an intellectual activity subjected to constant remodeling, in order to always warrant the fulfillment of quality policies advanced by the NST, and thus, that the patient benefits from the prescribed nutritional intervention strategy.

  19. Robot welding process control

    NASA Technical Reports Server (NTRS)

    Romine, Peter L.

    1991-01-01

    This final report documents the development and installation of software and hardware for Robotic Welding Process Control. Primary emphasis is on serial communications between the CYRO 750 robotic welder, Heurikon minicomputer running Hunter & Ready VRTX, and an IBM PC/AT, for offline programming and control and closed-loop welding control. The requirements for completion of the implementation of the Rocketdyne weld tracking control are discussed. The procedure for downloading programs from the Intergraph, over the network, is discussed. Conclusions are made on the results of this task, and recommendations are made for efficient implementation of communications, weld process control development, and advanced process control procedures using the Heurikon.

  20. Improved process robustness by using closed loop control in deep drawing applications

    NASA Astrophysics Data System (ADS)

    Barthau, M.; Liewald, M.; Christian, Held

    2017-09-01

    The production of irregular shaped deep-drawing parts with high quality requirements, which are common in today’s automotive production, permanently challenges production processes. High requirements on lightweight construction of passenger car bodies following European regulations until 2020 have been massively increasing the use of high strength steels substantially for years and are also leading to bigger challenges in sheet metal part production. Of course, the more and more complex shapes of today’s car body shells also intensify the issue due to modern and future design criteria. The metal forming technology tries to meet these challenges by developing a highly sophisticated layout of deep drawing dies that consider part quality requirements, process robustness and controlled material flow during the deep or stretch drawing process phase. A new method for controlling material flow using a closed loop system was developed at the IFU Stuttgart. In contrast to previous approaches, this new method allows a control intervention during the deep-drawing stroke. The blank holder force around the outline of the drawn part is used as control variable. The closed loop is designed as trajectory follow up with feed forward control. The used command variable is the part-wall stress that is measured with a piezo-electric measuring pin. In this paper the used control loop will be described in detail. The experimental tool that was built for testing the new control approach is explained here with its features. A method for gaining the follow up trajectories from simulation will also be presented. Furthermore, experimental results considering the robustness of the deep drawing process and the gain in process performance with developed control loop will be shown. Finally, a new procedure for the industrial application of the new control method of deep drawing will be presented by using a new kind of active element to influence the local blank holder pressure onto part

  1. Quality control mechanisms exclude incorrect polymerases from the eukaryotic replication fork

    PubMed Central

    Schauer, Grant D.; O’Donnell, Michael E.

    2017-01-01

    The eukaryotic genome is primarily replicated by two DNA polymerases, Pol ε and Pol δ, that function on the leading and lagging strands, respectively. Previous studies have established recruitment mechanisms whereby Cdc45-Mcm2-7-GINS (CMG) helicase binds Pol ε and tethers it to the leading strand, and PCNA (proliferating cell nuclear antigen) binds tightly to Pol δ and recruits it to the lagging strand. The current report identifies quality control mechanisms that exclude the improper polymerase from a particular strand. We find that the replication factor C (RFC) clamp loader specifically inhibits Pol ε on the lagging strand, and CMG protects Pol ε against RFC inhibition on the leading strand. Previous studies show that Pol δ is slow and distributive with CMG on the leading strand. However, Saccharomyces cerevisiae Pol δ–PCNA is a rapid and processive enzyme, suggesting that CMG may bind and alter Pol δ activity or position it on the lagging strand. Measurements of polymerase binding to CMG demonstrate Pol ε binds CMG with a Kd value of 12 nM, but Pol δ binding CMG is undetectable. Pol δ, like bacterial replicases, undergoes collision release upon completing replication, and we propose Pol δ–PCNA collides with the slower CMG, and in the absence of a stabilizing Pol δ–CMG interaction, the collision release process is triggered, ejecting Pol δ on the leading strand. Hence, by eviction of incorrect polymerases at the fork, the clamp machinery directs quality control on the lagging strand and CMG enforces quality control on the leading strand. PMID:28069954

  2. Multi-Agent Architecture with Support to Quality of Service and Quality of Control

    NASA Astrophysics Data System (ADS)

    Poza-Luján, Jose-Luis; Posadas-Yagüe, Juan-Luis; Simó-Ten, Jose-Enrique

    Multi Agent Systems (MAS) are one of the most suitable frameworks for the implementation of intelligent distributed control system. Agents provide suitable flexibility to give support to implied heterogeneity in cyber-physical systems. Quality of Service (QoS) and Quality of Control (QoC) parameters are commonly utilized to evaluate the efficiency of the communications and the control loop. Agents can use the quality measures to take a wide range of decisions, like suitable placement on the control node or to change the workload to save energy. This article describes the architecture of a multi agent system that provides support to QoS and QoC parameters to optimize de system. The architecture uses a Publish-Subscriber model, based on Data Distribution Service (DDS) to send the control messages. Due to the nature of the Publish-Subscribe model, the architecture is suitable to implement event-based control (EBC) systems. The architecture has been called FSACtrl.

  3. 40 CFR 81.87 - Metropolitan Boise Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Quality Control Region. 81.87 Section 81.87 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.87 Metropolitan Boise Intrastate Air Quality Control Region. The Metropolitan Boise Intrastate Air Quality Control Region (Idaho) consists of the territorial area encompassed...

  4. 40 CFR 81.89 - Metropolitan Cheyenne Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Quality Control Region. 81.89 Section 81.89 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.89 Metropolitan Cheyenne Intrastate Air Quality Control Region. The Metropolitan Cheyenne Intrastate Air Quality Control Region (Wyoming) consists of the territorial area...

  5. 40 CFR 81.101 - Metropolitan Dubuque Interstate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Quality Control Region. 81.101 Section 81.101 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.101 Metropolitan Dubuque Interstate Air Quality Control Region. The Metropolitan Dubuque Interstate Air Quality Control Region (Illinois-Iowa-Wisconsin) consists of the...

  6. 40 CFR 81.104 - Central Pennsylvania Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Quality Control Region. 81.104 Section 81.104 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.104 Central Pennsylvania Intrastate Air Quality Control Region. The Central Pennsylvania Intrastate Air Quality Control Region consists of the territorial area encompassed by...

  7. 40 CFR 81.106 - Greenville-Spartanburg Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Quality Control Region. 81.106 Section 81.106 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.106 Greenville-Spartanburg Intrastate Air Quality Control Region. The Greenville-Spartanburg Intrastate Air Quality Control Region (South Carolina) consists of the territorial...

  8. 40 CFR 81.120 - Middle Tennessee Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Quality Control Region. 81.120 Section 81.120 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.120 Middle Tennessee Intrastate Air Quality Control Region. The Middle Tennessee Intrastate Air Quality Control Region consists of the territorial area encompassed by...

  9. 40 CFR 81.75 - Metropolitan Charlotte Interstate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Quality Control Region. 81.75 Section 81.75 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.75 Metropolitan Charlotte Interstate Air Quality Control Region. The Metropolitan Charlotte Interstate Air Quality Control Region (North Carolina-South Carolina) has been revised...

  10. 40 CFR 81.79 - Northeastern Oklahoma Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Quality Control Region. 81.79 Section 81.79 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.79 Northeastern Oklahoma Intrastate Air Quality Control Region. The Metropolitan Tulsa Intrastate Air Quality Control Region has been renamed the Northeastern Oklahoma Intrastate...

  11. 40 CFR 81.119 - Western Tennessee Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Quality Control Region. 81.119 Section 81.119 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.119 Western Tennessee Intrastate Air Quality Control Region. The Western Tennessee Intrastate Air Quality Control Region consists of the territorial area encompassed by...

  12. 40 CFR 81.104 - Central Pennsylvania Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Quality Control Region. 81.104 Section 81.104 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.104 Central Pennsylvania Intrastate Air Quality Control Region. The Central Pennsylvania Intrastate Air Quality Control Region consists of the territorial area encompassed by...

  13. 40 CFR 81.87 - Metropolitan Boise Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Quality Control Region. 81.87 Section 81.87 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.87 Metropolitan Boise Intrastate Air Quality Control Region. The Metropolitan Boise Intrastate Air Quality Control Region (Idaho) consists of the territorial area encompassed...

  14. 40 CFR 81.79 - Northeastern Oklahoma Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Quality Control Region. 81.79 Section 81.79 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.79 Northeastern Oklahoma Intrastate Air Quality Control Region. The Metropolitan Tulsa Intrastate Air Quality Control Region has been renamed the Northeastern Oklahoma Intrastate...

  15. 40 CFR 81.101 - Metropolitan Dubuque Interstate Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Quality Control Region. 81.101 Section 81.101 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.101 Metropolitan Dubuque Interstate Air Quality Control Region. The Metropolitan Dubuque Interstate Air Quality Control Region (Illinois-Iowa-Wisconsin) consists of the...

  16. 40 CFR 81.106 - Greenville-Spartanburg Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Quality Control Region. 81.106 Section 81.106 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.106 Greenville-Spartanburg Intrastate Air Quality Control Region. The Greenville-Spartanburg Intrastate Air Quality Control Region (South Carolina) consists of the territorial...

  17. 40 CFR 81.89 - Metropolitan Cheyenne Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Quality Control Region. 81.89 Section 81.89 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.89 Metropolitan Cheyenne Intrastate Air Quality Control Region. The Metropolitan Cheyenne Intrastate Air Quality Control Region (Wyoming) consists of the territorial area...

  18. 40 CFR 81.62 - Northeast Mississippi Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Quality Control Region. 81.62 Section 81.62 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.62 Northeast Mississippi Intrastate Air Quality Control Region. The Alabama-Mississippi-Tennessee Interstate Air Quality Control Region has been renamed the Northeast...

  19. 40 CFR 81.78 - Metropolitan Portland Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Quality Control Region. 81.78 Section 81.78 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.78 Metropolitan Portland Intrastate Air Quality Control Region. The Metropolitan Portland Intrastate Air Quality Control Region (Maine) consists of the territorial area...

  20. 40 CFR 81.77 - Puerto Rico Air Quality Control Region.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 18 2014-07-01 2014-07-01 false Puerto Rico Air Quality Control Region... Control Regions § 81.77 Puerto Rico Air Quality Control Region. The Puerto Rico Air Quality Control Region... delimited): The entire Commonwealth of Puerto Rico: Puerto Rico and surrounding islands, Vieques and...

  1. HPLC for quality control of polyimides

    NASA Technical Reports Server (NTRS)

    Young, P. R.; Sykes, G. F.

    1979-01-01

    High Pressure Liquid Chromatography (HPLC) as a quality control tool for polyimide resins and prepregs are presented. A data base to help establish accept/reject criteria for these materials was developed. This work is intended to supplement, not replace, standard quality control tests normally conducted on incoming resins and prepregs. To help achieve these objectives, the HPLC separation of LARC-160 polyimide precursor resin was characterized. Room temperature resin aging effects were studied. Graphite reinforced composites made from fresh and aged resin were fabricated and tested to determine if changes observed by HPLC were significant.

  2. Latest processing status and quality assessment of the GOMOS, MIPAS and SCIAMACHY ESA dataset

    NASA Astrophysics Data System (ADS)

    Niro, F.; Brizzi, G.; Saavedra de Miguel, L.; Scarpino, G.; Dehn, A.; Fehr, T.; von Kuhlmann, R.

    2011-12-01

    GOMOS, MIPAS and SCIAMACHY instruments are successfully observing the changing Earth's atmosphere since the launch of the ENVISAT-ESA platform on March 2002. The measurements recorded by these instruments are relevant for the Atmospheric-Chemistry community both in terms of time extent and variety of observing geometry and techniques. In order to fully exploit these measurements, it is crucial to maintain a good reliability in the data processing and distribution and to continuously improving the scientific output. The goal is to meet the evolving needs of both the near-real-time and research applications. Within this frame, the ESA operational processor remains the reference code, although many scientific algorithms are nowadays available to the users. In fact, the ESA algorithm has a well-established calibration and validation scheme, a certified quality assessment process and the possibility to reach a wide users' community. Moreover, the ESA algorithm upgrade procedures and the re-processing performances have much improved during last two years, thanks to the recent updates of the Ground Segment infrastructure and overall organization. The aim of this paper is to promote the usage and stress the quality of the ESA operational dataset for the GOMOS, MIPAS and SCIAMACHY missions. The recent upgrades in the ESA processor (GOMOS V6, MIPAS V5 and SCIAMACHY V5) will be presented, with detailed information on improvements in the scientific output and preliminary validation results. The planned algorithm evolution and on-going re-processing campaigns will be mentioned that involves the adoption of advanced set-up, such as the MIPAS V6 re-processing on a clouds-computing system. Finally, the quality control process will be illustrated that allows to guarantee a standard of quality to the users. In fact, the operational ESA algorithm is carefully tested before switching into operations and the near-real time and off-line production is thoughtfully verified via the

  3. COMMUNITY MULTISCALE AIR QUALITY ( CMAQ ) MODEL - QUALITY ASSURANCE AND VERSION CONTROL

    EPA Science Inventory

    This presentation will be given to the EPA Exposure Modeling Workgroup on January 24, 2006. The quality assurance and version control procedures for the Community Multiscale Air Quality (CMAQ) Model are presented. A brief background of CMAQ is given, then issues related to qual...

  4. Implementing self sustained quality control procedures in a clinical laboratory.

    PubMed

    Khatri, Roshan; K C, Sanjay; Shrestha, Prabodh; Sinha, J N

    2013-01-01

    Quality control is an essential component in every clinical laboratory which maintains the excellence of laboratory standards, supplementing to proper disease diagnosis, patient care and resulting in overall strengthening of health care system. Numerous quality control schemes are available, with combinations of procedures, most of which are tedious, time consuming and can be "too technical" whereas commercially available quality control materials can be expensive especially for laboratories in developing nations like Nepal. Here, we present a procedure performed at our centre with self prepared control serum and use of simple statistical tools for quality assurance. The pooled serum was prepared as per guidelines for preparation of stabilized liquid quality control serum from human sera. Internal Quality Assessment was performed on this sample, on a daily basis which included measurement of 12 routine biochemical parameters. The results were plotted on Levey-Jennings charts and analysed with quality control rules, for a period of one month. The mean levels of biochemical analytes in self prepared control serum were within normal physiological range. This serum was evaluated every day along with patients' samples. The results obtained were plotted on control charts and analysed using common quality control rules to identify possible systematic and random errors. Immediate mitigation measures were taken and the dispatch of erroneous reports was avoided. In this study we try to highlight on a simple internal quality control procedure which can be performed by laboratories, with minimum technology, expenditure, and expertise and improve reliability and validity of the test reports.

  5. 40 CFR 81.118 - Southwest Missouri Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Quality Control Region. 81.118 Section 81.118 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.118 Southwest Missouri Intrastate Air Quality Control Region. The Southwest Missouri Intrastate Air Quality Control Region consists of the territorial area encompassed by the...

  6. 40 CFR 81.116 - Northern Missouri Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Quality Control Region. 81.116 Section 81.116 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.116 Northern Missouri Intrastate Air Quality Control Region. The Northern Missouri Intrastate Air Quality Control Region consists of the territorial area encompassed by the...

  7. 40 CFR 81.97 - Southwest Florida Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Quality Control Region. 81.97 Section 81.97 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.97 Southwest Florida Intrastate Air Quality Control Region. The Southwest Florida Intrastate Air Quality Control Region consists of the territorial area encompassed by the...

  8. 40 CFR 81.117 - Southeast Missouri Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Quality Control Region. 81.117 Section 81.117 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.117 Southeast Missouri Intrastate Air Quality Control Region. The Southeast Missouri Intrastate Air Quality Control Region consists of the territorial area encompassed by the...

  9. 40 CFR 81.98 - Burlington-Keokuk Interstate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Quality Control Region. 81.98 Section 81.98 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.98 Burlington-Keokuk Interstate Air Quality Control Region. The Burlington-Keokuk Interstate Air Quality Control Region (Illinois-Iowa) is revised to consist of the...

  10. 40 CFR 81.118 - Southwest Missouri Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Quality Control Region. 81.118 Section 81.118 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.118 Southwest Missouri Intrastate Air Quality Control Region. The Southwest Missouri Intrastate Air Quality Control Region consists of the territorial area encompassed by the...

  11. 40 CFR 81.115 - Northwest Nevada Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Quality Control Region. 81.115 Section 81.115 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.115 Northwest Nevada Intrastate Air Quality Control Region. The Northwest Nevada Intrastate Air Quality Control Region consists of the territorial area encompassed by the...

  12. 40 CFR 81.116 - Northern Missouri Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Quality Control Region. 81.116 Section 81.116 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.116 Northern Missouri Intrastate Air Quality Control Region. The Northern Missouri Intrastate Air Quality Control Region consists of the territorial area encompassed by the...

  13. 40 CFR 81.123 - Southeastern Oklahoma Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Quality Control Region. 81.123 Section 81.123 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.123 Southeastern Oklahoma Intrastate Air Quality Control Region. The Southeastern Oklahoma Intrastate Air Quality Control Region consists of the territorial area encompassed by the...

  14. 40 CFR 81.67 - Lake Michigan Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Quality Control Regions § 81.67 Lake Michigan Intrastate Air Quality Control Region. The Menominee-Escanaba (Michigan)-Marinette (Wisconsin) Interstate Air Quality Control Region has been renamed the Lake Michigan Intrastate Air Quality Control Region (Wisconsin) and revised to consist of the territorial area...

  15. 40 CFR 81.115 - Northwest Nevada Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Quality Control Region. 81.115 Section 81.115 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.115 Northwest Nevada Intrastate Air Quality Control Region. The Northwest Nevada Intrastate Air Quality Control Region consists of the territorial area encompassed by the...

  16. 40 CFR 81.97 - Southwest Florida Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Quality Control Region. 81.97 Section 81.97 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.97 Southwest Florida Intrastate Air Quality Control Region. The Southwest Florida Intrastate Air Quality Control Region consists of the territorial area encompassed by the...

  17. 40 CFR 81.117 - Southeast Missouri Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Quality Control Region. 81.117 Section 81.117 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.117 Southeast Missouri Intrastate Air Quality Control Region. The Southeast Missouri Intrastate Air Quality Control Region consists of the territorial area encompassed by the...

  18. 40 CFR 81.122 - Mississippi Delta Intrastate Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Quality Control Region. 81.122 Section 81.122 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.122 Mississippi Delta Intrastate Air Quality Control Region. The Mississippi Delta Intrastate Air Quality Control Region consists of the territorial area encompassed by the...

  19. 40 CFR 81.98 - Burlington-Keokuk Interstate Air Quality Control Region.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Quality Control Region. 81.98 Section 81.98 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... Air Quality Control Regions § 81.98 Burlington-Keokuk Interstate Air Quality Control Region. The Burlington-Keokuk Interstate Air Quality Control Region (Illinois-Iowa) is revised to consist of the...

  20. Yeast prions are useful for studying protein chaperones and protein quality control.

    PubMed

    Masison, Daniel C; Reidy, Michael

    2015-01-01

    Protein chaperones help proteins adopt and maintain native conformations and play vital roles in cellular processes where proteins are partially folded. They comprise a major part of the cellular protein quality control system that protects the integrity of the proteome. Many disorders are caused when proteins misfold despite this protection. Yeast prions are fibrous amyloid aggregates of misfolded proteins. The normal action of chaperones on yeast prions breaks the fibers into pieces, which results in prion replication. Because this process is necessary for propagation of yeast prions, even small differences in activity of many chaperones noticeably affect prion phenotypes. Several other factors involved in protein processing also influence formation, propagation or elimination of prions in yeast. Thus, in much the same way that the dependency of viruses on cellular functions has allowed us to learn much about cell biology, the dependency of yeast prions on chaperones presents a unique and sensitive way to monitor the functions and interactions of many components of the cell's protein quality control system. Our recent work illustrates the utility of this system for identifying and defining chaperone machinery interactions.

  1. Dynamic control of remelting processes

    DOEpatents

    Bertram, Lee A.; Williamson, Rodney L.; Melgaard, David K.; Beaman, Joseph J.; Evans, David G.

    2000-01-01

    An apparatus and method of controlling a remelting process by providing measured process variable values to a process controller; estimating process variable values using a process model of a remelting process; and outputting estimated process variable values from the process controller. Feedback and feedforward control devices receive the estimated process variable values and adjust inputs to the remelting process. Electrode weight, electrode mass, electrode gap, process current, process voltage, electrode position, electrode temperature, electrode thermal boundary layer thickness, electrode velocity, electrode acceleration, slag temperature, melting efficiency, cooling water temperature, cooling water flow rate, crucible temperature profile, slag skin temperature, and/or drip short events are employed, as are parameters representing physical constraints of electroslag remelting or vacuum arc remelting, as applicable.

  2. ChiLin: a comprehensive ChIP-seq and DNase-seq quality control and analysis pipeline.

    PubMed

    Qin, Qian; Mei, Shenglin; Wu, Qiu; Sun, Hanfei; Li, Lewyn; Taing, Len; Chen, Sujun; Li, Fugen; Liu, Tao; Zang, Chongzhi; Xu, Han; Chen, Yiwen; Meyer, Clifford A; Zhang, Yong; Brown, Myles; Long, Henry W; Liu, X Shirley

    2016-10-03

    Transcription factor binding, histone modification, and chromatin accessibility studies are important approaches to understanding the biology of gene regulation. ChIP-seq and DNase-seq have become the standard techniques for studying protein-DNA interactions and chromatin accessibility respectively, and comprehensive quality control (QC) and analysis tools are critical to extracting the most value from these assay types. Although many analysis and QC tools have been reported, few combine ChIP-seq and DNase-seq data analysis and quality control in a unified framework with a comprehensive and unbiased reference of data quality metrics. ChiLin is a computational pipeline that automates the quality control and data analyses of ChIP-seq and DNase-seq data. It is developed using a flexible and modular software framework that can be easily extended and modified. ChiLin is ideal for batch processing of many datasets and is well suited for large collaborative projects involving ChIP-seq and DNase-seq from different designs. ChiLin generates comprehensive quality control reports that include comparisons with historical data derived from over 23,677 public ChIP-seq and DNase-seq samples (11,265 datasets) from eight literature-based classified categories. To the best of our knowledge, this atlas represents the most comprehensive ChIP-seq and DNase-seq related quality metric resource currently available. These historical metrics provide useful heuristic quality references for experiment across all commonly used assay types. Using representative datasets, we demonstrate the versatility of the pipeline by applying it to different assay types of ChIP-seq data. The pipeline software is available open source at https://github.com/cfce/chilin . ChiLin is a scalable and powerful tool to process large batches of ChIP-seq and DNase-seq datasets. The analysis output and quality metrics have been structured into user-friendly directories and reports. We have successfully compiled 23

  3. Data Quality Control of the French Permanent Broadband Network in the RESIF Framework

    NASA Astrophysics Data System (ADS)

    Grunberg, Marc; Lambotte, Sophie; Engels, Fabien; Dretzen, Remi; Hernandez, Alain

    2014-05-01

    In the framework of the RESIF (Réseau Sismologique et géodésique Français) project, a new information system is being setting up, allowing the improvement of the management and the distribution of high quality data from the different elements of RESIF and the associated networks. Within this information system, EOST (in Strasbourg) is in charge of collecting real-time permanent broadband seismic waveform, and performing Quality Control on these data. The real-time and validated data set are pushed to the French National Distribution Center (Isterre/Grenoble) in order to make them publicly available. Furthermore EOST hosts the BCSF-ReNaSS, in charge of the French metropolitan seismic bulletin. This allows to benefit from some high-end quality control based on the national and world-wide seismicity. Here we present first the real-time seismic data flow from the stations of the French National Broad Band Network to EOST, and then, the data Quality Control procedures that were recently installed, including some new developments. The data Quality Control consists in applying a variety of subprocesses to check the consistency of the whole system and process from the stations to the data center. This allows us to verify that instruments and data transmission are operating correctly. Moreover analysis of the ambient noise helps to characterize intrinsic seismic quality of the stations and to identify other kind of disturbances. The deployed Quality Control consist in a pipeline that starts with low-level procedures : check the real-time miniseed data file (file naming convention, data integrity), check for inconsistencies between waveform and meta-data (channel name, sample rate, etc.), compute waveform statistics (data availability, gap/overlap, mean, rms, time quality, spike). It is followed by some high-level procedures such as : power spectral density computation (PSD), STA/LTA computation to be correlated to the seismicity, phases picking and stations magnitudes

  4. Evaluation of statistical protocols for quality control of ecosystem carbon dioxide fluxes

    Treesearch

    Jorge F. Perez-Quezada; Nicanor Z. Saliendra; William E. Emmerich; Emilio A. Laca

    2007-01-01

    The process of quality control of micrometeorological and carbon dioxide (CO2) flux data can be subjective and may lack repeatability, which would undermine the results of many studies. Multivariate statistical methods and time series analysis were used together and independently to detect and replace outliers in CO2 flux...

  5. A Total Quality Leadership Process Improvement Model

    DTIC Science & Technology

    1993-12-01

    Leadership Process Improvement Model by Archester Houston, Ph.D. and Steven L. Dockstader, Ph.D. DTICS ELECTE tleaese oand sale itsFeat ben proe 94-12058...tTl ’AND SIATE COVERID0 Z lits Z40 uerI’Ll12/93 IFinalS.FNR IM F A Total Quality Leadership Process Improvement Model M ARRhOW~ Archester Houston, Ph.D...and Steven L. Dockstader, Ph.D. ?. 7PEJORMING ORG-AN1:AION NAMEIS) AND 00-RESS(ES) L PERFORMIN4 ORAINIZATION Total Quality Leadership OfficeREOTNMR

  6. AstroCloud, a Cyber-Infrastructure for Astronomy Research: Data Archiving and Quality Control

    NASA Astrophysics Data System (ADS)

    He, B.; Cui, C.; Fan, D.; Li, C.; Xiao, J.; Yu, C.; Wang, C.; Cao, Z.; Chen, J.; Yi, W.; Li, S.; Mi, L.; Yang, S.

    2015-09-01

    AstroCloud is a cyber-Infrastructure for Astronomy Research initiated by Chinese Virtual Observatory (China-VO) under funding support from NDRC (National Development and Reform commission) and CAS (Chinese Academy of Sciences)1(Cui et al. 2014). To archive the astronomical data in China, we present the implementation of the astronomical data archiving system (ADAS). Data archiving and quality control are the infrastructure for the AstroCloud. Throughout the data of the entire life cycle, data archiving system standardized data, transferring data, logging observational data, archiving ambient data, And storing these data and metadata in database. Quality control covers the whole process and all aspects of data archiving.

  7. 20 CFR 602.41 - Proper expenditure of Quality Control granted funds.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 20 Employees' Benefits 3 2011-04-01 2011-04-01 false Proper expenditure of Quality Control granted... LABOR QUALITY CONTROL IN THE FEDERAL-STATE UNEMPLOYMENT INSURANCE SYSTEM Quality Control Grants to States § 602.41 Proper expenditure of Quality Control granted funds. The Secretary may, after reasonable...

  8. 20 CFR 602.41 - Proper expenditure of Quality Control granted funds.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Proper expenditure of Quality Control granted... LABOR QUALITY CONTROL IN THE FEDERAL-STATE UNEMPLOYMENT INSURANCE SYSTEM Quality Control Grants to States § 602.41 Proper expenditure of Quality Control granted funds. The Secretary may, after reasonable...

  9. Japanese Quality Control Circles.

    ERIC Educational Resources Information Center

    Nishiyama, Kazuo

    In recent years, United States scholars with an interest in international business and organizational communication have begun to notice the success of Japanese "quality control circles." These are small groups, usually composed of seven to ten workers, who are organized at the production levels within most large Japanese factories. A…

  10. Data Quality Assurance and Control for AmeriFlux Network at CDIAC, ORNL

    NASA Astrophysics Data System (ADS)

    Shem, W.; Boden, T.; Krassovski, M.; Yang, B.

    2014-12-01

    The Carbon Dioxide Information Analysis Center (CDIAC) at the Oak Ridge National Laboratory (ORNL) serves as the long-term data repository for the AmeriFlux network. Datasets currently available include hourly or half-hourly meteorological and flux observations, biological measurement records, and synthesis data products. Currently there is a lack of standardized nomenclature and specifically designed procedures for data quality assurance/control in processing and handling micrometeorological and ecological data at individual flux sites. CDIAC's has bridged this gap by providing efficient and accurate procedures for data quality control and standardization of the results for easier assimilation by the models used in climate science. In this presentation we highlight the procedures we have put in place to scrutinize continuous flux and meteorological data within Ameriflux network. We itemize some basic data quality issues that we have observed over the past years and include some examples of typical data quality issues. Such issues, e.g., incorrect time-stamping, poor calibration or maintenance of instruments, missing or incomplete metadata and others that are commonly over-looked by PI's, invariably impact the time-series observations.

  11. Evaluating the quality of a cell counting measurement process via a dilution series experimental design.

    PubMed

    Sarkar, Sumona; Lund, Steven P; Vyzasatya, Ravi; Vanguri, Padmavathy; Elliott, John T; Plant, Anne L; Lin-Gibson, Sheng

    2017-12-01

    Cell counting measurements are critical in the research, development and manufacturing of cell-based products, yet determining cell quantity with accuracy and precision remains a challenge. Validating and evaluating a cell counting measurement process can be difficult because of the lack of appropriate reference material. Here we describe an experimental design and statistical analysis approach to evaluate the quality of a cell counting measurement process in the absence of appropriate reference materials or reference methods. The experimental design is based on a dilution series study with replicate samples and observations as well as measurement process controls. The statistical analysis evaluates the precision and proportionality of the cell counting measurement process and can be used to compare the quality of two or more counting methods. As an illustration of this approach, cell counting measurement processes (automated and manual methods) were compared for a human mesenchymal stromal cell (hMSC) preparation. For the hMSC preparation investigated, results indicated that the automated method performed better than the manual counting methods in terms of precision and proportionality. By conducting well controlled dilution series experimental designs coupled with appropriate statistical analysis, quantitative indicators of repeatability and proportionality can be calculated to provide an assessment of cell counting measurement quality. This approach does not rely on the use of a reference material or comparison to "gold standard" methods known to have limited assurance of accuracy and precision. The approach presented here may help the selection, optimization, and/or validation of a cell counting measurement process. Published by Elsevier Inc.

  12. 14 CFR 21.143 - Quality control data requirements; prime manufacturer.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 1 2011-01-01 2011-01-01 false Quality control data requirements; prime... describing assigned responsibilities and delegated authority of the quality control organization, together with a chart indicating the functional relationship of the quality control organization to management...

  13. 40 CFR 81.51 - Portland Interstate Air Quality Control Region.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Control Region. 81.51 Section 81.51 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) DESIGNATION OF AREAS FOR AIR QUALITY PLANNING PURPOSES Designation of Air Quality Control Regions § 81.51 Portland Interstate Air Quality Control Region. The Portland Interstate...

  14. 40 CFR 81.51 - Portland Interstate Air Quality Control Region.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Control Region. 81.51 Section 81.51 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) DESIGNATION OF AREAS FOR AIR QUALITY PLANNING PURPOSES Designation of Air Quality Control Regions § 81.51 Portland Interstate Air Quality Control Region. The Portland Interstate...

  15. 40 CFR 81.51 - Portland Interstate Air Quality Control Region.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Control Region. 81.51 Section 81.51 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) DESIGNATION OF AREAS FOR AIR QUALITY PLANNING PURPOSES Designation of Air Quality Control Regions § 81.51 Portland Interstate Air Quality Control Region. The Portland Interstate...

  16. Assessing the quality of radiographic processing in general dental practice.

    PubMed

    Thornley, P H; Stewardson, D A; Rout, P G J; Burke, F J T

    2006-05-13

    To determine if a commercial device (Vischeck) for monitoring film processing quality was a practical option in general dental practice, and to assess processing quality among a group of GDPs in the West Midlands with this device. Clinical evaluation. General dental practice, UK, 2004. Ten GDP volunteers from a practice based research group processed Vischeck strips (a) when chemicals were changed, (b) one week later, and (c) immediately before the next change of chemicals. These were compared with strips processed under ideal conditions. Additionally, a series of duplicate radiographs were produced and processed together with Vischeck strips in progressively more dilute developer solutions to compare the change in radiograph quality assessed clinically with that derived from the Vischeck. The Vischeck strips suggested that at the time chosen for change of processing chemicals, eight dentists had been processing films well beyond the point indicated for replacement. Solutions were changed after a wide range of time periods and number of films processed. The calibration of the Vischeck strip correlated closely to a clinical assessment of acceptable film quality. Vischeck strips are a useful aid to monitoring processing quality in automatic developers in general dental practice. Most of this group of GDPs were using chemicals beyond the point at which diagnostic yield would be affected.

  17. 30 CFR 28.32 - Proposed quality control plans; approval by MSHA.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Proposed quality control plans; approval by...-CIRCUIT PROTECTION FOR TRAILING CABLES IN COAL MINES Quality Control § 28.32 Proposed quality control plans; approval by MSHA. (a) Each proposed quality control plan submitted in accordance with this...

  18. 30 CFR 28.32 - Proposed quality control plans; approval by MSHA.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...-CIRCUIT PROTECTION FOR TRAILING CABLES IN COAL MINES Quality Control § 28.32 Proposed quality control plans; approval by MSHA. (a) Each proposed quality control plan submitted in accordance with this... 30 Mineral Resources 1 2012-07-01 2012-07-01 false Proposed quality control plans; approval by...

  19. 30 CFR 28.32 - Proposed quality control plans; approval by MSHA.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...-CIRCUIT PROTECTION FOR TRAILING CABLES IN COAL MINES Quality Control § 28.32 Proposed quality control plans; approval by MSHA. (a) Each proposed quality control plan submitted in accordance with this... 30 Mineral Resources 1 2013-07-01 2013-07-01 false Proposed quality control plans; approval by...

  20. 30 CFR 28.32 - Proposed quality control plans; approval by MSHA.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...-CIRCUIT PROTECTION FOR TRAILING CABLES IN COAL MINES Quality Control § 28.32 Proposed quality control plans; approval by MSHA. (a) Each proposed quality control plan submitted in accordance with this... 30 Mineral Resources 1 2014-07-01 2014-07-01 false Proposed quality control plans; approval by...